Mar 09 12:58:44 crc systemd[1]: Starting Kubernetes Kubelet... Mar 09 12:58:44 crc restorecon[4692]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 12:58:44 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 12:58:45 crc restorecon[4692]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 12:58:45 crc restorecon[4692]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 09 12:58:46 crc kubenswrapper[4723]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 09 12:58:46 crc kubenswrapper[4723]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 09 12:58:46 crc kubenswrapper[4723]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 09 12:58:46 crc kubenswrapper[4723]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 09 12:58:46 crc kubenswrapper[4723]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 09 12:58:46 crc kubenswrapper[4723]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.604643 4723 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620012 4723 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620072 4723 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620084 4723 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620095 4723 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620105 4723 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620115 4723 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620126 4723 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620136 4723 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620152 4723 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620164 4723 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620175 4723 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620187 4723 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620197 4723 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620209 4723 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620220 4723 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620233 4723 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620247 4723 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620261 4723 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620273 4723 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620286 4723 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620298 4723 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620309 4723 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620319 4723 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620330 4723 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620339 4723 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620349 4723 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620359 4723 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620369 4723 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620379 4723 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620389 4723 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620401 4723 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620411 4723 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620421 4723 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620430 4723 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620442 4723 feature_gate.go:330] unrecognized feature gate: Example Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620451 4723 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620461 4723 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620471 4723 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620481 4723 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620491 4723 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620501 4723 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620511 4723 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620520 4723 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620531 4723 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620541 4723 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620551 4723 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620561 4723 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620572 4723 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620583 4723 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620593 4723 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620603 4723 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620617 4723 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620631 4723 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620642 4723 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620651 4723 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620661 4723 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620671 4723 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620680 4723 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620690 4723 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620700 4723 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620712 4723 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620721 4723 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620731 4723 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620740 4723 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620749 4723 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620758 4723 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620768 4723 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620779 4723 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620788 4723 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620798 4723 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.620807 4723 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621104 4723 flags.go:64] FLAG: --address="0.0.0.0" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621137 4723 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621160 4723 flags.go:64] FLAG: --anonymous-auth="true" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621176 4723 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621192 4723 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621203 4723 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621217 4723 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621230 4723 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621240 4723 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621251 4723 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621263 4723 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621276 4723 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621287 4723 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621300 4723 flags.go:64] FLAG: --cgroup-root="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621310 4723 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621320 4723 flags.go:64] FLAG: --client-ca-file="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621331 4723 flags.go:64] FLAG: --cloud-config="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621341 4723 flags.go:64] FLAG: --cloud-provider="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621353 4723 flags.go:64] FLAG: --cluster-dns="[]" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621366 4723 flags.go:64] FLAG: --cluster-domain="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621376 4723 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621386 4723 flags.go:64] FLAG: --config-dir="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621397 4723 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621407 4723 flags.go:64] FLAG: --container-log-max-files="5" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621421 4723 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621431 4723 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621441 4723 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621451 4723 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621460 4723 flags.go:64] FLAG: --contention-profiling="false" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621470 4723 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621480 4723 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621491 4723 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621500 4723 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621513 4723 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621523 4723 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621533 4723 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621543 4723 flags.go:64] FLAG: --enable-load-reader="false" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621552 4723 flags.go:64] FLAG: --enable-server="true" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621562 4723 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621576 4723 flags.go:64] FLAG: --event-burst="100" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621588 4723 flags.go:64] FLAG: --event-qps="50" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621599 4723 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621609 4723 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621618 4723 flags.go:64] FLAG: --eviction-hard="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621631 4723 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621646 4723 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621657 4723 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621670 4723 flags.go:64] FLAG: --eviction-soft="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621682 4723 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621692 4723 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621704 4723 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621715 4723 flags.go:64] FLAG: --experimental-mounter-path="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621726 4723 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621735 4723 flags.go:64] FLAG: --fail-swap-on="true" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621745 4723 flags.go:64] FLAG: --feature-gates="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621757 4723 flags.go:64] FLAG: --file-check-frequency="20s" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621767 4723 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621777 4723 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621787 4723 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621797 4723 flags.go:64] FLAG: --healthz-port="10248" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621807 4723 flags.go:64] FLAG: --help="false" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621817 4723 flags.go:64] FLAG: --hostname-override="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621834 4723 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621844 4723 flags.go:64] FLAG: --http-check-frequency="20s" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621854 4723 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621900 4723 flags.go:64] FLAG: --image-credential-provider-config="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621912 4723 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621923 4723 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621933 4723 flags.go:64] FLAG: --image-service-endpoint="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621943 4723 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621953 4723 flags.go:64] FLAG: --kube-api-burst="100" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621963 4723 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621973 4723 flags.go:64] FLAG: --kube-api-qps="50" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621983 4723 flags.go:64] FLAG: --kube-reserved="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.621993 4723 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622002 4723 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622013 4723 flags.go:64] FLAG: --kubelet-cgroups="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622023 4723 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622033 4723 flags.go:64] FLAG: --lock-file="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622043 4723 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622053 4723 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622065 4723 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622093 4723 flags.go:64] FLAG: --log-json-split-stream="false" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622105 4723 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622115 4723 flags.go:64] FLAG: --log-text-split-stream="false" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622124 4723 flags.go:64] FLAG: --logging-format="text" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622135 4723 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622146 4723 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622156 4723 flags.go:64] FLAG: --manifest-url="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622167 4723 flags.go:64] FLAG: --manifest-url-header="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622179 4723 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622189 4723 flags.go:64] FLAG: --max-open-files="1000000" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622202 4723 flags.go:64] FLAG: --max-pods="110" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622211 4723 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622225 4723 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622236 4723 flags.go:64] FLAG: --memory-manager-policy="None" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622246 4723 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622258 4723 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622268 4723 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622279 4723 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622318 4723 flags.go:64] FLAG: --node-status-max-images="50" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622329 4723 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622341 4723 flags.go:64] FLAG: --oom-score-adj="-999" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622351 4723 flags.go:64] FLAG: --pod-cidr="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622362 4723 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622381 4723 flags.go:64] FLAG: --pod-manifest-path="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622391 4723 flags.go:64] FLAG: --pod-max-pids="-1" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622403 4723 flags.go:64] FLAG: --pods-per-core="0" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622414 4723 flags.go:64] FLAG: --port="10250" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622425 4723 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622436 4723 flags.go:64] FLAG: --provider-id="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622447 4723 flags.go:64] FLAG: --qos-reserved="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622458 4723 flags.go:64] FLAG: --read-only-port="10255" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622470 4723 flags.go:64] FLAG: --register-node="true" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622481 4723 flags.go:64] FLAG: --register-schedulable="true" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622492 4723 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622511 4723 flags.go:64] FLAG: --registry-burst="10" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622522 4723 flags.go:64] FLAG: --registry-qps="5" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622532 4723 flags.go:64] FLAG: --reserved-cpus="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622544 4723 flags.go:64] FLAG: --reserved-memory="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622558 4723 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622570 4723 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622582 4723 flags.go:64] FLAG: --rotate-certificates="false" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622593 4723 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622604 4723 flags.go:64] FLAG: --runonce="false" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622615 4723 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622634 4723 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622646 4723 flags.go:64] FLAG: --seccomp-default="false" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622658 4723 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622669 4723 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622680 4723 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622691 4723 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622702 4723 flags.go:64] FLAG: --storage-driver-password="root" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622714 4723 flags.go:64] FLAG: --storage-driver-secure="false" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622725 4723 flags.go:64] FLAG: --storage-driver-table="stats" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622736 4723 flags.go:64] FLAG: --storage-driver-user="root" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622747 4723 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622757 4723 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622768 4723 flags.go:64] FLAG: --system-cgroups="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622779 4723 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622798 4723 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622808 4723 flags.go:64] FLAG: --tls-cert-file="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622820 4723 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622834 4723 flags.go:64] FLAG: --tls-min-version="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622845 4723 flags.go:64] FLAG: --tls-private-key-file="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622855 4723 flags.go:64] FLAG: --topology-manager-policy="none" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622907 4723 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622918 4723 flags.go:64] FLAG: --topology-manager-scope="container" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622928 4723 flags.go:64] FLAG: --v="2" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622943 4723 flags.go:64] FLAG: --version="false" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622957 4723 flags.go:64] FLAG: --vmodule="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622969 4723 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.622981 4723 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623242 4723 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623256 4723 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623266 4723 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623275 4723 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623283 4723 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623296 4723 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623304 4723 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623313 4723 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623324 4723 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623335 4723 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623345 4723 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623354 4723 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623362 4723 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623370 4723 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623379 4723 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623386 4723 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623395 4723 feature_gate.go:330] unrecognized feature gate: Example Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623403 4723 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623410 4723 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623418 4723 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623426 4723 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623434 4723 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623443 4723 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623450 4723 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623458 4723 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623466 4723 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623474 4723 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623482 4723 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623489 4723 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623497 4723 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623505 4723 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623512 4723 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623520 4723 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623528 4723 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623536 4723 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623543 4723 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623551 4723 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623564 4723 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623574 4723 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623582 4723 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623591 4723 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623600 4723 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623607 4723 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623615 4723 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623623 4723 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623630 4723 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623638 4723 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623646 4723 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623654 4723 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623662 4723 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623671 4723 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623679 4723 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623689 4723 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623704 4723 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623714 4723 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623723 4723 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623731 4723 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623739 4723 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623747 4723 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623755 4723 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623763 4723 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623771 4723 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623779 4723 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623786 4723 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623794 4723 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623801 4723 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623809 4723 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623817 4723 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623825 4723 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623835 4723 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.623843 4723 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.623893 4723 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.636433 4723 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.636476 4723 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.636609 4723 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.636624 4723 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.636635 4723 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.636645 4723 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.636654 4723 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.636663 4723 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.636672 4723 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.636680 4723 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.636688 4723 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.636698 4723 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.636707 4723 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.636715 4723 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.636723 4723 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.636733 4723 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.636741 4723 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.636750 4723 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.636760 4723 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.636768 4723 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.636777 4723 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.636786 4723 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.636797 4723 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.636806 4723 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.636815 4723 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.636824 4723 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.636834 4723 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.636843 4723 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.636853 4723 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.636895 4723 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.636907 4723 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.636918 4723 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.636929 4723 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.636940 4723 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.636950 4723 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.636959 4723 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.636968 4723 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.636976 4723 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.636985 4723 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.636994 4723 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637002 4723 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637010 4723 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637022 4723 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637033 4723 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637043 4723 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637051 4723 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637061 4723 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637070 4723 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637079 4723 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637087 4723 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637096 4723 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637107 4723 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637117 4723 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637126 4723 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637135 4723 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637143 4723 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637152 4723 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637160 4723 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637169 4723 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637178 4723 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637186 4723 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637195 4723 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637203 4723 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637211 4723 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637220 4723 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637228 4723 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637240 4723 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637251 4723 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637261 4723 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637271 4723 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637279 4723 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637288 4723 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637297 4723 feature_gate.go:330] unrecognized feature gate: Example Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.637311 4723 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637540 4723 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637554 4723 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637565 4723 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637574 4723 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637583 4723 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637593 4723 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637601 4723 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637610 4723 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637618 4723 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637626 4723 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637635 4723 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637643 4723 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637652 4723 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637661 4723 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637669 4723 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637678 4723 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637687 4723 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637727 4723 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637739 4723 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637748 4723 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637758 4723 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637767 4723 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637775 4723 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637783 4723 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637793 4723 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637801 4723 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637812 4723 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637820 4723 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637829 4723 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637841 4723 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637854 4723 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637889 4723 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637899 4723 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637908 4723 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637918 4723 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637927 4723 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637936 4723 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637946 4723 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637955 4723 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637966 4723 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637977 4723 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637987 4723 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.637996 4723 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.638006 4723 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.638014 4723 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.638023 4723 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.638032 4723 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.638042 4723 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.638053 4723 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.638061 4723 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.638070 4723 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.638079 4723 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.638089 4723 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.638097 4723 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.638106 4723 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.638114 4723 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.638126 4723 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.638135 4723 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.638143 4723 feature_gate.go:330] unrecognized feature gate: Example Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.638151 4723 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.638160 4723 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.638169 4723 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.638177 4723 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.638186 4723 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.638194 4723 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.638203 4723 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.638212 4723 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.638224 4723 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.638234 4723 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.638243 4723 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.638252 4723 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.638264 4723 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.638535 4723 server.go:940] "Client rotation is on, will bootstrap in background" Mar 09 12:58:46 crc kubenswrapper[4723]: E0309 12:58:46.643116 4723 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.647927 4723 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.648069 4723 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.649932 4723 server.go:997] "Starting client certificate rotation" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.649979 4723 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.651079 4723 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.677700 4723 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 09 12:58:46 crc kubenswrapper[4723]: E0309 12:58:46.680991 4723 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.129:6443: connect: connection refused" logger="UnhandledError" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.681156 4723 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.699313 4723 log.go:25] "Validated CRI v1 runtime API" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.735452 4723 log.go:25] "Validated CRI v1 image API" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.737831 4723 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.743645 4723 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-09-12-54-18-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.743700 4723 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.772705 4723 manager.go:217] Machine: {Timestamp:2026-03-09 12:58:46.768263027 +0000 UTC m=+0.782730617 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654132736 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:548816a4-d164-4614-b3ad-c3e4f48e94b9 BootID:7a1a0b61-b39c-449b-8223-316b94b8c26c Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827068416 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730829824 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:52:a6:e2 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:52:a6:e2 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:ef:25:59 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:11:29:57 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:39:f8:8e Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:34:01:e8 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:ce:5a:74:73:96:21 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:56:62:34:02:ec:1c Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654132736 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.773262 4723 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.773784 4723 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.774694 4723 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.775081 4723 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.775159 4723 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.775652 4723 topology_manager.go:138] "Creating topology manager with none policy" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.775677 4723 container_manager_linux.go:303] "Creating device plugin manager" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.776283 4723 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.776354 4723 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.776695 4723 state_mem.go:36] "Initialized new in-memory state store" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.776888 4723 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.780551 4723 kubelet.go:418] "Attempting to sync node with API server" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.780792 4723 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.780850 4723 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.780906 4723 kubelet.go:324] "Adding apiserver pod source" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.780927 4723 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.786990 4723 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.788270 4723 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.129:6443: connect: connection refused Mar 09 12:58:46 crc kubenswrapper[4723]: E0309 12:58:46.788401 4723 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.129:6443: connect: connection refused" logger="UnhandledError" Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.788260 4723 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.129:6443: connect: connection refused Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.788496 4723 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 09 12:58:46 crc kubenswrapper[4723]: E0309 12:58:46.788496 4723 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.129:6443: connect: connection refused" logger="UnhandledError" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.790853 4723 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.792915 4723 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.792975 4723 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.792999 4723 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.793019 4723 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.793053 4723 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.793074 4723 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.793100 4723 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.793134 4723 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.793156 4723 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.793176 4723 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.793235 4723 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.793256 4723 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.794692 4723 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.795812 4723 server.go:1280] "Started kubelet" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.800027 4723 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.800253 4723 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.801191 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.129:6443: connect: connection refused Mar 09 12:58:46 crc systemd[1]: Started Kubernetes Kubelet. Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.803717 4723 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.805462 4723 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.805527 4723 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.805974 4723 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.806008 4723 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 09 12:58:46 crc kubenswrapper[4723]: E0309 12:58:46.813466 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.813765 4723 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 09 12:58:46 crc kubenswrapper[4723]: E0309 12:58:46.814199 4723 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" interval="200ms" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.814260 4723 factory.go:55] Registering systemd factory Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.814301 4723 factory.go:221] Registration of the systemd container factory successfully Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.814477 4723 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.129:6443: connect: connection refused Mar 09 12:58:46 crc kubenswrapper[4723]: E0309 12:58:46.814591 4723 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.129:6443: connect: connection refused" logger="UnhandledError" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.815256 4723 factory.go:153] Registering CRI-O factory Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.815354 4723 factory.go:221] Registration of the crio container factory successfully Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.815671 4723 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.815770 4723 factory.go:103] Registering Raw factory Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.815807 4723 manager.go:1196] Started watching for new ooms in manager Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.816117 4723 server.go:460] "Adding debug handlers to kubelet server" Mar 09 12:58:46 crc kubenswrapper[4723]: E0309 12:58:46.815099 4723 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.129:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189b2db363b6c60e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:46.795740686 +0000 UTC m=+0.810208316,LastTimestamp:2026-03-09 12:58:46.795740686 +0000 UTC m=+0.810208316,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.818037 4723 manager.go:319] Starting recovery of all containers Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.829943 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830008 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830018 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830030 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830042 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830053 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830065 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830074 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830086 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830096 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830106 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830118 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830134 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830144 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830152 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830161 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830171 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830179 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830187 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830195 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830207 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830214 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830223 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830231 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830239 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830248 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830258 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830268 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830293 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830322 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830333 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830343 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830355 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830364 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830373 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830383 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830393 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830402 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830410 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830419 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830427 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830436 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830444 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830452 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830460 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830468 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830477 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830485 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830494 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830502 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830511 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830520 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830534 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830544 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830554 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.830565 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832449 4723 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832470 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832481 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832492 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832503 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832512 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832521 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832530 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832539 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832548 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832558 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832567 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832577 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832587 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832596 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832605 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832613 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832621 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832630 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832638 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832647 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832656 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832666 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832675 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832685 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832693 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832701 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832711 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832719 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832728 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832736 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832744 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832752 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832765 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832773 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832780 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832788 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832797 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832807 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832815 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832824 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832833 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832842 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832851 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832876 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832884 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832893 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832902 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832911 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832951 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832961 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832972 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832981 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.832991 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833001 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833010 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833031 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833040 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833049 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833058 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833069 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833078 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833086 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833094 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833103 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833111 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833121 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833129 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833137 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833145 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833155 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833164 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833178 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833186 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833194 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833202 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833212 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833220 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833229 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833237 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833245 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833254 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833262 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833272 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833280 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833288 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833297 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833306 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833317 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833327 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833337 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833346 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833356 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833364 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833373 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833382 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833391 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833400 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833409 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833418 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833427 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833435 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833444 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833456 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833465 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833474 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833484 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833493 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833501 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833510 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833518 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833526 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833535 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833544 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833552 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833562 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833570 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833578 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833587 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833595 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833604 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833612 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833622 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833630 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833641 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833654 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833677 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833693 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833707 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833718 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833728 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833739 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833748 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833758 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833767 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833777 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833787 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833798 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833806 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833815 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833825 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833835 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833844 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833853 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833879 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833888 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833897 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833912 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833922 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833932 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833944 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833953 4723 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833962 4723 reconstruct.go:97] "Volume reconstruction finished" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.833969 4723 reconciler.go:26] "Reconciler: start to sync state" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.852681 4723 manager.go:324] Recovery completed Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.871612 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.873493 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.873541 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.873554 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.873485 4723 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.875014 4723 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.875043 4723 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.875062 4723 state_mem.go:36] "Initialized new in-memory state store" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.876618 4723 policy_none.go:49] "None policy: Start" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.879023 4723 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.879578 4723 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.879627 4723 kubelet.go:2335] "Starting kubelet main sync loop" Mar 09 12:58:46 crc kubenswrapper[4723]: E0309 12:58:46.879712 4723 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 09 12:58:46 crc kubenswrapper[4723]: W0309 12:58:46.881293 4723 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.129:6443: connect: connection refused Mar 09 12:58:46 crc kubenswrapper[4723]: E0309 12:58:46.881396 4723 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.129:6443: connect: connection refused" logger="UnhandledError" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.881912 4723 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.881954 4723 state_mem.go:35] "Initializing new in-memory state store" Mar 09 12:58:46 crc kubenswrapper[4723]: E0309 12:58:46.914207 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.944799 4723 manager.go:334] "Starting Device Plugin manager" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.945076 4723 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.945092 4723 server.go:79] "Starting device plugin registration server" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.945452 4723 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.945475 4723 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.945876 4723 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.945943 4723 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.945951 4723 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 09 12:58:46 crc kubenswrapper[4723]: E0309 12:58:46.953756 4723 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.980282 4723 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.980469 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.981989 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.982031 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.982045 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.982194 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.982352 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.982400 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.982822 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.982852 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.982888 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.983044 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.983131 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.983156 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.983324 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.983345 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.983359 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.983709 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.983730 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.983740 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.983840 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.983939 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.983960 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.983968 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.984197 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.984227 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.984626 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.984649 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.984658 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.984735 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.984925 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.984981 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.984996 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.984979 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.985033 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.985528 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.985546 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.985556 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.985660 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.985674 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.985682 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.985783 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.985800 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.986296 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.986312 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:58:46 crc kubenswrapper[4723]: I0309 12:58:46.986321 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:58:47 crc kubenswrapper[4723]: E0309 12:58:47.015378 4723 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" interval="400ms" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.036974 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.037131 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.037247 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.037337 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.037437 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.037525 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.037632 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.037726 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.037756 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.037773 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.037789 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.037808 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.037825 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.037842 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.037872 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.046245 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.047244 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.047281 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.047291 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.047313 4723 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 12:58:47 crc kubenswrapper[4723]: E0309 12:58:47.047656 4723 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.129:6443: connect: connection refused" node="crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.139248 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.139541 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.139673 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.139824 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.139998 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.140108 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.140027 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.139605 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.140059 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.140123 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.140444 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.140485 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.140507 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.139546 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.140563 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.140547 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.140510 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.140605 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.140627 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.140649 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.140667 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.140683 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.140704 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.140736 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.140766 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.140763 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.140783 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.140795 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.140908 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.141425 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.248832 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.250764 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.250819 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.250832 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.250901 4723 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 12:58:47 crc kubenswrapper[4723]: E0309 12:58:47.251344 4723 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.129:6443: connect: connection refused" node="crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.325461 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.332637 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.351769 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.370811 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: W0309 12:58:47.375846 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-3e90e25c192fed707ea689513dc4c8e6d94f00a8d69547f3a78033131595759c WatchSource:0}: Error finding container 3e90e25c192fed707ea689513dc4c8e6d94f00a8d69547f3a78033131595759c: Status 404 returned error can't find the container with id 3e90e25c192fed707ea689513dc4c8e6d94f00a8d69547f3a78033131595759c Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.376314 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 12:58:47 crc kubenswrapper[4723]: W0309 12:58:47.379808 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-d26a907efa72b897f0652672acc0db292017e2680574fa9c37c7d8d19883a68d WatchSource:0}: Error finding container d26a907efa72b897f0652672acc0db292017e2680574fa9c37c7d8d19883a68d: Status 404 returned error can't find the container with id d26a907efa72b897f0652672acc0db292017e2680574fa9c37c7d8d19883a68d Mar 09 12:58:47 crc kubenswrapper[4723]: W0309 12:58:47.385453 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-5db6bc16e25f1e1c275b94713b12526862b1c7d11bd6b854b3a12df159e14059 WatchSource:0}: Error finding container 5db6bc16e25f1e1c275b94713b12526862b1c7d11bd6b854b3a12df159e14059: Status 404 returned error can't find the container with id 5db6bc16e25f1e1c275b94713b12526862b1c7d11bd6b854b3a12df159e14059 Mar 09 12:58:47 crc kubenswrapper[4723]: W0309 12:58:47.391089 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-361bdf007fbd5ce0ad60795cd27dfa5818c2ca723e046bc1e418982c938e4531 WatchSource:0}: Error finding container 361bdf007fbd5ce0ad60795cd27dfa5818c2ca723e046bc1e418982c938e4531: Status 404 returned error can't find the container with id 361bdf007fbd5ce0ad60795cd27dfa5818c2ca723e046bc1e418982c938e4531 Mar 09 12:58:47 crc kubenswrapper[4723]: E0309 12:58:47.416647 4723 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" interval="800ms" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.652213 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.653421 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.653483 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.653498 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.653527 4723 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 12:58:47 crc kubenswrapper[4723]: E0309 12:58:47.654022 4723 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.129:6443: connect: connection refused" node="crc" Mar 09 12:58:47 crc kubenswrapper[4723]: W0309 12:58:47.778246 4723 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.129:6443: connect: connection refused Mar 09 12:58:47 crc kubenswrapper[4723]: E0309 12:58:47.778350 4723 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.129:6443: connect: connection refused" logger="UnhandledError" Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.802472 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.129:6443: connect: connection refused Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.887009 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5db6bc16e25f1e1c275b94713b12526862b1c7d11bd6b854b3a12df159e14059"} Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.888579 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d26a907efa72b897f0652672acc0db292017e2680574fa9c37c7d8d19883a68d"} Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.889486 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"3e90e25c192fed707ea689513dc4c8e6d94f00a8d69547f3a78033131595759c"} Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.890163 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"10a48c3366a489633374ef60dbe380bb7397b395c2a6c0624499d1dbb7459c24"} Mar 09 12:58:47 crc kubenswrapper[4723]: I0309 12:58:47.891092 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"361bdf007fbd5ce0ad60795cd27dfa5818c2ca723e046bc1e418982c938e4531"} Mar 09 12:58:47 crc kubenswrapper[4723]: W0309 12:58:47.975269 4723 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.129:6443: connect: connection refused Mar 09 12:58:47 crc kubenswrapper[4723]: E0309 12:58:47.975359 4723 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.129:6443: connect: connection refused" logger="UnhandledError" Mar 09 12:58:48 crc kubenswrapper[4723]: E0309 12:58:48.180791 4723 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.129:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189b2db363b6c60e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:46.795740686 +0000 UTC m=+0.810208316,LastTimestamp:2026-03-09 12:58:46.795740686 +0000 UTC m=+0.810208316,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:58:48 crc kubenswrapper[4723]: E0309 12:58:48.218299 4723 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" interval="1.6s" Mar 09 12:58:48 crc kubenswrapper[4723]: W0309 12:58:48.221183 4723 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.129:6443: connect: connection refused Mar 09 12:58:48 crc kubenswrapper[4723]: E0309 12:58:48.221301 4723 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.129:6443: connect: connection refused" logger="UnhandledError" Mar 09 12:58:48 crc kubenswrapper[4723]: W0309 12:58:48.270536 4723 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.129:6443: connect: connection refused Mar 09 12:58:48 crc kubenswrapper[4723]: E0309 12:58:48.270658 4723 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.129:6443: connect: connection refused" logger="UnhandledError" Mar 09 12:58:48 crc kubenswrapper[4723]: I0309 12:58:48.454767 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:58:48 crc kubenswrapper[4723]: I0309 12:58:48.456208 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:58:48 crc kubenswrapper[4723]: I0309 12:58:48.456250 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:58:48 crc kubenswrapper[4723]: I0309 12:58:48.456266 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:58:48 crc kubenswrapper[4723]: I0309 12:58:48.456297 4723 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 12:58:48 crc kubenswrapper[4723]: E0309 12:58:48.456805 4723 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.129:6443: connect: connection refused" node="crc" Mar 09 12:58:48 crc kubenswrapper[4723]: I0309 12:58:48.802935 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.129:6443: connect: connection refused Mar 09 12:58:48 crc kubenswrapper[4723]: I0309 12:58:48.853670 4723 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 12:58:48 crc kubenswrapper[4723]: E0309 12:58:48.854999 4723 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.129:6443: connect: connection refused" logger="UnhandledError" Mar 09 12:58:48 crc kubenswrapper[4723]: I0309 12:58:48.897349 4723 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ae9a5c84863b11797de92c148fc1e17d198adb15c82153b389e844a0be767528" exitCode=0 Mar 09 12:58:48 crc kubenswrapper[4723]: I0309 12:58:48.897448 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ae9a5c84863b11797de92c148fc1e17d198adb15c82153b389e844a0be767528"} Mar 09 12:58:48 crc kubenswrapper[4723]: I0309 12:58:48.897505 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:58:48 crc kubenswrapper[4723]: I0309 12:58:48.898720 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:58:48 crc kubenswrapper[4723]: I0309 12:58:48.898749 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:58:48 crc kubenswrapper[4723]: I0309 12:58:48.898760 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:58:48 crc kubenswrapper[4723]: I0309 12:58:48.901168 4723 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165" exitCode=0 Mar 09 12:58:48 crc kubenswrapper[4723]: I0309 12:58:48.901226 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165"} Mar 09 12:58:48 crc kubenswrapper[4723]: I0309 12:58:48.901288 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:58:48 crc kubenswrapper[4723]: I0309 12:58:48.902538 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:58:48 crc kubenswrapper[4723]: I0309 12:58:48.902561 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:58:48 crc kubenswrapper[4723]: I0309 12:58:48.902573 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:58:48 crc kubenswrapper[4723]: I0309 12:58:48.903305 4723 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="228079a220bbfc7f4d8dc9529398b36cf51b4093c84900cb89f45128c9049f5f" exitCode=0 Mar 09 12:58:48 crc kubenswrapper[4723]: I0309 12:58:48.903379 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:58:48 crc kubenswrapper[4723]: I0309 12:58:48.903377 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"228079a220bbfc7f4d8dc9529398b36cf51b4093c84900cb89f45128c9049f5f"} Mar 09 12:58:48 crc kubenswrapper[4723]: I0309 12:58:48.904137 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:58:48 crc kubenswrapper[4723]: I0309 12:58:48.904163 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:58:48 crc kubenswrapper[4723]: I0309 12:58:48.904173 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:58:48 crc kubenswrapper[4723]: I0309 12:58:48.907263 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:58:48 crc kubenswrapper[4723]: I0309 12:58:48.907259 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d5b7dd23119bfc1f8a7b5e83f86c9b2cf5251fa2034f279c0fbf10ace885c1fa"} Mar 09 12:58:48 crc kubenswrapper[4723]: I0309 12:58:48.907324 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"53e3a9b881b10c9e76abe1296bbd7e75c6bbf41a7b47a319177614567c22de0a"} Mar 09 12:58:48 crc kubenswrapper[4723]: I0309 12:58:48.907346 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8a85b6fe63998440f0ad1d952a2eaaa3fb53449997bae0986b4fc1c776b90063"} Mar 09 12:58:48 crc kubenswrapper[4723]: I0309 12:58:48.907366 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c9d14cd0710057eb4ca03d616f762c6dbd472b8bf45c04180d00387292000f19"} Mar 09 12:58:48 crc kubenswrapper[4723]: I0309 12:58:48.910641 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:58:48 crc kubenswrapper[4723]: I0309 12:58:48.911183 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:58:48 crc kubenswrapper[4723]: I0309 12:58:48.911230 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:58:48 crc kubenswrapper[4723]: I0309 12:58:48.912255 4723 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9" exitCode=0 Mar 09 12:58:48 crc kubenswrapper[4723]: I0309 12:58:48.912302 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9"} Mar 09 12:58:48 crc kubenswrapper[4723]: I0309 12:58:48.912403 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:58:48 crc kubenswrapper[4723]: I0309 12:58:48.913705 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:58:48 crc kubenswrapper[4723]: I0309 12:58:48.913802 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:58:48 crc kubenswrapper[4723]: I0309 12:58:48.913818 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:58:48 crc kubenswrapper[4723]: I0309 12:58:48.924197 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:58:48 crc kubenswrapper[4723]: I0309 12:58:48.925657 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:58:48 crc kubenswrapper[4723]: I0309 12:58:48.925701 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:58:48 crc kubenswrapper[4723]: I0309 12:58:48.925713 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:58:49 crc kubenswrapper[4723]: I0309 12:58:49.110165 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 12:58:49 crc kubenswrapper[4723]: I0309 12:58:49.173971 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 12:58:49 crc kubenswrapper[4723]: I0309 12:58:49.513303 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 12:58:49 crc kubenswrapper[4723]: I0309 12:58:49.524190 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 12:58:49 crc kubenswrapper[4723]: W0309 12:58:49.577936 4723 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.129:6443: connect: connection refused Mar 09 12:58:49 crc kubenswrapper[4723]: E0309 12:58:49.578011 4723 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.129:6443: connect: connection refused" logger="UnhandledError" Mar 09 12:58:49 crc kubenswrapper[4723]: I0309 12:58:49.802287 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.129:6443: connect: connection refused Mar 09 12:58:49 crc kubenswrapper[4723]: E0309 12:58:49.819284 4723 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" interval="3.2s" Mar 09 12:58:49 crc kubenswrapper[4723]: I0309 12:58:49.917039 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0777edbfbe04e20d91d97fb2daa6bc7ef67cbf56cdd1fdb20849350a5f0d4f23"} Mar 09 12:58:49 crc kubenswrapper[4723]: I0309 12:58:49.917085 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c342ce2d9033896e1d8f3c5428e8b03220c003d3c746103ac17b5cd564663091"} Mar 09 12:58:49 crc kubenswrapper[4723]: I0309 12:58:49.917104 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7b5ea090dbcca9cb0683626e5137a3e9aea8749a1deb31339e45dc6db3336f1b"} Mar 09 12:58:49 crc kubenswrapper[4723]: I0309 12:58:49.917117 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:58:49 crc kubenswrapper[4723]: I0309 12:58:49.917753 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:58:49 crc kubenswrapper[4723]: I0309 12:58:49.917775 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:58:49 crc kubenswrapper[4723]: I0309 12:58:49.917784 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:58:49 crc kubenswrapper[4723]: I0309 12:58:49.919894 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7"} Mar 09 12:58:49 crc kubenswrapper[4723]: I0309 12:58:49.919924 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1"} Mar 09 12:58:49 crc kubenswrapper[4723]: I0309 12:58:49.919940 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140"} Mar 09 12:58:49 crc kubenswrapper[4723]: I0309 12:58:49.919954 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f"} Mar 09 12:58:49 crc kubenswrapper[4723]: I0309 12:58:49.922360 4723 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="6e42c2eda289aaca193f96c781b61ac93dfde285fe70db0c26928e4d64f9287e" exitCode=0 Mar 09 12:58:49 crc kubenswrapper[4723]: I0309 12:58:49.922436 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"6e42c2eda289aaca193f96c781b61ac93dfde285fe70db0c26928e4d64f9287e"} Mar 09 12:58:49 crc kubenswrapper[4723]: I0309 12:58:49.922630 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:58:49 crc kubenswrapper[4723]: I0309 12:58:49.926576 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:58:49 crc kubenswrapper[4723]: I0309 12:58:49.926630 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:58:49 crc kubenswrapper[4723]: I0309 12:58:49.926653 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:58:49 crc kubenswrapper[4723]: I0309 12:58:49.927990 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:58:49 crc kubenswrapper[4723]: I0309 12:58:49.928448 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:58:49 crc kubenswrapper[4723]: I0309 12:58:49.928722 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"4ace64875a6e7a6735b141be933b0fe4fb330351eb8b4c9a943e32312d019849"} Mar 09 12:58:49 crc kubenswrapper[4723]: I0309 12:58:49.928774 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:58:49 crc kubenswrapper[4723]: I0309 12:58:49.928813 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:58:49 crc kubenswrapper[4723]: I0309 12:58:49.929003 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:58:49 crc kubenswrapper[4723]: I0309 12:58:49.929297 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:58:49 crc kubenswrapper[4723]: I0309 12:58:49.929327 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:58:49 crc kubenswrapper[4723]: I0309 12:58:49.929546 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:58:50 crc kubenswrapper[4723]: I0309 12:58:50.057922 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:58:50 crc kubenswrapper[4723]: I0309 12:58:50.065305 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:58:50 crc kubenswrapper[4723]: I0309 12:58:50.065447 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:58:50 crc kubenswrapper[4723]: I0309 12:58:50.065514 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:58:50 crc kubenswrapper[4723]: I0309 12:58:50.065594 4723 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 12:58:50 crc kubenswrapper[4723]: E0309 12:58:50.066061 4723 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.129:6443: connect: connection refused" node="crc" Mar 09 12:58:50 crc kubenswrapper[4723]: I0309 12:58:50.937421 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"09375dccb7fa5e4b307e0511c381a2122353e190297f1f3164a8a11bfbe075e8"} Mar 09 12:58:50 crc kubenswrapper[4723]: I0309 12:58:50.937531 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:58:50 crc kubenswrapper[4723]: I0309 12:58:50.939228 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:58:50 crc kubenswrapper[4723]: I0309 12:58:50.939285 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:58:50 crc kubenswrapper[4723]: I0309 12:58:50.939302 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:58:50 crc kubenswrapper[4723]: I0309 12:58:50.943505 4723 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="c9dc6bb6ef0374b09c8942b16ee6cca8589818d45cb73e34ac23893012e6f51b" exitCode=0 Mar 09 12:58:50 crc kubenswrapper[4723]: I0309 12:58:50.943618 4723 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 09 12:58:50 crc kubenswrapper[4723]: I0309 12:58:50.943691 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:58:50 crc kubenswrapper[4723]: I0309 12:58:50.943722 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:58:50 crc kubenswrapper[4723]: I0309 12:58:50.943728 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"c9dc6bb6ef0374b09c8942b16ee6cca8589818d45cb73e34ac23893012e6f51b"} Mar 09 12:58:50 crc kubenswrapper[4723]: I0309 12:58:50.943764 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:58:50 crc kubenswrapper[4723]: I0309 12:58:50.943839 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:58:50 crc kubenswrapper[4723]: I0309 12:58:50.945628 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:58:50 crc kubenswrapper[4723]: I0309 12:58:50.945691 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:58:50 crc kubenswrapper[4723]: I0309 12:58:50.945714 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:58:50 crc kubenswrapper[4723]: I0309 12:58:50.945647 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:58:50 crc kubenswrapper[4723]: I0309 12:58:50.945785 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:58:50 crc kubenswrapper[4723]: I0309 12:58:50.945803 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:58:50 crc kubenswrapper[4723]: I0309 12:58:50.945845 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:58:50 crc kubenswrapper[4723]: I0309 12:58:50.945817 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:58:50 crc kubenswrapper[4723]: I0309 12:58:50.945903 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:58:50 crc kubenswrapper[4723]: I0309 12:58:50.945913 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:58:50 crc kubenswrapper[4723]: I0309 12:58:50.945818 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:58:50 crc kubenswrapper[4723]: I0309 12:58:50.946114 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:58:51 crc kubenswrapper[4723]: I0309 12:58:51.683472 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 12:58:51 crc kubenswrapper[4723]: I0309 12:58:51.950608 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:58:51 crc kubenswrapper[4723]: I0309 12:58:51.950832 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9c782f280333f2744f1613dba5865a07bc62621bbfb16afbc07b7a2773158158"} Mar 09 12:58:51 crc kubenswrapper[4723]: I0309 12:58:51.950901 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0deb75ea2912aa03cc13add72de571ae022399aa2d93476b6c033743bce24064"} Mar 09 12:58:51 crc kubenswrapper[4723]: I0309 12:58:51.950916 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1a5d6dc0927a17b46df81771efe98ee1d01b5642b9c4479a1178c9e19b51a498"} Mar 09 12:58:51 crc kubenswrapper[4723]: I0309 12:58:51.950934 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"05eaa9e188d05222aa0d112f75c9cdec6deb2da334d9d113fceae9f64c8608a5"} Mar 09 12:58:51 crc kubenswrapper[4723]: I0309 12:58:51.951024 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 12:58:51 crc kubenswrapper[4723]: I0309 12:58:51.951062 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:58:51 crc kubenswrapper[4723]: I0309 12:58:51.952090 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:58:51 crc kubenswrapper[4723]: I0309 12:58:51.952145 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:58:51 crc kubenswrapper[4723]: I0309 12:58:51.952165 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:58:51 crc kubenswrapper[4723]: I0309 12:58:51.952177 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:58:51 crc kubenswrapper[4723]: I0309 12:58:51.952235 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:58:51 crc kubenswrapper[4723]: I0309 12:58:51.952261 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:58:52 crc kubenswrapper[4723]: I0309 12:58:52.296097 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 12:58:52 crc kubenswrapper[4723]: I0309 12:58:52.960471 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:58:52 crc kubenswrapper[4723]: I0309 12:58:52.961112 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6a819c722cd255915dc8ed291ef3fb85d4c96db713fa891965ea422c77067ac3"} Mar 09 12:58:52 crc kubenswrapper[4723]: I0309 12:58:52.961317 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:58:52 crc kubenswrapper[4723]: I0309 12:58:52.962625 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:58:52 crc kubenswrapper[4723]: I0309 12:58:52.962688 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:58:52 crc kubenswrapper[4723]: I0309 12:58:52.962709 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:58:52 crc kubenswrapper[4723]: I0309 12:58:52.963043 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:58:52 crc kubenswrapper[4723]: I0309 12:58:52.963114 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:58:52 crc kubenswrapper[4723]: I0309 12:58:52.963129 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:58:53 crc kubenswrapper[4723]: I0309 12:58:53.146329 4723 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 12:58:53 crc kubenswrapper[4723]: I0309 12:58:53.267135 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:58:53 crc kubenswrapper[4723]: I0309 12:58:53.269223 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:58:53 crc kubenswrapper[4723]: I0309 12:58:53.269289 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:58:53 crc kubenswrapper[4723]: I0309 12:58:53.269309 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:58:53 crc kubenswrapper[4723]: I0309 12:58:53.269352 4723 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 12:58:53 crc kubenswrapper[4723]: I0309 12:58:53.963226 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:58:53 crc kubenswrapper[4723]: I0309 12:58:53.963263 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:58:53 crc kubenswrapper[4723]: I0309 12:58:53.964895 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:58:53 crc kubenswrapper[4723]: I0309 12:58:53.964950 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:58:53 crc kubenswrapper[4723]: I0309 12:58:53.964973 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:58:53 crc kubenswrapper[4723]: I0309 12:58:53.965045 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:58:53 crc kubenswrapper[4723]: I0309 12:58:53.965085 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:58:53 crc kubenswrapper[4723]: I0309 12:58:53.965096 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:58:54 crc kubenswrapper[4723]: I0309 12:58:54.684510 4723 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 12:58:54 crc kubenswrapper[4723]: I0309 12:58:54.684605 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 12:58:54 crc kubenswrapper[4723]: I0309 12:58:54.909504 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 12:58:54 crc kubenswrapper[4723]: I0309 12:58:54.966651 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:58:54 crc kubenswrapper[4723]: I0309 12:58:54.968328 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:58:54 crc kubenswrapper[4723]: I0309 12:58:54.968400 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:58:54 crc kubenswrapper[4723]: I0309 12:58:54.968426 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:58:55 crc kubenswrapper[4723]: I0309 12:58:55.088667 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 12:58:55 crc kubenswrapper[4723]: I0309 12:58:55.088963 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:58:55 crc kubenswrapper[4723]: I0309 12:58:55.090475 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:58:55 crc kubenswrapper[4723]: I0309 12:58:55.090541 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:58:55 crc kubenswrapper[4723]: I0309 12:58:55.090555 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:58:55 crc kubenswrapper[4723]: I0309 12:58:55.651719 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 09 12:58:55 crc kubenswrapper[4723]: I0309 12:58:55.652010 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:58:55 crc kubenswrapper[4723]: I0309 12:58:55.654386 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:58:55 crc kubenswrapper[4723]: I0309 12:58:55.654441 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:58:55 crc kubenswrapper[4723]: I0309 12:58:55.654458 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:58:56 crc kubenswrapper[4723]: I0309 12:58:56.185845 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 09 12:58:56 crc kubenswrapper[4723]: I0309 12:58:56.186155 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:58:56 crc kubenswrapper[4723]: I0309 12:58:56.187613 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:58:56 crc kubenswrapper[4723]: I0309 12:58:56.187668 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:58:56 crc kubenswrapper[4723]: I0309 12:58:56.187689 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:58:56 crc kubenswrapper[4723]: E0309 12:58:56.953914 4723 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 12:58:59 crc kubenswrapper[4723]: I0309 12:58:59.181753 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 12:58:59 crc kubenswrapper[4723]: I0309 12:58:59.182058 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:58:59 crc kubenswrapper[4723]: I0309 12:58:59.183530 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:58:59 crc kubenswrapper[4723]: I0309 12:58:59.183626 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:58:59 crc kubenswrapper[4723]: I0309 12:58:59.183656 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:59:00 crc kubenswrapper[4723]: W0309 12:59:00.514121 4723 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 09 12:59:00 crc kubenswrapper[4723]: I0309 12:59:00.514219 4723 trace.go:236] Trace[1382227614]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Mar-2026 12:58:50.512) (total time: 10001ms): Mar 09 12:59:00 crc kubenswrapper[4723]: Trace[1382227614]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (12:59:00.514) Mar 09 12:59:00 crc kubenswrapper[4723]: Trace[1382227614]: [10.001357073s] [10.001357073s] END Mar 09 12:59:00 crc kubenswrapper[4723]: E0309 12:59:00.514244 4723 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 09 12:59:00 crc kubenswrapper[4723]: I0309 12:59:00.804195 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 09 12:59:00 crc kubenswrapper[4723]: W0309 12:59:00.830015 4723 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 09 12:59:00 crc kubenswrapper[4723]: I0309 12:59:00.830128 4723 trace.go:236] Trace[1909141520]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Mar-2026 12:58:50.829) (total time: 10001ms): Mar 09 12:59:00 crc kubenswrapper[4723]: Trace[1909141520]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10000ms (12:59:00.830) Mar 09 12:59:00 crc kubenswrapper[4723]: Trace[1909141520]: [10.001073911s] [10.001073911s] END Mar 09 12:59:00 crc kubenswrapper[4723]: E0309 12:59:00.830158 4723 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 09 12:59:00 crc kubenswrapper[4723]: I0309 12:59:00.896998 4723 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:44782->192.168.126.11:17697: read: connection reset by peer" start-of-body= Mar 09 12:59:00 crc kubenswrapper[4723]: I0309 12:59:00.897099 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:44782->192.168.126.11:17697: read: connection reset by peer" Mar 09 12:59:00 crc kubenswrapper[4723]: W0309 12:59:00.954559 4723 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 09 12:59:00 crc kubenswrapper[4723]: I0309 12:59:00.954668 4723 trace.go:236] Trace[291559994]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Mar-2026 12:58:50.953) (total time: 10001ms): Mar 09 12:59:00 crc kubenswrapper[4723]: Trace[291559994]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10000ms (12:59:00.954) Mar 09 12:59:00 crc kubenswrapper[4723]: Trace[291559994]: [10.001064044s] [10.001064044s] END Mar 09 12:59:00 crc kubenswrapper[4723]: E0309 12:59:00.954696 4723 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 09 12:59:00 crc kubenswrapper[4723]: I0309 12:59:00.985493 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 09 12:59:00 crc kubenswrapper[4723]: I0309 12:59:00.988447 4723 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="09375dccb7fa5e4b307e0511c381a2122353e190297f1f3164a8a11bfbe075e8" exitCode=255 Mar 09 12:59:00 crc kubenswrapper[4723]: I0309 12:59:00.988545 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"09375dccb7fa5e4b307e0511c381a2122353e190297f1f3164a8a11bfbe075e8"} Mar 09 12:59:00 crc kubenswrapper[4723]: I0309 12:59:00.989100 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:59:00 crc kubenswrapper[4723]: I0309 12:59:00.990849 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:59:00 crc kubenswrapper[4723]: I0309 12:59:00.991086 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:59:00 crc kubenswrapper[4723]: I0309 12:59:00.991241 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:59:00 crc kubenswrapper[4723]: I0309 12:59:00.992345 4723 scope.go:117] "RemoveContainer" containerID="09375dccb7fa5e4b307e0511c381a2122353e190297f1f3164a8a11bfbe075e8" Mar 09 12:59:01 crc kubenswrapper[4723]: E0309 12:59:01.166396 4723 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:01Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 09 12:59:01 crc kubenswrapper[4723]: E0309 12:59:01.178789 4723 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:01Z is after 2026-02-23T05:33:13Z" node="crc" Mar 09 12:59:01 crc kubenswrapper[4723]: E0309 12:59:01.182073 4723 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:01Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 12:59:01 crc kubenswrapper[4723]: I0309 12:59:01.184148 4723 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Mar 09 12:59:01 crc kubenswrapper[4723]: I0309 12:59:01.184235 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 09 12:59:01 crc kubenswrapper[4723]: W0309 12:59:01.185755 4723 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:01Z is after 2026-02-23T05:33:13Z Mar 09 12:59:01 crc kubenswrapper[4723]: E0309 12:59:01.185833 4723 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:01Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 12:59:01 crc kubenswrapper[4723]: E0309 12:59:01.188805 4723 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:01Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b2db363b6c60e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:46.795740686 +0000 UTC m=+0.810208316,LastTimestamp:2026-03-09 12:58:46.795740686 +0000 UTC m=+0.810208316,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:01 crc kubenswrapper[4723]: I0309 12:59:01.194468 4723 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Mar 09 12:59:01 crc kubenswrapper[4723]: I0309 12:59:01.194533 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 09 12:59:01 crc kubenswrapper[4723]: I0309 12:59:01.805640 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:01Z is after 2026-02-23T05:33:13Z Mar 09 12:59:01 crc kubenswrapper[4723]: I0309 12:59:01.992790 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 09 12:59:01 crc kubenswrapper[4723]: I0309 12:59:01.994903 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d404d0354964e2f4f92dc541c542cba8638bc1bfc58eac37e3b16e23562bce0b"} Mar 09 12:59:01 crc kubenswrapper[4723]: I0309 12:59:01.995014 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:59:01 crc kubenswrapper[4723]: I0309 12:59:01.995781 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:59:01 crc kubenswrapper[4723]: I0309 12:59:01.995806 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:59:01 crc kubenswrapper[4723]: I0309 12:59:01.995815 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:59:02 crc kubenswrapper[4723]: I0309 12:59:02.301105 4723 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 09 12:59:02 crc kubenswrapper[4723]: [+]log ok Mar 09 12:59:02 crc kubenswrapper[4723]: [+]etcd ok Mar 09 12:59:02 crc kubenswrapper[4723]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 09 12:59:02 crc kubenswrapper[4723]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 09 12:59:02 crc kubenswrapper[4723]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 09 12:59:02 crc kubenswrapper[4723]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 09 12:59:02 crc kubenswrapper[4723]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 09 12:59:02 crc kubenswrapper[4723]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 09 12:59:02 crc kubenswrapper[4723]: [+]poststarthook/generic-apiserver-start-informers ok Mar 09 12:59:02 crc kubenswrapper[4723]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 09 12:59:02 crc kubenswrapper[4723]: [+]poststarthook/priority-and-fairness-filter ok Mar 09 12:59:02 crc kubenswrapper[4723]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 09 12:59:02 crc kubenswrapper[4723]: [+]poststarthook/start-apiextensions-informers ok Mar 09 12:59:02 crc kubenswrapper[4723]: [+]poststarthook/start-apiextensions-controllers ok Mar 09 12:59:02 crc kubenswrapper[4723]: [+]poststarthook/crd-informer-synced ok Mar 09 12:59:02 crc kubenswrapper[4723]: [+]poststarthook/start-system-namespaces-controller ok Mar 09 12:59:02 crc kubenswrapper[4723]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 09 12:59:02 crc kubenswrapper[4723]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 09 12:59:02 crc kubenswrapper[4723]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 09 12:59:02 crc kubenswrapper[4723]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 09 12:59:02 crc kubenswrapper[4723]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 09 12:59:02 crc kubenswrapper[4723]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 09 12:59:02 crc kubenswrapper[4723]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Mar 09 12:59:02 crc kubenswrapper[4723]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 09 12:59:02 crc kubenswrapper[4723]: [+]poststarthook/bootstrap-controller ok Mar 09 12:59:02 crc kubenswrapper[4723]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 09 12:59:02 crc kubenswrapper[4723]: [+]poststarthook/start-kube-aggregator-informers ok Mar 09 12:59:02 crc kubenswrapper[4723]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 09 12:59:02 crc kubenswrapper[4723]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 09 12:59:02 crc kubenswrapper[4723]: [+]poststarthook/apiservice-registration-controller ok Mar 09 12:59:02 crc kubenswrapper[4723]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 09 12:59:02 crc kubenswrapper[4723]: [+]poststarthook/apiservice-discovery-controller ok Mar 09 12:59:02 crc kubenswrapper[4723]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 09 12:59:02 crc kubenswrapper[4723]: [+]autoregister-completion ok Mar 09 12:59:02 crc kubenswrapper[4723]: [+]poststarthook/apiservice-openapi-controller ok Mar 09 12:59:02 crc kubenswrapper[4723]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 09 12:59:02 crc kubenswrapper[4723]: livez check failed Mar 09 12:59:02 crc kubenswrapper[4723]: I0309 12:59:02.301179 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 12:59:02 crc kubenswrapper[4723]: I0309 12:59:02.807417 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:02Z is after 2026-02-23T05:33:13Z Mar 09 12:59:02 crc kubenswrapper[4723]: I0309 12:59:02.999304 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 09 12:59:03 crc kubenswrapper[4723]: I0309 12:59:03.000101 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 09 12:59:03 crc kubenswrapper[4723]: I0309 12:59:03.002700 4723 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d404d0354964e2f4f92dc541c542cba8638bc1bfc58eac37e3b16e23562bce0b" exitCode=255 Mar 09 12:59:03 crc kubenswrapper[4723]: I0309 12:59:03.002742 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"d404d0354964e2f4f92dc541c542cba8638bc1bfc58eac37e3b16e23562bce0b"} Mar 09 12:59:03 crc kubenswrapper[4723]: I0309 12:59:03.002799 4723 scope.go:117] "RemoveContainer" containerID="09375dccb7fa5e4b307e0511c381a2122353e190297f1f3164a8a11bfbe075e8" Mar 09 12:59:03 crc kubenswrapper[4723]: I0309 12:59:03.002977 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:59:03 crc kubenswrapper[4723]: I0309 12:59:03.004509 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:59:03 crc kubenswrapper[4723]: I0309 12:59:03.004659 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:59:03 crc kubenswrapper[4723]: I0309 12:59:03.004910 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:59:03 crc kubenswrapper[4723]: I0309 12:59:03.005775 4723 scope.go:117] "RemoveContainer" containerID="d404d0354964e2f4f92dc541c542cba8638bc1bfc58eac37e3b16e23562bce0b" Mar 09 12:59:03 crc kubenswrapper[4723]: E0309 12:59:03.006166 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 12:59:03 crc kubenswrapper[4723]: I0309 12:59:03.805397 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:03Z is after 2026-02-23T05:33:13Z Mar 09 12:59:04 crc kubenswrapper[4723]: I0309 12:59:04.008427 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 09 12:59:04 crc kubenswrapper[4723]: I0309 12:59:04.471257 4723 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 12:59:04 crc kubenswrapper[4723]: I0309 12:59:04.471504 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:59:04 crc kubenswrapper[4723]: I0309 12:59:04.472880 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:59:04 crc kubenswrapper[4723]: I0309 12:59:04.472924 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:59:04 crc kubenswrapper[4723]: I0309 12:59:04.472938 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:59:04 crc kubenswrapper[4723]: I0309 12:59:04.473576 4723 scope.go:117] "RemoveContainer" containerID="d404d0354964e2f4f92dc541c542cba8638bc1bfc58eac37e3b16e23562bce0b" Mar 09 12:59:04 crc kubenswrapper[4723]: E0309 12:59:04.473759 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 12:59:04 crc kubenswrapper[4723]: I0309 12:59:04.684818 4723 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 12:59:04 crc kubenswrapper[4723]: I0309 12:59:04.684953 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 12:59:04 crc kubenswrapper[4723]: I0309 12:59:04.807598 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:04Z is after 2026-02-23T05:33:13Z Mar 09 12:59:05 crc kubenswrapper[4723]: I0309 12:59:05.671906 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 09 12:59:05 crc kubenswrapper[4723]: I0309 12:59:05.672083 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:59:05 crc kubenswrapper[4723]: I0309 12:59:05.673014 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:59:05 crc kubenswrapper[4723]: I0309 12:59:05.673046 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:59:05 crc kubenswrapper[4723]: I0309 12:59:05.673057 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:59:05 crc kubenswrapper[4723]: I0309 12:59:05.681839 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 09 12:59:05 crc kubenswrapper[4723]: I0309 12:59:05.805379 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:05Z is after 2026-02-23T05:33:13Z Mar 09 12:59:06 crc kubenswrapper[4723]: W0309 12:59:06.002527 4723 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:06Z is after 2026-02-23T05:33:13Z Mar 09 12:59:06 crc kubenswrapper[4723]: E0309 12:59:06.002648 4723 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:06Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 12:59:06 crc kubenswrapper[4723]: I0309 12:59:06.016077 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:59:06 crc kubenswrapper[4723]: I0309 12:59:06.017216 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:59:06 crc kubenswrapper[4723]: I0309 12:59:06.017322 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:59:06 crc kubenswrapper[4723]: I0309 12:59:06.017400 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:59:06 crc kubenswrapper[4723]: W0309 12:59:06.339125 4723 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:06Z is after 2026-02-23T05:33:13Z Mar 09 12:59:06 crc kubenswrapper[4723]: E0309 12:59:06.339440 4723 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:06Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 12:59:06 crc kubenswrapper[4723]: I0309 12:59:06.807560 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:06Z is after 2026-02-23T05:33:13Z Mar 09 12:59:06 crc kubenswrapper[4723]: E0309 12:59:06.954105 4723 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 12:59:07 crc kubenswrapper[4723]: W0309 12:59:07.144066 4723 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:07Z is after 2026-02-23T05:33:13Z Mar 09 12:59:07 crc kubenswrapper[4723]: E0309 12:59:07.144205 4723 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:07Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 12:59:07 crc kubenswrapper[4723]: I0309 12:59:07.302991 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 12:59:07 crc kubenswrapper[4723]: I0309 12:59:07.303159 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:59:07 crc kubenswrapper[4723]: I0309 12:59:07.304367 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:59:07 crc kubenswrapper[4723]: I0309 12:59:07.304417 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:59:07 crc kubenswrapper[4723]: I0309 12:59:07.304434 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:59:07 crc kubenswrapper[4723]: I0309 12:59:07.305138 4723 scope.go:117] "RemoveContainer" containerID="d404d0354964e2f4f92dc541c542cba8638bc1bfc58eac37e3b16e23562bce0b" Mar 09 12:59:07 crc kubenswrapper[4723]: E0309 12:59:07.305339 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 12:59:07 crc kubenswrapper[4723]: I0309 12:59:07.306951 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 12:59:07 crc kubenswrapper[4723]: E0309 12:59:07.571847 4723 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:07Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 09 12:59:07 crc kubenswrapper[4723]: I0309 12:59:07.579104 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:59:07 crc kubenswrapper[4723]: I0309 12:59:07.580617 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:59:07 crc kubenswrapper[4723]: I0309 12:59:07.580681 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:59:07 crc kubenswrapper[4723]: I0309 12:59:07.580707 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:59:07 crc kubenswrapper[4723]: I0309 12:59:07.580755 4723 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 12:59:07 crc kubenswrapper[4723]: E0309 12:59:07.588574 4723 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:07Z is after 2026-02-23T05:33:13Z" node="crc" Mar 09 12:59:07 crc kubenswrapper[4723]: I0309 12:59:07.805808 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:07Z is after 2026-02-23T05:33:13Z Mar 09 12:59:08 crc kubenswrapper[4723]: I0309 12:59:08.024571 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:59:08 crc kubenswrapper[4723]: I0309 12:59:08.025557 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:59:08 crc kubenswrapper[4723]: I0309 12:59:08.025619 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:59:08 crc kubenswrapper[4723]: I0309 12:59:08.025633 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:59:08 crc kubenswrapper[4723]: I0309 12:59:08.026164 4723 scope.go:117] "RemoveContainer" containerID="d404d0354964e2f4f92dc541c542cba8638bc1bfc58eac37e3b16e23562bce0b" Mar 09 12:59:08 crc kubenswrapper[4723]: E0309 12:59:08.026342 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 12:59:08 crc kubenswrapper[4723]: I0309 12:59:08.807597 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:08Z is after 2026-02-23T05:33:13Z Mar 09 12:59:09 crc kubenswrapper[4723]: I0309 12:59:09.176990 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 12:59:09 crc kubenswrapper[4723]: I0309 12:59:09.177250 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:59:09 crc kubenswrapper[4723]: I0309 12:59:09.178669 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:59:09 crc kubenswrapper[4723]: I0309 12:59:09.178737 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:59:09 crc kubenswrapper[4723]: I0309 12:59:09.178760 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:59:09 crc kubenswrapper[4723]: I0309 12:59:09.179844 4723 scope.go:117] "RemoveContainer" containerID="d404d0354964e2f4f92dc541c542cba8638bc1bfc58eac37e3b16e23562bce0b" Mar 09 12:59:09 crc kubenswrapper[4723]: E0309 12:59:09.180207 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 12:59:09 crc kubenswrapper[4723]: I0309 12:59:09.705641 4723 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 12:59:09 crc kubenswrapper[4723]: E0309 12:59:09.711570 4723 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:09Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 12:59:09 crc kubenswrapper[4723]: I0309 12:59:09.807026 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:09Z is after 2026-02-23T05:33:13Z Mar 09 12:59:10 crc kubenswrapper[4723]: I0309 12:59:10.805912 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:10Z is after 2026-02-23T05:33:13Z Mar 09 12:59:11 crc kubenswrapper[4723]: E0309 12:59:11.194215 4723 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:11Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b2db363b6c60e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:46.795740686 +0000 UTC m=+0.810208316,LastTimestamp:2026-03-09 12:58:46.795740686 +0000 UTC m=+0.810208316,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:11 crc kubenswrapper[4723]: I0309 12:59:11.806846 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:11Z is after 2026-02-23T05:33:13Z Mar 09 12:59:12 crc kubenswrapper[4723]: I0309 12:59:12.807218 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:12Z is after 2026-02-23T05:33:13Z Mar 09 12:59:12 crc kubenswrapper[4723]: W0309 12:59:12.920161 4723 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:12Z is after 2026-02-23T05:33:13Z Mar 09 12:59:12 crc kubenswrapper[4723]: E0309 12:59:12.920284 4723 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:12Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 12:59:13 crc kubenswrapper[4723]: I0309 12:59:13.805489 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:13Z is after 2026-02-23T05:33:13Z Mar 09 12:59:14 crc kubenswrapper[4723]: E0309 12:59:14.575906 4723 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:14Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 09 12:59:14 crc kubenswrapper[4723]: I0309 12:59:14.589262 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:59:14 crc kubenswrapper[4723]: I0309 12:59:14.590591 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:59:14 crc kubenswrapper[4723]: I0309 12:59:14.590627 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:59:14 crc kubenswrapper[4723]: I0309 12:59:14.590637 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:59:14 crc kubenswrapper[4723]: I0309 12:59:14.590664 4723 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 12:59:14 crc kubenswrapper[4723]: E0309 12:59:14.595338 4723 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:14Z is after 2026-02-23T05:33:13Z" node="crc" Mar 09 12:59:14 crc kubenswrapper[4723]: I0309 12:59:14.684451 4723 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 12:59:14 crc kubenswrapper[4723]: I0309 12:59:14.684533 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 12:59:14 crc kubenswrapper[4723]: I0309 12:59:14.684602 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 12:59:14 crc kubenswrapper[4723]: I0309 12:59:14.684758 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:59:14 crc kubenswrapper[4723]: I0309 12:59:14.685900 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:59:14 crc kubenswrapper[4723]: I0309 12:59:14.685943 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:59:14 crc kubenswrapper[4723]: I0309 12:59:14.685959 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:59:14 crc kubenswrapper[4723]: I0309 12:59:14.686456 4723 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"8a85b6fe63998440f0ad1d952a2eaaa3fb53449997bae0986b4fc1c776b90063"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 09 12:59:14 crc kubenswrapper[4723]: I0309 12:59:14.686623 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://8a85b6fe63998440f0ad1d952a2eaaa3fb53449997bae0986b4fc1c776b90063" gracePeriod=30 Mar 09 12:59:14 crc kubenswrapper[4723]: I0309 12:59:14.806856 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:14Z is after 2026-02-23T05:33:13Z Mar 09 12:59:15 crc kubenswrapper[4723]: I0309 12:59:15.050063 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 09 12:59:15 crc kubenswrapper[4723]: I0309 12:59:15.050514 4723 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="8a85b6fe63998440f0ad1d952a2eaaa3fb53449997bae0986b4fc1c776b90063" exitCode=255 Mar 09 12:59:15 crc kubenswrapper[4723]: I0309 12:59:15.050568 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"8a85b6fe63998440f0ad1d952a2eaaa3fb53449997bae0986b4fc1c776b90063"} Mar 09 12:59:15 crc kubenswrapper[4723]: W0309 12:59:15.461135 4723 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:15Z is after 2026-02-23T05:33:13Z Mar 09 12:59:15 crc kubenswrapper[4723]: E0309 12:59:15.461256 4723 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:15Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 12:59:15 crc kubenswrapper[4723]: I0309 12:59:15.807818 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:15Z is after 2026-02-23T05:33:13Z Mar 09 12:59:16 crc kubenswrapper[4723]: I0309 12:59:16.058060 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 09 12:59:16 crc kubenswrapper[4723]: I0309 12:59:16.058782 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"720488c49a69e71b04c080c524f4cff2751dae29c27c494059d67a1ba2fe28a2"} Mar 09 12:59:16 crc kubenswrapper[4723]: I0309 12:59:16.059040 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:59:16 crc kubenswrapper[4723]: I0309 12:59:16.060939 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:59:16 crc kubenswrapper[4723]: I0309 12:59:16.061009 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:59:16 crc kubenswrapper[4723]: I0309 12:59:16.061034 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:59:16 crc kubenswrapper[4723]: W0309 12:59:16.439949 4723 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:16Z is after 2026-02-23T05:33:13Z Mar 09 12:59:16 crc kubenswrapper[4723]: E0309 12:59:16.440051 4723 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:16Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 12:59:16 crc kubenswrapper[4723]: I0309 12:59:16.804526 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:16Z is after 2026-02-23T05:33:13Z Mar 09 12:59:16 crc kubenswrapper[4723]: E0309 12:59:16.954272 4723 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 12:59:17 crc kubenswrapper[4723]: I0309 12:59:17.060970 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:59:17 crc kubenswrapper[4723]: I0309 12:59:17.062187 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:59:17 crc kubenswrapper[4723]: I0309 12:59:17.062275 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:59:17 crc kubenswrapper[4723]: I0309 12:59:17.062315 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:59:17 crc kubenswrapper[4723]: I0309 12:59:17.806842 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:17Z is after 2026-02-23T05:33:13Z Mar 09 12:59:18 crc kubenswrapper[4723]: I0309 12:59:18.806649 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:18Z is after 2026-02-23T05:33:13Z Mar 09 12:59:18 crc kubenswrapper[4723]: W0309 12:59:18.858141 4723 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:18Z is after 2026-02-23T05:33:13Z Mar 09 12:59:18 crc kubenswrapper[4723]: E0309 12:59:18.858256 4723 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:18Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 12:59:19 crc kubenswrapper[4723]: I0309 12:59:19.110395 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 12:59:19 crc kubenswrapper[4723]: I0309 12:59:19.110655 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:59:19 crc kubenswrapper[4723]: I0309 12:59:19.112568 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:59:19 crc kubenswrapper[4723]: I0309 12:59:19.112652 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:59:19 crc kubenswrapper[4723]: I0309 12:59:19.112675 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:59:19 crc kubenswrapper[4723]: I0309 12:59:19.806841 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:19Z is after 2026-02-23T05:33:13Z Mar 09 12:59:20 crc kubenswrapper[4723]: I0309 12:59:20.804168 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:20Z is after 2026-02-23T05:33:13Z Mar 09 12:59:21 crc kubenswrapper[4723]: E0309 12:59:21.199576 4723 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:21Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b2db363b6c60e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:46.795740686 +0000 UTC m=+0.810208316,LastTimestamp:2026-03-09 12:58:46.795740686 +0000 UTC m=+0.810208316,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:21 crc kubenswrapper[4723]: E0309 12:59:21.579741 4723 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:21Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 09 12:59:21 crc kubenswrapper[4723]: I0309 12:59:21.596020 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:59:21 crc kubenswrapper[4723]: I0309 12:59:21.597254 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:59:21 crc kubenswrapper[4723]: I0309 12:59:21.597295 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:59:21 crc kubenswrapper[4723]: I0309 12:59:21.597309 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:59:21 crc kubenswrapper[4723]: I0309 12:59:21.597337 4723 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 12:59:21 crc kubenswrapper[4723]: E0309 12:59:21.599849 4723 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:21Z is after 2026-02-23T05:33:13Z" node="crc" Mar 09 12:59:21 crc kubenswrapper[4723]: I0309 12:59:21.684274 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 12:59:21 crc kubenswrapper[4723]: I0309 12:59:21.684539 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:59:21 crc kubenswrapper[4723]: I0309 12:59:21.686240 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:59:21 crc kubenswrapper[4723]: I0309 12:59:21.686306 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:59:21 crc kubenswrapper[4723]: I0309 12:59:21.686332 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:59:21 crc kubenswrapper[4723]: I0309 12:59:21.804514 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:21Z is after 2026-02-23T05:33:13Z Mar 09 12:59:22 crc kubenswrapper[4723]: I0309 12:59:22.808044 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:22Z is after 2026-02-23T05:33:13Z Mar 09 12:59:22 crc kubenswrapper[4723]: I0309 12:59:22.880399 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:59:22 crc kubenswrapper[4723]: I0309 12:59:22.881800 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:59:22 crc kubenswrapper[4723]: I0309 12:59:22.881913 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:59:22 crc kubenswrapper[4723]: I0309 12:59:22.881941 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:59:22 crc kubenswrapper[4723]: I0309 12:59:22.882720 4723 scope.go:117] "RemoveContainer" containerID="d404d0354964e2f4f92dc541c542cba8638bc1bfc58eac37e3b16e23562bce0b" Mar 09 12:59:23 crc kubenswrapper[4723]: I0309 12:59:23.807477 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:23Z is after 2026-02-23T05:33:13Z Mar 09 12:59:24 crc kubenswrapper[4723]: I0309 12:59:24.080718 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 09 12:59:24 crc kubenswrapper[4723]: I0309 12:59:24.081718 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 09 12:59:24 crc kubenswrapper[4723]: I0309 12:59:24.084824 4723 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4863f4037639c1a2b5d073d0651bd57975684c007e0262ebb8a44c4259e75f2d" exitCode=255 Mar 09 12:59:24 crc kubenswrapper[4723]: I0309 12:59:24.084923 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"4863f4037639c1a2b5d073d0651bd57975684c007e0262ebb8a44c4259e75f2d"} Mar 09 12:59:24 crc kubenswrapper[4723]: I0309 12:59:24.084991 4723 scope.go:117] "RemoveContainer" containerID="d404d0354964e2f4f92dc541c542cba8638bc1bfc58eac37e3b16e23562bce0b" Mar 09 12:59:24 crc kubenswrapper[4723]: I0309 12:59:24.085168 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:59:24 crc kubenswrapper[4723]: I0309 12:59:24.086839 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:59:24 crc kubenswrapper[4723]: I0309 12:59:24.086940 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:59:24 crc kubenswrapper[4723]: I0309 12:59:24.086963 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:59:24 crc kubenswrapper[4723]: I0309 12:59:24.087959 4723 scope.go:117] "RemoveContainer" containerID="4863f4037639c1a2b5d073d0651bd57975684c007e0262ebb8a44c4259e75f2d" Mar 09 12:59:24 crc kubenswrapper[4723]: E0309 12:59:24.088314 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 12:59:24 crc kubenswrapper[4723]: I0309 12:59:24.471051 4723 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 12:59:24 crc kubenswrapper[4723]: I0309 12:59:24.684806 4723 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 12:59:24 crc kubenswrapper[4723]: I0309 12:59:24.684975 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 12:59:24 crc kubenswrapper[4723]: I0309 12:59:24.807542 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:24Z is after 2026-02-23T05:33:13Z Mar 09 12:59:25 crc kubenswrapper[4723]: I0309 12:59:25.090044 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 09 12:59:25 crc kubenswrapper[4723]: I0309 12:59:25.093379 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:59:25 crc kubenswrapper[4723]: I0309 12:59:25.094701 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:59:25 crc kubenswrapper[4723]: I0309 12:59:25.094764 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:59:25 crc kubenswrapper[4723]: I0309 12:59:25.094787 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:59:25 crc kubenswrapper[4723]: I0309 12:59:25.095638 4723 scope.go:117] "RemoveContainer" containerID="4863f4037639c1a2b5d073d0651bd57975684c007e0262ebb8a44c4259e75f2d" Mar 09 12:59:25 crc kubenswrapper[4723]: E0309 12:59:25.095953 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 12:59:25 crc kubenswrapper[4723]: I0309 12:59:25.807393 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:25Z is after 2026-02-23T05:33:13Z Mar 09 12:59:26 crc kubenswrapper[4723]: I0309 12:59:26.517287 4723 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 12:59:26 crc kubenswrapper[4723]: E0309 12:59:26.524357 4723 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:26Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 12:59:26 crc kubenswrapper[4723]: E0309 12:59:26.525823 4723 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 09 12:59:26 crc kubenswrapper[4723]: I0309 12:59:26.809084 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:26Z is after 2026-02-23T05:33:13Z Mar 09 12:59:26 crc kubenswrapper[4723]: E0309 12:59:26.954393 4723 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 12:59:27 crc kubenswrapper[4723]: I0309 12:59:27.807305 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:27Z is after 2026-02-23T05:33:13Z Mar 09 12:59:28 crc kubenswrapper[4723]: E0309 12:59:28.586681 4723 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:28Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 09 12:59:28 crc kubenswrapper[4723]: I0309 12:59:28.600649 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:59:28 crc kubenswrapper[4723]: I0309 12:59:28.602570 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:59:28 crc kubenswrapper[4723]: I0309 12:59:28.602656 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:59:28 crc kubenswrapper[4723]: I0309 12:59:28.602673 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:59:28 crc kubenswrapper[4723]: I0309 12:59:28.602702 4723 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 12:59:28 crc kubenswrapper[4723]: E0309 12:59:28.607743 4723 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:28Z is after 2026-02-23T05:33:13Z" node="crc" Mar 09 12:59:28 crc kubenswrapper[4723]: I0309 12:59:28.806967 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:28Z is after 2026-02-23T05:33:13Z Mar 09 12:59:29 crc kubenswrapper[4723]: I0309 12:59:29.176506 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 12:59:29 crc kubenswrapper[4723]: I0309 12:59:29.176826 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:59:29 crc kubenswrapper[4723]: I0309 12:59:29.178688 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:59:29 crc kubenswrapper[4723]: I0309 12:59:29.178754 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:59:29 crc kubenswrapper[4723]: I0309 12:59:29.178779 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:59:29 crc kubenswrapper[4723]: I0309 12:59:29.179836 4723 scope.go:117] "RemoveContainer" containerID="4863f4037639c1a2b5d073d0651bd57975684c007e0262ebb8a44c4259e75f2d" Mar 09 12:59:29 crc kubenswrapper[4723]: E0309 12:59:29.180258 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 12:59:29 crc kubenswrapper[4723]: I0309 12:59:29.871961 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:29Z is after 2026-02-23T05:33:13Z Mar 09 12:59:30 crc kubenswrapper[4723]: W0309 12:59:30.370476 4723 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:30Z is after 2026-02-23T05:33:13Z Mar 09 12:59:30 crc kubenswrapper[4723]: E0309 12:59:30.370585 4723 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:30Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 12:59:30 crc kubenswrapper[4723]: W0309 12:59:30.737330 4723 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:30Z is after 2026-02-23T05:33:13Z Mar 09 12:59:30 crc kubenswrapper[4723]: E0309 12:59:30.737436 4723 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:30Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 12:59:30 crc kubenswrapper[4723]: I0309 12:59:30.806434 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:30Z is after 2026-02-23T05:33:13Z Mar 09 12:59:31 crc kubenswrapper[4723]: E0309 12:59:31.204925 4723 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:31Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b2db363b6c60e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:46.795740686 +0000 UTC m=+0.810208316,LastTimestamp:2026-03-09 12:58:46.795740686 +0000 UTC m=+0.810208316,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:31 crc kubenswrapper[4723]: I0309 12:59:31.806632 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:31Z is after 2026-02-23T05:33:13Z Mar 09 12:59:32 crc kubenswrapper[4723]: I0309 12:59:32.807527 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:32Z is after 2026-02-23T05:33:13Z Mar 09 12:59:33 crc kubenswrapper[4723]: I0309 12:59:33.806287 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:33Z is after 2026-02-23T05:33:13Z Mar 09 12:59:34 crc kubenswrapper[4723]: I0309 12:59:34.685160 4723 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 12:59:34 crc kubenswrapper[4723]: I0309 12:59:34.685374 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 12:59:34 crc kubenswrapper[4723]: I0309 12:59:34.808659 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:34Z is after 2026-02-23T05:33:13Z Mar 09 12:59:35 crc kubenswrapper[4723]: I0309 12:59:35.098332 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 12:59:35 crc kubenswrapper[4723]: I0309 12:59:35.098623 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:59:35 crc kubenswrapper[4723]: I0309 12:59:35.104170 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:59:35 crc kubenswrapper[4723]: I0309 12:59:35.104249 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:59:35 crc kubenswrapper[4723]: I0309 12:59:35.104269 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:59:35 crc kubenswrapper[4723]: E0309 12:59:35.591265 4723 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:35Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 09 12:59:35 crc kubenswrapper[4723]: I0309 12:59:35.608624 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:59:35 crc kubenswrapper[4723]: I0309 12:59:35.610762 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:59:35 crc kubenswrapper[4723]: I0309 12:59:35.610811 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:59:35 crc kubenswrapper[4723]: I0309 12:59:35.610825 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:59:35 crc kubenswrapper[4723]: I0309 12:59:35.610875 4723 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 12:59:35 crc kubenswrapper[4723]: E0309 12:59:35.614701 4723 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:35Z is after 2026-02-23T05:33:13Z" node="crc" Mar 09 12:59:35 crc kubenswrapper[4723]: W0309 12:59:35.657222 4723 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:35Z is after 2026-02-23T05:33:13Z Mar 09 12:59:35 crc kubenswrapper[4723]: E0309 12:59:35.657357 4723 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:35Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 12:59:35 crc kubenswrapper[4723]: I0309 12:59:35.804617 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:35Z is after 2026-02-23T05:33:13Z Mar 09 12:59:36 crc kubenswrapper[4723]: W0309 12:59:36.140782 4723 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:36Z is after 2026-02-23T05:33:13Z Mar 09 12:59:36 crc kubenswrapper[4723]: E0309 12:59:36.141016 4723 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:36Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 12:59:36 crc kubenswrapper[4723]: I0309 12:59:36.806820 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:36Z is after 2026-02-23T05:33:13Z Mar 09 12:59:36 crc kubenswrapper[4723]: E0309 12:59:36.954493 4723 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 12:59:37 crc kubenswrapper[4723]: I0309 12:59:37.807724 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:37Z is after 2026-02-23T05:33:13Z Mar 09 12:59:38 crc kubenswrapper[4723]: I0309 12:59:38.804516 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:38Z is after 2026-02-23T05:33:13Z Mar 09 12:59:39 crc kubenswrapper[4723]: I0309 12:59:39.806741 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:39Z is after 2026-02-23T05:33:13Z Mar 09 12:59:40 crc kubenswrapper[4723]: I0309 12:59:40.807623 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:40Z is after 2026-02-23T05:33:13Z Mar 09 12:59:41 crc kubenswrapper[4723]: E0309 12:59:41.210609 4723 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:41Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b2db363b6c60e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:46.795740686 +0000 UTC m=+0.810208316,LastTimestamp:2026-03-09 12:58:46.795740686 +0000 UTC m=+0.810208316,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:41 crc kubenswrapper[4723]: I0309 12:59:41.807260 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T12:59:41Z is after 2026-02-23T05:33:13Z Mar 09 12:59:42 crc kubenswrapper[4723]: E0309 12:59:42.607818 4723 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 09 12:59:42 crc kubenswrapper[4723]: I0309 12:59:42.615049 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:59:42 crc kubenswrapper[4723]: I0309 12:59:42.616911 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:59:42 crc kubenswrapper[4723]: I0309 12:59:42.616974 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:59:42 crc kubenswrapper[4723]: I0309 12:59:42.616992 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:59:42 crc kubenswrapper[4723]: I0309 12:59:42.617033 4723 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 12:59:42 crc kubenswrapper[4723]: E0309 12:59:42.624700 4723 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 09 12:59:42 crc kubenswrapper[4723]: I0309 12:59:42.811322 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 12:59:42 crc kubenswrapper[4723]: I0309 12:59:42.880378 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:59:42 crc kubenswrapper[4723]: I0309 12:59:42.882038 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:59:42 crc kubenswrapper[4723]: I0309 12:59:42.882103 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:59:42 crc kubenswrapper[4723]: I0309 12:59:42.882126 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:59:42 crc kubenswrapper[4723]: I0309 12:59:42.883086 4723 scope.go:117] "RemoveContainer" containerID="4863f4037639c1a2b5d073d0651bd57975684c007e0262ebb8a44c4259e75f2d" Mar 09 12:59:42 crc kubenswrapper[4723]: E0309 12:59:42.883445 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 12:59:43 crc kubenswrapper[4723]: I0309 12:59:43.810404 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 12:59:44 crc kubenswrapper[4723]: I0309 12:59:44.684881 4723 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 12:59:44 crc kubenswrapper[4723]: I0309 12:59:44.685031 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 12:59:44 crc kubenswrapper[4723]: I0309 12:59:44.685114 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 12:59:44 crc kubenswrapper[4723]: I0309 12:59:44.685322 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:59:44 crc kubenswrapper[4723]: I0309 12:59:44.687205 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:59:44 crc kubenswrapper[4723]: I0309 12:59:44.687244 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:59:44 crc kubenswrapper[4723]: I0309 12:59:44.687254 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:59:44 crc kubenswrapper[4723]: I0309 12:59:44.687674 4723 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"720488c49a69e71b04c080c524f4cff2751dae29c27c494059d67a1ba2fe28a2"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 09 12:59:44 crc kubenswrapper[4723]: I0309 12:59:44.687758 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://720488c49a69e71b04c080c524f4cff2751dae29c27c494059d67a1ba2fe28a2" gracePeriod=30 Mar 09 12:59:44 crc kubenswrapper[4723]: I0309 12:59:44.810118 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 12:59:45 crc kubenswrapper[4723]: I0309 12:59:45.157137 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 09 12:59:45 crc kubenswrapper[4723]: I0309 12:59:45.158727 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 09 12:59:45 crc kubenswrapper[4723]: I0309 12:59:45.159593 4723 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="720488c49a69e71b04c080c524f4cff2751dae29c27c494059d67a1ba2fe28a2" exitCode=255 Mar 09 12:59:45 crc kubenswrapper[4723]: I0309 12:59:45.159645 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"720488c49a69e71b04c080c524f4cff2751dae29c27c494059d67a1ba2fe28a2"} Mar 09 12:59:45 crc kubenswrapper[4723]: I0309 12:59:45.159699 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"18c8dbb6ccbd987ea1d04c26b7aa1e98ef5bc8504b30ed849562f8e20bd1074b"} Mar 09 12:59:45 crc kubenswrapper[4723]: I0309 12:59:45.159720 4723 scope.go:117] "RemoveContainer" containerID="8a85b6fe63998440f0ad1d952a2eaaa3fb53449997bae0986b4fc1c776b90063" Mar 09 12:59:45 crc kubenswrapper[4723]: I0309 12:59:45.159827 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:59:45 crc kubenswrapper[4723]: I0309 12:59:45.161029 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:59:45 crc kubenswrapper[4723]: I0309 12:59:45.161068 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:59:45 crc kubenswrapper[4723]: I0309 12:59:45.161080 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:59:45 crc kubenswrapper[4723]: I0309 12:59:45.807342 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 12:59:46 crc kubenswrapper[4723]: I0309 12:59:46.165917 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 09 12:59:46 crc kubenswrapper[4723]: I0309 12:59:46.168328 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:59:46 crc kubenswrapper[4723]: I0309 12:59:46.169706 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:59:46 crc kubenswrapper[4723]: I0309 12:59:46.169769 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:59:46 crc kubenswrapper[4723]: I0309 12:59:46.169791 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:59:46 crc kubenswrapper[4723]: I0309 12:59:46.809730 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 12:59:46 crc kubenswrapper[4723]: E0309 12:59:46.955246 4723 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 12:59:47 crc kubenswrapper[4723]: I0309 12:59:47.807580 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 12:59:48 crc kubenswrapper[4723]: I0309 12:59:48.808134 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 12:59:49 crc kubenswrapper[4723]: I0309 12:59:49.110940 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 12:59:49 crc kubenswrapper[4723]: I0309 12:59:49.111144 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:59:49 crc kubenswrapper[4723]: I0309 12:59:49.113077 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:59:49 crc kubenswrapper[4723]: I0309 12:59:49.113133 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:59:49 crc kubenswrapper[4723]: I0309 12:59:49.113150 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:59:49 crc kubenswrapper[4723]: E0309 12:59:49.613751 4723 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 09 12:59:49 crc kubenswrapper[4723]: I0309 12:59:49.624903 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:59:49 crc kubenswrapper[4723]: I0309 12:59:49.626591 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:59:49 crc kubenswrapper[4723]: I0309 12:59:49.626739 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:59:49 crc kubenswrapper[4723]: I0309 12:59:49.626836 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:59:49 crc kubenswrapper[4723]: I0309 12:59:49.626989 4723 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 12:59:49 crc kubenswrapper[4723]: E0309 12:59:49.632294 4723 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 09 12:59:49 crc kubenswrapper[4723]: I0309 12:59:49.808828 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 12:59:50 crc kubenswrapper[4723]: I0309 12:59:50.807517 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.216136 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2db363b6c60e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:46.795740686 +0000 UTC m=+0.810208316,LastTimestamp:2026-03-09 12:58:46.795740686 +0000 UTC m=+0.810208316,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.219679 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2db36859c6e3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:46.873532131 +0000 UTC m=+0.887999681,LastTimestamp:2026-03-09 12:58:46.873532131 +0000 UTC m=+0.887999681,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.224587 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2db3685a0af9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:46.873549561 +0000 UTC m=+0.888017111,LastTimestamp:2026-03-09 12:58:46.873549561 +0000 UTC m=+0.888017111,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.227461 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2db3685a3713 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:46.873560851 +0000 UTC m=+0.888028401,LastTimestamp:2026-03-09 12:58:46.873560851 +0000 UTC m=+0.888028401,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.230549 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2db36d85e921 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:46.960310561 +0000 UTC m=+0.974778101,LastTimestamp:2026-03-09 12:58:46.960310561 +0000 UTC m=+0.974778101,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.234004 4723 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2db36859c6e3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2db36859c6e3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:46.873532131 +0000 UTC m=+0.887999681,LastTimestamp:2026-03-09 12:58:46.982015222 +0000 UTC m=+0.996482762,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.238940 4723 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2db3685a0af9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2db3685a0af9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:46.873549561 +0000 UTC m=+0.888017111,LastTimestamp:2026-03-09 12:58:46.982041733 +0000 UTC m=+0.996509273,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.244651 4723 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2db3685a3713\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2db3685a3713 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:46.873560851 +0000 UTC m=+0.888028401,LastTimestamp:2026-03-09 12:58:46.982050313 +0000 UTC m=+0.996517853,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.248084 4723 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2db36859c6e3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2db36859c6e3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:46.873532131 +0000 UTC m=+0.887999681,LastTimestamp:2026-03-09 12:58:46.982838152 +0000 UTC m=+0.997305692,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.252285 4723 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2db3685a0af9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2db3685a0af9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:46.873549561 +0000 UTC m=+0.888017111,LastTimestamp:2026-03-09 12:58:46.982882173 +0000 UTC m=+0.997349713,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.255601 4723 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2db3685a3713\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2db3685a3713 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:46.873560851 +0000 UTC m=+0.888028401,LastTimestamp:2026-03-09 12:58:46.982893673 +0000 UTC m=+0.997361213,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.259658 4723 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2db36859c6e3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2db36859c6e3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:46.873532131 +0000 UTC m=+0.887999681,LastTimestamp:2026-03-09 12:58:46.983337224 +0000 UTC m=+0.997804764,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.263268 4723 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2db3685a0af9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2db3685a0af9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:46.873549561 +0000 UTC m=+0.888017111,LastTimestamp:2026-03-09 12:58:46.983352724 +0000 UTC m=+0.997820264,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.266798 4723 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2db3685a3713\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2db3685a3713 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:46.873560851 +0000 UTC m=+0.888028401,LastTimestamp:2026-03-09 12:58:46.983365434 +0000 UTC m=+0.997832974,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.271633 4723 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2db36859c6e3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2db36859c6e3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:46.873532131 +0000 UTC m=+0.887999681,LastTimestamp:2026-03-09 12:58:46.983724233 +0000 UTC m=+0.998191773,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.275770 4723 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2db3685a0af9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2db3685a0af9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:46.873549561 +0000 UTC m=+0.888017111,LastTimestamp:2026-03-09 12:58:46.983736243 +0000 UTC m=+0.998203783,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.279039 4723 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2db3685a3713\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2db3685a3713 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:46.873560851 +0000 UTC m=+0.888028401,LastTimestamp:2026-03-09 12:58:46.983744953 +0000 UTC m=+0.998212493,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.282234 4723 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2db36859c6e3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2db36859c6e3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:46.873532131 +0000 UTC m=+0.887999681,LastTimestamp:2026-03-09 12:58:46.983955919 +0000 UTC m=+0.998423459,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.286098 4723 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2db3685a0af9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2db3685a0af9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:46.873549561 +0000 UTC m=+0.888017111,LastTimestamp:2026-03-09 12:58:46.983965429 +0000 UTC m=+0.998432969,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.290031 4723 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2db3685a3713\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2db3685a3713 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:46.873560851 +0000 UTC m=+0.888028401,LastTimestamp:2026-03-09 12:58:46.983973669 +0000 UTC m=+0.998441209,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.293557 4723 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2db36859c6e3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2db36859c6e3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:46.873532131 +0000 UTC m=+0.887999681,LastTimestamp:2026-03-09 12:58:46.984639125 +0000 UTC m=+0.999106665,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.298069 4723 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2db3685a0af9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2db3685a0af9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:46.873549561 +0000 UTC m=+0.888017111,LastTimestamp:2026-03-09 12:58:46.984655475 +0000 UTC m=+0.999123015,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.302294 4723 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2db3685a3713\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2db3685a3713 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:46.873560851 +0000 UTC m=+0.888028401,LastTimestamp:2026-03-09 12:58:46.984663635 +0000 UTC m=+0.999131175,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.306100 4723 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2db36859c6e3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2db36859c6e3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:46.873532131 +0000 UTC m=+0.887999681,LastTimestamp:2026-03-09 12:58:46.984966363 +0000 UTC m=+0.999433903,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.309766 4723 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2db3685a0af9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2db3685a0af9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:46.873549561 +0000 UTC m=+0.888017111,LastTimestamp:2026-03-09 12:58:46.984991813 +0000 UTC m=+0.999459353,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.316432 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2db386d0388b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:47.384610955 +0000 UTC m=+1.399078495,LastTimestamp:2026-03-09 12:58:47.384610955 +0000 UTC m=+1.399078495,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.322061 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b2db386d3b22d openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:47.384838701 +0000 UTC m=+1.399306241,LastTimestamp:2026-03-09 12:58:47.384838701 +0000 UTC m=+1.399306241,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.325566 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2db38745d03c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:47.3923175 +0000 UTC m=+1.406785040,LastTimestamp:2026-03-09 12:58:47.3923175 +0000 UTC m=+1.406785040,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.331182 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2db387c059db openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:47.400348123 +0000 UTC m=+1.414815703,LastTimestamp:2026-03-09 12:58:47.400348123 +0000 UTC m=+1.414815703,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.334431 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2db387c9f892 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:47.400978578 +0000 UTC m=+1.415446158,LastTimestamp:2026-03-09 12:58:47.400978578 +0000 UTC m=+1.415446158,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.338066 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2db3a750e984 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:47.92991578 +0000 UTC m=+1.944383330,LastTimestamp:2026-03-09 12:58:47.92991578 +0000 UTC m=+1.944383330,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.341242 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b2db3a76058e7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:47.930927335 +0000 UTC m=+1.945394885,LastTimestamp:2026-03-09 12:58:47.930927335 +0000 UTC m=+1.945394885,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.344598 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2db3a763593d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:47.931124029 +0000 UTC m=+1.945591609,LastTimestamp:2026-03-09 12:58:47.931124029 +0000 UTC m=+1.945591609,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.347950 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2db3a76829b1 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:47.931439537 +0000 UTC m=+1.945907077,LastTimestamp:2026-03-09 12:58:47.931439537 +0000 UTC m=+1.945907077,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.351190 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2db3a7688a41 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:47.931464257 +0000 UTC m=+1.945931817,LastTimestamp:2026-03-09 12:58:47.931464257 +0000 UTC m=+1.945931817,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.354277 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2db3a7e5f28b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:47.939682955 +0000 UTC m=+1.954150515,LastTimestamp:2026-03-09 12:58:47.939682955 +0000 UTC m=+1.954150515,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.357229 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2db3a7f8e552 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:47.940924754 +0000 UTC m=+1.955392294,LastTimestamp:2026-03-09 12:58:47.940924754 +0000 UTC m=+1.955392294,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.360066 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2db3a8075c7f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:47.941872767 +0000 UTC m=+1.956340307,LastTimestamp:2026-03-09 12:58:47.941872767 +0000 UTC m=+1.956340307,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.362963 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b2db3a83a0157 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:47.945191767 +0000 UTC m=+1.959659307,LastTimestamp:2026-03-09 12:58:47.945191767 +0000 UTC m=+1.959659307,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.365823 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2db3a8a29405 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:47.952045061 +0000 UTC m=+1.966512601,LastTimestamp:2026-03-09 12:58:47.952045061 +0000 UTC m=+1.966512601,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.368796 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2db3a91cd6d1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:47.960057553 +0000 UTC m=+1.974525093,LastTimestamp:2026-03-09 12:58:47.960057553 +0000 UTC m=+1.974525093,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.371881 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2db3ba2f78bc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:48.246491324 +0000 UTC m=+2.260958894,LastTimestamp:2026-03-09 12:58:48.246491324 +0000 UTC m=+2.260958894,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.375046 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2db3bad7684e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:48.257497166 +0000 UTC m=+2.271964746,LastTimestamp:2026-03-09 12:58:48.257497166 +0000 UTC m=+2.271964746,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.378143 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2db3bae9a053 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:48.258691155 +0000 UTC m=+2.273158735,LastTimestamp:2026-03-09 12:58:48.258691155 +0000 UTC m=+2.273158735,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.381310 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2db3c58a2ea0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:48.436985504 +0000 UTC m=+2.451453044,LastTimestamp:2026-03-09 12:58:48.436985504 +0000 UTC m=+2.451453044,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.384475 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2db3c6823d78 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:48.453242232 +0000 UTC m=+2.467709782,LastTimestamp:2026-03-09 12:58:48.453242232 +0000 UTC m=+2.467709782,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.390392 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2db3c698e545 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:48.454726981 +0000 UTC m=+2.469194531,LastTimestamp:2026-03-09 12:58:48.454726981 +0000 UTC m=+2.469194531,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.393782 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2db3d2db4716 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:48.66040399 +0000 UTC m=+2.674871530,LastTimestamp:2026-03-09 12:58:48.66040399 +0000 UTC m=+2.674871530,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.397046 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2db3d3aa5556 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:48.67397359 +0000 UTC m=+2.688441170,LastTimestamp:2026-03-09 12:58:48.67397359 +0000 UTC m=+2.688441170,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.400510 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2db3e12b0ca6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:48.900512934 +0000 UTC m=+2.914980474,LastTimestamp:2026-03-09 12:58:48.900512934 +0000 UTC m=+2.914980474,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.404133 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b2db3e15ac8a4 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:48.903641252 +0000 UTC m=+2.918108802,LastTimestamp:2026-03-09 12:58:48.903641252 +0000 UTC m=+2.918108802,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.408253 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2db3e175e377 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:48.905417591 +0000 UTC m=+2.919885131,LastTimestamp:2026-03-09 12:58:48.905417591 +0000 UTC m=+2.919885131,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.412215 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2db3e2914fd1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:48.923992017 +0000 UTC m=+2.938459557,LastTimestamp:2026-03-09 12:58:48.923992017 +0000 UTC m=+2.938459557,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.415791 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2db3ed742c2b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:49.106631723 +0000 UTC m=+3.121099253,LastTimestamp:2026-03-09 12:58:49.106631723 +0000 UTC m=+3.121099253,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.419486 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2db3ed74998b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:49.106659723 +0000 UTC m=+3.121127263,LastTimestamp:2026-03-09 12:58:49.106659723 +0000 UTC m=+3.121127263,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.424320 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b2db3ed82e8f1 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:49.107597553 +0000 UTC m=+3.122065093,LastTimestamp:2026-03-09 12:58:49.107597553 +0000 UTC m=+3.122065093,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.428670 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2db3edb28dd8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:49.11071996 +0000 UTC m=+3.125187500,LastTimestamp:2026-03-09 12:58:49.11071996 +0000 UTC m=+3.125187500,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.433063 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2db3ee0636e6 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:49.116202726 +0000 UTC m=+3.130670266,LastTimestamp:2026-03-09 12:58:49.116202726 +0000 UTC m=+3.130670266,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.436758 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2db3ee275b95 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:49.118374805 +0000 UTC m=+3.132842345,LastTimestamp:2026-03-09 12:58:49.118374805 +0000 UTC m=+3.132842345,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.440590 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b2db3ee70d519 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:49.123190041 +0000 UTC m=+3.137657581,LastTimestamp:2026-03-09 12:58:49.123190041 +0000 UTC m=+3.137657581,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.444404 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2db3eea0e7cf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:49.126340559 +0000 UTC m=+3.140808109,LastTimestamp:2026-03-09 12:58:49.126340559 +0000 UTC m=+3.140808109,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.448177 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2db3eebca16c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:49.128157548 +0000 UTC m=+3.142625088,LastTimestamp:2026-03-09 12:58:49.128157548 +0000 UTC m=+3.142625088,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.452615 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2db3ef63f176 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:49.13912255 +0000 UTC m=+3.153590090,LastTimestamp:2026-03-09 12:58:49.13912255 +0000 UTC m=+3.153590090,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.456690 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2db3f979a9a4 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:49.308318116 +0000 UTC m=+3.322785656,LastTimestamp:2026-03-09 12:58:49.308318116 +0000 UTC m=+3.322785656,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.460650 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2db3f9c05b48 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:49.312951112 +0000 UTC m=+3.327418652,LastTimestamp:2026-03-09 12:58:49.312951112 +0000 UTC m=+3.327418652,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.464477 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2db3fa31df4d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:49.320390477 +0000 UTC m=+3.334858017,LastTimestamp:2026-03-09 12:58:49.320390477 +0000 UTC m=+3.334858017,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.469300 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2db3fa4a2ba6 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:49.321982886 +0000 UTC m=+3.336450426,LastTimestamp:2026-03-09 12:58:49.321982886 +0000 UTC m=+3.336450426,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.472829 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2db3fb2ac0f7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:49.336701175 +0000 UTC m=+3.351168715,LastTimestamp:2026-03-09 12:58:49.336701175 +0000 UTC m=+3.351168715,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.476248 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2db3fb365344 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:49.337459524 +0000 UTC m=+3.351927064,LastTimestamp:2026-03-09 12:58:49.337459524 +0000 UTC m=+3.351927064,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.479327 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2db405fe117a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:49.518322042 +0000 UTC m=+3.532789582,LastTimestamp:2026-03-09 12:58:49.518322042 +0000 UTC m=+3.532789582,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.482353 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2db40620d8d5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:49.520601301 +0000 UTC m=+3.535068841,LastTimestamp:2026-03-09 12:58:49.520601301 +0000 UTC m=+3.535068841,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.485697 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2db406ca6f86 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:49.531715462 +0000 UTC m=+3.546183002,LastTimestamp:2026-03-09 12:58:49.531715462 +0000 UTC m=+3.546183002,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.489231 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2db4084234a4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:49.556341924 +0000 UTC m=+3.570809464,LastTimestamp:2026-03-09 12:58:49.556341924 +0000 UTC m=+3.570809464,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.492401 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2db4085343ca openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:49.557459914 +0000 UTC m=+3.571927474,LastTimestamp:2026-03-09 12:58:49.557459914 +0000 UTC m=+3.571927474,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.497645 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2db41493a02f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:49.763004463 +0000 UTC m=+3.777472003,LastTimestamp:2026-03-09 12:58:49.763004463 +0000 UTC m=+3.777472003,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.501295 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2db4154136e7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:49.774380775 +0000 UTC m=+3.788848305,LastTimestamp:2026-03-09 12:58:49.774380775 +0000 UTC m=+3.788848305,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.505200 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2db41550eb04 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:49.775409924 +0000 UTC m=+3.789877514,LastTimestamp:2026-03-09 12:58:49.775409924 +0000 UTC m=+3.789877514,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.509373 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2db41e729bfe openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:49.928612862 +0000 UTC m=+3.943080402,LastTimestamp:2026-03-09 12:58:49.928612862 +0000 UTC m=+3.943080402,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.513743 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2db42149e12f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:49.976275247 +0000 UTC m=+3.990742787,LastTimestamp:2026-03-09 12:58:49.976275247 +0000 UTC m=+3.990742787,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.517295 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2db421eca2bd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:49.986941629 +0000 UTC m=+4.001409159,LastTimestamp:2026-03-09 12:58:49.986941629 +0000 UTC m=+4.001409159,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.521295 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2db4292b0494 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:50.10847042 +0000 UTC m=+4.122937960,LastTimestamp:2026-03-09 12:58:50.10847042 +0000 UTC m=+4.122937960,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.525132 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2db429dfbae3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:50.120313571 +0000 UTC m=+4.134781111,LastTimestamp:2026-03-09 12:58:50.120313571 +0000 UTC m=+4.134781111,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.529972 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2db45b40a439 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:50.948748345 +0000 UTC m=+4.963215915,LastTimestamp:2026-03-09 12:58:50.948748345 +0000 UTC m=+4.963215915,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.534004 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2db46a09312d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:51.196772653 +0000 UTC m=+5.211240193,LastTimestamp:2026-03-09 12:58:51.196772653 +0000 UTC m=+5.211240193,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.538496 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2db46bb09511 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:51.224519953 +0000 UTC m=+5.238987503,LastTimestamp:2026-03-09 12:58:51.224519953 +0000 UTC m=+5.238987503,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.539984 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2db46bc579c6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:51.225889222 +0000 UTC m=+5.240356782,LastTimestamp:2026-03-09 12:58:51.225889222 +0000 UTC m=+5.240356782,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.542432 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2db47a85efcb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:51.473383371 +0000 UTC m=+5.487850911,LastTimestamp:2026-03-09 12:58:51.473383371 +0000 UTC m=+5.487850911,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.544249 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2db47b4a74c8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:51.486262472 +0000 UTC m=+5.500730012,LastTimestamp:2026-03-09 12:58:51.486262472 +0000 UTC m=+5.500730012,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.546125 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2db47b5938a9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:51.487230121 +0000 UTC m=+5.501697661,LastTimestamp:2026-03-09 12:58:51.487230121 +0000 UTC m=+5.501697661,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.548814 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2db4865f7fd8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:51.672190936 +0000 UTC m=+5.686658496,LastTimestamp:2026-03-09 12:58:51.672190936 +0000 UTC m=+5.686658496,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.551984 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2db48701ea5e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:51.682835038 +0000 UTC m=+5.697302618,LastTimestamp:2026-03-09 12:58:51.682835038 +0000 UTC m=+5.697302618,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.555408 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2db48719bfed openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:51.684397037 +0000 UTC m=+5.698864577,LastTimestamp:2026-03-09 12:58:51.684397037 +0000 UTC m=+5.698864577,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.560313 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2db49387a60c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:51.892925964 +0000 UTC m=+5.907393514,LastTimestamp:2026-03-09 12:58:51.892925964 +0000 UTC m=+5.907393514,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.563270 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2db49468a755 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:51.907671893 +0000 UTC m=+5.922139443,LastTimestamp:2026-03-09 12:58:51.907671893 +0000 UTC m=+5.922139443,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.566334 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2db49478cfc7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:51.908730823 +0000 UTC m=+5.923198373,LastTimestamp:2026-03-09 12:58:51.908730823 +0000 UTC m=+5.923198373,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.569751 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2db4a216c753 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:52.137187155 +0000 UTC m=+6.151654705,LastTimestamp:2026-03-09 12:58:52.137187155 +0000 UTC m=+6.151654705,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.573021 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2db4a301b2bb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:52.152582843 +0000 UTC m=+6.167050423,LastTimestamp:2026-03-09 12:58:52.152582843 +0000 UTC m=+6.167050423,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.576987 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 09 12:59:51 crc kubenswrapper[4723]: &Event{ObjectMeta:{kube-controller-manager-crc.189b2db539ecda52 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 09 12:59:51 crc kubenswrapper[4723]: body: Mar 09 12:59:51 crc kubenswrapper[4723]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:54.684576338 +0000 UTC m=+8.699043888,LastTimestamp:2026-03-09 12:58:54.684576338 +0000 UTC m=+8.699043888,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 12:59:51 crc kubenswrapper[4723]: > Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.581906 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2db539edd7ec openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:54.68464126 +0000 UTC m=+8.699108810,LastTimestamp:2026-03-09 12:58:54.68464126 +0000 UTC m=+8.699108810,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.586266 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 09 12:59:51 crc kubenswrapper[4723]: &Event{ObjectMeta:{kube-apiserver-crc.189b2db6ac380558 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Readiness probe error: Get "https://192.168.126.11:17697/healthz": read tcp 192.168.126.11:44782->192.168.126.11:17697: read: connection reset by peer Mar 09 12:59:51 crc kubenswrapper[4723]: body: Mar 09 12:59:51 crc kubenswrapper[4723]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:59:00.897072472 +0000 UTC m=+14.911540052,LastTimestamp:2026-03-09 12:59:00.897072472 +0000 UTC m=+14.911540052,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 12:59:51 crc kubenswrapper[4723]: > Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.589375 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2db6ac39135a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Readiness probe failed: Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:44782->192.168.126.11:17697: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:59:00.897141594 +0000 UTC m=+14.911609174,LastTimestamp:2026-03-09 12:59:00.897141594 +0000 UTC m=+14.911609174,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.592883 4723 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b2db41550eb04\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2db41550eb04 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:49.775409924 +0000 UTC m=+3.789877514,LastTimestamp:2026-03-09 12:59:00.995292849 +0000 UTC m=+15.009760429,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.596096 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 09 12:59:51 crc kubenswrapper[4723]: &Event{ObjectMeta:{kube-apiserver-crc.189b2db6bd55607f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 09 12:59:51 crc kubenswrapper[4723]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Mar 09 12:59:51 crc kubenswrapper[4723]: Mar 09 12:59:51 crc kubenswrapper[4723]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:59:01.184209023 +0000 UTC m=+15.198676583,LastTimestamp:2026-03-09 12:59:01.184209023 +0000 UTC m=+15.198676583,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 12:59:51 crc kubenswrapper[4723]: > Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.599197 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2db6bd565365 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:59:01.184271205 +0000 UTC m=+15.198738765,LastTimestamp:2026-03-09 12:59:01.184271205 +0000 UTC m=+15.198738765,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.602524 4723 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b2db6bd55607f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 09 12:59:51 crc kubenswrapper[4723]: &Event{ObjectMeta:{kube-apiserver-crc.189b2db6bd55607f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 09 12:59:51 crc kubenswrapper[4723]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Mar 09 12:59:51 crc kubenswrapper[4723]: Mar 09 12:59:51 crc kubenswrapper[4723]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:59:01.184209023 +0000 UTC m=+15.198676583,LastTimestamp:2026-03-09 12:59:01.19451503 +0000 UTC m=+15.208982570,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 12:59:51 crc kubenswrapper[4723]: > Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.605580 4723 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b2db6bd565365\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2db6bd565365 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:59:01.184271205 +0000 UTC m=+15.198738765,LastTimestamp:2026-03-09 12:59:01.194554471 +0000 UTC m=+15.209022011,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.609313 4723 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b2db539ecda52\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 09 12:59:51 crc kubenswrapper[4723]: &Event{ObjectMeta:{kube-controller-manager-crc.189b2db539ecda52 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 09 12:59:51 crc kubenswrapper[4723]: body: Mar 09 12:59:51 crc kubenswrapper[4723]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:54.684576338 +0000 UTC m=+8.699043888,LastTimestamp:2026-03-09 12:59:04.684928985 +0000 UTC m=+18.699396565,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 12:59:51 crc kubenswrapper[4723]: > Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.612659 4723 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b2db539edd7ec\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2db539edd7ec openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:54.68464126 +0000 UTC m=+8.699108810,LastTimestamp:2026-03-09 12:59:04.684994146 +0000 UTC m=+18.699461716,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.617414 4723 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b2db539ecda52\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 09 12:59:51 crc kubenswrapper[4723]: &Event{ObjectMeta:{kube-controller-manager-crc.189b2db539ecda52 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 09 12:59:51 crc kubenswrapper[4723]: body: Mar 09 12:59:51 crc kubenswrapper[4723]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:54.684576338 +0000 UTC m=+8.699043888,LastTimestamp:2026-03-09 12:59:14.684514582 +0000 UTC m=+28.698982142,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 12:59:51 crc kubenswrapper[4723]: > Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.620711 4723 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b2db539edd7ec\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2db539edd7ec openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:54.68464126 +0000 UTC m=+8.699108810,LastTimestamp:2026-03-09 12:59:14.684570263 +0000 UTC m=+28.699037813,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.624513 4723 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2db9e2239596 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:59:14.686604694 +0000 UTC m=+28.701072244,LastTimestamp:2026-03-09 12:59:14.686604694 +0000 UTC m=+28.701072244,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.627607 4723 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b2db3a8075c7f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2db3a8075c7f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:47.941872767 +0000 UTC m=+1.956340307,LastTimestamp:2026-03-09 12:59:14.809735841 +0000 UTC m=+28.824203381,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.631113 4723 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b2db3ba2f78bc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2db3ba2f78bc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:48.246491324 +0000 UTC m=+2.260958894,LastTimestamp:2026-03-09 12:59:15.041563364 +0000 UTC m=+29.056030914,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.634457 4723 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b2db3bad7684e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2db3bad7684e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:48.257497166 +0000 UTC m=+2.271964746,LastTimestamp:2026-03-09 12:59:15.053900521 +0000 UTC m=+29.068368071,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.638716 4723 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b2db539ecda52\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 09 12:59:51 crc kubenswrapper[4723]: &Event{ObjectMeta:{kube-controller-manager-crc.189b2db539ecda52 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 09 12:59:51 crc kubenswrapper[4723]: body: Mar 09 12:59:51 crc kubenswrapper[4723]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:54.684576338 +0000 UTC m=+8.699043888,LastTimestamp:2026-03-09 12:59:24.684936161 +0000 UTC m=+38.699403781,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 12:59:51 crc kubenswrapper[4723]: > Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.643222 4723 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b2db539edd7ec\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2db539edd7ec openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:54.68464126 +0000 UTC m=+8.699108810,LastTimestamp:2026-03-09 12:59:24.685025433 +0000 UTC m=+38.699493003,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 12:59:51 crc kubenswrapper[4723]: E0309 12:59:51.647248 4723 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b2db539ecda52\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 09 12:59:51 crc kubenswrapper[4723]: &Event{ObjectMeta:{kube-controller-manager-crc.189b2db539ecda52 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 09 12:59:51 crc kubenswrapper[4723]: body: Mar 09 12:59:51 crc kubenswrapper[4723]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 12:58:54.684576338 +0000 UTC m=+8.699043888,LastTimestamp:2026-03-09 12:59:34.68532542 +0000 UTC m=+48.699792990,Count:5,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 12:59:51 crc kubenswrapper[4723]: > Mar 09 12:59:51 crc kubenswrapper[4723]: I0309 12:59:51.684299 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 12:59:51 crc kubenswrapper[4723]: I0309 12:59:51.684437 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:59:51 crc kubenswrapper[4723]: I0309 12:59:51.685551 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:59:51 crc kubenswrapper[4723]: I0309 12:59:51.685611 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:59:51 crc kubenswrapper[4723]: I0309 12:59:51.685629 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:59:51 crc kubenswrapper[4723]: I0309 12:59:51.807100 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 12:59:52 crc kubenswrapper[4723]: I0309 12:59:52.617077 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 12:59:52 crc kubenswrapper[4723]: I0309 12:59:52.617205 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:59:52 crc kubenswrapper[4723]: I0309 12:59:52.618544 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:59:52 crc kubenswrapper[4723]: I0309 12:59:52.618590 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:59:52 crc kubenswrapper[4723]: I0309 12:59:52.618603 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:59:52 crc kubenswrapper[4723]: I0309 12:59:52.810273 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 12:59:53 crc kubenswrapper[4723]: I0309 12:59:53.805740 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 12:59:54 crc kubenswrapper[4723]: I0309 12:59:54.806140 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 12:59:54 crc kubenswrapper[4723]: I0309 12:59:54.881131 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:59:54 crc kubenswrapper[4723]: I0309 12:59:54.882801 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:59:54 crc kubenswrapper[4723]: I0309 12:59:54.882920 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:59:54 crc kubenswrapper[4723]: I0309 12:59:54.882942 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:59:54 crc kubenswrapper[4723]: I0309 12:59:54.883789 4723 scope.go:117] "RemoveContainer" containerID="4863f4037639c1a2b5d073d0651bd57975684c007e0262ebb8a44c4259e75f2d" Mar 09 12:59:55 crc kubenswrapper[4723]: I0309 12:59:55.192574 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 09 12:59:55 crc kubenswrapper[4723]: I0309 12:59:55.194050 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468"} Mar 09 12:59:55 crc kubenswrapper[4723]: I0309 12:59:55.194272 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:59:55 crc kubenswrapper[4723]: I0309 12:59:55.195802 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:59:55 crc kubenswrapper[4723]: I0309 12:59:55.196124 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:59:55 crc kubenswrapper[4723]: I0309 12:59:55.196155 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:59:55 crc kubenswrapper[4723]: I0309 12:59:55.813681 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 12:59:56 crc kubenswrapper[4723]: I0309 12:59:56.196758 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 09 12:59:56 crc kubenswrapper[4723]: I0309 12:59:56.197101 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 09 12:59:56 crc kubenswrapper[4723]: I0309 12:59:56.198459 4723 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468" exitCode=255 Mar 09 12:59:56 crc kubenswrapper[4723]: I0309 12:59:56.198490 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468"} Mar 09 12:59:56 crc kubenswrapper[4723]: I0309 12:59:56.198522 4723 scope.go:117] "RemoveContainer" containerID="4863f4037639c1a2b5d073d0651bd57975684c007e0262ebb8a44c4259e75f2d" Mar 09 12:59:56 crc kubenswrapper[4723]: I0309 12:59:56.198645 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:59:56 crc kubenswrapper[4723]: I0309 12:59:56.199408 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:59:56 crc kubenswrapper[4723]: I0309 12:59:56.199428 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:59:56 crc kubenswrapper[4723]: I0309 12:59:56.199436 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:59:56 crc kubenswrapper[4723]: I0309 12:59:56.199808 4723 scope.go:117] "RemoveContainer" containerID="093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468" Mar 09 12:59:56 crc kubenswrapper[4723]: E0309 12:59:56.199962 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 12:59:56 crc kubenswrapper[4723]: E0309 12:59:56.618445 4723 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 09 12:59:56 crc kubenswrapper[4723]: I0309 12:59:56.633287 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:59:56 crc kubenswrapper[4723]: I0309 12:59:56.634219 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:59:56 crc kubenswrapper[4723]: I0309 12:59:56.634252 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:59:56 crc kubenswrapper[4723]: I0309 12:59:56.634263 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:59:56 crc kubenswrapper[4723]: I0309 12:59:56.634283 4723 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 12:59:56 crc kubenswrapper[4723]: E0309 12:59:56.638278 4723 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 09 12:59:56 crc kubenswrapper[4723]: I0309 12:59:56.808377 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 12:59:56 crc kubenswrapper[4723]: E0309 12:59:56.956149 4723 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 12:59:57 crc kubenswrapper[4723]: I0309 12:59:57.203246 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 09 12:59:57 crc kubenswrapper[4723]: I0309 12:59:57.805991 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 12:59:58 crc kubenswrapper[4723]: I0309 12:59:58.527300 4723 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 12:59:58 crc kubenswrapper[4723]: I0309 12:59:58.542581 4723 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 09 12:59:58 crc kubenswrapper[4723]: I0309 12:59:58.808049 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 12:59:59 crc kubenswrapper[4723]: I0309 12:59:59.114670 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 12:59:59 crc kubenswrapper[4723]: I0309 12:59:59.114929 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:59:59 crc kubenswrapper[4723]: I0309 12:59:59.116405 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:59:59 crc kubenswrapper[4723]: I0309 12:59:59.116462 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:59:59 crc kubenswrapper[4723]: I0309 12:59:59.116480 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:59:59 crc kubenswrapper[4723]: I0309 12:59:59.176121 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 12:59:59 crc kubenswrapper[4723]: I0309 12:59:59.176310 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 12:59:59 crc kubenswrapper[4723]: I0309 12:59:59.178389 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 12:59:59 crc kubenswrapper[4723]: I0309 12:59:59.178585 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 12:59:59 crc kubenswrapper[4723]: I0309 12:59:59.178658 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 12:59:59 crc kubenswrapper[4723]: I0309 12:59:59.179199 4723 scope.go:117] "RemoveContainer" containerID="093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468" Mar 09 12:59:59 crc kubenswrapper[4723]: E0309 12:59:59.179424 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 12:59:59 crc kubenswrapper[4723]: I0309 12:59:59.808678 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:00:00 crc kubenswrapper[4723]: I0309 13:00:00.807474 4723 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:00:01 crc kubenswrapper[4723]: I0309 13:00:01.793911 4723 csr.go:261] certificate signing request csr-xcgq9 is approved, waiting to be issued Mar 09 13:00:01 crc kubenswrapper[4723]: I0309 13:00:01.801736 4723 csr.go:257] certificate signing request csr-xcgq9 is issued Mar 09 13:00:01 crc kubenswrapper[4723]: I0309 13:00:01.823735 4723 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 09 13:00:02 crc kubenswrapper[4723]: I0309 13:00:02.650643 4723 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 09 13:00:02 crc kubenswrapper[4723]: I0309 13:00:02.802882 4723 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-01 23:08:58.094825699 +0000 UTC Mar 09 13:00:02 crc kubenswrapper[4723]: I0309 13:00:02.802927 4723 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6418h8m55.291903073s for next certificate rotation Mar 09 13:00:03 crc kubenswrapper[4723]: I0309 13:00:03.639106 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:00:03 crc kubenswrapper[4723]: I0309 13:00:03.640724 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:03 crc kubenswrapper[4723]: I0309 13:00:03.640769 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:03 crc kubenswrapper[4723]: I0309 13:00:03.640783 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:03 crc kubenswrapper[4723]: I0309 13:00:03.640960 4723 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 13:00:03 crc kubenswrapper[4723]: I0309 13:00:03.650008 4723 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 09 13:00:03 crc kubenswrapper[4723]: I0309 13:00:03.650258 4723 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 09 13:00:03 crc kubenswrapper[4723]: E0309 13:00:03.650279 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 09 13:00:03 crc kubenswrapper[4723]: I0309 13:00:03.654352 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:03 crc kubenswrapper[4723]: I0309 13:00:03.654382 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:03 crc kubenswrapper[4723]: I0309 13:00:03.654393 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:03 crc kubenswrapper[4723]: I0309 13:00:03.654408 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:03 crc kubenswrapper[4723]: I0309 13:00:03.654419 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:03Z","lastTransitionTime":"2026-03-09T13:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:03 crc kubenswrapper[4723]: E0309 13:00:03.675629 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7a1a0b61-b39c-449b-8223-316b94b8c26c\\\",\\\"systemUUID\\\":\\\"548816a4-d164-4614-b3ad-c3e4f48e94b9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:00:03 crc kubenswrapper[4723]: I0309 13:00:03.685905 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:03 crc kubenswrapper[4723]: I0309 13:00:03.685940 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:03 crc kubenswrapper[4723]: I0309 13:00:03.685950 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:03 crc kubenswrapper[4723]: I0309 13:00:03.685968 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:03 crc kubenswrapper[4723]: I0309 13:00:03.685980 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:03Z","lastTransitionTime":"2026-03-09T13:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:03 crc kubenswrapper[4723]: E0309 13:00:03.700955 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7a1a0b61-b39c-449b-8223-316b94b8c26c\\\",\\\"systemUUID\\\":\\\"548816a4-d164-4614-b3ad-c3e4f48e94b9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:00:03 crc kubenswrapper[4723]: I0309 13:00:03.711853 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:03 crc kubenswrapper[4723]: I0309 13:00:03.711929 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:03 crc kubenswrapper[4723]: I0309 13:00:03.711945 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:03 crc kubenswrapper[4723]: I0309 13:00:03.711968 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:03 crc kubenswrapper[4723]: I0309 13:00:03.711983 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:03Z","lastTransitionTime":"2026-03-09T13:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:03 crc kubenswrapper[4723]: E0309 13:00:03.727737 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7a1a0b61-b39c-449b-8223-316b94b8c26c\\\",\\\"systemUUID\\\":\\\"548816a4-d164-4614-b3ad-c3e4f48e94b9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:00:03 crc kubenswrapper[4723]: I0309 13:00:03.737482 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:03 crc kubenswrapper[4723]: I0309 13:00:03.737514 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:03 crc kubenswrapper[4723]: I0309 13:00:03.737526 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:03 crc kubenswrapper[4723]: I0309 13:00:03.737543 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:03 crc kubenswrapper[4723]: I0309 13:00:03.737555 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:03Z","lastTransitionTime":"2026-03-09T13:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:03 crc kubenswrapper[4723]: E0309 13:00:03.753702 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7a1a0b61-b39c-449b-8223-316b94b8c26c\\\",\\\"systemUUID\\\":\\\"548816a4-d164-4614-b3ad-c3e4f48e94b9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:00:03 crc kubenswrapper[4723]: E0309 13:00:03.754007 4723 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 13:00:03 crc kubenswrapper[4723]: E0309 13:00:03.754054 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:03 crc kubenswrapper[4723]: E0309 13:00:03.855082 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:03 crc kubenswrapper[4723]: E0309 13:00:03.956086 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:04 crc kubenswrapper[4723]: E0309 13:00:04.056268 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:04 crc kubenswrapper[4723]: E0309 13:00:04.156727 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:04 crc kubenswrapper[4723]: E0309 13:00:04.257850 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:04 crc kubenswrapper[4723]: E0309 13:00:04.359606 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:04 crc kubenswrapper[4723]: E0309 13:00:04.460203 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:04 crc kubenswrapper[4723]: I0309 13:00:04.471585 4723 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:00:04 crc kubenswrapper[4723]: I0309 13:00:04.471776 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:00:04 crc kubenswrapper[4723]: I0309 13:00:04.473128 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:04 crc kubenswrapper[4723]: I0309 13:00:04.473167 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:04 crc kubenswrapper[4723]: I0309 13:00:04.473181 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:04 crc kubenswrapper[4723]: I0309 13:00:04.473984 4723 scope.go:117] "RemoveContainer" containerID="093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468" Mar 09 13:00:04 crc kubenswrapper[4723]: E0309 13:00:04.474208 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 13:00:04 crc kubenswrapper[4723]: E0309 13:00:04.560839 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:04 crc kubenswrapper[4723]: E0309 13:00:04.661006 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:04 crc kubenswrapper[4723]: E0309 13:00:04.761363 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:04 crc kubenswrapper[4723]: E0309 13:00:04.862039 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:04 crc kubenswrapper[4723]: E0309 13:00:04.963097 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:05 crc kubenswrapper[4723]: E0309 13:00:05.064013 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:05 crc kubenswrapper[4723]: E0309 13:00:05.164598 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:05 crc kubenswrapper[4723]: E0309 13:00:05.265554 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:05 crc kubenswrapper[4723]: E0309 13:00:05.365702 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:05 crc kubenswrapper[4723]: E0309 13:00:05.466842 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:05 crc kubenswrapper[4723]: E0309 13:00:05.567781 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:05 crc kubenswrapper[4723]: E0309 13:00:05.668672 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:05 crc kubenswrapper[4723]: E0309 13:00:05.770043 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:05 crc kubenswrapper[4723]: E0309 13:00:05.870737 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:05 crc kubenswrapper[4723]: E0309 13:00:05.970942 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:06 crc kubenswrapper[4723]: E0309 13:00:06.071491 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:06 crc kubenswrapper[4723]: E0309 13:00:06.172313 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:06 crc kubenswrapper[4723]: E0309 13:00:06.272488 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:06 crc kubenswrapper[4723]: E0309 13:00:06.373312 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:06 crc kubenswrapper[4723]: E0309 13:00:06.473780 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:06 crc kubenswrapper[4723]: E0309 13:00:06.574169 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:06 crc kubenswrapper[4723]: E0309 13:00:06.675291 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:06 crc kubenswrapper[4723]: E0309 13:00:06.775691 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:06 crc kubenswrapper[4723]: E0309 13:00:06.875980 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:06 crc kubenswrapper[4723]: E0309 13:00:06.956764 4723 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 13:00:06 crc kubenswrapper[4723]: E0309 13:00:06.977058 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:07 crc kubenswrapper[4723]: E0309 13:00:07.078123 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:07 crc kubenswrapper[4723]: E0309 13:00:07.179257 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:07 crc kubenswrapper[4723]: E0309 13:00:07.280296 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:07 crc kubenswrapper[4723]: E0309 13:00:07.381161 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:07 crc kubenswrapper[4723]: E0309 13:00:07.481998 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:07 crc kubenswrapper[4723]: E0309 13:00:07.582163 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:07 crc kubenswrapper[4723]: E0309 13:00:07.683340 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:07 crc kubenswrapper[4723]: E0309 13:00:07.784331 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:07 crc kubenswrapper[4723]: E0309 13:00:07.885327 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:07 crc kubenswrapper[4723]: E0309 13:00:07.986124 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:08 crc kubenswrapper[4723]: E0309 13:00:08.086274 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:08 crc kubenswrapper[4723]: E0309 13:00:08.187396 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:08 crc kubenswrapper[4723]: E0309 13:00:08.287946 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:08 crc kubenswrapper[4723]: E0309 13:00:08.388742 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:08 crc kubenswrapper[4723]: E0309 13:00:08.489939 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:08 crc kubenswrapper[4723]: E0309 13:00:08.591045 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:08 crc kubenswrapper[4723]: E0309 13:00:08.691470 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:08 crc kubenswrapper[4723]: E0309 13:00:08.792191 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:08 crc kubenswrapper[4723]: E0309 13:00:08.893000 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:08 crc kubenswrapper[4723]: E0309 13:00:08.993834 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:09 crc kubenswrapper[4723]: E0309 13:00:09.094414 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:09 crc kubenswrapper[4723]: E0309 13:00:09.195569 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:09 crc kubenswrapper[4723]: E0309 13:00:09.295928 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:09 crc kubenswrapper[4723]: E0309 13:00:09.396102 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:09 crc kubenswrapper[4723]: E0309 13:00:09.497314 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:09 crc kubenswrapper[4723]: E0309 13:00:09.598360 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:09 crc kubenswrapper[4723]: E0309 13:00:09.698708 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:09 crc kubenswrapper[4723]: E0309 13:00:09.798965 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:09 crc kubenswrapper[4723]: E0309 13:00:09.899711 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:10 crc kubenswrapper[4723]: E0309 13:00:10.000505 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:10 crc kubenswrapper[4723]: E0309 13:00:10.101349 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:10 crc kubenswrapper[4723]: E0309 13:00:10.202318 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:10 crc kubenswrapper[4723]: I0309 13:00:10.276571 4723 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 09 13:00:10 crc kubenswrapper[4723]: E0309 13:00:10.302734 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:10 crc kubenswrapper[4723]: E0309 13:00:10.403144 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:10 crc kubenswrapper[4723]: E0309 13:00:10.504232 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:10 crc kubenswrapper[4723]: E0309 13:00:10.605304 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:10 crc kubenswrapper[4723]: E0309 13:00:10.706302 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:10 crc kubenswrapper[4723]: E0309 13:00:10.806696 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:10 crc kubenswrapper[4723]: E0309 13:00:10.907388 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:11 crc kubenswrapper[4723]: E0309 13:00:11.007577 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:11 crc kubenswrapper[4723]: E0309 13:00:11.108560 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:11 crc kubenswrapper[4723]: E0309 13:00:11.208975 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:11 crc kubenswrapper[4723]: E0309 13:00:11.309939 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:11 crc kubenswrapper[4723]: E0309 13:00:11.411169 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:11 crc kubenswrapper[4723]: E0309 13:00:11.512100 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:11 crc kubenswrapper[4723]: E0309 13:00:11.613108 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:11 crc kubenswrapper[4723]: E0309 13:00:11.713612 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:11 crc kubenswrapper[4723]: E0309 13:00:11.813718 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:11 crc kubenswrapper[4723]: E0309 13:00:11.914979 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:12 crc kubenswrapper[4723]: E0309 13:00:12.015964 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:12 crc kubenswrapper[4723]: E0309 13:00:12.116915 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:12 crc kubenswrapper[4723]: E0309 13:00:12.217590 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:12 crc kubenswrapper[4723]: E0309 13:00:12.318330 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:12 crc kubenswrapper[4723]: E0309 13:00:12.419187 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:12 crc kubenswrapper[4723]: E0309 13:00:12.519976 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:12 crc kubenswrapper[4723]: E0309 13:00:12.620941 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:12 crc kubenswrapper[4723]: E0309 13:00:12.721783 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:12 crc kubenswrapper[4723]: E0309 13:00:12.822381 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:12 crc kubenswrapper[4723]: E0309 13:00:12.923147 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:13 crc kubenswrapper[4723]: E0309 13:00:13.023390 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:13 crc kubenswrapper[4723]: E0309 13:00:13.124074 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:13 crc kubenswrapper[4723]: E0309 13:00:13.224192 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:13 crc kubenswrapper[4723]: E0309 13:00:13.324343 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:13 crc kubenswrapper[4723]: E0309 13:00:13.425519 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:13 crc kubenswrapper[4723]: E0309 13:00:13.526317 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:13 crc kubenswrapper[4723]: E0309 13:00:13.627053 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:13 crc kubenswrapper[4723]: E0309 13:00:13.727277 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:13 crc kubenswrapper[4723]: E0309 13:00:13.827404 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:13 crc kubenswrapper[4723]: E0309 13:00:13.927730 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:14 crc kubenswrapper[4723]: E0309 13:00:14.028321 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:14 crc kubenswrapper[4723]: E0309 13:00:14.128565 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:14 crc kubenswrapper[4723]: E0309 13:00:14.139922 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 09 13:00:14 crc kubenswrapper[4723]: I0309 13:00:14.144967 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:14 crc kubenswrapper[4723]: I0309 13:00:14.145022 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:14 crc kubenswrapper[4723]: I0309 13:00:14.145039 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:14 crc kubenswrapper[4723]: I0309 13:00:14.145064 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:14 crc kubenswrapper[4723]: I0309 13:00:14.145081 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:14Z","lastTransitionTime":"2026-03-09T13:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:14 crc kubenswrapper[4723]: E0309 13:00:14.160190 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7a1a0b61-b39c-449b-8223-316b94b8c26c\\\",\\\"systemUUID\\\":\\\"548816a4-d164-4614-b3ad-c3e4f48e94b9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:00:14 crc kubenswrapper[4723]: I0309 13:00:14.164340 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:14 crc kubenswrapper[4723]: I0309 13:00:14.164375 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:14 crc kubenswrapper[4723]: I0309 13:00:14.164385 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:14 crc kubenswrapper[4723]: I0309 13:00:14.164400 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:14 crc kubenswrapper[4723]: I0309 13:00:14.164412 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:14Z","lastTransitionTime":"2026-03-09T13:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:14 crc kubenswrapper[4723]: E0309 13:00:14.182729 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7a1a0b61-b39c-449b-8223-316b94b8c26c\\\",\\\"systemUUID\\\":\\\"548816a4-d164-4614-b3ad-c3e4f48e94b9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:00:14 crc kubenswrapper[4723]: I0309 13:00:14.188110 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:14 crc kubenswrapper[4723]: I0309 13:00:14.188171 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:14 crc kubenswrapper[4723]: I0309 13:00:14.188192 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:14 crc kubenswrapper[4723]: I0309 13:00:14.188217 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:14 crc kubenswrapper[4723]: I0309 13:00:14.188235 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:14Z","lastTransitionTime":"2026-03-09T13:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:14 crc kubenswrapper[4723]: E0309 13:00:14.202722 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7a1a0b61-b39c-449b-8223-316b94b8c26c\\\",\\\"systemUUID\\\":\\\"548816a4-d164-4614-b3ad-c3e4f48e94b9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:00:14 crc kubenswrapper[4723]: I0309 13:00:14.206980 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:14 crc kubenswrapper[4723]: I0309 13:00:14.207022 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:14 crc kubenswrapper[4723]: I0309 13:00:14.207035 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:14 crc kubenswrapper[4723]: I0309 13:00:14.207053 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:14 crc kubenswrapper[4723]: I0309 13:00:14.207066 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:14Z","lastTransitionTime":"2026-03-09T13:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:14 crc kubenswrapper[4723]: E0309 13:00:14.220871 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7a1a0b61-b39c-449b-8223-316b94b8c26c\\\",\\\"systemUUID\\\":\\\"548816a4-d164-4614-b3ad-c3e4f48e94b9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:00:14 crc kubenswrapper[4723]: E0309 13:00:14.220975 4723 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 13:00:14 crc kubenswrapper[4723]: E0309 13:00:14.228848 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:14 crc kubenswrapper[4723]: E0309 13:00:14.329594 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:14 crc kubenswrapper[4723]: E0309 13:00:14.430254 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:14 crc kubenswrapper[4723]: E0309 13:00:14.530375 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:14 crc kubenswrapper[4723]: E0309 13:00:14.631424 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:14 crc kubenswrapper[4723]: E0309 13:00:14.732002 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:14 crc kubenswrapper[4723]: E0309 13:00:14.832910 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:14 crc kubenswrapper[4723]: E0309 13:00:14.933225 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:15 crc kubenswrapper[4723]: E0309 13:00:15.033940 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:15 crc kubenswrapper[4723]: E0309 13:00:15.134315 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:15 crc kubenswrapper[4723]: E0309 13:00:15.235477 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:15 crc kubenswrapper[4723]: E0309 13:00:15.335627 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:15 crc kubenswrapper[4723]: E0309 13:00:15.435744 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:15 crc kubenswrapper[4723]: E0309 13:00:15.536955 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:15 crc kubenswrapper[4723]: E0309 13:00:15.638109 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:15 crc kubenswrapper[4723]: E0309 13:00:15.738506 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:15 crc kubenswrapper[4723]: E0309 13:00:15.838905 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:15 crc kubenswrapper[4723]: I0309 13:00:15.880707 4723 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:00:15 crc kubenswrapper[4723]: I0309 13:00:15.882359 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:15 crc kubenswrapper[4723]: I0309 13:00:15.882419 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:15 crc kubenswrapper[4723]: I0309 13:00:15.882433 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:15 crc kubenswrapper[4723]: E0309 13:00:15.939671 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:16 crc kubenswrapper[4723]: E0309 13:00:16.040596 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:16 crc kubenswrapper[4723]: E0309 13:00:16.141277 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:16 crc kubenswrapper[4723]: E0309 13:00:16.242198 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:16 crc kubenswrapper[4723]: E0309 13:00:16.342581 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:16 crc kubenswrapper[4723]: E0309 13:00:16.442772 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:16 crc kubenswrapper[4723]: E0309 13:00:16.543838 4723 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.551848 4723 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.646551 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.646589 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.646598 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.646614 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.646623 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:16Z","lastTransitionTime":"2026-03-09T13:00:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.750355 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.750427 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.750443 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.750466 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.750485 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:16Z","lastTransitionTime":"2026-03-09T13:00:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.760971 4723 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.853471 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.853541 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.853550 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.853565 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.853575 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:16Z","lastTransitionTime":"2026-03-09T13:00:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.897655 4723 apiserver.go:52] "Watching apiserver" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.903903 4723 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.904212 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.904650 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.904715 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.904884 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.905054 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 13:00:16 crc kubenswrapper[4723]: E0309 13:00:16.905145 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:00:16 crc kubenswrapper[4723]: E0309 13:00:16.905243 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.905597 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.905609 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 13:00:16 crc kubenswrapper[4723]: E0309 13:00:16.905668 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.909171 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.909998 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.910200 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.910419 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.910610 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.911008 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.911093 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.911017 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.913030 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.914257 4723 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.918502 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.918583 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.918637 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.918686 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.918733 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.918787 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.918829 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.918914 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.918965 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.919013 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.919059 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.919107 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.919153 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.919161 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.919199 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.919245 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.919291 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.919337 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.919384 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.919438 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.919485 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.919534 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.919576 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.919619 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.919651 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.919661 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.919708 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.919753 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.919800 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.919848 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.919951 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.919998 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.920041 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.920088 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.920136 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.920185 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.920232 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.920278 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.920327 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.920375 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.920423 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.920471 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.920519 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.920564 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.920629 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.920683 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.920725 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.920762 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.920799 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.920837 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.920917 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.920956 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.920993 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.921030 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.921070 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.921110 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.921151 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.921201 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.921252 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.921292 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.921327 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.921363 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.921394 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.921424 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.921459 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.921491 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.921524 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.921556 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.921589 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.921622 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.921652 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.921735 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.921786 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.919800 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.919809 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.919904 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.921835 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.921921 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.921980 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.922029 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.922073 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.922116 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.922167 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.922215 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.922260 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.922309 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.922410 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.922460 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.922507 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.922553 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.922599 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.922644 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.922689 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.922740 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.922789 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.922836 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.922919 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.922969 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.923015 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.923061 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.923116 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.923171 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.923219 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.923268 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.923318 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.923369 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.923415 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.923603 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.923673 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.923723 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.923780 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.923828 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.923936 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.923988 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.924043 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.924094 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.924144 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.924199 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.924251 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.924302 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.924350 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.924403 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.924453 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.924504 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.924557 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.924608 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.924659 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.924713 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.924768 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.924817 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.924900 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.924954 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.925006 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.925058 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.925118 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.925167 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.925219 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.925267 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.925320 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.925363 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.925400 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.925434 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.925474 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.925508 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.925544 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.925576 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.925620 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.925653 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.925723 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.925774 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.925817 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.925854 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.925934 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.925983 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.926036 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.926082 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.926134 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.926185 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.926235 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.926287 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.926335 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.926382 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.926434 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.926485 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.926539 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.926591 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.926642 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.926693 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.926749 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.926801 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.926852 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.926940 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.926993 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.927047 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.927105 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.927164 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.927213 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.927262 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.927313 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.927364 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.927408 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.927444 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.927478 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.927514 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.927560 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.927612 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.927662 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.927715 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.927766 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.927823 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.927939 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.927994 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.928047 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.928103 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.928158 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.928211 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.928265 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.928320 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.928366 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.920031 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.920108 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.920158 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.920228 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.920364 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.920433 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.928819 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.920652 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.920658 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.920843 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.921004 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.921052 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.921071 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.921747 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.921804 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.921898 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.921949 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.922178 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.922209 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.922213 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.922406 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.922479 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.922493 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.922676 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.922726 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.922881 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.922832 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.923077 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.923112 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.923237 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.923268 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.923454 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.923536 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.923877 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.923947 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.924110 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.924128 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.924330 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.924477 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.924570 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.925580 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.925765 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.925813 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.926082 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.926256 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.926468 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.926762 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.926805 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.926804 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.927075 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.927674 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.927747 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.927790 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.927965 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.928092 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.928116 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.928306 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.928524 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.928626 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.928885 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.929124 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.929289 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.929277 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.929450 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.929498 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.929678 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.928374 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.931330 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.931545 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.931747 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.931939 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.932061 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.932180 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.932289 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.932414 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.934760 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.935418 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.935536 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.935663 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.936205 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.936250 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.936291 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.936394 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.936417 4723 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.936439 4723 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.936461 4723 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.936482 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.936502 4723 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.936523 4723 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.936552 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.936582 4723 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.936613 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.936639 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.936668 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.936699 4723 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.936730 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.936757 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.936789 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.936818 4723 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.936849 4723 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.936928 4723 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.936963 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.936997 4723 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.937028 4723 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.937059 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.937084 4723 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.937104 4723 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.937127 4723 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.937152 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.937173 4723 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.937195 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.937230 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.937259 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.937286 4723 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.938342 4723 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.930991 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.931253 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.931435 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.931459 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.931544 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.931552 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.931732 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.931777 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.931851 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.932003 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.932062 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.942813 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.932537 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.932622 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.932837 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.933105 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.933099 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.933109 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.933137 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.933213 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.933234 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.933313 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.933590 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.933609 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.933753 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.933766 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.933837 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.934001 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.934404 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.934587 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.934878 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.935165 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.935290 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.935281 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.935502 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.935560 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.936219 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.936391 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.936419 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.936473 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.936828 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.937285 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.937377 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.937631 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.937876 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.937978 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.939005 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.939182 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.939224 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.939504 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.939565 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.940291 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.940124 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.940350 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.940409 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.940527 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.940532 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: E0309 13:00:16.940612 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:00:17.440596589 +0000 UTC m=+91.455064129 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.940745 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.940798 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.940981 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.940971 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.941387 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.941803 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.942783 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.943392 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: E0309 13:00:16.943273 4723 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.944532 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.944652 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.944825 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.944889 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.944944 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:00:16 crc kubenswrapper[4723]: E0309 13:00:16.945084 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:00:17.44506322 +0000 UTC m=+91.459530770 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.943492 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.945314 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.945365 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.945688 4723 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.946319 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.946401 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.946706 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.947009 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.947123 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.947594 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.947695 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.947900 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.947918 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.947938 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: E0309 13:00:16.948022 4723 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:00:16 crc kubenswrapper[4723]: E0309 13:00:16.948109 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:00:17.448050704 +0000 UTC m=+91.462518244 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.948360 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.948302 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.952574 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.955083 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.955106 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.955648 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.955741 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.955946 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.955957 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.955970 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.955979 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.955993 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.956003 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:16Z","lastTransitionTime":"2026-03-09T13:00:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.956166 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.956266 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.956281 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: E0309 13:00:16.957170 4723 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:00:16 crc kubenswrapper[4723]: E0309 13:00:16.957685 4723 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:00:16 crc kubenswrapper[4723]: E0309 13:00:16.957698 4723 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:00:16 crc kubenswrapper[4723]: E0309 13:00:16.958766 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 13:00:17.457792527 +0000 UTC m=+91.472260067 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:00:16 crc kubenswrapper[4723]: E0309 13:00:16.958916 4723 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:00:16 crc kubenswrapper[4723]: E0309 13:00:16.958927 4723 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:00:16 crc kubenswrapper[4723]: E0309 13:00:16.958935 4723 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:00:16 crc kubenswrapper[4723]: E0309 13:00:16.958960 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 13:00:17.458953316 +0000 UTC m=+91.473420856 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.961429 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.961553 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.961711 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.967883 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.967994 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.968180 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.968569 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.968960 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.971829 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.972957 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.973531 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.973584 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.973610 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.973637 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.974026 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.975045 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.975328 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.975357 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.975473 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.975736 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.976427 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.976465 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.976561 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.976618 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.977073 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.977456 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.978099 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.978364 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.983348 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.985141 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.985136 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.986058 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.986122 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.986142 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.986806 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.986849 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.987182 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.987540 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.995266 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:00:16 crc kubenswrapper[4723]: I0309 13:00:16.997560 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.003318 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.009459 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.012820 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.039048 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.039149 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.039298 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.039515 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.039599 4723 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.039671 4723 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.039750 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.039830 4723 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.039922 4723 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.040009 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.040094 4723 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.040185 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.040270 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.040369 4723 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.039334 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.040449 4723 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.040504 4723 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.040523 4723 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.040538 4723 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.040552 4723 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.040566 4723 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.040578 4723 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.040591 4723 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.040603 4723 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.040615 4723 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.040630 4723 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.040642 4723 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.040655 4723 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.040667 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.040679 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.040691 4723 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.040703 4723 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.040714 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.040726 4723 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.040737 4723 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.040749 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.040760 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.040772 4723 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.040783 4723 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.040795 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.040806 4723 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.040818 4723 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.040831 4723 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.040843 4723 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.040854 4723 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.040887 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.040899 4723 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.040910 4723 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.040922 4723 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.040934 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.040945 4723 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.040957 4723 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.040969 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.040980 4723 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.040992 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041006 4723 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041018 4723 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041030 4723 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041041 4723 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041053 4723 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041064 4723 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041074 4723 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041086 4723 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041098 4723 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041110 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041121 4723 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041133 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041147 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041158 4723 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041170 4723 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041182 4723 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041194 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041204 4723 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041216 4723 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041227 4723 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041239 4723 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041253 4723 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041266 4723 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041277 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041289 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041300 4723 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041311 4723 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041323 4723 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041335 4723 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041347 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041361 4723 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041372 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041384 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041396 4723 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041408 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041419 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041430 4723 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041442 4723 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041454 4723 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041466 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041477 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041489 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041500 4723 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041512 4723 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041524 4723 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041535 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041547 4723 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041559 4723 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041570 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041582 4723 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041593 4723 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041604 4723 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041615 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041626 4723 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041639 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041650 4723 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041661 4723 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041672 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041683 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041695 4723 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041706 4723 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041717 4723 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041728 4723 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041740 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041752 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041764 4723 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041775 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041787 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041798 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041809 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041822 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041833 4723 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041844 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041856 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041892 4723 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041909 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041925 4723 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041941 4723 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041953 4723 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.041965 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.042002 4723 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.042014 4723 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.042025 4723 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.042036 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.042047 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.042059 4723 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.042071 4723 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.042082 4723 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.042093 4723 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.042104 4723 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.042115 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.042127 4723 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.042138 4723 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.042149 4723 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.042160 4723 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.042171 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.042183 4723 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.042196 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.042207 4723 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.042218 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.042229 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.042240 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.042252 4723 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.042263 4723 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.042275 4723 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.042285 4723 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.042298 4723 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.042310 4723 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.042322 4723 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.042334 4723 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.058486 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.058527 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.058536 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.058551 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.058560 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:17Z","lastTransitionTime":"2026-03-09T13:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.090923 4723 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.161654 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.161712 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.161729 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.161754 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.161771 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:17Z","lastTransitionTime":"2026-03-09T13:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.218941 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.243102 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.245637 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.263909 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"9ec420a57851b2692481d02d020c62beee66a4094ede71c0d18d1a68c71b2e6b"} Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.266309 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.266342 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.266354 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.266372 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.266383 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:17Z","lastTransitionTime":"2026-03-09T13:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:17 crc kubenswrapper[4723]: W0309 13:00:17.272803 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-39461a318d6e290a8e8516c3d2f9f391034aee37e7b67c952ed349e2074da17e WatchSource:0}: Error finding container 39461a318d6e290a8e8516c3d2f9f391034aee37e7b67c952ed349e2074da17e: Status 404 returned error can't find the container with id 39461a318d6e290a8e8516c3d2f9f391034aee37e7b67c952ed349e2074da17e Mar 09 13:00:17 crc kubenswrapper[4723]: W0309 13:00:17.279266 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-6e98265b1ad9f39b06cf5131ac8238a897861e3ae41d9c2dc9d27e23b71da4f0 WatchSource:0}: Error finding container 6e98265b1ad9f39b06cf5131ac8238a897861e3ae41d9c2dc9d27e23b71da4f0: Status 404 returned error can't find the container with id 6e98265b1ad9f39b06cf5131ac8238a897861e3ae41d9c2dc9d27e23b71da4f0 Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.368265 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.368294 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.368303 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.368316 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.368325 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:17Z","lastTransitionTime":"2026-03-09T13:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.445987 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:00:17 crc kubenswrapper[4723]: E0309 13:00:17.446154 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:00:18.446132589 +0000 UTC m=+92.460600129 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.446229 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:00:17 crc kubenswrapper[4723]: E0309 13:00:17.446313 4723 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:00:17 crc kubenswrapper[4723]: E0309 13:00:17.446357 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:00:18.446349134 +0000 UTC m=+92.460816674 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.470756 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.470803 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.470819 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.470838 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.470853 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:17Z","lastTransitionTime":"2026-03-09T13:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.546789 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.546835 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.546878 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:00:17 crc kubenswrapper[4723]: E0309 13:00:17.546961 4723 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:00:17 crc kubenswrapper[4723]: E0309 13:00:17.546984 4723 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:00:17 crc kubenswrapper[4723]: E0309 13:00:17.546996 4723 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:00:17 crc kubenswrapper[4723]: E0309 13:00:17.547004 4723 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:00:17 crc kubenswrapper[4723]: E0309 13:00:17.547009 4723 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:00:17 crc kubenswrapper[4723]: E0309 13:00:17.547015 4723 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:00:17 crc kubenswrapper[4723]: E0309 13:00:17.547062 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 13:00:18.547050182 +0000 UTC m=+92.561517722 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:00:17 crc kubenswrapper[4723]: E0309 13:00:17.547057 4723 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:00:17 crc kubenswrapper[4723]: E0309 13:00:17.547077 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 13:00:18.547072232 +0000 UTC m=+92.561539772 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:00:17 crc kubenswrapper[4723]: E0309 13:00:17.547168 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:00:18.547148464 +0000 UTC m=+92.561616004 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.573623 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.573673 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.573686 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.573704 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.573715 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:17Z","lastTransitionTime":"2026-03-09T13:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.676698 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.676746 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.676758 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.676778 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.676793 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:17Z","lastTransitionTime":"2026-03-09T13:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.779191 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.779232 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.779241 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.779256 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.779266 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:17Z","lastTransitionTime":"2026-03-09T13:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.882200 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.882267 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.882288 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.882316 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.882337 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:17Z","lastTransitionTime":"2026-03-09T13:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.985116 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.986111 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.986147 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.986186 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:17 crc kubenswrapper[4723]: I0309 13:00:17.986204 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:17Z","lastTransitionTime":"2026-03-09T13:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.089509 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.089584 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.089605 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.089632 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.089650 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:18Z","lastTransitionTime":"2026-03-09T13:00:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.192076 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.192147 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.192170 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.192196 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.192214 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:18Z","lastTransitionTime":"2026-03-09T13:00:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.269381 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"bb8bd72449608bbfb68165af6302aae01ce97dd9cf13392ccd6366dc7fbb7e08"} Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.271556 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"6e98265b1ad9f39b06cf5131ac8238a897861e3ae41d9c2dc9d27e23b71da4f0"} Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.276394 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"65ca3fb3acb84f38e208051240b02c48f3a17be1ed5b3e309c769ae601bac22f"} Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.276469 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c06f3f6a6fc3af6d7c96d5860969110d357f79b45ffba8797c8939c39c299a82"} Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.276493 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"39461a318d6e290a8e8516c3d2f9f391034aee37e7b67c952ed349e2074da17e"} Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.291405 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:18Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.295834 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.295910 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.295932 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.295953 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.295970 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:18Z","lastTransitionTime":"2026-03-09T13:00:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.308747 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:18Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.329757 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8bd72449608bbfb68165af6302aae01ce97dd9cf13392ccd6366dc7fbb7e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:18Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.344303 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:18Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.361753 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:18Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.376073 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:18Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.392828 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:18Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.397959 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.397994 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.398004 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.398019 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.398027 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:18Z","lastTransitionTime":"2026-03-09T13:00:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.406116 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:18Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.426495 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ca3fb3acb84f38e208051240b02c48f3a17be1ed5b3e309c769ae601bac22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06f3f6a6fc3af6d7c96d5860969110d357f79b45ffba8797c8939c39c299a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:18Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.445136 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:18Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.455926 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.456059 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:00:18 crc kubenswrapper[4723]: E0309 13:00:18.456192 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:00:20.456153001 +0000 UTC m=+94.470620591 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:00:18 crc kubenswrapper[4723]: E0309 13:00:18.456209 4723 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:00:18 crc kubenswrapper[4723]: E0309 13:00:18.456329 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:00:20.456305385 +0000 UTC m=+94.470772925 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.461849 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8bd72449608bbfb68165af6302aae01ce97dd9cf13392ccd6366dc7fbb7e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:18Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.475307 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:18Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.500639 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.500676 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.500686 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.500700 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.500711 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:18Z","lastTransitionTime":"2026-03-09T13:00:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.557231 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.557272 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.557292 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:00:18 crc kubenswrapper[4723]: E0309 13:00:18.557428 4723 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:00:18 crc kubenswrapper[4723]: E0309 13:00:18.557449 4723 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:00:18 crc kubenswrapper[4723]: E0309 13:00:18.557462 4723 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:00:18 crc kubenswrapper[4723]: E0309 13:00:18.557480 4723 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:00:18 crc kubenswrapper[4723]: E0309 13:00:18.557509 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 13:00:20.557490485 +0000 UTC m=+94.571958025 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:00:18 crc kubenswrapper[4723]: E0309 13:00:18.557520 4723 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:00:18 crc kubenswrapper[4723]: E0309 13:00:18.557509 4723 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:00:18 crc kubenswrapper[4723]: E0309 13:00:18.557608 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:00:20.557588167 +0000 UTC m=+94.572055717 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:00:18 crc kubenswrapper[4723]: E0309 13:00:18.557533 4723 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:00:18 crc kubenswrapper[4723]: E0309 13:00:18.557699 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 13:00:20.55768203 +0000 UTC m=+94.572149570 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.602503 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.602545 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.602556 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.602574 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.602587 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:18Z","lastTransitionTime":"2026-03-09T13:00:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.705546 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.705632 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.705682 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.705762 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.705853 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:18Z","lastTransitionTime":"2026-03-09T13:00:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.808258 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.808288 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.808296 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.808310 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.808321 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:18Z","lastTransitionTime":"2026-03-09T13:00:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.882408 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:00:18 crc kubenswrapper[4723]: E0309 13:00:18.882507 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.882757 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:00:18 crc kubenswrapper[4723]: E0309 13:00:18.882806 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.883226 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:00:18 crc kubenswrapper[4723]: E0309 13:00:18.883287 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.885095 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.885789 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.886468 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.887072 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.887765 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.888289 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.890067 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.890595 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.891515 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.892094 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.892551 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.893667 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.894291 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.895190 4723 scope.go:117] "RemoveContainer" containerID="093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.895235 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: E0309 13:00:18.895435 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.895759 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.896663 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.897231 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.897585 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.898600 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.899182 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.899615 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.900620 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.902073 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.903848 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.904253 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.905297 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.906075 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.907948 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.908527 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.909447 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.910691 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.910723 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.910732 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.910745 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.910755 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:18Z","lastTransitionTime":"2026-03-09T13:00:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.910704 4723 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.910980 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.912661 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.913174 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.914062 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.915949 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.916693 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.917601 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.918395 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.919473 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.919957 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.920926 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.921499 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.922446 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.922901 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.923750 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.924248 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.925438 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.925894 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.926676 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.927131 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.928011 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.928689 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.929254 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 09 13:00:18 crc kubenswrapper[4723]: I0309 13:00:18.930041 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.014047 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.014112 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.014129 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.014153 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.014165 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:19Z","lastTransitionTime":"2026-03-09T13:00:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.116213 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.116297 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.116317 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.116339 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.116355 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:19Z","lastTransitionTime":"2026-03-09T13:00:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.218274 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.218326 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.218344 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.218368 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.218384 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:19Z","lastTransitionTime":"2026-03-09T13:00:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.280156 4723 scope.go:117] "RemoveContainer" containerID="093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468" Mar 09 13:00:19 crc kubenswrapper[4723]: E0309 13:00:19.280444 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.321095 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.321158 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.321182 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.321266 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.321293 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:19Z","lastTransitionTime":"2026-03-09T13:00:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.424240 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.424310 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.424333 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.424365 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.424389 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:19Z","lastTransitionTime":"2026-03-09T13:00:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.526787 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.526829 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.526837 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.526851 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.526900 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:19Z","lastTransitionTime":"2026-03-09T13:00:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.629809 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.629874 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.629887 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.629906 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.629916 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:19Z","lastTransitionTime":"2026-03-09T13:00:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.732345 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.732380 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.732388 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.732401 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.732412 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:19Z","lastTransitionTime":"2026-03-09T13:00:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.834746 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.834798 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.834808 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.834827 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.834838 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:19Z","lastTransitionTime":"2026-03-09T13:00:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.886003 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.937626 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.937714 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.937754 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.937787 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:19 crc kubenswrapper[4723]: I0309 13:00:19.937813 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:19Z","lastTransitionTime":"2026-03-09T13:00:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.040179 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.040213 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.040222 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.040236 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.040246 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:20Z","lastTransitionTime":"2026-03-09T13:00:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.141891 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.141922 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.141930 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.141945 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.141958 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:20Z","lastTransitionTime":"2026-03-09T13:00:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.244266 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.244335 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.244354 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.244381 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.244398 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:20Z","lastTransitionTime":"2026-03-09T13:00:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.284258 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"de264c73b21eebce2a476c8f7d57aa48db637aa83bb6f49be76940a38163b487"} Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.302524 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:20Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.315361 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:20Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.327682 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ca3fb3acb84f38e208051240b02c48f3a17be1ed5b3e309c769ae601bac22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06f3f6a6fc3af6d7c96d5860969110d357f79b45ffba8797c8939c39c299a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:20Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.344730 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:20Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.346437 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.346482 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.346501 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.346523 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.346538 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:20Z","lastTransitionTime":"2026-03-09T13:00:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.353833 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c8c6bc-647d-452a-9bee-7b7ccafd2816\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ace64875a6e7a6735b141be933b0fe4fb330351eb8b4c9a943e32312d019849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:20Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.365779 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T12:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 12:59:55.482549 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 12:59:55.482783 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 12:59:55.483649 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2499336393/tls.crt::/tmp/serving-cert-2499336393/tls.key\\\\\\\"\\\\nI0309 12:59:56.049731 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 12:59:56.052155 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 12:59:56.052201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 12:59:56.052220 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 12:59:56.052226 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 12:59:56.066627 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 12:59:56.066654 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066658 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 12:59:56.066665 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 12:59:56.066668 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 12:59:56.066671 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 12:59:56.067066 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 12:59:56.069624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T12:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:20Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.376962 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8bd72449608bbfb68165af6302aae01ce97dd9cf13392ccd6366dc7fbb7e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:20Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.387164 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de264c73b21eebce2a476c8f7d57aa48db637aa83bb6f49be76940a38163b487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:20Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.449827 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.449992 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.450060 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.450084 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.450097 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:20Z","lastTransitionTime":"2026-03-09T13:00:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.473507 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.473636 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:00:20 crc kubenswrapper[4723]: E0309 13:00:20.473666 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:00:24.473634995 +0000 UTC m=+98.488102545 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:00:20 crc kubenswrapper[4723]: E0309 13:00:20.473740 4723 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:00:20 crc kubenswrapper[4723]: E0309 13:00:20.473817 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:00:24.473797049 +0000 UTC m=+98.488264629 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.552235 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.552284 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.552295 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.552311 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.552324 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:20Z","lastTransitionTime":"2026-03-09T13:00:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.574929 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.575005 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.575034 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:00:20 crc kubenswrapper[4723]: E0309 13:00:20.575144 4723 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:00:20 crc kubenswrapper[4723]: E0309 13:00:20.575178 4723 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:00:20 crc kubenswrapper[4723]: E0309 13:00:20.575198 4723 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:00:20 crc kubenswrapper[4723]: E0309 13:00:20.575211 4723 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:00:20 crc kubenswrapper[4723]: E0309 13:00:20.575146 4723 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:00:20 crc kubenswrapper[4723]: E0309 13:00:20.575262 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:00:24.575226445 +0000 UTC m=+98.589694015 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:00:20 crc kubenswrapper[4723]: E0309 13:00:20.575278 4723 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:00:20 crc kubenswrapper[4723]: E0309 13:00:20.575303 4723 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:00:20 crc kubenswrapper[4723]: E0309 13:00:20.575308 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 13:00:24.575287507 +0000 UTC m=+98.589755157 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:00:20 crc kubenswrapper[4723]: E0309 13:00:20.575396 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 13:00:24.575372969 +0000 UTC m=+98.589840589 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.654380 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.654430 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.654445 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.654461 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.654472 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:20Z","lastTransitionTime":"2026-03-09T13:00:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.757168 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.757214 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.757224 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.757239 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.757252 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:20Z","lastTransitionTime":"2026-03-09T13:00:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.859953 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.860023 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.860036 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.860055 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.860068 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:20Z","lastTransitionTime":"2026-03-09T13:00:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.880369 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.880396 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.880369 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:00:20 crc kubenswrapper[4723]: E0309 13:00:20.880475 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:00:20 crc kubenswrapper[4723]: E0309 13:00:20.880578 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:00:20 crc kubenswrapper[4723]: E0309 13:00:20.880711 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.962385 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.962424 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.962433 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.962449 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:20 crc kubenswrapper[4723]: I0309 13:00:20.962460 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:20Z","lastTransitionTime":"2026-03-09T13:00:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.065065 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.065087 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.065097 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.065112 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.065123 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:21Z","lastTransitionTime":"2026-03-09T13:00:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.167751 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.167795 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.167812 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.167836 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.167854 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:21Z","lastTransitionTime":"2026-03-09T13:00:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.271197 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.271245 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.271265 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.271286 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.271302 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:21Z","lastTransitionTime":"2026-03-09T13:00:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.373920 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.373968 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.373980 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.373996 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.374009 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:21Z","lastTransitionTime":"2026-03-09T13:00:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.476451 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.476494 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.476502 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.476516 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.476535 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:21Z","lastTransitionTime":"2026-03-09T13:00:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.578520 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.578565 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.578577 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.578594 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.578605 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:21Z","lastTransitionTime":"2026-03-09T13:00:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.680432 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.680471 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.680479 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.680491 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.680500 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:21Z","lastTransitionTime":"2026-03-09T13:00:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.782790 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.782834 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.782845 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.782882 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.782895 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:21Z","lastTransitionTime":"2026-03-09T13:00:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.885412 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.885443 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.885460 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.885476 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.885487 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:21Z","lastTransitionTime":"2026-03-09T13:00:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.988923 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.988956 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.988964 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.988976 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:21 crc kubenswrapper[4723]: I0309 13:00:21.988985 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:21Z","lastTransitionTime":"2026-03-09T13:00:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.090851 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.090944 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.090963 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.090983 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.090995 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:22Z","lastTransitionTime":"2026-03-09T13:00:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.193171 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.193215 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.193242 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.193266 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.193281 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:22Z","lastTransitionTime":"2026-03-09T13:00:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.295457 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.295503 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.295516 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.295539 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.295556 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:22Z","lastTransitionTime":"2026-03-09T13:00:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.397642 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.397683 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.397694 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.397710 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.397720 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:22Z","lastTransitionTime":"2026-03-09T13:00:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.499736 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.499799 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.499814 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.499843 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.499886 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:22Z","lastTransitionTime":"2026-03-09T13:00:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.602570 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.602625 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.602637 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.602656 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.602669 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:22Z","lastTransitionTime":"2026-03-09T13:00:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.705165 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.705203 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.705211 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.705225 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.705234 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:22Z","lastTransitionTime":"2026-03-09T13:00:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.807951 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.808029 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.808051 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.808082 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.808120 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:22Z","lastTransitionTime":"2026-03-09T13:00:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.880552 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.880613 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.880648 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:00:22 crc kubenswrapper[4723]: E0309 13:00:22.880686 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:00:22 crc kubenswrapper[4723]: E0309 13:00:22.880825 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:00:22 crc kubenswrapper[4723]: E0309 13:00:22.880999 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.910355 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.910389 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.910401 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.910434 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:22 crc kubenswrapper[4723]: I0309 13:00:22.910443 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:22Z","lastTransitionTime":"2026-03-09T13:00:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.012522 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.012562 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.012588 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.012604 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.012615 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:23Z","lastTransitionTime":"2026-03-09T13:00:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.115158 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.115198 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.115210 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.115226 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.115237 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:23Z","lastTransitionTime":"2026-03-09T13:00:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.218581 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.218646 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.218657 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.218687 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.218705 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:23Z","lastTransitionTime":"2026-03-09T13:00:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.320472 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.320511 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.320527 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.320550 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.320560 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:23Z","lastTransitionTime":"2026-03-09T13:00:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.422708 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.422746 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.422754 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.422768 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.422779 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:23Z","lastTransitionTime":"2026-03-09T13:00:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.536056 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.536082 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.536092 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.536106 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.536114 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:23Z","lastTransitionTime":"2026-03-09T13:00:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.637851 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.637907 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.637918 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.637933 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.637943 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:23Z","lastTransitionTime":"2026-03-09T13:00:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.740400 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.740474 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.740494 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.740520 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.740539 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:23Z","lastTransitionTime":"2026-03-09T13:00:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.850243 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.850323 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.850335 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.850351 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.850361 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:23Z","lastTransitionTime":"2026-03-09T13:00:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.953486 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.953541 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.953556 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.953575 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:23 crc kubenswrapper[4723]: I0309 13:00:23.953590 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:23Z","lastTransitionTime":"2026-03-09T13:00:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.055683 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.055732 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.055745 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.055764 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.055776 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:24Z","lastTransitionTime":"2026-03-09T13:00:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.158472 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.158537 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.158547 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.158562 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.158573 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:24Z","lastTransitionTime":"2026-03-09T13:00:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.260778 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.260813 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.260825 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.260839 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.260850 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:24Z","lastTransitionTime":"2026-03-09T13:00:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.350047 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.350089 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.350100 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.350117 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.350130 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:24Z","lastTransitionTime":"2026-03-09T13:00:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:24 crc kubenswrapper[4723]: E0309 13:00:24.370043 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7a1a0b61-b39c-449b-8223-316b94b8c26c\\\",\\\"systemUUID\\\":\\\"548816a4-d164-4614-b3ad-c3e4f48e94b9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:24Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.374224 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.374253 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.374263 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.374280 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.374290 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:24Z","lastTransitionTime":"2026-03-09T13:00:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:24 crc kubenswrapper[4723]: E0309 13:00:24.389223 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7a1a0b61-b39c-449b-8223-316b94b8c26c\\\",\\\"systemUUID\\\":\\\"548816a4-d164-4614-b3ad-c3e4f48e94b9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:24Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.392589 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.392618 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.392629 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.392645 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.392657 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:24Z","lastTransitionTime":"2026-03-09T13:00:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:24 crc kubenswrapper[4723]: E0309 13:00:24.407773 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7a1a0b61-b39c-449b-8223-316b94b8c26c\\\",\\\"systemUUID\\\":\\\"548816a4-d164-4614-b3ad-c3e4f48e94b9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:24Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.411250 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.411276 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.411285 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.411298 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.411307 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:24Z","lastTransitionTime":"2026-03-09T13:00:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:24 crc kubenswrapper[4723]: E0309 13:00:24.422744 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7a1a0b61-b39c-449b-8223-316b94b8c26c\\\",\\\"systemUUID\\\":\\\"548816a4-d164-4614-b3ad-c3e4f48e94b9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:24Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.426128 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.426155 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.426166 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.426184 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.426196 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:24Z","lastTransitionTime":"2026-03-09T13:00:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:24 crc kubenswrapper[4723]: E0309 13:00:24.437579 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7a1a0b61-b39c-449b-8223-316b94b8c26c\\\",\\\"systemUUID\\\":\\\"548816a4-d164-4614-b3ad-c3e4f48e94b9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:24Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:24 crc kubenswrapper[4723]: E0309 13:00:24.437744 4723 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.439299 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.439328 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.439340 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.439358 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.439370 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:24Z","lastTransitionTime":"2026-03-09T13:00:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.509734 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.509806 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:00:24 crc kubenswrapper[4723]: E0309 13:00:24.509973 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:00:32.509953215 +0000 UTC m=+106.524420755 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:00:24 crc kubenswrapper[4723]: E0309 13:00:24.510039 4723 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:00:24 crc kubenswrapper[4723]: E0309 13:00:24.510071 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:00:32.510064878 +0000 UTC m=+106.524532418 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.541422 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.541459 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.541469 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.541484 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.541496 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:24Z","lastTransitionTime":"2026-03-09T13:00:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.610365 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.610414 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.610437 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:00:24 crc kubenswrapper[4723]: E0309 13:00:24.610558 4723 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:00:24 crc kubenswrapper[4723]: E0309 13:00:24.610577 4723 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:00:24 crc kubenswrapper[4723]: E0309 13:00:24.610589 4723 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:00:24 crc kubenswrapper[4723]: E0309 13:00:24.610636 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 13:00:32.610620602 +0000 UTC m=+106.625088142 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:00:24 crc kubenswrapper[4723]: E0309 13:00:24.610949 4723 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:00:24 crc kubenswrapper[4723]: E0309 13:00:24.610987 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:00:32.610975221 +0000 UTC m=+106.625442771 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:00:24 crc kubenswrapper[4723]: E0309 13:00:24.611109 4723 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:00:24 crc kubenswrapper[4723]: E0309 13:00:24.611148 4723 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:00:24 crc kubenswrapper[4723]: E0309 13:00:24.611160 4723 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:00:24 crc kubenswrapper[4723]: E0309 13:00:24.611218 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 13:00:32.611203597 +0000 UTC m=+106.625671137 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.643316 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.643346 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.643354 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.643368 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.643378 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:24Z","lastTransitionTime":"2026-03-09T13:00:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.746643 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.746682 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.746694 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.746714 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.746728 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:24Z","lastTransitionTime":"2026-03-09T13:00:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.849081 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.849128 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.849145 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.849167 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.849182 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:24Z","lastTransitionTime":"2026-03-09T13:00:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.880329 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:00:24 crc kubenswrapper[4723]: E0309 13:00:24.880493 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.880816 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.880843 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:00:24 crc kubenswrapper[4723]: E0309 13:00:24.880984 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:00:24 crc kubenswrapper[4723]: E0309 13:00:24.881093 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.951633 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.951713 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.951735 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.951768 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:24 crc kubenswrapper[4723]: I0309 13:00:24.951788 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:24Z","lastTransitionTime":"2026-03-09T13:00:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.054575 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.054620 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.054637 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.054659 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.054689 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:25Z","lastTransitionTime":"2026-03-09T13:00:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.158092 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.158162 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.158179 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.158201 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.158215 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:25Z","lastTransitionTime":"2026-03-09T13:00:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.261365 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.261438 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.261460 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.261491 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.261544 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:25Z","lastTransitionTime":"2026-03-09T13:00:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.365166 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.365234 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.365250 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.365272 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.365291 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:25Z","lastTransitionTime":"2026-03-09T13:00:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.468797 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.468878 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.468889 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.468915 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.468933 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:25Z","lastTransitionTime":"2026-03-09T13:00:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.571146 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.571203 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.571214 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.571232 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.571244 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:25Z","lastTransitionTime":"2026-03-09T13:00:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.673893 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.673974 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.673988 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.674008 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.674023 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:25Z","lastTransitionTime":"2026-03-09T13:00:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.777022 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.777075 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.777090 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.777113 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.777128 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:25Z","lastTransitionTime":"2026-03-09T13:00:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.880376 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.880443 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.880484 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.880570 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.880599 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:25Z","lastTransitionTime":"2026-03-09T13:00:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.983173 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.983255 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.983279 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.983304 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:25 crc kubenswrapper[4723]: I0309 13:00:25.983322 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:25Z","lastTransitionTime":"2026-03-09T13:00:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.086384 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.086448 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.086466 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.086498 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.086514 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:26Z","lastTransitionTime":"2026-03-09T13:00:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.189364 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.189443 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.189467 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.189498 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.189520 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:26Z","lastTransitionTime":"2026-03-09T13:00:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.292295 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.292360 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.292377 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.292403 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.292420 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:26Z","lastTransitionTime":"2026-03-09T13:00:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.395415 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.395482 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.395504 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.395531 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.395548 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:26Z","lastTransitionTime":"2026-03-09T13:00:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.498728 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.498787 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.498802 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.498823 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.498834 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:26Z","lastTransitionTime":"2026-03-09T13:00:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.601302 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.601386 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.601408 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.601430 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.601441 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:26Z","lastTransitionTime":"2026-03-09T13:00:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.704795 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.704888 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.704909 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.704935 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.704954 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:26Z","lastTransitionTime":"2026-03-09T13:00:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.808436 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.808486 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.808503 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.808526 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.808542 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:26Z","lastTransitionTime":"2026-03-09T13:00:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.880662 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.880677 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.882293 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:00:26 crc kubenswrapper[4723]: E0309 13:00:26.882463 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:00:26 crc kubenswrapper[4723]: E0309 13:00:26.882724 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:00:26 crc kubenswrapper[4723]: E0309 13:00:26.882948 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.901037 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T12:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 12:59:55.482549 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 12:59:55.482783 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 12:59:55.483649 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2499336393/tls.crt::/tmp/serving-cert-2499336393/tls.key\\\\\\\"\\\\nI0309 12:59:56.049731 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 12:59:56.052155 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 12:59:56.052201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 12:59:56.052220 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 12:59:56.052226 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 12:59:56.066627 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 12:59:56.066654 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066658 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 12:59:56.066665 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 12:59:56.066668 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 12:59:56.066671 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 12:59:56.067066 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 12:59:56.069624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T12:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:26Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.911037 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.911090 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.911106 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.911131 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.911147 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:26Z","lastTransitionTime":"2026-03-09T13:00:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.925116 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8bd72449608bbfb68165af6302aae01ce97dd9cf13392ccd6366dc7fbb7e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:26Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.941258 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:26Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.956061 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:26Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.973806 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ca3fb3acb84f38e208051240b02c48f3a17be1ed5b3e309c769ae601bac22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06f3f6a6fc3af6d7c96d5860969110d357f79b45ffba8797c8939c39c299a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:26Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:26 crc kubenswrapper[4723]: I0309 13:00:26.992117 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:26Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.009524 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c8c6bc-647d-452a-9bee-7b7ccafd2816\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ace64875a6e7a6735b141be933b0fe4fb330351eb8b4c9a943e32312d019849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:27Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.014145 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.014348 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.014640 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.014780 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.014972 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:27Z","lastTransitionTime":"2026-03-09T13:00:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.027273 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de264c73b21eebce2a476c8f7d57aa48db637aa83bb6f49be76940a38163b487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:27Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.118306 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.118372 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.118384 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.118419 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.118429 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:27Z","lastTransitionTime":"2026-03-09T13:00:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.221502 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.221543 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.221552 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.221566 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.221577 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:27Z","lastTransitionTime":"2026-03-09T13:00:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.325360 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.325416 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.325488 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.325515 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.325589 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:27Z","lastTransitionTime":"2026-03-09T13:00:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.428427 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.428506 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.428524 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.428552 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.428576 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:27Z","lastTransitionTime":"2026-03-09T13:00:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.531323 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.531360 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.531375 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.531393 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.531405 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:27Z","lastTransitionTime":"2026-03-09T13:00:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.634270 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.634317 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.634333 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.634355 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.634372 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:27Z","lastTransitionTime":"2026-03-09T13:00:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.738244 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.738330 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.738358 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.738390 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.738411 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:27Z","lastTransitionTime":"2026-03-09T13:00:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.841549 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.841590 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.841606 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.841625 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.841636 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:27Z","lastTransitionTime":"2026-03-09T13:00:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.945111 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.945184 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.945202 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.945228 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:27 crc kubenswrapper[4723]: I0309 13:00:27.945249 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:27Z","lastTransitionTime":"2026-03-09T13:00:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.048968 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.049043 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.049061 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.049089 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.049108 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:28Z","lastTransitionTime":"2026-03-09T13:00:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.152719 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.152825 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.152845 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.152934 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.152977 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:28Z","lastTransitionTime":"2026-03-09T13:00:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.255942 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.256253 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.256432 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.256572 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.256699 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:28Z","lastTransitionTime":"2026-03-09T13:00:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.360515 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.361185 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.361305 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.361471 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.361583 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:28Z","lastTransitionTime":"2026-03-09T13:00:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.464921 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.464963 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.464973 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.464991 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.465002 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:28Z","lastTransitionTime":"2026-03-09T13:00:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.566919 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.566989 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.567001 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.567021 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.567032 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:28Z","lastTransitionTime":"2026-03-09T13:00:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.669804 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.669925 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.669957 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.669986 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.670008 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:28Z","lastTransitionTime":"2026-03-09T13:00:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.771807 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.771915 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.771928 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.771945 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.771980 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:28Z","lastTransitionTime":"2026-03-09T13:00:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.875614 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.875661 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.875675 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.875694 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.875711 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:28Z","lastTransitionTime":"2026-03-09T13:00:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.880154 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.880210 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:00:28 crc kubenswrapper[4723]: E0309 13:00:28.880362 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.880458 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:00:28 crc kubenswrapper[4723]: E0309 13:00:28.880666 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:00:28 crc kubenswrapper[4723]: E0309 13:00:28.880897 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.978809 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.978928 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.978955 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.978986 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:28 crc kubenswrapper[4723]: I0309 13:00:28.979013 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:28Z","lastTransitionTime":"2026-03-09T13:00:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:29 crc kubenswrapper[4723]: I0309 13:00:29.082372 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:29 crc kubenswrapper[4723]: I0309 13:00:29.082422 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:29 crc kubenswrapper[4723]: I0309 13:00:29.082456 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:29 crc kubenswrapper[4723]: I0309 13:00:29.082496 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:29 crc kubenswrapper[4723]: I0309 13:00:29.082519 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:29Z","lastTransitionTime":"2026-03-09T13:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:29 crc kubenswrapper[4723]: I0309 13:00:29.185055 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:29 crc kubenswrapper[4723]: I0309 13:00:29.185096 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:29 crc kubenswrapper[4723]: I0309 13:00:29.185107 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:29 crc kubenswrapper[4723]: I0309 13:00:29.185125 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:29 crc kubenswrapper[4723]: I0309 13:00:29.185138 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:29Z","lastTransitionTime":"2026-03-09T13:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:29 crc kubenswrapper[4723]: I0309 13:00:29.288491 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:29 crc kubenswrapper[4723]: I0309 13:00:29.288557 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:29 crc kubenswrapper[4723]: I0309 13:00:29.288579 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:29 crc kubenswrapper[4723]: I0309 13:00:29.288608 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:29 crc kubenswrapper[4723]: I0309 13:00:29.288630 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:29Z","lastTransitionTime":"2026-03-09T13:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:29 crc kubenswrapper[4723]: I0309 13:00:29.391941 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:29 crc kubenswrapper[4723]: I0309 13:00:29.392073 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:29 crc kubenswrapper[4723]: I0309 13:00:29.392100 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:29 crc kubenswrapper[4723]: I0309 13:00:29.392134 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:29 crc kubenswrapper[4723]: I0309 13:00:29.392157 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:29Z","lastTransitionTime":"2026-03-09T13:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:29 crc kubenswrapper[4723]: I0309 13:00:29.495350 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:29 crc kubenswrapper[4723]: I0309 13:00:29.495384 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:29 crc kubenswrapper[4723]: I0309 13:00:29.495394 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:29 crc kubenswrapper[4723]: I0309 13:00:29.495411 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:29 crc kubenswrapper[4723]: I0309 13:00:29.495422 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:29Z","lastTransitionTime":"2026-03-09T13:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:29 crc kubenswrapper[4723]: I0309 13:00:29.598967 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:29 crc kubenswrapper[4723]: I0309 13:00:29.599027 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:29 crc kubenswrapper[4723]: I0309 13:00:29.599047 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:29 crc kubenswrapper[4723]: I0309 13:00:29.599075 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:29 crc kubenswrapper[4723]: I0309 13:00:29.599098 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:29Z","lastTransitionTime":"2026-03-09T13:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:29 crc kubenswrapper[4723]: I0309 13:00:29.702152 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:29 crc kubenswrapper[4723]: I0309 13:00:29.702205 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:29 crc kubenswrapper[4723]: I0309 13:00:29.702223 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:29 crc kubenswrapper[4723]: I0309 13:00:29.702249 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:29 crc kubenswrapper[4723]: I0309 13:00:29.702267 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:29Z","lastTransitionTime":"2026-03-09T13:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:29 crc kubenswrapper[4723]: I0309 13:00:29.804584 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:29 crc kubenswrapper[4723]: I0309 13:00:29.804618 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:29 crc kubenswrapper[4723]: I0309 13:00:29.804628 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:29 crc kubenswrapper[4723]: I0309 13:00:29.804642 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:29 crc kubenswrapper[4723]: I0309 13:00:29.804653 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:29Z","lastTransitionTime":"2026-03-09T13:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:29 crc kubenswrapper[4723]: I0309 13:00:29.906524 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:29 crc kubenswrapper[4723]: I0309 13:00:29.906567 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:29 crc kubenswrapper[4723]: I0309 13:00:29.906589 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:29 crc kubenswrapper[4723]: I0309 13:00:29.906606 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:29 crc kubenswrapper[4723]: I0309 13:00:29.906617 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:29Z","lastTransitionTime":"2026-03-09T13:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.009003 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.009051 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.009065 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.009087 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.009100 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:30Z","lastTransitionTime":"2026-03-09T13:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.112089 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.112370 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.112572 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.112847 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.113202 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:30Z","lastTransitionTime":"2026-03-09T13:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.216690 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.216764 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.216778 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.216797 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.216810 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:30Z","lastTransitionTime":"2026-03-09T13:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.319674 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.319738 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.319754 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.319779 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.319792 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:30Z","lastTransitionTime":"2026-03-09T13:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.423331 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.423402 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.423426 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.423458 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.423481 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:30Z","lastTransitionTime":"2026-03-09T13:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.527749 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.527910 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.527936 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.527969 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.527991 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:30Z","lastTransitionTime":"2026-03-09T13:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.631517 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.631562 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.631582 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.631609 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.631622 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:30Z","lastTransitionTime":"2026-03-09T13:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.735028 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.735085 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.735102 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.735124 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.735140 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:30Z","lastTransitionTime":"2026-03-09T13:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.837848 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.837949 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.837973 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.838003 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.838026 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:30Z","lastTransitionTime":"2026-03-09T13:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.881016 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.881004 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:00:30 crc kubenswrapper[4723]: E0309 13:00:30.881705 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.881056 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:00:30 crc kubenswrapper[4723]: E0309 13:00:30.881922 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:00:30 crc kubenswrapper[4723]: E0309 13:00:30.882341 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.940835 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.940944 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.940967 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.940996 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:30 crc kubenswrapper[4723]: I0309 13:00:30.941019 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:30Z","lastTransitionTime":"2026-03-09T13:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.044627 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.044988 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.045236 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.045474 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.045665 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:31Z","lastTransitionTime":"2026-03-09T13:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.149453 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.149517 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.149534 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.149559 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.149576 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:31Z","lastTransitionTime":"2026-03-09T13:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.251901 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.251950 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.251960 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.251972 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.251980 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:31Z","lastTransitionTime":"2026-03-09T13:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.354617 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.354671 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.354687 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.354712 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.354730 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:31Z","lastTransitionTime":"2026-03-09T13:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.457468 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.457536 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.457562 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.457594 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.457615 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:31Z","lastTransitionTime":"2026-03-09T13:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.560351 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.560404 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.560417 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.560442 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.560457 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:31Z","lastTransitionTime":"2026-03-09T13:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.662911 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.663138 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.663239 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.663315 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.663386 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:31Z","lastTransitionTime":"2026-03-09T13:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.766883 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.767190 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.767253 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.767327 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.767391 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:31Z","lastTransitionTime":"2026-03-09T13:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.869964 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.870007 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.870019 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.870036 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.870048 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:31Z","lastTransitionTime":"2026-03-09T13:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.972228 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.972265 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.972275 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.972288 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:31 crc kubenswrapper[4723]: I0309 13:00:31.972296 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:31Z","lastTransitionTime":"2026-03-09T13:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.074089 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.074388 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.074459 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.074531 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.074596 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:32Z","lastTransitionTime":"2026-03-09T13:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.177509 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.177839 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.178045 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.178320 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.178708 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:32Z","lastTransitionTime":"2026-03-09T13:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.282287 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.282618 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.282845 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.283151 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.283434 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:32Z","lastTransitionTime":"2026-03-09T13:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.389938 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.390036 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.390061 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.390089 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.390107 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:32Z","lastTransitionTime":"2026-03-09T13:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.492178 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.492233 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.492248 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.492274 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.492292 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:32Z","lastTransitionTime":"2026-03-09T13:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.561363 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.561507 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:00:32 crc kubenswrapper[4723]: E0309 13:00:32.561571 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:00:48.561535799 +0000 UTC m=+122.576003339 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:00:32 crc kubenswrapper[4723]: E0309 13:00:32.561619 4723 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:00:32 crc kubenswrapper[4723]: E0309 13:00:32.561718 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:00:48.561684283 +0000 UTC m=+122.576151863 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.595475 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.595540 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.595553 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.595571 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.595583 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:32Z","lastTransitionTime":"2026-03-09T13:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.662960 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.663023 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.663059 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:00:32 crc kubenswrapper[4723]: E0309 13:00:32.663184 4723 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:00:32 crc kubenswrapper[4723]: E0309 13:00:32.663214 4723 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:00:32 crc kubenswrapper[4723]: E0309 13:00:32.663224 4723 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:00:32 crc kubenswrapper[4723]: E0309 13:00:32.663239 4723 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:00:32 crc kubenswrapper[4723]: E0309 13:00:32.663249 4723 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:00:32 crc kubenswrapper[4723]: E0309 13:00:32.663256 4723 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:00:32 crc kubenswrapper[4723]: E0309 13:00:32.663322 4723 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:00:32 crc kubenswrapper[4723]: E0309 13:00:32.663332 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 13:00:48.663310424 +0000 UTC m=+122.677777994 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:00:32 crc kubenswrapper[4723]: E0309 13:00:32.663450 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 13:00:48.663429407 +0000 UTC m=+122.677896977 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:00:32 crc kubenswrapper[4723]: E0309 13:00:32.663478 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:00:48.663465758 +0000 UTC m=+122.677933328 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.697980 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.698037 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.698056 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.698079 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.698096 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:32Z","lastTransitionTime":"2026-03-09T13:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.800589 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.800774 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.800798 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.800823 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.800841 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:32Z","lastTransitionTime":"2026-03-09T13:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.879832 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.879913 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.879832 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:00:32 crc kubenswrapper[4723]: E0309 13:00:32.880026 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:00:32 crc kubenswrapper[4723]: E0309 13:00:32.880201 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:00:32 crc kubenswrapper[4723]: E0309 13:00:32.880333 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.904201 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.904259 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.904277 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.904299 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:32 crc kubenswrapper[4723]: I0309 13:00:32.904317 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:32Z","lastTransitionTime":"2026-03-09T13:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.007408 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.007477 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.007503 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.007530 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.007548 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:33Z","lastTransitionTime":"2026-03-09T13:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.111058 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.111122 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.111139 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.111162 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.111178 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:33Z","lastTransitionTime":"2026-03-09T13:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.196844 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-l2l9x"] Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.197302 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-l2l9x" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.199587 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.200091 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.200952 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.213591 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.213645 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.213657 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.213678 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.213695 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:33Z","lastTransitionTime":"2026-03-09T13:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.224198 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.242765 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c8c6bc-647d-452a-9bee-7b7ccafd2816\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ace64875a6e7a6735b141be933b0fe4fb330351eb8b4c9a943e32312d019849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.263669 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T12:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 12:59:55.482549 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 12:59:55.482783 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 12:59:55.483649 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2499336393/tls.crt::/tmp/serving-cert-2499336393/tls.key\\\\\\\"\\\\nI0309 12:59:56.049731 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 12:59:56.052155 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 12:59:56.052201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 12:59:56.052220 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 12:59:56.052226 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 12:59:56.066627 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 12:59:56.066654 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066658 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 12:59:56.066665 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 12:59:56.066668 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 12:59:56.066671 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 12:59:56.067066 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 12:59:56.069624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T12:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.268207 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjlzn\" (UniqueName: \"kubernetes.io/projected/91019298-2c2b-48a9-8813-cd58d2681f71-kube-api-access-fjlzn\") pod \"node-resolver-l2l9x\" (UID: \"91019298-2c2b-48a9-8813-cd58d2681f71\") " pod="openshift-dns/node-resolver-l2l9x" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.268378 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/91019298-2c2b-48a9-8813-cd58d2681f71-hosts-file\") pod \"node-resolver-l2l9x\" (UID: \"91019298-2c2b-48a9-8813-cd58d2681f71\") " pod="openshift-dns/node-resolver-l2l9x" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.283465 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8bd72449608bbfb68165af6302aae01ce97dd9cf13392ccd6366dc7fbb7e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.302608 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.316402 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.316481 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.316509 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.316540 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.316560 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:33Z","lastTransitionTime":"2026-03-09T13:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.320500 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.341046 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ca3fb3acb84f38e208051240b02c48f3a17be1ed5b3e309c769ae601bac22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06f3f6a6fc3af6d7c96d5860969110d357f79b45ffba8797c8939c39c299a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.354702 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l2l9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91019298-2c2b-48a9-8813-cd58d2681f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l2l9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.369362 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/91019298-2c2b-48a9-8813-cd58d2681f71-hosts-file\") pod \"node-resolver-l2l9x\" (UID: \"91019298-2c2b-48a9-8813-cd58d2681f71\") " pod="openshift-dns/node-resolver-l2l9x" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.369432 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjlzn\" (UniqueName: \"kubernetes.io/projected/91019298-2c2b-48a9-8813-cd58d2681f71-kube-api-access-fjlzn\") pod \"node-resolver-l2l9x\" (UID: \"91019298-2c2b-48a9-8813-cd58d2681f71\") " pod="openshift-dns/node-resolver-l2l9x" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.369576 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/91019298-2c2b-48a9-8813-cd58d2681f71-hosts-file\") pod \"node-resolver-l2l9x\" (UID: \"91019298-2c2b-48a9-8813-cd58d2681f71\") " pod="openshift-dns/node-resolver-l2l9x" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.371500 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de264c73b21eebce2a476c8f7d57aa48db637aa83bb6f49be76940a38163b487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.393517 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjlzn\" (UniqueName: \"kubernetes.io/projected/91019298-2c2b-48a9-8813-cd58d2681f71-kube-api-access-fjlzn\") pod \"node-resolver-l2l9x\" (UID: \"91019298-2c2b-48a9-8813-cd58d2681f71\") " pod="openshift-dns/node-resolver-l2l9x" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.420225 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.420363 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.420397 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.420427 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.420450 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:33Z","lastTransitionTime":"2026-03-09T13:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.517931 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-l2l9x" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.529664 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.529733 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.529755 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.529790 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.529812 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:33Z","lastTransitionTime":"2026-03-09T13:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.595684 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-cfjq2"] Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.596153 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-g92rf"] Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.596471 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.596968 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.601940 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-jb44m"] Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.602508 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.602625 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.602766 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.602829 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.602835 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.602931 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.602968 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jb44m" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.602995 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.603119 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.603183 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.604691 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.609440 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.609739 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.619498 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T12:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 12:59:55.482549 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 12:59:55.482783 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 12:59:55.483649 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2499336393/tls.crt::/tmp/serving-cert-2499336393/tls.key\\\\\\\"\\\\nI0309 12:59:56.049731 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 12:59:56.052155 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 12:59:56.052201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 12:59:56.052220 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 12:59:56.052226 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 12:59:56.066627 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 12:59:56.066654 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066658 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 12:59:56.066665 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 12:59:56.066668 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 12:59:56.066671 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 12:59:56.067066 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 12:59:56.069624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T12:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.633324 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.633421 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.633435 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.633456 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.633474 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:33Z","lastTransitionTime":"2026-03-09T13:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.638176 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8bd72449608bbfb68165af6302aae01ce97dd9cf13392ccd6366dc7fbb7e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.655148 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ca3fb3acb84f38e208051240b02c48f3a17be1ed5b3e309c769ae601bac22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06f3f6a6fc3af6d7c96d5860969110d357f79b45ffba8797c8939c39c299a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.671532 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.671627 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/242d3bf9-4462-4562-813a-f3548edc94fd-system-cni-dir\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.671658 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/242d3bf9-4462-4562-813a-f3548edc94fd-cnibin\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.672787 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/242d3bf9-4462-4562-813a-f3548edc94fd-multus-socket-dir-parent\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.672935 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/242d3bf9-4462-4562-813a-f3548edc94fd-multus-daemon-config\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.673000 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/242d3bf9-4462-4562-813a-f3548edc94fd-multus-conf-dir\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.673024 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/242d3bf9-4462-4562-813a-f3548edc94fd-host-run-multus-certs\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.673047 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1-system-cni-dir\") pod \"multus-additional-cni-plugins-jb44m\" (UID: \"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\") " pod="openshift-multus/multus-additional-cni-plugins-jb44m" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.673083 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/242d3bf9-4462-4562-813a-f3548edc94fd-host-var-lib-kubelet\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.673222 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/242d3bf9-4462-4562-813a-f3548edc94fd-etc-kubernetes\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.673312 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/242d3bf9-4462-4562-813a-f3548edc94fd-os-release\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.673337 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7r96\" (UniqueName: \"kubernetes.io/projected/242d3bf9-4462-4562-813a-f3548edc94fd-kube-api-access-c7r96\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.673361 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/983d5ed4-cfc7-402a-b226-29dc071c6e4e-mcd-auth-proxy-config\") pod \"machine-config-daemon-cfjq2\" (UID: \"983d5ed4-cfc7-402a-b226-29dc071c6e4e\") " pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.673382 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/242d3bf9-4462-4562-813a-f3548edc94fd-host-run-netns\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.673401 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/242d3bf9-4462-4562-813a-f3548edc94fd-hostroot\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.673423 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sll9\" (UniqueName: \"kubernetes.io/projected/983d5ed4-cfc7-402a-b226-29dc071c6e4e-kube-api-access-7sll9\") pod \"machine-config-daemon-cfjq2\" (UID: \"983d5ed4-cfc7-402a-b226-29dc071c6e4e\") " pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.673444 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/242d3bf9-4462-4562-813a-f3548edc94fd-host-run-k8s-cni-cncf-io\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.673465 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jb44m\" (UID: \"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\") " pod="openshift-multus/multus-additional-cni-plugins-jb44m" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.673555 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1-cnibin\") pod \"multus-additional-cni-plugins-jb44m\" (UID: \"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\") " pod="openshift-multus/multus-additional-cni-plugins-jb44m" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.673683 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/983d5ed4-cfc7-402a-b226-29dc071c6e4e-rootfs\") pod \"machine-config-daemon-cfjq2\" (UID: \"983d5ed4-cfc7-402a-b226-29dc071c6e4e\") " pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.673717 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/983d5ed4-cfc7-402a-b226-29dc071c6e4e-proxy-tls\") pod \"machine-config-daemon-cfjq2\" (UID: \"983d5ed4-cfc7-402a-b226-29dc071c6e4e\") " pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.673751 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/242d3bf9-4462-4562-813a-f3548edc94fd-multus-cni-dir\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.673783 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/242d3bf9-4462-4562-813a-f3548edc94fd-cni-binary-copy\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.673814 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/242d3bf9-4462-4562-813a-f3548edc94fd-host-var-lib-cni-bin\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.673882 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chfrr\" (UniqueName: \"kubernetes.io/projected/2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1-kube-api-access-chfrr\") pod \"multus-additional-cni-plugins-jb44m\" (UID: \"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\") " pod="openshift-multus/multus-additional-cni-plugins-jb44m" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.673944 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jb44m\" (UID: \"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\") " pod="openshift-multus/multus-additional-cni-plugins-jb44m" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.673991 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/242d3bf9-4462-4562-813a-f3548edc94fd-host-var-lib-cni-multus\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.674019 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1-os-release\") pod \"multus-additional-cni-plugins-jb44m\" (UID: \"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\") " pod="openshift-multus/multus-additional-cni-plugins-jb44m" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.674036 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1-cni-binary-copy\") pod \"multus-additional-cni-plugins-jb44m\" (UID: \"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\") " pod="openshift-multus/multus-additional-cni-plugins-jb44m" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.688322 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l2l9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91019298-2c2b-48a9-8813-cd58d2681f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l2l9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.699225 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"983d5ed4-cfc7-402a-b226-29dc071c6e4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cfjq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.710845 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g92rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"242d3bf9-4462-4562-813a-f3548edc94fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7r96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g92rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.721052 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.731872 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.735708 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.735738 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.735747 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.735760 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.735769 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:33Z","lastTransitionTime":"2026-03-09T13:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.742908 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c8c6bc-647d-452a-9bee-7b7ccafd2816\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ace64875a6e7a6735b141be933b0fe4fb330351eb8b4c9a943e32312d019849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.753773 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de264c73b21eebce2a476c8f7d57aa48db637aa83bb6f49be76940a38163b487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.765098 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g92rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"242d3bf9-4462-4562-813a-f3548edc94fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7r96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g92rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.774379 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jb44m\" (UID: \"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\") " pod="openshift-multus/multus-additional-cni-plugins-jb44m" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.774413 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/242d3bf9-4462-4562-813a-f3548edc94fd-host-run-k8s-cni-cncf-io\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.774451 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1-cnibin\") pod \"multus-additional-cni-plugins-jb44m\" (UID: \"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\") " pod="openshift-multus/multus-additional-cni-plugins-jb44m" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.774475 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/983d5ed4-cfc7-402a-b226-29dc071c6e4e-rootfs\") pod \"machine-config-daemon-cfjq2\" (UID: \"983d5ed4-cfc7-402a-b226-29dc071c6e4e\") " pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.774491 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/983d5ed4-cfc7-402a-b226-29dc071c6e4e-proxy-tls\") pod \"machine-config-daemon-cfjq2\" (UID: \"983d5ed4-cfc7-402a-b226-29dc071c6e4e\") " pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.774525 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/242d3bf9-4462-4562-813a-f3548edc94fd-multus-cni-dir\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.774540 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/242d3bf9-4462-4562-813a-f3548edc94fd-cni-binary-copy\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.774553 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/242d3bf9-4462-4562-813a-f3548edc94fd-host-var-lib-cni-bin\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.774567 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chfrr\" (UniqueName: \"kubernetes.io/projected/2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1-kube-api-access-chfrr\") pod \"multus-additional-cni-plugins-jb44m\" (UID: \"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\") " pod="openshift-multus/multus-additional-cni-plugins-jb44m" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.774606 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jb44m\" (UID: \"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\") " pod="openshift-multus/multus-additional-cni-plugins-jb44m" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.774624 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/242d3bf9-4462-4562-813a-f3548edc94fd-host-var-lib-cni-multus\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.774638 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1-os-release\") pod \"multus-additional-cni-plugins-jb44m\" (UID: \"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\") " pod="openshift-multus/multus-additional-cni-plugins-jb44m" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.774652 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1-cni-binary-copy\") pod \"multus-additional-cni-plugins-jb44m\" (UID: \"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\") " pod="openshift-multus/multus-additional-cni-plugins-jb44m" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.774700 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/242d3bf9-4462-4562-813a-f3548edc94fd-system-cni-dir\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.774717 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/242d3bf9-4462-4562-813a-f3548edc94fd-cnibin\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.774730 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/242d3bf9-4462-4562-813a-f3548edc94fd-multus-socket-dir-parent\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.774764 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/242d3bf9-4462-4562-813a-f3548edc94fd-multus-daemon-config\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.774779 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/242d3bf9-4462-4562-813a-f3548edc94fd-multus-conf-dir\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.774792 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/242d3bf9-4462-4562-813a-f3548edc94fd-host-run-multus-certs\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.774806 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1-system-cni-dir\") pod \"multus-additional-cni-plugins-jb44m\" (UID: \"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\") " pod="openshift-multus/multus-additional-cni-plugins-jb44m" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.774846 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/242d3bf9-4462-4562-813a-f3548edc94fd-host-var-lib-kubelet\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.774892 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/242d3bf9-4462-4562-813a-f3548edc94fd-etc-kubernetes\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.774907 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/242d3bf9-4462-4562-813a-f3548edc94fd-os-release\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.774924 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7r96\" (UniqueName: \"kubernetes.io/projected/242d3bf9-4462-4562-813a-f3548edc94fd-kube-api-access-c7r96\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.774960 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/242d3bf9-4462-4562-813a-f3548edc94fd-host-run-netns\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.774977 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/242d3bf9-4462-4562-813a-f3548edc94fd-hostroot\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.774993 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/983d5ed4-cfc7-402a-b226-29dc071c6e4e-mcd-auth-proxy-config\") pod \"machine-config-daemon-cfjq2\" (UID: \"983d5ed4-cfc7-402a-b226-29dc071c6e4e\") " pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.775007 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sll9\" (UniqueName: \"kubernetes.io/projected/983d5ed4-cfc7-402a-b226-29dc071c6e4e-kube-api-access-7sll9\") pod \"machine-config-daemon-cfjq2\" (UID: \"983d5ed4-cfc7-402a-b226-29dc071c6e4e\") " pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.775048 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/242d3bf9-4462-4562-813a-f3548edc94fd-host-run-multus-certs\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.775117 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1-os-release\") pod \"multus-additional-cni-plugins-jb44m\" (UID: \"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\") " pod="openshift-multus/multus-additional-cni-plugins-jb44m" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.775127 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/242d3bf9-4462-4562-813a-f3548edc94fd-cnibin\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.775174 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/242d3bf9-4462-4562-813a-f3548edc94fd-multus-socket-dir-parent\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.775259 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/242d3bf9-4462-4562-813a-f3548edc94fd-multus-cni-dir\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.775380 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1-system-cni-dir\") pod \"multus-additional-cni-plugins-jb44m\" (UID: \"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\") " pod="openshift-multus/multus-additional-cni-plugins-jb44m" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.775414 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/242d3bf9-4462-4562-813a-f3548edc94fd-host-var-lib-kubelet\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.775456 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/242d3bf9-4462-4562-813a-f3548edc94fd-etc-kubernetes\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.775495 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/242d3bf9-4462-4562-813a-f3548edc94fd-os-release\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.775694 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/242d3bf9-4462-4562-813a-f3548edc94fd-host-run-netns\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.775715 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/242d3bf9-4462-4562-813a-f3548edc94fd-multus-daemon-config\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.775736 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/242d3bf9-4462-4562-813a-f3548edc94fd-hostroot\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.775751 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/242d3bf9-4462-4562-813a-f3548edc94fd-multus-conf-dir\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.775979 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/242d3bf9-4462-4562-813a-f3548edc94fd-cni-binary-copy\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.776008 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/242d3bf9-4462-4562-813a-f3548edc94fd-system-cni-dir\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.775946 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/242d3bf9-4462-4562-813a-f3548edc94fd-host-var-lib-cni-bin\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.776039 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/242d3bf9-4462-4562-813a-f3548edc94fd-host-var-lib-cni-multus\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.776064 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/983d5ed4-cfc7-402a-b226-29dc071c6e4e-rootfs\") pod \"machine-config-daemon-cfjq2\" (UID: \"983d5ed4-cfc7-402a-b226-29dc071c6e4e\") " pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.776066 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1-cnibin\") pod \"multus-additional-cni-plugins-jb44m\" (UID: \"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\") " pod="openshift-multus/multus-additional-cni-plugins-jb44m" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.776106 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/242d3bf9-4462-4562-813a-f3548edc94fd-host-run-k8s-cni-cncf-io\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.777010 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jb44m\" (UID: \"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\") " pod="openshift-multus/multus-additional-cni-plugins-jb44m" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.778444 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c8c6bc-647d-452a-9bee-7b7ccafd2816\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ace64875a6e7a6735b141be933b0fe4fb330351eb8b4c9a943e32312d019849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.778787 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/983d5ed4-cfc7-402a-b226-29dc071c6e4e-proxy-tls\") pod \"machine-config-daemon-cfjq2\" (UID: \"983d5ed4-cfc7-402a-b226-29dc071c6e4e\") " pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.779128 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jb44m\" (UID: \"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\") " pod="openshift-multus/multus-additional-cni-plugins-jb44m" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.779886 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1-cni-binary-copy\") pod \"multus-additional-cni-plugins-jb44m\" (UID: \"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\") " pod="openshift-multus/multus-additional-cni-plugins-jb44m" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.780259 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/983d5ed4-cfc7-402a-b226-29dc071c6e4e-mcd-auth-proxy-config\") pod \"machine-config-daemon-cfjq2\" (UID: \"983d5ed4-cfc7-402a-b226-29dc071c6e4e\") " pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.793413 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7r96\" (UniqueName: \"kubernetes.io/projected/242d3bf9-4462-4562-813a-f3548edc94fd-kube-api-access-c7r96\") pod \"multus-g92rf\" (UID: \"242d3bf9-4462-4562-813a-f3548edc94fd\") " pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.794316 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chfrr\" (UniqueName: \"kubernetes.io/projected/2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1-kube-api-access-chfrr\") pod \"multus-additional-cni-plugins-jb44m\" (UID: \"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\") " pod="openshift-multus/multus-additional-cni-plugins-jb44m" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.795184 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.795411 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sll9\" (UniqueName: \"kubernetes.io/projected/983d5ed4-cfc7-402a-b226-29dc071c6e4e-kube-api-access-7sll9\") pod \"machine-config-daemon-cfjq2\" (UID: \"983d5ed4-cfc7-402a-b226-29dc071c6e4e\") " pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.808128 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.819712 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de264c73b21eebce2a476c8f7d57aa48db637aa83bb6f49be76940a38163b487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.833309 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jb44m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jb44m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.837984 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.838011 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.838045 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.838060 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.838071 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:33Z","lastTransitionTime":"2026-03-09T13:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.850040 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T12:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 12:59:55.482549 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 12:59:55.482783 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 12:59:55.483649 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2499336393/tls.crt::/tmp/serving-cert-2499336393/tls.key\\\\\\\"\\\\nI0309 12:59:56.049731 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 12:59:56.052155 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 12:59:56.052201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 12:59:56.052220 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 12:59:56.052226 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 12:59:56.066627 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 12:59:56.066654 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066658 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 12:59:56.066665 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 12:59:56.066668 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 12:59:56.066671 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 12:59:56.067066 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 12:59:56.069624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T12:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.866831 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8bd72449608bbfb68165af6302aae01ce97dd9cf13392ccd6366dc7fbb7e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.885297 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ca3fb3acb84f38e208051240b02c48f3a17be1ed5b3e309c769ae601bac22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06f3f6a6fc3af6d7c96d5860969110d357f79b45ffba8797c8939c39c299a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.903979 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.916943 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l2l9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91019298-2c2b-48a9-8813-cd58d2681f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l2l9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.923626 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-g92rf" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.927962 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"983d5ed4-cfc7-402a-b226-29dc071c6e4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cfjq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:33 crc kubenswrapper[4723]: W0309 13:00:33.936745 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod242d3bf9_4462_4562_813a_f3548edc94fd.slice/crio-81569cdd757591cda3595a235594b552fc1ec73093c1d09dbaff9a0b2cf6c61d WatchSource:0}: Error finding container 81569cdd757591cda3595a235594b552fc1ec73093c1d09dbaff9a0b2cf6c61d: Status 404 returned error can't find the container with id 81569cdd757591cda3595a235594b552fc1ec73093c1d09dbaff9a0b2cf6c61d Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.939969 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.939994 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.940002 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.940015 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.940026 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:33Z","lastTransitionTime":"2026-03-09T13:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.945834 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.956518 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jb44m" Mar 09 13:00:33 crc kubenswrapper[4723]: W0309 13:00:33.960109 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod983d5ed4_cfc7_402a_b226_29dc071c6e4e.slice/crio-d662f1b35fab50f937d662933b422df503e2dffee4d8964f7856e1a48de31aef WatchSource:0}: Error finding container d662f1b35fab50f937d662933b422df503e2dffee4d8964f7856e1a48de31aef: Status 404 returned error can't find the container with id d662f1b35fab50f937d662933b422df503e2dffee4d8964f7856e1a48de31aef Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.974592 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zngwx"] Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.975492 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.978067 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.978203 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.978227 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.978263 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.978310 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.978320 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.978647 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 09 13:00:33 crc kubenswrapper[4723]: W0309 13:00:33.982587 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e3fdcc4_63d9_4867_b5d8_5f0a5be569a1.slice/crio-27dff8f98762814073fe5285b2b577cad1ca0ef54de255e1bfb32eaa8110f930 WatchSource:0}: Error finding container 27dff8f98762814073fe5285b2b577cad1ca0ef54de255e1bfb32eaa8110f930: Status 404 returned error can't find the container with id 27dff8f98762814073fe5285b2b577cad1ca0ef54de255e1bfb32eaa8110f930 Mar 09 13:00:33 crc kubenswrapper[4723]: I0309 13:00:33.989832 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"983d5ed4-cfc7-402a-b226-29dc071c6e4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cfjq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.002497 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T12:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 12:59:55.482549 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 12:59:55.482783 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 12:59:55.483649 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2499336393/tls.crt::/tmp/serving-cert-2499336393/tls.key\\\\\\\"\\\\nI0309 12:59:56.049731 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 12:59:56.052155 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 12:59:56.052201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 12:59:56.052220 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 12:59:56.052226 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 12:59:56.066627 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 12:59:56.066654 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066658 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 12:59:56.066665 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 12:59:56.066668 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 12:59:56.066671 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 12:59:56.067066 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 12:59:56.069624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T12:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.019586 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8bd72449608bbfb68165af6302aae01ce97dd9cf13392ccd6366dc7fbb7e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.034360 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ca3fb3acb84f38e208051240b02c48f3a17be1ed5b3e309c769ae601bac22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06f3f6a6fc3af6d7c96d5860969110d357f79b45ffba8797c8939c39c299a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.044107 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.044187 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.044205 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.044262 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.044281 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:34Z","lastTransitionTime":"2026-03-09T13:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.046836 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.063420 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l2l9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91019298-2c2b-48a9-8813-cd58d2681f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l2l9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.077186 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g92rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"242d3bf9-4462-4562-813a-f3548edc94fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7r96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g92rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.077400 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjtfd\" (UniqueName: \"kubernetes.io/projected/edb23619-78b6-4d63-aacf-98d7ce86bc5b-kube-api-access-qjtfd\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.077437 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-run-systemd\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.077461 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-host-kubelet\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.077484 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.077511 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-var-lib-openvswitch\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.077752 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/edb23619-78b6-4d63-aacf-98d7ce86bc5b-ovn-node-metrics-cert\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.077814 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/edb23619-78b6-4d63-aacf-98d7ce86bc5b-ovnkube-config\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.077836 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-host-run-netns\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.077869 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-run-ovn\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.077896 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-etc-openvswitch\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.077917 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/edb23619-78b6-4d63-aacf-98d7ce86bc5b-env-overrides\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.077936 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/edb23619-78b6-4d63-aacf-98d7ce86bc5b-ovnkube-script-lib\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.077956 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-host-cni-bin\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.077979 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-host-cni-netd\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.078014 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-host-slash\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.078047 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-node-log\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.078071 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-run-openvswitch\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.078096 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-host-run-ovn-kubernetes\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.078190 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-systemd-units\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.078212 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-log-socket\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.091406 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c8c6bc-647d-452a-9bee-7b7ccafd2816\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ace64875a6e7a6735b141be933b0fe4fb330351eb8b4c9a943e32312d019849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.102946 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.115065 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.127415 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de264c73b21eebce2a476c8f7d57aa48db637aa83bb6f49be76940a38163b487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.144453 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jb44m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jb44m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.148027 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.148056 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.148066 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.148081 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.148092 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:34Z","lastTransitionTime":"2026-03-09T13:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.162720 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edb23619-78b6-4d63-aacf-98d7ce86bc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zngwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.179085 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-host-run-netns\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.179127 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-run-ovn\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.179153 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-etc-openvswitch\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.179178 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/edb23619-78b6-4d63-aacf-98d7ce86bc5b-env-overrides\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.179201 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-run-ovn\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.179233 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-etc-openvswitch\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.179260 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-host-run-netns\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.179740 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/edb23619-78b6-4d63-aacf-98d7ce86bc5b-env-overrides\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.179785 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/edb23619-78b6-4d63-aacf-98d7ce86bc5b-ovnkube-script-lib\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.179808 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-host-cni-bin\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.179831 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-host-cni-netd\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.179874 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/edb23619-78b6-4d63-aacf-98d7ce86bc5b-ovnkube-script-lib\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.179896 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-host-slash\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.179912 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-host-cni-bin\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.179925 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-node-log\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.179933 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-host-cni-netd\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.179948 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-run-openvswitch\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.179967 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-node-log\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.179969 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-host-run-ovn-kubernetes\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.179985 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-run-openvswitch\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.179952 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-host-slash\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.179999 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-systemd-units\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.180013 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-host-run-ovn-kubernetes\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.180022 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-log-socket\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.180036 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-systemd-units\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.180044 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjtfd\" (UniqueName: \"kubernetes.io/projected/edb23619-78b6-4d63-aacf-98d7ce86bc5b-kube-api-access-qjtfd\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.180054 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-log-socket\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.180065 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-run-systemd\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.180086 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-host-kubelet\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.180108 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.180131 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-var-lib-openvswitch\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.180236 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-host-kubelet\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.180258 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-run-systemd\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.180289 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-var-lib-openvswitch\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.180310 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.180335 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/edb23619-78b6-4d63-aacf-98d7ce86bc5b-ovn-node-metrics-cert\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.180876 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/edb23619-78b6-4d63-aacf-98d7ce86bc5b-ovnkube-config\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.181473 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/edb23619-78b6-4d63-aacf-98d7ce86bc5b-ovnkube-config\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.184732 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/edb23619-78b6-4d63-aacf-98d7ce86bc5b-ovn-node-metrics-cert\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.195023 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjtfd\" (UniqueName: \"kubernetes.io/projected/edb23619-78b6-4d63-aacf-98d7ce86bc5b-kube-api-access-qjtfd\") pod \"ovnkube-node-zngwx\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.250161 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.250191 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.250199 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.250212 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.250222 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:34Z","lastTransitionTime":"2026-03-09T13:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.292655 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:34 crc kubenswrapper[4723]: W0309 13:00:34.307145 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedb23619_78b6_4d63_aacf_98d7ce86bc5b.slice/crio-2c7853925c23f3246ae068df12e4e648173c9cb519e0bdf339c2d4e4fc6c7aa8 WatchSource:0}: Error finding container 2c7853925c23f3246ae068df12e4e648173c9cb519e0bdf339c2d4e4fc6c7aa8: Status 404 returned error can't find the container with id 2c7853925c23f3246ae068df12e4e648173c9cb519e0bdf339c2d4e4fc6c7aa8 Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.320352 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" event={"ID":"983d5ed4-cfc7-402a-b226-29dc071c6e4e","Type":"ContainerStarted","Data":"090e849795de6303acc7ba93444e1096a0cbb499371f86f3ce97d74a639f8ad7"} Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.320422 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" event={"ID":"983d5ed4-cfc7-402a-b226-29dc071c6e4e","Type":"ContainerStarted","Data":"8b99b2d350bcf051156986ebffa7486b5fd35d0dcd70f0163bbbeb54050408e1"} Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.320435 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" event={"ID":"983d5ed4-cfc7-402a-b226-29dc071c6e4e","Type":"ContainerStarted","Data":"d662f1b35fab50f937d662933b422df503e2dffee4d8964f7856e1a48de31aef"} Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.322943 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-l2l9x" event={"ID":"91019298-2c2b-48a9-8813-cd58d2681f71","Type":"ContainerStarted","Data":"2fc4cb656087f1906e4337675f18465d9bb6b9e324c185a76aeac922438e5b5d"} Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.322992 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-l2l9x" event={"ID":"91019298-2c2b-48a9-8813-cd58d2681f71","Type":"ContainerStarted","Data":"5b16e006c2d898baa699cb53e540a1535009f8f8437d52abd57d08768c07f87d"} Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.324152 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" event={"ID":"edb23619-78b6-4d63-aacf-98d7ce86bc5b","Type":"ContainerStarted","Data":"2c7853925c23f3246ae068df12e4e648173c9cb519e0bdf339c2d4e4fc6c7aa8"} Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.325496 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-g92rf" event={"ID":"242d3bf9-4462-4562-813a-f3548edc94fd","Type":"ContainerStarted","Data":"ca482bd84eb9a872cae7dd27e7606909c24675c47b5def6c7d3b0a947fe198a5"} Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.325515 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-g92rf" event={"ID":"242d3bf9-4462-4562-813a-f3548edc94fd","Type":"ContainerStarted","Data":"81569cdd757591cda3595a235594b552fc1ec73093c1d09dbaff9a0b2cf6c61d"} Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.326943 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jb44m" event={"ID":"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1","Type":"ContainerStarted","Data":"f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea"} Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.326966 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jb44m" event={"ID":"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1","Type":"ContainerStarted","Data":"27dff8f98762814073fe5285b2b577cad1ca0ef54de255e1bfb32eaa8110f930"} Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.341083 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.353278 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c8c6bc-647d-452a-9bee-7b7ccafd2816\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ace64875a6e7a6735b141be933b0fe4fb330351eb8b4c9a943e32312d019849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.356283 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.356319 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.356331 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.356347 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.356357 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:34Z","lastTransitionTime":"2026-03-09T13:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.374240 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.409916 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edb23619-78b6-4d63-aacf-98d7ce86bc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zngwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.430554 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de264c73b21eebce2a476c8f7d57aa48db637aa83bb6f49be76940a38163b487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.450697 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jb44m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jb44m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.458805 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.458887 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.458907 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.458973 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.459092 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:34Z","lastTransitionTime":"2026-03-09T13:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.465753 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8bd72449608bbfb68165af6302aae01ce97dd9cf13392ccd6366dc7fbb7e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.481143 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ca3fb3acb84f38e208051240b02c48f3a17be1ed5b3e309c769ae601bac22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06f3f6a6fc3af6d7c96d5860969110d357f79b45ffba8797c8939c39c299a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.493803 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.508258 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l2l9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91019298-2c2b-48a9-8813-cd58d2681f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l2l9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.520730 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"983d5ed4-cfc7-402a-b226-29dc071c6e4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://090e849795de6303acc7ba93444e1096a0cbb499371f86f3ce97d74a639f8ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b99b2d350bcf051156986ebffa7486b5fd35d0dcd70f0163bbbeb54050408e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cfjq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.533811 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T12:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 12:59:55.482549 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 12:59:55.482783 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 12:59:55.483649 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2499336393/tls.crt::/tmp/serving-cert-2499336393/tls.key\\\\\\\"\\\\nI0309 12:59:56.049731 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 12:59:56.052155 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 12:59:56.052201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 12:59:56.052220 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 12:59:56.052226 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 12:59:56.066627 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 12:59:56.066654 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066658 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 12:59:56.066665 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 12:59:56.066668 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 12:59:56.066671 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 12:59:56.067066 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 12:59:56.069624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T12:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.535889 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.535928 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.535940 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.535987 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.536002 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:34Z","lastTransitionTime":"2026-03-09T13:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.547479 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g92rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"242d3bf9-4462-4562-813a-f3548edc94fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7r96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g92rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:34 crc kubenswrapper[4723]: E0309 13:00:34.549546 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7a1a0b61-b39c-449b-8223-316b94b8c26c\\\",\\\"systemUUID\\\":\\\"548816a4-d164-4614-b3ad-c3e4f48e94b9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.554978 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.555016 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.555025 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.555039 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.555052 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:34Z","lastTransitionTime":"2026-03-09T13:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.577014 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g92rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"242d3bf9-4462-4562-813a-f3548edc94fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca482bd84eb9a872cae7dd27e7606909c24675c47b5def6c7d3b0a947fe198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7r96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g92rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:34 crc kubenswrapper[4723]: E0309 13:00:34.581302 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7a1a0b61-b39c-449b-8223-316b94b8c26c\\\",\\\"systemUUID\\\":\\\"548816a4-d164-4614-b3ad-c3e4f48e94b9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.599768 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.599847 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.599884 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.599907 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.599923 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:34Z","lastTransitionTime":"2026-03-09T13:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.609378 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c8c6bc-647d-452a-9bee-7b7ccafd2816\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ace64875a6e7a6735b141be933b0fe4fb330351eb8b4c9a943e32312d019849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:34 crc kubenswrapper[4723]: E0309 13:00:34.623781 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7a1a0b61-b39c-449b-8223-316b94b8c26c\\\",\\\"systemUUID\\\":\\\"548816a4-d164-4614-b3ad-c3e4f48e94b9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.627794 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.627846 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.627879 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.627893 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.627919 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:34Z","lastTransitionTime":"2026-03-09T13:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.631855 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:34 crc kubenswrapper[4723]: E0309 13:00:34.641332 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7a1a0b61-b39c-449b-8223-316b94b8c26c\\\",\\\"systemUUID\\\":\\\"548816a4-d164-4614-b3ad-c3e4f48e94b9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.644182 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.644221 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.644230 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.644244 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.644253 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:34Z","lastTransitionTime":"2026-03-09T13:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.648021 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:34 crc kubenswrapper[4723]: E0309 13:00:34.657396 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7a1a0b61-b39c-449b-8223-316b94b8c26c\\\",\\\"systemUUID\\\":\\\"548816a4-d164-4614-b3ad-c3e4f48e94b9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:34 crc kubenswrapper[4723]: E0309 13:00:34.657807 4723 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.659602 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.659644 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.659653 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.659666 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.659675 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:34Z","lastTransitionTime":"2026-03-09T13:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.660760 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de264c73b21eebce2a476c8f7d57aa48db637aa83bb6f49be76940a38163b487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.674471 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jb44m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jb44m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.693653 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edb23619-78b6-4d63-aacf-98d7ce86bc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zngwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.707029 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ca3fb3acb84f38e208051240b02c48f3a17be1ed5b3e309c769ae601bac22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06f3f6a6fc3af6d7c96d5860969110d357f79b45ffba8797c8939c39c299a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.718722 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.729140 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l2l9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91019298-2c2b-48a9-8813-cd58d2681f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fc4cb656087f1906e4337675f18465d9bb6b9e324c185a76aeac922438e5b5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l2l9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.740938 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"983d5ed4-cfc7-402a-b226-29dc071c6e4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://090e849795de6303acc7ba93444e1096a0cbb499371f86f3ce97d74a639f8ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b99b2d350bcf051156986ebffa7486b5fd35d0dcd70f0163bbbeb54050408e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cfjq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.756341 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T12:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 12:59:55.482549 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 12:59:55.482783 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 12:59:55.483649 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2499336393/tls.crt::/tmp/serving-cert-2499336393/tls.key\\\\\\\"\\\\nI0309 12:59:56.049731 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 12:59:56.052155 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 12:59:56.052201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 12:59:56.052220 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 12:59:56.052226 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 12:59:56.066627 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 12:59:56.066654 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066658 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 12:59:56.066665 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 12:59:56.066668 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 12:59:56.066671 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 12:59:56.067066 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 12:59:56.069624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T12:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.761985 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.762096 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.762230 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.762301 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.762358 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:34Z","lastTransitionTime":"2026-03-09T13:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.772104 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8bd72449608bbfb68165af6302aae01ce97dd9cf13392ccd6366dc7fbb7e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.864306 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.864342 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.864352 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.864367 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.864376 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:34Z","lastTransitionTime":"2026-03-09T13:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.880027 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.880281 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:00:34 crc kubenswrapper[4723]: E0309 13:00:34.880345 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.880489 4723 scope.go:117] "RemoveContainer" containerID="093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468" Mar 09 13:00:34 crc kubenswrapper[4723]: E0309 13:00:34.880541 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.880493 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:00:34 crc kubenswrapper[4723]: E0309 13:00:34.880755 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:00:34 crc kubenswrapper[4723]: E0309 13:00:34.880640 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.967583 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.967847 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.967949 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.968025 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:34 crc kubenswrapper[4723]: I0309 13:00:34.968097 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:34Z","lastTransitionTime":"2026-03-09T13:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.071394 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.071428 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.071437 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.071451 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.071460 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:35Z","lastTransitionTime":"2026-03-09T13:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.174942 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.175542 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.175568 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.175601 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.175621 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:35Z","lastTransitionTime":"2026-03-09T13:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.278760 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.278897 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.278926 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.278960 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.278979 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:35Z","lastTransitionTime":"2026-03-09T13:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.331638 4723 generic.go:334] "Generic (PLEG): container finished" podID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerID="b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258" exitCode=0 Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.331689 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" event={"ID":"edb23619-78b6-4d63-aacf-98d7ce86bc5b","Type":"ContainerDied","Data":"b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258"} Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.332927 4723 generic.go:334] "Generic (PLEG): container finished" podID="2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1" containerID="f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea" exitCode=0 Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.333019 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jb44m" event={"ID":"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1","Type":"ContainerDied","Data":"f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea"} Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.352761 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g92rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"242d3bf9-4462-4562-813a-f3548edc94fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca482bd84eb9a872cae7dd27e7606909c24675c47b5def6c7d3b0a947fe198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7r96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g92rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.364013 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c8c6bc-647d-452a-9bee-7b7ccafd2816\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ace64875a6e7a6735b141be933b0fe4fb330351eb8b4c9a943e32312d019849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.381425 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.382423 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.384027 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.384104 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.384146 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.384175 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:35Z","lastTransitionTime":"2026-03-09T13:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.399562 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.412844 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de264c73b21eebce2a476c8f7d57aa48db637aa83bb6f49be76940a38163b487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.436550 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jb44m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jb44m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.457330 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edb23619-78b6-4d63-aacf-98d7ce86bc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zngwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.475415 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.488424 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l2l9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91019298-2c2b-48a9-8813-cd58d2681f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fc4cb656087f1906e4337675f18465d9bb6b9e324c185a76aeac922438e5b5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l2l9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.488898 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.489475 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.489504 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.489538 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.489564 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:35Z","lastTransitionTime":"2026-03-09T13:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.503454 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"983d5ed4-cfc7-402a-b226-29dc071c6e4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://090e849795de6303acc7ba93444e1096a0cbb499371f86f3ce97d74a639f8ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b99b2d350bcf051156986ebffa7486b5fd35d0dcd70f0163bbbeb54050408e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cfjq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.517890 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T12:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 12:59:55.482549 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 12:59:55.482783 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 12:59:55.483649 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2499336393/tls.crt::/tmp/serving-cert-2499336393/tls.key\\\\\\\"\\\\nI0309 12:59:56.049731 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 12:59:56.052155 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 12:59:56.052201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 12:59:56.052220 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 12:59:56.052226 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 12:59:56.066627 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 12:59:56.066654 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066658 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 12:59:56.066665 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 12:59:56.066668 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 12:59:56.066671 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 12:59:56.067066 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 12:59:56.069624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T12:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.533770 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8bd72449608bbfb68165af6302aae01ce97dd9cf13392ccd6366dc7fbb7e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.547059 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ca3fb3acb84f38e208051240b02c48f3a17be1ed5b3e309c769ae601bac22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06f3f6a6fc3af6d7c96d5860969110d357f79b45ffba8797c8939c39c299a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.556627 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c8c6bc-647d-452a-9bee-7b7ccafd2816\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ace64875a6e7a6735b141be933b0fe4fb330351eb8b4c9a943e32312d019849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.569240 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.583384 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.592787 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.592999 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.593026 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.593087 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.593117 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:35Z","lastTransitionTime":"2026-03-09T13:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.597282 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de264c73b21eebce2a476c8f7d57aa48db637aa83bb6f49be76940a38163b487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.617259 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jb44m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jb44m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.649854 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edb23619-78b6-4d63-aacf-98d7ce86bc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zngwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.664117 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.674253 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l2l9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91019298-2c2b-48a9-8813-cd58d2681f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fc4cb656087f1906e4337675f18465d9bb6b9e324c185a76aeac922438e5b5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l2l9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.685153 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"983d5ed4-cfc7-402a-b226-29dc071c6e4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://090e849795de6303acc7ba93444e1096a0cbb499371f86f3ce97d74a639f8ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b99b2d350bcf051156986ebffa7486b5fd35d0dcd70f0163bbbeb54050408e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cfjq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.695811 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.695854 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.695880 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.695898 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.695935 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:35Z","lastTransitionTime":"2026-03-09T13:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.698429 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T12:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 12:59:55.482549 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 12:59:55.482783 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 12:59:55.483649 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2499336393/tls.crt::/tmp/serving-cert-2499336393/tls.key\\\\\\\"\\\\nI0309 12:59:56.049731 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 12:59:56.052155 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 12:59:56.052201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 12:59:56.052220 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 12:59:56.052226 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 12:59:56.066627 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 12:59:56.066654 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066658 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 12:59:56.066665 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 12:59:56.066668 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 12:59:56.066671 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 12:59:56.067066 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 12:59:56.069624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T12:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.712846 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8bd72449608bbfb68165af6302aae01ce97dd9cf13392ccd6366dc7fbb7e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.726369 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ca3fb3acb84f38e208051240b02c48f3a17be1ed5b3e309c769ae601bac22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06f3f6a6fc3af6d7c96d5860969110d357f79b45ffba8797c8939c39c299a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.743074 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g92rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"242d3bf9-4462-4562-813a-f3548edc94fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca482bd84eb9a872cae7dd27e7606909c24675c47b5def6c7d3b0a947fe198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7r96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g92rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.798531 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.799003 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.799013 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.799027 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.799036 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:35Z","lastTransitionTime":"2026-03-09T13:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.900702 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.900738 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.900749 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.900767 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:35 crc kubenswrapper[4723]: I0309 13:00:35.900780 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:35Z","lastTransitionTime":"2026-03-09T13:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.003425 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.003462 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.003470 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.003485 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.003494 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:36Z","lastTransitionTime":"2026-03-09T13:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.106814 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.106919 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.106937 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.106963 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.106982 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:36Z","lastTransitionTime":"2026-03-09T13:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.211497 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.211917 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.212188 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.212326 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.212507 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:36Z","lastTransitionTime":"2026-03-09T13:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.314952 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.315055 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.315080 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.315118 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.315140 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:36Z","lastTransitionTime":"2026-03-09T13:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.338327 4723 generic.go:334] "Generic (PLEG): container finished" podID="2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1" containerID="cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7" exitCode=0 Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.338487 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jb44m" event={"ID":"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1","Type":"ContainerDied","Data":"cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7"} Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.351278 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" event={"ID":"edb23619-78b6-4d63-aacf-98d7ce86bc5b","Type":"ContainerStarted","Data":"2d41e316c1f5e87b224e0037a94379033e22254097db3d2decf47a5fd335f766"} Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.351361 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" event={"ID":"edb23619-78b6-4d63-aacf-98d7ce86bc5b","Type":"ContainerStarted","Data":"093509abfd0f010ebce9a116475da8c898a65569861b646314023d4ca81fea7e"} Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.351382 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" event={"ID":"edb23619-78b6-4d63-aacf-98d7ce86bc5b","Type":"ContainerStarted","Data":"74e157a2b8f0e011a72fd9bb3979bede5694bfab635bf7a6c4ee6780d35119e7"} Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.351400 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" event={"ID":"edb23619-78b6-4d63-aacf-98d7ce86bc5b","Type":"ContainerStarted","Data":"db49efb6d6e555b280f71c32194247ba51a121250abba598286800970b6bec74"} Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.351418 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" event={"ID":"edb23619-78b6-4d63-aacf-98d7ce86bc5b","Type":"ContainerStarted","Data":"0f52ae34a25520c1f48c20a86ec3e00706385ab5dcfeb3bfd616422b454cf5c9"} Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.351435 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" event={"ID":"edb23619-78b6-4d63-aacf-98d7ce86bc5b","Type":"ContainerStarted","Data":"84ae27509fbfe9c2f53f5de2a01cfc809ca8abe68dcd96a96963438f0924ddb1"} Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.366580 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:36Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.378056 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l2l9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91019298-2c2b-48a9-8813-cd58d2681f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fc4cb656087f1906e4337675f18465d9bb6b9e324c185a76aeac922438e5b5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l2l9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:36Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.391996 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"983d5ed4-cfc7-402a-b226-29dc071c6e4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://090e849795de6303acc7ba93444e1096a0cbb499371f86f3ce97d74a639f8ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b99b2d350bcf051156986ebffa7486b5fd35d0dcd70f0163bbbeb54050408e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cfjq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:36Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.409071 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T12:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 12:59:55.482549 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 12:59:55.482783 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 12:59:55.483649 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2499336393/tls.crt::/tmp/serving-cert-2499336393/tls.key\\\\\\\"\\\\nI0309 12:59:56.049731 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 12:59:56.052155 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 12:59:56.052201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 12:59:56.052220 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 12:59:56.052226 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 12:59:56.066627 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 12:59:56.066654 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066658 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 12:59:56.066665 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 12:59:56.066668 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 12:59:56.066671 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 12:59:56.067066 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 12:59:56.069624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T12:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:36Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.417731 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.417778 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.417791 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.417808 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.417820 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:36Z","lastTransitionTime":"2026-03-09T13:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.425412 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8bd72449608bbfb68165af6302aae01ce97dd9cf13392ccd6366dc7fbb7e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:36Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.438632 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ca3fb3acb84f38e208051240b02c48f3a17be1ed5b3e309c769ae601bac22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06f3f6a6fc3af6d7c96d5860969110d357f79b45ffba8797c8939c39c299a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:36Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.450143 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g92rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"242d3bf9-4462-4562-813a-f3548edc94fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca482bd84eb9a872cae7dd27e7606909c24675c47b5def6c7d3b0a947fe198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7r96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g92rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:36Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.458840 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c8c6bc-647d-452a-9bee-7b7ccafd2816\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ace64875a6e7a6735b141be933b0fe4fb330351eb8b4c9a943e32312d019849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:36Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.471497 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:36Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.483547 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:36Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.495702 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de264c73b21eebce2a476c8f7d57aa48db637aa83bb6f49be76940a38163b487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:36Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.509459 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jb44m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jb44m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:36Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.520256 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.520286 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.520294 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.520307 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.520316 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:36Z","lastTransitionTime":"2026-03-09T13:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.532761 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edb23619-78b6-4d63-aacf-98d7ce86bc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zngwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:36Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.622765 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.622818 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.622831 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.622916 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.622932 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:36Z","lastTransitionTime":"2026-03-09T13:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.726153 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.726217 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.726239 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.726267 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.726287 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:36Z","lastTransitionTime":"2026-03-09T13:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.829974 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.830042 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.830064 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.830093 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.830117 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:36Z","lastTransitionTime":"2026-03-09T13:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.880742 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.880763 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.880843 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:00:36 crc kubenswrapper[4723]: E0309 13:00:36.881016 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:00:36 crc kubenswrapper[4723]: E0309 13:00:36.881097 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:00:36 crc kubenswrapper[4723]: E0309 13:00:36.881168 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.899511 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c8c6bc-647d-452a-9bee-7b7ccafd2816\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ace64875a6e7a6735b141be933b0fe4fb330351eb8b4c9a943e32312d019849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:36Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.919073 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:36Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.936835 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.936919 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.936938 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.936963 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.936980 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:36Z","lastTransitionTime":"2026-03-09T13:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.936985 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:36Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.954981 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de264c73b21eebce2a476c8f7d57aa48db637aa83bb6f49be76940a38163b487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:36Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:36 crc kubenswrapper[4723]: I0309 13:00:36.977525 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jb44m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jb44m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:36Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.009444 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edb23619-78b6-4d63-aacf-98d7ce86bc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zngwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:37Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.029131 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T12:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 12:59:55.482549 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 12:59:55.482783 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 12:59:55.483649 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2499336393/tls.crt::/tmp/serving-cert-2499336393/tls.key\\\\\\\"\\\\nI0309 12:59:56.049731 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 12:59:56.052155 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 12:59:56.052201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 12:59:56.052220 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 12:59:56.052226 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 12:59:56.066627 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 12:59:56.066654 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066658 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 12:59:56.066665 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 12:59:56.066668 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 12:59:56.066671 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 12:59:56.067066 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 12:59:56.069624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T12:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:37Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.039267 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.039646 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.039814 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.040003 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.040170 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:37Z","lastTransitionTime":"2026-03-09T13:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.044728 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8bd72449608bbfb68165af6302aae01ce97dd9cf13392ccd6366dc7fbb7e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:37Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.064261 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ca3fb3acb84f38e208051240b02c48f3a17be1ed5b3e309c769ae601bac22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06f3f6a6fc3af6d7c96d5860969110d357f79b45ffba8797c8939c39c299a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:37Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.082348 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:37Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.092631 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l2l9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91019298-2c2b-48a9-8813-cd58d2681f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fc4cb656087f1906e4337675f18465d9bb6b9e324c185a76aeac922438e5b5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l2l9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:37Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.102171 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"983d5ed4-cfc7-402a-b226-29dc071c6e4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://090e849795de6303acc7ba93444e1096a0cbb499371f86f3ce97d74a639f8ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b99b2d350bcf051156986ebffa7486b5fd35d0dcd70f0163bbbeb54050408e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cfjq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:37Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.113634 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g92rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"242d3bf9-4462-4562-813a-f3548edc94fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca482bd84eb9a872cae7dd27e7606909c24675c47b5def6c7d3b0a947fe198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7r96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g92rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:37Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.143005 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.143053 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.143064 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.143084 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.143095 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:37Z","lastTransitionTime":"2026-03-09T13:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.245072 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.245113 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.245122 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.245135 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.245144 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:37Z","lastTransitionTime":"2026-03-09T13:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.347244 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.347318 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.347330 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.347342 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.347351 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:37Z","lastTransitionTime":"2026-03-09T13:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.356615 4723 generic.go:334] "Generic (PLEG): container finished" podID="2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1" containerID="44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5" exitCode=0 Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.356666 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jb44m" event={"ID":"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1","Type":"ContainerDied","Data":"44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5"} Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.368780 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c8c6bc-647d-452a-9bee-7b7ccafd2816\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ace64875a6e7a6735b141be933b0fe4fb330351eb8b4c9a943e32312d019849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:37Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.380677 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:37Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.392818 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:37Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.406202 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de264c73b21eebce2a476c8f7d57aa48db637aa83bb6f49be76940a38163b487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:37Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.428046 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jb44m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jb44m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:37Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.447886 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edb23619-78b6-4d63-aacf-98d7ce86bc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zngwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:37Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.451673 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.451714 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.451731 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.451752 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.451767 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:37Z","lastTransitionTime":"2026-03-09T13:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.463525 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T12:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 12:59:55.482549 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 12:59:55.482783 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 12:59:55.483649 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2499336393/tls.crt::/tmp/serving-cert-2499336393/tls.key\\\\\\\"\\\\nI0309 12:59:56.049731 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 12:59:56.052155 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 12:59:56.052201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 12:59:56.052220 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 12:59:56.052226 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 12:59:56.066627 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 12:59:56.066654 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066658 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 12:59:56.066665 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 12:59:56.066668 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 12:59:56.066671 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 12:59:56.067066 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 12:59:56.069624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T12:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:37Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.474585 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8bd72449608bbfb68165af6302aae01ce97dd9cf13392ccd6366dc7fbb7e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:37Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.490736 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ca3fb3acb84f38e208051240b02c48f3a17be1ed5b3e309c769ae601bac22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06f3f6a6fc3af6d7c96d5860969110d357f79b45ffba8797c8939c39c299a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:37Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.502302 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:37Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.512782 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l2l9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91019298-2c2b-48a9-8813-cd58d2681f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fc4cb656087f1906e4337675f18465d9bb6b9e324c185a76aeac922438e5b5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l2l9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:37Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.526188 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"983d5ed4-cfc7-402a-b226-29dc071c6e4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://090e849795de6303acc7ba93444e1096a0cbb499371f86f3ce97d74a639f8ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b99b2d350bcf051156986ebffa7486b5fd35d0dcd70f0163bbbeb54050408e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cfjq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:37Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.539177 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g92rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"242d3bf9-4462-4562-813a-f3548edc94fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca482bd84eb9a872cae7dd27e7606909c24675c47b5def6c7d3b0a947fe198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7r96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g92rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:37Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.554279 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.554491 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.554500 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.554512 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.554520 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:37Z","lastTransitionTime":"2026-03-09T13:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.656564 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.656601 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.656611 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.656627 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.656638 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:37Z","lastTransitionTime":"2026-03-09T13:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.759346 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.759391 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.759406 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.759423 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.759435 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:37Z","lastTransitionTime":"2026-03-09T13:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.862408 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.862448 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.862458 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.862474 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.862484 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:37Z","lastTransitionTime":"2026-03-09T13:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.965013 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.965062 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.965076 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.965093 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:37 crc kubenswrapper[4723]: I0309 13:00:37.965105 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:37Z","lastTransitionTime":"2026-03-09T13:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.068436 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.068488 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.068506 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.068533 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.068551 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:38Z","lastTransitionTime":"2026-03-09T13:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.172196 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.172267 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.172284 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.172307 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.172325 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:38Z","lastTransitionTime":"2026-03-09T13:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.275526 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.275580 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.275596 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.275618 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.275636 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:38Z","lastTransitionTime":"2026-03-09T13:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.367809 4723 generic.go:334] "Generic (PLEG): container finished" podID="2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1" containerID="a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a" exitCode=0 Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.367887 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jb44m" event={"ID":"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1","Type":"ContainerDied","Data":"a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a"} Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.380626 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.380665 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.380679 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.380701 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.380716 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:38Z","lastTransitionTime":"2026-03-09T13:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.390153 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de264c73b21eebce2a476c8f7d57aa48db637aa83bb6f49be76940a38163b487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:38Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.409269 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jb44m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jb44m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:38Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.429955 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edb23619-78b6-4d63-aacf-98d7ce86bc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zngwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:38Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.443678 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ca3fb3acb84f38e208051240b02c48f3a17be1ed5b3e309c769ae601bac22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06f3f6a6fc3af6d7c96d5860969110d357f79b45ffba8797c8939c39c299a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:38Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.455030 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:38Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.465926 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l2l9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91019298-2c2b-48a9-8813-cd58d2681f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fc4cb656087f1906e4337675f18465d9bb6b9e324c185a76aeac922438e5b5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l2l9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:38Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.475562 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"983d5ed4-cfc7-402a-b226-29dc071c6e4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://090e849795de6303acc7ba93444e1096a0cbb499371f86f3ce97d74a639f8ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b99b2d350bcf051156986ebffa7486b5fd35d0dcd70f0163bbbeb54050408e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cfjq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:38Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.484833 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.484950 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.484971 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.484991 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.485005 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:38Z","lastTransitionTime":"2026-03-09T13:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.489133 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T12:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 12:59:55.482549 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 12:59:55.482783 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 12:59:55.483649 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2499336393/tls.crt::/tmp/serving-cert-2499336393/tls.key\\\\\\\"\\\\nI0309 12:59:56.049731 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 12:59:56.052155 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 12:59:56.052201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 12:59:56.052220 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 12:59:56.052226 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 12:59:56.066627 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 12:59:56.066654 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066658 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 12:59:56.066665 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 12:59:56.066668 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 12:59:56.066671 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 12:59:56.067066 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 12:59:56.069624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T12:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:38Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.501824 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8bd72449608bbfb68165af6302aae01ce97dd9cf13392ccd6366dc7fbb7e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:38Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.513476 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g92rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"242d3bf9-4462-4562-813a-f3548edc94fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca482bd84eb9a872cae7dd27e7606909c24675c47b5def6c7d3b0a947fe198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7r96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g92rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:38Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.526874 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c8c6bc-647d-452a-9bee-7b7ccafd2816\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ace64875a6e7a6735b141be933b0fe4fb330351eb8b4c9a943e32312d019849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:38Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.536579 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:38Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.547123 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:38Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.587593 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.587625 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.587633 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.587647 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.587658 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:38Z","lastTransitionTime":"2026-03-09T13:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.691004 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.691071 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.691091 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.691122 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.691140 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:38Z","lastTransitionTime":"2026-03-09T13:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.793931 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.793962 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.793969 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.793982 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.793990 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:38Z","lastTransitionTime":"2026-03-09T13:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.880540 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.880610 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.880695 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:00:38 crc kubenswrapper[4723]: E0309 13:00:38.880785 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:00:38 crc kubenswrapper[4723]: E0309 13:00:38.881062 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:00:38 crc kubenswrapper[4723]: E0309 13:00:38.881176 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.896458 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.896496 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.896507 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.896523 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.896533 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:38Z","lastTransitionTime":"2026-03-09T13:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.999189 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.999242 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.999254 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.999308 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:38 crc kubenswrapper[4723]: I0309 13:00:38.999324 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:38Z","lastTransitionTime":"2026-03-09T13:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.102430 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.102496 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.102516 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.102554 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.102575 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:39Z","lastTransitionTime":"2026-03-09T13:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.206025 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.206083 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.206101 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.206127 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.206145 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:39Z","lastTransitionTime":"2026-03-09T13:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.308837 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.308929 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.308944 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.308965 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.308976 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:39Z","lastTransitionTime":"2026-03-09T13:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.375888 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" event={"ID":"edb23619-78b6-4d63-aacf-98d7ce86bc5b","Type":"ContainerStarted","Data":"ca158acc1cc3ee4b2280f8d4a28c1ba412659c33412a3f887c38601163536461"} Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.378795 4723 generic.go:334] "Generic (PLEG): container finished" podID="2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1" containerID="e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b" exitCode=0 Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.378845 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jb44m" event={"ID":"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1","Type":"ContainerDied","Data":"e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b"} Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.398525 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T12:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 12:59:55.482549 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 12:59:55.482783 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 12:59:55.483649 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2499336393/tls.crt::/tmp/serving-cert-2499336393/tls.key\\\\\\\"\\\\nI0309 12:59:56.049731 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 12:59:56.052155 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 12:59:56.052201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 12:59:56.052220 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 12:59:56.052226 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 12:59:56.066627 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 12:59:56.066654 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066658 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 12:59:56.066665 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 12:59:56.066668 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 12:59:56.066671 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 12:59:56.067066 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 12:59:56.069624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T12:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:39Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.410628 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.410652 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.410659 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.410672 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.410681 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:39Z","lastTransitionTime":"2026-03-09T13:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.411907 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8bd72449608bbfb68165af6302aae01ce97dd9cf13392ccd6366dc7fbb7e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:39Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.422272 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ca3fb3acb84f38e208051240b02c48f3a17be1ed5b3e309c769ae601bac22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06f3f6a6fc3af6d7c96d5860969110d357f79b45ffba8797c8939c39c299a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:39Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.431695 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:39Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.444059 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l2l9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91019298-2c2b-48a9-8813-cd58d2681f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fc4cb656087f1906e4337675f18465d9bb6b9e324c185a76aeac922438e5b5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l2l9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:39Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.457421 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"983d5ed4-cfc7-402a-b226-29dc071c6e4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://090e849795de6303acc7ba93444e1096a0cbb499371f86f3ce97d74a639f8ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b99b2d350bcf051156986ebffa7486b5fd35d0dcd70f0163bbbeb54050408e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cfjq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:39Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.473025 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g92rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"242d3bf9-4462-4562-813a-f3548edc94fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca482bd84eb9a872cae7dd27e7606909c24675c47b5def6c7d3b0a947fe198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7r96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g92rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:39Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.481671 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c8c6bc-647d-452a-9bee-7b7ccafd2816\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ace64875a6e7a6735b141be933b0fe4fb330351eb8b4c9a943e32312d019849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:39Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.498097 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:39Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.517082 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:39Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.518040 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.518066 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.518075 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.518088 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.518098 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:39Z","lastTransitionTime":"2026-03-09T13:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.528305 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de264c73b21eebce2a476c8f7d57aa48db637aa83bb6f49be76940a38163b487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:39Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.545047 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jb44m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jb44m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:39Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.561197 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edb23619-78b6-4d63-aacf-98d7ce86bc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zngwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:39Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.621135 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.621163 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.621172 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.621185 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.621194 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:39Z","lastTransitionTime":"2026-03-09T13:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.723631 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.724074 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.724135 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.724166 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.724188 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:39Z","lastTransitionTime":"2026-03-09T13:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.827806 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.827850 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.827888 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.827907 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.827918 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:39Z","lastTransitionTime":"2026-03-09T13:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.927385 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-7dh57"] Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.928335 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7dh57" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.931528 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.931579 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.931596 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.931620 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.931638 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:39Z","lastTransitionTime":"2026-03-09T13:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.934150 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.934205 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.936075 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.937522 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.954947 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g92rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"242d3bf9-4462-4562-813a-f3548edc94fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca482bd84eb9a872cae7dd27e7606909c24675c47b5def6c7d3b0a947fe198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7r96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g92rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:39Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.973728 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7dh57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b334debc-b7ae-4bc1-8e6d-44e4bc86bb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7dh57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:39Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.987828 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:39Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:39 crc kubenswrapper[4723]: I0309 13:00:39.997660 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c8c6bc-647d-452a-9bee-7b7ccafd2816\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ace64875a6e7a6735b141be933b0fe4fb330351eb8b4c9a943e32312d019849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:39Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.010237 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.034265 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.034316 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.034333 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.034357 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.034374 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:40Z","lastTransitionTime":"2026-03-09T13:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.037606 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edb23619-78b6-4d63-aacf-98d7ce86bc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zngwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.052390 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de264c73b21eebce2a476c8f7d57aa48db637aa83bb6f49be76940a38163b487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.052853 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b334debc-b7ae-4bc1-8e6d-44e4bc86bb6a-host\") pod \"node-ca-7dh57\" (UID: \"b334debc-b7ae-4bc1-8e6d-44e4bc86bb6a\") " pod="openshift-image-registry/node-ca-7dh57" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.052925 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b334debc-b7ae-4bc1-8e6d-44e4bc86bb6a-serviceca\") pod \"node-ca-7dh57\" (UID: \"b334debc-b7ae-4bc1-8e6d-44e4bc86bb6a\") " pod="openshift-image-registry/node-ca-7dh57" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.052963 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nstjd\" (UniqueName: \"kubernetes.io/projected/b334debc-b7ae-4bc1-8e6d-44e4bc86bb6a-kube-api-access-nstjd\") pod \"node-ca-7dh57\" (UID: \"b334debc-b7ae-4bc1-8e6d-44e4bc86bb6a\") " pod="openshift-image-registry/node-ca-7dh57" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.066978 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jb44m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jb44m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.086188 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8bd72449608bbfb68165af6302aae01ce97dd9cf13392ccd6366dc7fbb7e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.100194 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ca3fb3acb84f38e208051240b02c48f3a17be1ed5b3e309c769ae601bac22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06f3f6a6fc3af6d7c96d5860969110d357f79b45ffba8797c8939c39c299a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.114243 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.133423 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l2l9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91019298-2c2b-48a9-8813-cd58d2681f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fc4cb656087f1906e4337675f18465d9bb6b9e324c185a76aeac922438e5b5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l2l9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.137627 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.137661 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.137672 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.137689 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.137702 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:40Z","lastTransitionTime":"2026-03-09T13:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.149957 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"983d5ed4-cfc7-402a-b226-29dc071c6e4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://090e849795de6303acc7ba93444e1096a0cbb499371f86f3ce97d74a639f8ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b99b2d350bcf051156986ebffa7486b5fd35d0dcd70f0163bbbeb54050408e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cfjq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.153477 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b334debc-b7ae-4bc1-8e6d-44e4bc86bb6a-host\") pod \"node-ca-7dh57\" (UID: \"b334debc-b7ae-4bc1-8e6d-44e4bc86bb6a\") " pod="openshift-image-registry/node-ca-7dh57" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.153518 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b334debc-b7ae-4bc1-8e6d-44e4bc86bb6a-serviceca\") pod \"node-ca-7dh57\" (UID: \"b334debc-b7ae-4bc1-8e6d-44e4bc86bb6a\") " pod="openshift-image-registry/node-ca-7dh57" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.153538 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b334debc-b7ae-4bc1-8e6d-44e4bc86bb6a-host\") pod \"node-ca-7dh57\" (UID: \"b334debc-b7ae-4bc1-8e6d-44e4bc86bb6a\") " pod="openshift-image-registry/node-ca-7dh57" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.153547 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nstjd\" (UniqueName: \"kubernetes.io/projected/b334debc-b7ae-4bc1-8e6d-44e4bc86bb6a-kube-api-access-nstjd\") pod \"node-ca-7dh57\" (UID: \"b334debc-b7ae-4bc1-8e6d-44e4bc86bb6a\") " pod="openshift-image-registry/node-ca-7dh57" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.155270 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b334debc-b7ae-4bc1-8e6d-44e4bc86bb6a-serviceca\") pod \"node-ca-7dh57\" (UID: \"b334debc-b7ae-4bc1-8e6d-44e4bc86bb6a\") " pod="openshift-image-registry/node-ca-7dh57" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.168468 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T12:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 12:59:55.482549 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 12:59:55.482783 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 12:59:55.483649 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2499336393/tls.crt::/tmp/serving-cert-2499336393/tls.key\\\\\\\"\\\\nI0309 12:59:56.049731 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 12:59:56.052155 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 12:59:56.052201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 12:59:56.052220 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 12:59:56.052226 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 12:59:56.066627 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 12:59:56.066654 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066658 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 12:59:56.066665 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 12:59:56.066668 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 12:59:56.066671 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 12:59:56.067066 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 12:59:56.069624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T12:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.175904 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nstjd\" (UniqueName: \"kubernetes.io/projected/b334debc-b7ae-4bc1-8e6d-44e4bc86bb6a-kube-api-access-nstjd\") pod \"node-ca-7dh57\" (UID: \"b334debc-b7ae-4bc1-8e6d-44e4bc86bb6a\") " pod="openshift-image-registry/node-ca-7dh57" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.240734 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.240816 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.240829 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.240847 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.240924 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:40Z","lastTransitionTime":"2026-03-09T13:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.253153 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7dh57" Mar 09 13:00:40 crc kubenswrapper[4723]: W0309 13:00:40.278097 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb334debc_b7ae_4bc1_8e6d_44e4bc86bb6a.slice/crio-763b6d090ac5e1f26142ae4d993f1d880d84f725bbb7d8f02b783bbb5f8541f7 WatchSource:0}: Error finding container 763b6d090ac5e1f26142ae4d993f1d880d84f725bbb7d8f02b783bbb5f8541f7: Status 404 returned error can't find the container with id 763b6d090ac5e1f26142ae4d993f1d880d84f725bbb7d8f02b783bbb5f8541f7 Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.344101 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.344159 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.344173 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.344196 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.344213 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:40Z","lastTransitionTime":"2026-03-09T13:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.384228 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7dh57" event={"ID":"b334debc-b7ae-4bc1-8e6d-44e4bc86bb6a","Type":"ContainerStarted","Data":"763b6d090ac5e1f26142ae4d993f1d880d84f725bbb7d8f02b783bbb5f8541f7"} Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.393757 4723 generic.go:334] "Generic (PLEG): container finished" podID="2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1" containerID="92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523" exitCode=0 Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.393810 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jb44m" event={"ID":"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1","Type":"ContainerDied","Data":"92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523"} Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.405030 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c8c6bc-647d-452a-9bee-7b7ccafd2816\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ace64875a6e7a6735b141be933b0fe4fb330351eb8b4c9a943e32312d019849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.416922 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.427893 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.437515 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de264c73b21eebce2a476c8f7d57aa48db637aa83bb6f49be76940a38163b487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.450016 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.450057 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.450066 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.450084 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.450094 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:40Z","lastTransitionTime":"2026-03-09T13:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.450172 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jb44m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jb44m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.467655 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edb23619-78b6-4d63-aacf-98d7ce86bc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zngwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.478484 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ca3fb3acb84f38e208051240b02c48f3a17be1ed5b3e309c769ae601bac22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06f3f6a6fc3af6d7c96d5860969110d357f79b45ffba8797c8939c39c299a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.490988 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.501950 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l2l9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91019298-2c2b-48a9-8813-cd58d2681f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fc4cb656087f1906e4337675f18465d9bb6b9e324c185a76aeac922438e5b5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l2l9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.551333 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"983d5ed4-cfc7-402a-b226-29dc071c6e4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://090e849795de6303acc7ba93444e1096a0cbb499371f86f3ce97d74a639f8ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b99b2d350bcf051156986ebffa7486b5fd35d0dcd70f0163bbbeb54050408e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cfjq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.553830 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.553884 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.553892 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.553906 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.553914 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:40Z","lastTransitionTime":"2026-03-09T13:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.564374 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T12:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 12:59:55.482549 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 12:59:55.482783 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 12:59:55.483649 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2499336393/tls.crt::/tmp/serving-cert-2499336393/tls.key\\\\\\\"\\\\nI0309 12:59:56.049731 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 12:59:56.052155 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 12:59:56.052201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 12:59:56.052220 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 12:59:56.052226 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 12:59:56.066627 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 12:59:56.066654 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066658 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 12:59:56.066665 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 12:59:56.066668 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 12:59:56.066671 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 12:59:56.067066 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 12:59:56.069624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T12:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.575963 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8bd72449608bbfb68165af6302aae01ce97dd9cf13392ccd6366dc7fbb7e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.587311 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7dh57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b334debc-b7ae-4bc1-8e6d-44e4bc86bb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7dh57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.600710 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g92rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"242d3bf9-4462-4562-813a-f3548edc94fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca482bd84eb9a872cae7dd27e7606909c24675c47b5def6c7d3b0a947fe198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7r96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g92rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.656759 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.656811 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.656824 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.656848 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.656892 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:40Z","lastTransitionTime":"2026-03-09T13:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.759832 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.759879 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.759891 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.759907 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.759919 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:40Z","lastTransitionTime":"2026-03-09T13:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.862752 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.862783 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.862794 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.862809 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.862821 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:40Z","lastTransitionTime":"2026-03-09T13:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.880401 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.880441 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.880461 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:00:40 crc kubenswrapper[4723]: E0309 13:00:40.880561 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:00:40 crc kubenswrapper[4723]: E0309 13:00:40.880664 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:00:40 crc kubenswrapper[4723]: E0309 13:00:40.880739 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.965419 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.965457 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.965469 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.965486 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:40 crc kubenswrapper[4723]: I0309 13:00:40.965498 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:40Z","lastTransitionTime":"2026-03-09T13:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.068793 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.068886 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.068904 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.068929 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.068950 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:41Z","lastTransitionTime":"2026-03-09T13:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.172903 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.172958 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.172975 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.173014 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.173033 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:41Z","lastTransitionTime":"2026-03-09T13:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.275208 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.275242 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.275250 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.275263 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.275274 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:41Z","lastTransitionTime":"2026-03-09T13:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.377449 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.377525 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.377543 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.377569 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.377590 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:41Z","lastTransitionTime":"2026-03-09T13:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.398971 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7dh57" event={"ID":"b334debc-b7ae-4bc1-8e6d-44e4bc86bb6a","Type":"ContainerStarted","Data":"804f60c896756f05ea5aa24c1a6083c987a657c0753a2a49e401e67f74bee212"} Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.405154 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" event={"ID":"edb23619-78b6-4d63-aacf-98d7ce86bc5b","Type":"ContainerStarted","Data":"301aa9cafe3650778050a75658dae5d8b24f6ec90e8aba20d203ffad996c8c64"} Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.407110 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.412409 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jb44m" event={"ID":"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1","Type":"ContainerStarted","Data":"c0d50f12886941ea85f21371c6996c216a385703ebb13cd05080fdad052e271a"} Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.413426 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.427030 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l2l9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91019298-2c2b-48a9-8813-cd58d2681f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fc4cb656087f1906e4337675f18465d9bb6b9e324c185a76aeac922438e5b5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l2l9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.436733 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.441973 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"983d5ed4-cfc7-402a-b226-29dc071c6e4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://090e849795de6303acc7ba93444e1096a0cbb499371f86f3ce97d74a639f8ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b99b2d350bcf051156986ebffa7486b5fd35d0dcd70f0163bbbeb54050408e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cfjq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.465375 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T12:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 12:59:55.482549 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 12:59:55.482783 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 12:59:55.483649 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2499336393/tls.crt::/tmp/serving-cert-2499336393/tls.key\\\\\\\"\\\\nI0309 12:59:56.049731 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 12:59:56.052155 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 12:59:56.052201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 12:59:56.052220 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 12:59:56.052226 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 12:59:56.066627 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 12:59:56.066654 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066658 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 12:59:56.066665 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 12:59:56.066668 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 12:59:56.066671 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 12:59:56.067066 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 12:59:56.069624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T12:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.480679 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.480724 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.480740 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.480761 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.480997 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:41Z","lastTransitionTime":"2026-03-09T13:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.484338 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8bd72449608bbfb68165af6302aae01ce97dd9cf13392ccd6366dc7fbb7e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.499450 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ca3fb3acb84f38e208051240b02c48f3a17be1ed5b3e309c769ae601bac22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06f3f6a6fc3af6d7c96d5860969110d357f79b45ffba8797c8939c39c299a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.516082 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g92rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"242d3bf9-4462-4562-813a-f3548edc94fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca482bd84eb9a872cae7dd27e7606909c24675c47b5def6c7d3b0a947fe198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7r96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g92rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.526452 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7dh57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b334debc-b7ae-4bc1-8e6d-44e4bc86bb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://804f60c896756f05ea5aa24c1a6083c987a657c0753a2a49e401e67f74bee212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7dh57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.537106 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c8c6bc-647d-452a-9bee-7b7ccafd2816\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ace64875a6e7a6735b141be933b0fe4fb330351eb8b4c9a943e32312d019849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.552434 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.564824 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.576802 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de264c73b21eebce2a476c8f7d57aa48db637aa83bb6f49be76940a38163b487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.600165 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.600210 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.600220 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.600234 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.600243 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:41Z","lastTransitionTime":"2026-03-09T13:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.611990 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jb44m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jb44m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.651409 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edb23619-78b6-4d63-aacf-98d7ce86bc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zngwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.663794 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c8c6bc-647d-452a-9bee-7b7ccafd2816\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ace64875a6e7a6735b141be933b0fe4fb330351eb8b4c9a943e32312d019849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.680189 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.691998 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.703238 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.703285 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.703297 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.703317 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.703329 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:41Z","lastTransitionTime":"2026-03-09T13:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.704316 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de264c73b21eebce2a476c8f7d57aa48db637aa83bb6f49be76940a38163b487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.719898 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jb44m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0d50f12886941ea85f21371c6996c216a385703ebb13cd05080fdad052e271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jb44m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.740491 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edb23619-78b6-4d63-aacf-98d7ce86bc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db49efb6d6e555b280f71c32194247ba51a121250abba598286800970b6bec74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74e157a2b8f0e011a72fd9bb3979bede5694bfab635bf7a6c4ee6780d35119e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d41e316c1f5e87b224e0037a94379033e22254097db3d2decf47a5fd335f766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://093509abfd0f010ebce9a116475da8c898a65569861b646314023d4ca81fea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f52ae34a25520c1f48c20a86ec3e00706385ab5dcfeb3bfd616422b454cf5c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84ae27509fbfe9c2f53f5de2a01cfc809ca8abe68dcd96a96963438f0924ddb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://301aa9cafe3650778050a75658dae5d8b24f6ec90e8aba20d203ffad996c8c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca158acc1cc3ee4b2280f8d4a28c1ba412659c33412a3f887c38601163536461\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zngwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.755745 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T12:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 12:59:55.482549 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 12:59:55.482783 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 12:59:55.483649 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2499336393/tls.crt::/tmp/serving-cert-2499336393/tls.key\\\\\\\"\\\\nI0309 12:59:56.049731 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 12:59:56.052155 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 12:59:56.052201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 12:59:56.052220 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 12:59:56.052226 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 12:59:56.066627 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 12:59:56.066654 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066658 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 12:59:56.066665 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 12:59:56.066668 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 12:59:56.066671 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 12:59:56.067066 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 12:59:56.069624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T12:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.770561 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8bd72449608bbfb68165af6302aae01ce97dd9cf13392ccd6366dc7fbb7e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.782192 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ca3fb3acb84f38e208051240b02c48f3a17be1ed5b3e309c769ae601bac22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06f3f6a6fc3af6d7c96d5860969110d357f79b45ffba8797c8939c39c299a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.792271 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.806054 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.806092 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.806104 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.806119 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.806130 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:41Z","lastTransitionTime":"2026-03-09T13:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.808384 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l2l9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91019298-2c2b-48a9-8813-cd58d2681f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fc4cb656087f1906e4337675f18465d9bb6b9e324c185a76aeac922438e5b5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l2l9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.820994 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"983d5ed4-cfc7-402a-b226-29dc071c6e4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://090e849795de6303acc7ba93444e1096a0cbb499371f86f3ce97d74a639f8ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b99b2d350bcf051156986ebffa7486b5fd35d0dcd70f0163bbbeb54050408e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cfjq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.836437 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g92rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"242d3bf9-4462-4562-813a-f3548edc94fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca482bd84eb9a872cae7dd27e7606909c24675c47b5def6c7d3b0a947fe198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7r96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g92rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.846658 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7dh57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b334debc-b7ae-4bc1-8e6d-44e4bc86bb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://804f60c896756f05ea5aa24c1a6083c987a657c0753a2a49e401e67f74bee212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7dh57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.907837 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.907892 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.907904 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.907922 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:41 crc kubenswrapper[4723]: I0309 13:00:41.907933 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:41Z","lastTransitionTime":"2026-03-09T13:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.010511 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.010569 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.010582 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.010599 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.010612 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:42Z","lastTransitionTime":"2026-03-09T13:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.113690 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.113754 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.113773 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.113802 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.113819 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:42Z","lastTransitionTime":"2026-03-09T13:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.217335 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.217401 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.217419 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.217444 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.217464 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:42Z","lastTransitionTime":"2026-03-09T13:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.319652 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.319721 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.319744 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.319775 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.319796 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:42Z","lastTransitionTime":"2026-03-09T13:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.417073 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.417172 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.422463 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.422538 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.422556 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.422578 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.422596 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:42Z","lastTransitionTime":"2026-03-09T13:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.445984 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.458523 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7dh57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b334debc-b7ae-4bc1-8e6d-44e4bc86bb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://804f60c896756f05ea5aa24c1a6083c987a657c0753a2a49e401e67f74bee212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7dh57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.472494 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g92rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"242d3bf9-4462-4562-813a-f3548edc94fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca482bd84eb9a872cae7dd27e7606909c24675c47b5def6c7d3b0a947fe198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7r96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g92rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.481521 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c8c6bc-647d-452a-9bee-7b7ccafd2816\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ace64875a6e7a6735b141be933b0fe4fb330351eb8b4c9a943e32312d019849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.494841 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.510426 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.525565 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.525603 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.525619 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.525642 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.525660 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:42Z","lastTransitionTime":"2026-03-09T13:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.527215 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de264c73b21eebce2a476c8f7d57aa48db637aa83bb6f49be76940a38163b487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.543850 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jb44m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0d50f12886941ea85f21371c6996c216a385703ebb13cd05080fdad052e271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jb44m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.567243 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edb23619-78b6-4d63-aacf-98d7ce86bc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db49efb6d6e555b280f71c32194247ba51a121250abba598286800970b6bec74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74e157a2b8f0e011a72fd9bb3979bede5694bfab635bf7a6c4ee6780d35119e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d41e316c1f5e87b224e0037a94379033e22254097db3d2decf47a5fd335f766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://093509abfd0f010ebce9a116475da8c898a65569861b646314023d4ca81fea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f52ae34a25520c1f48c20a86ec3e00706385ab5dcfeb3bfd616422b454cf5c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84ae27509fbfe9c2f53f5de2a01cfc809ca8abe68dcd96a96963438f0924ddb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://301aa9cafe3650778050a75658dae5d8b24f6ec90e8aba20d203ffad996c8c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca158acc1cc3ee4b2280f8d4a28c1ba412659c33412a3f887c38601163536461\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zngwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.587516 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ca3fb3acb84f38e208051240b02c48f3a17be1ed5b3e309c769ae601bac22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06f3f6a6fc3af6d7c96d5860969110d357f79b45ffba8797c8939c39c299a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.602693 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.616231 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l2l9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91019298-2c2b-48a9-8813-cd58d2681f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fc4cb656087f1906e4337675f18465d9bb6b9e324c185a76aeac922438e5b5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l2l9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.628159 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.628270 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.628294 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.628324 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.628346 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:42Z","lastTransitionTime":"2026-03-09T13:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.633531 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"983d5ed4-cfc7-402a-b226-29dc071c6e4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://090e849795de6303acc7ba93444e1096a0cbb499371f86f3ce97d74a639f8ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b99b2d350bcf051156986ebffa7486b5fd35d0dcd70f0163bbbeb54050408e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cfjq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.655566 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T12:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 12:59:55.482549 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 12:59:55.482783 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 12:59:55.483649 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2499336393/tls.crt::/tmp/serving-cert-2499336393/tls.key\\\\\\\"\\\\nI0309 12:59:56.049731 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 12:59:56.052155 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 12:59:56.052201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 12:59:56.052220 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 12:59:56.052226 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 12:59:56.066627 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 12:59:56.066654 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066658 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 12:59:56.066665 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 12:59:56.066668 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 12:59:56.066671 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 12:59:56.067066 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 12:59:56.069624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T12:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.675673 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8bd72449608bbfb68165af6302aae01ce97dd9cf13392ccd6366dc7fbb7e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.731486 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.731536 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.731578 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.731602 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.731624 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:42Z","lastTransitionTime":"2026-03-09T13:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.834970 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.835050 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.835073 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.835103 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.835126 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:42Z","lastTransitionTime":"2026-03-09T13:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.879911 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.880004 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:00:42 crc kubenswrapper[4723]: E0309 13:00:42.880078 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:00:42 crc kubenswrapper[4723]: E0309 13:00:42.880223 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.880020 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:00:42 crc kubenswrapper[4723]: E0309 13:00:42.880389 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.938323 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.938377 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.938395 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.938417 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:42 crc kubenswrapper[4723]: I0309 13:00:42.938432 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:42Z","lastTransitionTime":"2026-03-09T13:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.040755 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.040797 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.040808 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.040824 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.040835 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:43Z","lastTransitionTime":"2026-03-09T13:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.143454 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.143506 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.143518 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.143541 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.143554 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:43Z","lastTransitionTime":"2026-03-09T13:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.246394 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.246441 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.246458 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.246481 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.246500 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:43Z","lastTransitionTime":"2026-03-09T13:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.348682 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.348712 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.348720 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.348732 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.348741 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:43Z","lastTransitionTime":"2026-03-09T13:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.451574 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.451915 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.451926 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.451945 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.451957 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:43Z","lastTransitionTime":"2026-03-09T13:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.554082 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.554123 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.554133 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.554148 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.554166 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:43Z","lastTransitionTime":"2026-03-09T13:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.656572 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.656634 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.656653 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.656676 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.656697 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:43Z","lastTransitionTime":"2026-03-09T13:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.759980 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.760064 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.760079 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.760103 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.760120 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:43Z","lastTransitionTime":"2026-03-09T13:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.863560 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.863629 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.863643 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.863659 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.863674 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:43Z","lastTransitionTime":"2026-03-09T13:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.967229 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.967295 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.967317 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.967347 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:43 crc kubenswrapper[4723]: I0309 13:00:43.967368 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:43Z","lastTransitionTime":"2026-03-09T13:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.071194 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.071263 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.071280 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.071304 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.071323 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:44Z","lastTransitionTime":"2026-03-09T13:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.174549 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.174639 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.174672 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.174703 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.174725 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:44Z","lastTransitionTime":"2026-03-09T13:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.278044 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.278128 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.278146 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.278689 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.278755 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:44Z","lastTransitionTime":"2026-03-09T13:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.381943 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.382000 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.382013 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.382031 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.382045 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:44Z","lastTransitionTime":"2026-03-09T13:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.427088 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zngwx_edb23619-78b6-4d63-aacf-98d7ce86bc5b/ovnkube-controller/0.log" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.431280 4723 generic.go:334] "Generic (PLEG): container finished" podID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerID="301aa9cafe3650778050a75658dae5d8b24f6ec90e8aba20d203ffad996c8c64" exitCode=1 Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.431351 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" event={"ID":"edb23619-78b6-4d63-aacf-98d7ce86bc5b","Type":"ContainerDied","Data":"301aa9cafe3650778050a75658dae5d8b24f6ec90e8aba20d203ffad996c8c64"} Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.432470 4723 scope.go:117] "RemoveContainer" containerID="301aa9cafe3650778050a75658dae5d8b24f6ec90e8aba20d203ffad996c8c64" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.460638 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T12:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 12:59:55.482549 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 12:59:55.482783 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 12:59:55.483649 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2499336393/tls.crt::/tmp/serving-cert-2499336393/tls.key\\\\\\\"\\\\nI0309 12:59:56.049731 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 12:59:56.052155 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 12:59:56.052201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 12:59:56.052220 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 12:59:56.052226 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 12:59:56.066627 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 12:59:56.066654 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066658 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 12:59:56.066665 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 12:59:56.066668 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 12:59:56.066671 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 12:59:56.067066 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 12:59:56.069624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T12:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.484605 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8bd72449608bbfb68165af6302aae01ce97dd9cf13392ccd6366dc7fbb7e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.485159 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.485197 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.485208 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.485225 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.485237 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:44Z","lastTransitionTime":"2026-03-09T13:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.498678 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ca3fb3acb84f38e208051240b02c48f3a17be1ed5b3e309c769ae601bac22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06f3f6a6fc3af6d7c96d5860969110d357f79b45ffba8797c8939c39c299a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.511581 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.523273 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l2l9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91019298-2c2b-48a9-8813-cd58d2681f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fc4cb656087f1906e4337675f18465d9bb6b9e324c185a76aeac922438e5b5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l2l9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.538320 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"983d5ed4-cfc7-402a-b226-29dc071c6e4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://090e849795de6303acc7ba93444e1096a0cbb499371f86f3ce97d74a639f8ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b99b2d350bcf051156986ebffa7486b5fd35d0dcd70f0163bbbeb54050408e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cfjq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.552709 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g92rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"242d3bf9-4462-4562-813a-f3548edc94fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca482bd84eb9a872cae7dd27e7606909c24675c47b5def6c7d3b0a947fe198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7r96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g92rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.565905 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7dh57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b334debc-b7ae-4bc1-8e6d-44e4bc86bb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://804f60c896756f05ea5aa24c1a6083c987a657c0753a2a49e401e67f74bee212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7dh57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.578746 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c8c6bc-647d-452a-9bee-7b7ccafd2816\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ace64875a6e7a6735b141be933b0fe4fb330351eb8b4c9a943e32312d019849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.587944 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.587976 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.587988 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.588005 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.588017 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:44Z","lastTransitionTime":"2026-03-09T13:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.597408 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.611394 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.625825 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de264c73b21eebce2a476c8f7d57aa48db637aa83bb6f49be76940a38163b487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.645939 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jb44m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0d50f12886941ea85f21371c6996c216a385703ebb13cd05080fdad052e271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jb44m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.666255 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edb23619-78b6-4d63-aacf-98d7ce86bc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db49efb6d6e555b280f71c32194247ba51a121250abba598286800970b6bec74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74e157a2b8f0e011a72fd9bb3979bede5694bfab635bf7a6c4ee6780d35119e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d41e316c1f5e87b224e0037a94379033e22254097db3d2decf47a5fd335f766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://093509abfd0f010ebce9a116475da8c898a65569861b646314023d4ca81fea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f52ae34a25520c1f48c20a86ec3e00706385ab5dcfeb3bfd616422b454cf5c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84ae27509fbfe9c2f53f5de2a01cfc809ca8abe68dcd96a96963438f0924ddb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://301aa9cafe3650778050a75658dae5d8b24f6ec90e8aba20d203ffad996c8c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://301aa9cafe3650778050a75658dae5d8b24f6ec90e8aba20d203ffad996c8c64\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:00:43Z\\\",\\\"message\\\":\\\"/client-go/informers/factory.go:160\\\\nI0309 13:00:43.486509 6619 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:00:43.486575 6619 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 13:00:43.486623 6619 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 13:00:43.486677 6619 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 13:00:43.486659 6619 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0309 13:00:43.486754 6619 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:00:43.486601 6619 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0309 13:00:43.486879 6619 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 13:00:43.486900 6619 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:00:43.487109 6619 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:00:43.487828 6619 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca158acc1cc3ee4b2280f8d4a28c1ba412659c33412a3f887c38601163536461\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zngwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.689778 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.689822 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.689835 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.689851 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.689879 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:44Z","lastTransitionTime":"2026-03-09T13:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.792609 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.792660 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.792682 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.792728 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.792746 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:44Z","lastTransitionTime":"2026-03-09T13:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.880814 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.880841 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:00:44 crc kubenswrapper[4723]: E0309 13:00:44.880952 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.881072 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:00:44 crc kubenswrapper[4723]: E0309 13:00:44.881202 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:00:44 crc kubenswrapper[4723]: E0309 13:00:44.881260 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.894523 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.894554 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.894563 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.894575 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.894584 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:44Z","lastTransitionTime":"2026-03-09T13:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.996461 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.996496 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.996504 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.996520 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:44 crc kubenswrapper[4723]: I0309 13:00:44.996530 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:44Z","lastTransitionTime":"2026-03-09T13:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.049379 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.049420 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.049432 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.049451 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.049463 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:45Z","lastTransitionTime":"2026-03-09T13:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:45 crc kubenswrapper[4723]: E0309 13:00:45.062692 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7a1a0b61-b39c-449b-8223-316b94b8c26c\\\",\\\"systemUUID\\\":\\\"548816a4-d164-4614-b3ad-c3e4f48e94b9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.066175 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.066216 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.066226 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.066241 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.066250 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:45Z","lastTransitionTime":"2026-03-09T13:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:45 crc kubenswrapper[4723]: E0309 13:00:45.078382 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7a1a0b61-b39c-449b-8223-316b94b8c26c\\\",\\\"systemUUID\\\":\\\"548816a4-d164-4614-b3ad-c3e4f48e94b9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.081559 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.081608 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.081625 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.081647 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.081664 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:45Z","lastTransitionTime":"2026-03-09T13:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:45 crc kubenswrapper[4723]: E0309 13:00:45.097404 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7a1a0b61-b39c-449b-8223-316b94b8c26c\\\",\\\"systemUUID\\\":\\\"548816a4-d164-4614-b3ad-c3e4f48e94b9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.100740 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.100789 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.100806 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.100829 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.100846 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:45Z","lastTransitionTime":"2026-03-09T13:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:45 crc kubenswrapper[4723]: E0309 13:00:45.113432 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7a1a0b61-b39c-449b-8223-316b94b8c26c\\\",\\\"systemUUID\\\":\\\"548816a4-d164-4614-b3ad-c3e4f48e94b9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.121179 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.121214 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.121226 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.121243 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.121252 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:45Z","lastTransitionTime":"2026-03-09T13:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:45 crc kubenswrapper[4723]: E0309 13:00:45.135455 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7a1a0b61-b39c-449b-8223-316b94b8c26c\\\",\\\"systemUUID\\\":\\\"548816a4-d164-4614-b3ad-c3e4f48e94b9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:45 crc kubenswrapper[4723]: E0309 13:00:45.135748 4723 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.137567 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.137610 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.137627 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.137648 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.137666 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:45Z","lastTransitionTime":"2026-03-09T13:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.240360 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.240390 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.240397 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.240411 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.240421 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:45Z","lastTransitionTime":"2026-03-09T13:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.342695 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.342737 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.342765 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.342781 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.342791 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:45Z","lastTransitionTime":"2026-03-09T13:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.438697 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zngwx_edb23619-78b6-4d63-aacf-98d7ce86bc5b/ovnkube-controller/0.log" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.442363 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" event={"ID":"edb23619-78b6-4d63-aacf-98d7ce86bc5b","Type":"ContainerStarted","Data":"e54bb87f6edf8806c89beb2fe9d8ae1c12af001955f19f597597eef3ee7ab5a0"} Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.442964 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.444774 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.444908 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.445025 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.445055 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.445072 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:45Z","lastTransitionTime":"2026-03-09T13:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.459734 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8bd72449608bbfb68165af6302aae01ce97dd9cf13392ccd6366dc7fbb7e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.479990 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ca3fb3acb84f38e208051240b02c48f3a17be1ed5b3e309c769ae601bac22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06f3f6a6fc3af6d7c96d5860969110d357f79b45ffba8797c8939c39c299a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.495708 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.509612 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l2l9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91019298-2c2b-48a9-8813-cd58d2681f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fc4cb656087f1906e4337675f18465d9bb6b9e324c185a76aeac922438e5b5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l2l9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.526960 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"983d5ed4-cfc7-402a-b226-29dc071c6e4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://090e849795de6303acc7ba93444e1096a0cbb499371f86f3ce97d74a639f8ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b99b2d350bcf051156986ebffa7486b5fd35d0dcd70f0163bbbeb54050408e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cfjq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.543787 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T12:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 12:59:55.482549 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 12:59:55.482783 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 12:59:55.483649 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2499336393/tls.crt::/tmp/serving-cert-2499336393/tls.key\\\\\\\"\\\\nI0309 12:59:56.049731 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 12:59:56.052155 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 12:59:56.052201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 12:59:56.052220 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 12:59:56.052226 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 12:59:56.066627 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 12:59:56.066654 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066658 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 12:59:56.066665 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 12:59:56.066668 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 12:59:56.066671 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 12:59:56.067066 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 12:59:56.069624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T12:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.548352 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.548389 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.548401 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.548417 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.548429 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:45Z","lastTransitionTime":"2026-03-09T13:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.558574 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g92rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"242d3bf9-4462-4562-813a-f3548edc94fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca482bd84eb9a872cae7dd27e7606909c24675c47b5def6c7d3b0a947fe198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7r96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g92rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.571965 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7dh57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b334debc-b7ae-4bc1-8e6d-44e4bc86bb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://804f60c896756f05ea5aa24c1a6083c987a657c0753a2a49e401e67f74bee212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7dh57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.588117 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.603052 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c8c6bc-647d-452a-9bee-7b7ccafd2816\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ace64875a6e7a6735b141be933b0fe4fb330351eb8b4c9a943e32312d019849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.622513 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.652228 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.652285 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.652304 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.652326 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.652344 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:45Z","lastTransitionTime":"2026-03-09T13:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.653660 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edb23619-78b6-4d63-aacf-98d7ce86bc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db49efb6d6e555b280f71c32194247ba51a121250abba598286800970b6bec74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74e157a2b8f0e011a72fd9bb3979bede5694bfab635bf7a6c4ee6780d35119e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d41e316c1f5e87b224e0037a94379033e22254097db3d2decf47a5fd335f766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://093509abfd0f010ebce9a116475da8c898a65569861b646314023d4ca81fea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f52ae34a25520c1f48c20a86ec3e00706385ab5dcfeb3bfd616422b454cf5c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84ae27509fbfe9c2f53f5de2a01cfc809ca8abe68dcd96a96963438f0924ddb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e54bb87f6edf8806c89beb2fe9d8ae1c12af001955f19f597597eef3ee7ab5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://301aa9cafe3650778050a75658dae5d8b24f6ec90e8aba20d203ffad996c8c64\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:00:43Z\\\",\\\"message\\\":\\\"/client-go/informers/factory.go:160\\\\nI0309 13:00:43.486509 6619 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:00:43.486575 6619 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 13:00:43.486623 6619 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 13:00:43.486677 6619 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 13:00:43.486659 6619 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0309 13:00:43.486754 6619 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:00:43.486601 6619 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0309 13:00:43.486879 6619 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 13:00:43.486900 6619 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:00:43.487109 6619 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:00:43.487828 6619 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca158acc1cc3ee4b2280f8d4a28c1ba412659c33412a3f887c38601163536461\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zngwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.669799 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de264c73b21eebce2a476c8f7d57aa48db637aa83bb6f49be76940a38163b487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.690819 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jb44m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0d50f12886941ea85f21371c6996c216a385703ebb13cd05080fdad052e271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jb44m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.755304 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.755407 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.755434 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.755463 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.755489 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:45Z","lastTransitionTime":"2026-03-09T13:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.858527 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.858579 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.858596 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.858619 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.858636 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:45Z","lastTransitionTime":"2026-03-09T13:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.958618 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mbhtt"] Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.959317 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mbhtt" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.961813 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.961911 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.961938 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.961967 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.961991 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:45Z","lastTransitionTime":"2026-03-09T13:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.964020 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.967472 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 09 13:00:45 crc kubenswrapper[4723]: I0309 13:00:45.984525 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de264c73b21eebce2a476c8f7d57aa48db637aa83bb6f49be76940a38163b487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.010225 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jb44m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0d50f12886941ea85f21371c6996c216a385703ebb13cd05080fdad052e271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jb44m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.016460 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dff98527-c16a-48b6-94a8-577172ec6dce-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-mbhtt\" (UID: \"dff98527-c16a-48b6-94a8-577172ec6dce\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mbhtt" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.016523 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkv4n\" (UniqueName: \"kubernetes.io/projected/dff98527-c16a-48b6-94a8-577172ec6dce-kube-api-access-gkv4n\") pod \"ovnkube-control-plane-749d76644c-mbhtt\" (UID: \"dff98527-c16a-48b6-94a8-577172ec6dce\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mbhtt" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.016565 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dff98527-c16a-48b6-94a8-577172ec6dce-env-overrides\") pod \"ovnkube-control-plane-749d76644c-mbhtt\" (UID: \"dff98527-c16a-48b6-94a8-577172ec6dce\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mbhtt" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.016608 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dff98527-c16a-48b6-94a8-577172ec6dce-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-mbhtt\" (UID: \"dff98527-c16a-48b6-94a8-577172ec6dce\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mbhtt" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.041051 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edb23619-78b6-4d63-aacf-98d7ce86bc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db49efb6d6e555b280f71c32194247ba51a121250abba598286800970b6bec74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74e157a2b8f0e011a72fd9bb3979bede5694bfab635bf7a6c4ee6780d35119e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d41e316c1f5e87b224e0037a94379033e22254097db3d2decf47a5fd335f766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://093509abfd0f010ebce9a116475da8c898a65569861b646314023d4ca81fea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f52ae34a25520c1f48c20a86ec3e00706385ab5dcfeb3bfd616422b454cf5c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84ae27509fbfe9c2f53f5de2a01cfc809ca8abe68dcd96a96963438f0924ddb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e54bb87f6edf8806c89beb2fe9d8ae1c12af001955f19f597597eef3ee7ab5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://301aa9cafe3650778050a75658dae5d8b24f6ec90e8aba20d203ffad996c8c64\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:00:43Z\\\",\\\"message\\\":\\\"/client-go/informers/factory.go:160\\\\nI0309 13:00:43.486509 6619 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:00:43.486575 6619 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 13:00:43.486623 6619 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 13:00:43.486677 6619 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 13:00:43.486659 6619 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0309 13:00:43.486754 6619 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:00:43.486601 6619 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0309 13:00:43.486879 6619 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 13:00:43.486900 6619 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:00:43.487109 6619 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:00:43.487828 6619 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca158acc1cc3ee4b2280f8d4a28c1ba412659c33412a3f887c38601163536461\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zngwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.066089 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.066141 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.066157 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.066180 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.066200 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:46Z","lastTransitionTime":"2026-03-09T13:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.068653 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"983d5ed4-cfc7-402a-b226-29dc071c6e4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://090e849795de6303acc7ba93444e1096a0cbb499371f86f3ce97d74a639f8ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b99b2d350bcf051156986ebffa7486b5fd35d0dcd70f0163bbbeb54050408e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cfjq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.090693 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T12:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 12:59:55.482549 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 12:59:55.482783 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 12:59:55.483649 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2499336393/tls.crt::/tmp/serving-cert-2499336393/tls.key\\\\\\\"\\\\nI0309 12:59:56.049731 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 12:59:56.052155 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 12:59:56.052201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 12:59:56.052220 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 12:59:56.052226 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 12:59:56.066627 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 12:59:56.066654 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066658 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 12:59:56.066665 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 12:59:56.066668 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 12:59:56.066671 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 12:59:56.067066 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 12:59:56.069624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T12:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.111155 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8bd72449608bbfb68165af6302aae01ce97dd9cf13392ccd6366dc7fbb7e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.118065 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dff98527-c16a-48b6-94a8-577172ec6dce-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-mbhtt\" (UID: \"dff98527-c16a-48b6-94a8-577172ec6dce\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mbhtt" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.118115 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkv4n\" (UniqueName: \"kubernetes.io/projected/dff98527-c16a-48b6-94a8-577172ec6dce-kube-api-access-gkv4n\") pod \"ovnkube-control-plane-749d76644c-mbhtt\" (UID: \"dff98527-c16a-48b6-94a8-577172ec6dce\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mbhtt" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.118143 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dff98527-c16a-48b6-94a8-577172ec6dce-env-overrides\") pod \"ovnkube-control-plane-749d76644c-mbhtt\" (UID: \"dff98527-c16a-48b6-94a8-577172ec6dce\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mbhtt" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.118164 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dff98527-c16a-48b6-94a8-577172ec6dce-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-mbhtt\" (UID: \"dff98527-c16a-48b6-94a8-577172ec6dce\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mbhtt" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.119096 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dff98527-c16a-48b6-94a8-577172ec6dce-env-overrides\") pod \"ovnkube-control-plane-749d76644c-mbhtt\" (UID: \"dff98527-c16a-48b6-94a8-577172ec6dce\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mbhtt" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.119266 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dff98527-c16a-48b6-94a8-577172ec6dce-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-mbhtt\" (UID: \"dff98527-c16a-48b6-94a8-577172ec6dce\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mbhtt" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.126344 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ca3fb3acb84f38e208051240b02c48f3a17be1ed5b3e309c769ae601bac22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06f3f6a6fc3af6d7c96d5860969110d357f79b45ffba8797c8939c39c299a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.134551 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dff98527-c16a-48b6-94a8-577172ec6dce-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-mbhtt\" (UID: \"dff98527-c16a-48b6-94a8-577172ec6dce\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mbhtt" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.138511 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkv4n\" (UniqueName: \"kubernetes.io/projected/dff98527-c16a-48b6-94a8-577172ec6dce-kube-api-access-gkv4n\") pod \"ovnkube-control-plane-749d76644c-mbhtt\" (UID: \"dff98527-c16a-48b6-94a8-577172ec6dce\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mbhtt" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.147488 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.160777 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l2l9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91019298-2c2b-48a9-8813-cd58d2681f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fc4cb656087f1906e4337675f18465d9bb6b9e324c185a76aeac922438e5b5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l2l9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.170048 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.170086 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.170094 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.170108 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.170119 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:46Z","lastTransitionTime":"2026-03-09T13:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.177710 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g92rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"242d3bf9-4462-4562-813a-f3548edc94fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca482bd84eb9a872cae7dd27e7606909c24675c47b5def6c7d3b0a947fe198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7r96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g92rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.188370 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7dh57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b334debc-b7ae-4bc1-8e6d-44e4bc86bb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://804f60c896756f05ea5aa24c1a6083c987a657c0753a2a49e401e67f74bee212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7dh57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.200289 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c8c6bc-647d-452a-9bee-7b7ccafd2816\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ace64875a6e7a6735b141be933b0fe4fb330351eb8b4c9a943e32312d019849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.216426 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.230497 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.242903 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mbhtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff98527-c16a-48b6-94a8-577172ec6dce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkv4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkv4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mbhtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.272561 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.272602 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.272613 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.272629 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.272640 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:46Z","lastTransitionTime":"2026-03-09T13:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.281279 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mbhtt" Mar 09 13:00:46 crc kubenswrapper[4723]: W0309 13:00:46.296832 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddff98527_c16a_48b6_94a8_577172ec6dce.slice/crio-89f2cea053a42ec98a40855e7f70ddb886681cd5a2fcfe22d4bf20503490e51f WatchSource:0}: Error finding container 89f2cea053a42ec98a40855e7f70ddb886681cd5a2fcfe22d4bf20503490e51f: Status 404 returned error can't find the container with id 89f2cea053a42ec98a40855e7f70ddb886681cd5a2fcfe22d4bf20503490e51f Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.374649 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.374677 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.374685 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.374701 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.374709 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:46Z","lastTransitionTime":"2026-03-09T13:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.453016 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mbhtt" event={"ID":"dff98527-c16a-48b6-94a8-577172ec6dce","Type":"ContainerStarted","Data":"89f2cea053a42ec98a40855e7f70ddb886681cd5a2fcfe22d4bf20503490e51f"} Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.454899 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zngwx_edb23619-78b6-4d63-aacf-98d7ce86bc5b/ovnkube-controller/1.log" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.455394 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zngwx_edb23619-78b6-4d63-aacf-98d7ce86bc5b/ovnkube-controller/0.log" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.457679 4723 generic.go:334] "Generic (PLEG): container finished" podID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerID="e54bb87f6edf8806c89beb2fe9d8ae1c12af001955f19f597597eef3ee7ab5a0" exitCode=1 Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.457708 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" event={"ID":"edb23619-78b6-4d63-aacf-98d7ce86bc5b","Type":"ContainerDied","Data":"e54bb87f6edf8806c89beb2fe9d8ae1c12af001955f19f597597eef3ee7ab5a0"} Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.457731 4723 scope.go:117] "RemoveContainer" containerID="301aa9cafe3650778050a75658dae5d8b24f6ec90e8aba20d203ffad996c8c64" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.458311 4723 scope.go:117] "RemoveContainer" containerID="e54bb87f6edf8806c89beb2fe9d8ae1c12af001955f19f597597eef3ee7ab5a0" Mar 09 13:00:46 crc kubenswrapper[4723]: E0309 13:00:46.458484 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zngwx_openshift-ovn-kubernetes(edb23619-78b6-4d63-aacf-98d7ce86bc5b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.472473 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jb44m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0d50f12886941ea85f21371c6996c216a385703ebb13cd05080fdad052e271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jb44m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.480291 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.480445 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.480522 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.480595 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.480654 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:46Z","lastTransitionTime":"2026-03-09T13:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.494998 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edb23619-78b6-4d63-aacf-98d7ce86bc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db49efb6d6e555b280f71c32194247ba51a121250abba598286800970b6bec74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74e157a2b8f0e011a72fd9bb3979bede5694bfab635bf7a6c4ee6780d35119e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d41e316c1f5e87b224e0037a94379033e22254097db3d2decf47a5fd335f766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://093509abfd0f010ebce9a116475da8c898a65569861b646314023d4ca81fea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f52ae34a25520c1f48c20a86ec3e00706385ab5dcfeb3bfd616422b454cf5c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84ae27509fbfe9c2f53f5de2a01cfc809ca8abe68dcd96a96963438f0924ddb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e54bb87f6edf8806c89beb2fe9d8ae1c12af001955f19f597597eef3ee7ab5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://301aa9cafe3650778050a75658dae5d8b24f6ec90e8aba20d203ffad996c8c64\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:00:43Z\\\",\\\"message\\\":\\\"/client-go/informers/factory.go:160\\\\nI0309 13:00:43.486509 6619 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:00:43.486575 6619 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 13:00:43.486623 6619 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 13:00:43.486677 6619 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 13:00:43.486659 6619 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0309 13:00:43.486754 6619 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:00:43.486601 6619 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0309 13:00:43.486879 6619 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 13:00:43.486900 6619 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:00:43.487109 6619 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:00:43.487828 6619 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e54bb87f6edf8806c89beb2fe9d8ae1c12af001955f19f597597eef3ee7ab5a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"message\\\":\\\"/127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:45Z is after 2025-08-24T17:21:41Z]\\\\nI0309 13:00:45.329285 6759 lb_config.go:1031] Cluster endpoints for openshift-marketplace/community-operators for network=default are: map[]\\\\nI0309 13:00:45.329184 6759 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca158acc1cc3ee4b2280f8d4a28c1ba412659c33412a3f887c38601163536461\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zngwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.508245 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de264c73b21eebce2a476c8f7d57aa48db637aa83bb6f49be76940a38163b487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.520317 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T12:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 12:59:55.482549 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 12:59:55.482783 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 12:59:55.483649 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2499336393/tls.crt::/tmp/serving-cert-2499336393/tls.key\\\\\\\"\\\\nI0309 12:59:56.049731 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 12:59:56.052155 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 12:59:56.052201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 12:59:56.052220 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 12:59:56.052226 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 12:59:56.066627 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 12:59:56.066654 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066658 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 12:59:56.066665 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 12:59:56.066668 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 12:59:56.066671 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 12:59:56.067066 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 12:59:56.069624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T12:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.536295 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8bd72449608bbfb68165af6302aae01ce97dd9cf13392ccd6366dc7fbb7e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.550312 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ca3fb3acb84f38e208051240b02c48f3a17be1ed5b3e309c769ae601bac22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06f3f6a6fc3af6d7c96d5860969110d357f79b45ffba8797c8939c39c299a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.562773 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.571468 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l2l9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91019298-2c2b-48a9-8813-cd58d2681f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fc4cb656087f1906e4337675f18465d9bb6b9e324c185a76aeac922438e5b5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l2l9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.582692 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"983d5ed4-cfc7-402a-b226-29dc071c6e4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://090e849795de6303acc7ba93444e1096a0cbb499371f86f3ce97d74a639f8ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b99b2d350bcf051156986ebffa7486b5fd35d0dcd70f0163bbbeb54050408e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cfjq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.582817 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.582832 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.582841 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.582872 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.582885 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:46Z","lastTransitionTime":"2026-03-09T13:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.594967 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g92rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"242d3bf9-4462-4562-813a-f3548edc94fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca482bd84eb9a872cae7dd27e7606909c24675c47b5def6c7d3b0a947fe198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7r96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g92rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.605528 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7dh57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b334debc-b7ae-4bc1-8e6d-44e4bc86bb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://804f60c896756f05ea5aa24c1a6083c987a657c0753a2a49e401e67f74bee212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7dh57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.618637 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.630664 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.641771 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mbhtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff98527-c16a-48b6-94a8-577172ec6dce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkv4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkv4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mbhtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.654190 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c8c6bc-647d-452a-9bee-7b7ccafd2816\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ace64875a6e7a6735b141be933b0fe4fb330351eb8b4c9a943e32312d019849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.685253 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.685289 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.685302 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.685318 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.685331 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:46Z","lastTransitionTime":"2026-03-09T13:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.692622 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-lztcd"] Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.693042 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lztcd" Mar 09 13:00:46 crc kubenswrapper[4723]: E0309 13:00:46.693098 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lztcd" podUID="f09eae28-36d6-4c16-8aab-bbd93934f921" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.706330 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7dh57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b334debc-b7ae-4bc1-8e6d-44e4bc86bb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://804f60c896756f05ea5aa24c1a6083c987a657c0753a2a49e401e67f74bee212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7dh57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.718280 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g92rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"242d3bf9-4462-4562-813a-f3548edc94fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca482bd84eb9a872cae7dd27e7606909c24675c47b5def6c7d3b0a947fe198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7r96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g92rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.729344 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mbhtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff98527-c16a-48b6-94a8-577172ec6dce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkv4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkv4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mbhtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.739060 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c8c6bc-647d-452a-9bee-7b7ccafd2816\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ace64875a6e7a6735b141be933b0fe4fb330351eb8b4c9a943e32312d019849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.748722 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.758138 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.766455 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de264c73b21eebce2a476c8f7d57aa48db637aa83bb6f49be76940a38163b487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.777742 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jb44m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0d50f12886941ea85f21371c6996c216a385703ebb13cd05080fdad052e271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jb44m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.786796 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.786837 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.786848 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.786877 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.786889 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:46Z","lastTransitionTime":"2026-03-09T13:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.794578 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edb23619-78b6-4d63-aacf-98d7ce86bc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db49efb6d6e555b280f71c32194247ba51a121250abba598286800970b6bec74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74e157a2b8f0e011a72fd9bb3979bede5694bfab635bf7a6c4ee6780d35119e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d41e316c1f5e87b224e0037a94379033e22254097db3d2decf47a5fd335f766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://093509abfd0f010ebce9a116475da8c898a65569861b646314023d4ca81fea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f52ae34a25520c1f48c20a86ec3e00706385ab5dcfeb3bfd616422b454cf5c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84ae27509fbfe9c2f53f5de2a01cfc809ca8abe68dcd96a96963438f0924ddb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e54bb87f6edf8806c89beb2fe9d8ae1c12af001955f19f597597eef3ee7ab5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://301aa9cafe3650778050a75658dae5d8b24f6ec90e8aba20d203ffad996c8c64\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:00:43Z\\\",\\\"message\\\":\\\"/client-go/informers/factory.go:160\\\\nI0309 13:00:43.486509 6619 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:00:43.486575 6619 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 13:00:43.486623 6619 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 13:00:43.486677 6619 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 13:00:43.486659 6619 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0309 13:00:43.486754 6619 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:00:43.486601 6619 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0309 13:00:43.486879 6619 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 13:00:43.486900 6619 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:00:43.487109 6619 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:00:43.487828 6619 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e54bb87f6edf8806c89beb2fe9d8ae1c12af001955f19f597597eef3ee7ab5a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"message\\\":\\\"/127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:45Z is after 2025-08-24T17:21:41Z]\\\\nI0309 13:00:45.329285 6759 lb_config.go:1031] Cluster endpoints for openshift-marketplace/community-operators for network=default are: map[]\\\\nI0309 13:00:45.329184 6759 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca158acc1cc3ee4b2280f8d4a28c1ba412659c33412a3f887c38601163536461\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zngwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.806106 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ca3fb3acb84f38e208051240b02c48f3a17be1ed5b3e309c769ae601bac22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06f3f6a6fc3af6d7c96d5860969110d357f79b45ffba8797c8939c39c299a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.817575 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.826080 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f09eae28-36d6-4c16-8aab-bbd93934f921-metrics-certs\") pod \"network-metrics-daemon-lztcd\" (UID: \"f09eae28-36d6-4c16-8aab-bbd93934f921\") " pod="openshift-multus/network-metrics-daemon-lztcd" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.826133 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gshcc\" (UniqueName: \"kubernetes.io/projected/f09eae28-36d6-4c16-8aab-bbd93934f921-kube-api-access-gshcc\") pod \"network-metrics-daemon-lztcd\" (UID: \"f09eae28-36d6-4c16-8aab-bbd93934f921\") " pod="openshift-multus/network-metrics-daemon-lztcd" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.830190 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l2l9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91019298-2c2b-48a9-8813-cd58d2681f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fc4cb656087f1906e4337675f18465d9bb6b9e324c185a76aeac922438e5b5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l2l9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.842731 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"983d5ed4-cfc7-402a-b226-29dc071c6e4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://090e849795de6303acc7ba93444e1096a0cbb499371f86f3ce97d74a639f8ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b99b2d350bcf051156986ebffa7486b5fd35d0dcd70f0163bbbeb54050408e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cfjq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.855643 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lztcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f09eae28-36d6-4c16-8aab-bbd93934f921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gshcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gshcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lztcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.868231 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T12:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 12:59:55.482549 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 12:59:55.482783 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 12:59:55.483649 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2499336393/tls.crt::/tmp/serving-cert-2499336393/tls.key\\\\\\\"\\\\nI0309 12:59:56.049731 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 12:59:56.052155 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 12:59:56.052201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 12:59:56.052220 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 12:59:56.052226 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 12:59:56.066627 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 12:59:56.066654 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066658 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 12:59:56.066665 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 12:59:56.066668 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 12:59:56.066671 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 12:59:56.067066 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 12:59:56.069624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T12:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.879895 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.879918 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.879980 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:00:46 crc kubenswrapper[4723]: E0309 13:00:46.880020 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:00:46 crc kubenswrapper[4723]: E0309 13:00:46.880074 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:00:46 crc kubenswrapper[4723]: E0309 13:00:46.880185 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.880832 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8bd72449608bbfb68165af6302aae01ce97dd9cf13392ccd6366dc7fbb7e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: E0309 13:00:46.887257 4723 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.892212 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"983d5ed4-cfc7-402a-b226-29dc071c6e4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://090e849795de6303acc7ba93444e1096a0cbb499371f86f3ce97d74a639f8ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b99b2d350bcf051156986ebffa7486b5fd35d0dcd70f0163bbbeb54050408e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cfjq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.901471 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lztcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f09eae28-36d6-4c16-8aab-bbd93934f921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gshcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gshcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lztcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.914460 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T12:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 12:59:55.482549 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 12:59:55.482783 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 12:59:55.483649 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2499336393/tls.crt::/tmp/serving-cert-2499336393/tls.key\\\\\\\"\\\\nI0309 12:59:56.049731 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 12:59:56.052155 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 12:59:56.052201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 12:59:56.052220 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 12:59:56.052226 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 12:59:56.066627 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 12:59:56.066654 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066658 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 12:59:56.066665 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 12:59:56.066668 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 12:59:56.066671 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 12:59:56.067066 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 12:59:56.069624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T12:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.927425 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f09eae28-36d6-4c16-8aab-bbd93934f921-metrics-certs\") pod \"network-metrics-daemon-lztcd\" (UID: \"f09eae28-36d6-4c16-8aab-bbd93934f921\") " pod="openshift-multus/network-metrics-daemon-lztcd" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.927546 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gshcc\" (UniqueName: \"kubernetes.io/projected/f09eae28-36d6-4c16-8aab-bbd93934f921-kube-api-access-gshcc\") pod \"network-metrics-daemon-lztcd\" (UID: \"f09eae28-36d6-4c16-8aab-bbd93934f921\") " pod="openshift-multus/network-metrics-daemon-lztcd" Mar 09 13:00:46 crc kubenswrapper[4723]: E0309 13:00:46.927628 4723 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:00:46 crc kubenswrapper[4723]: E0309 13:00:46.927724 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f09eae28-36d6-4c16-8aab-bbd93934f921-metrics-certs podName:f09eae28-36d6-4c16-8aab-bbd93934f921 nodeName:}" failed. No retries permitted until 2026-03-09 13:00:47.427704902 +0000 UTC m=+121.442172442 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f09eae28-36d6-4c16-8aab-bbd93934f921-metrics-certs") pod "network-metrics-daemon-lztcd" (UID: "f09eae28-36d6-4c16-8aab-bbd93934f921") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.932311 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8bd72449608bbfb68165af6302aae01ce97dd9cf13392ccd6366dc7fbb7e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.943753 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ca3fb3acb84f38e208051240b02c48f3a17be1ed5b3e309c769ae601bac22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06f3f6a6fc3af6d7c96d5860969110d357f79b45ffba8797c8939c39c299a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.944590 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gshcc\" (UniqueName: \"kubernetes.io/projected/f09eae28-36d6-4c16-8aab-bbd93934f921-kube-api-access-gshcc\") pod \"network-metrics-daemon-lztcd\" (UID: \"f09eae28-36d6-4c16-8aab-bbd93934f921\") " pod="openshift-multus/network-metrics-daemon-lztcd" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.956995 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.966485 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l2l9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91019298-2c2b-48a9-8813-cd58d2681f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fc4cb656087f1906e4337675f18465d9bb6b9e324c185a76aeac922438e5b5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l2l9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: E0309 13:00:46.973621 4723 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.978556 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g92rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"242d3bf9-4462-4562-813a-f3548edc94fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca482bd84eb9a872cae7dd27e7606909c24675c47b5def6c7d3b0a947fe198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7r96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g92rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:46 crc kubenswrapper[4723]: I0309 13:00:46.989175 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7dh57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b334debc-b7ae-4bc1-8e6d-44e4bc86bb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://804f60c896756f05ea5aa24c1a6083c987a657c0753a2a49e401e67f74bee212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7dh57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:47 crc kubenswrapper[4723]: I0309 13:00:47.000119 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c8c6bc-647d-452a-9bee-7b7ccafd2816\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ace64875a6e7a6735b141be933b0fe4fb330351eb8b4c9a943e32312d019849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:47 crc kubenswrapper[4723]: I0309 13:00:47.009360 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:47 crc kubenswrapper[4723]: I0309 13:00:47.019255 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:47 crc kubenswrapper[4723]: I0309 13:00:47.029239 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mbhtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff98527-c16a-48b6-94a8-577172ec6dce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkv4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkv4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mbhtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:47 crc kubenswrapper[4723]: I0309 13:00:47.040290 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de264c73b21eebce2a476c8f7d57aa48db637aa83bb6f49be76940a38163b487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:47 crc kubenswrapper[4723]: I0309 13:00:47.056815 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jb44m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0d50f12886941ea85f21371c6996c216a385703ebb13cd05080fdad052e271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jb44m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:47 crc kubenswrapper[4723]: I0309 13:00:47.073132 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edb23619-78b6-4d63-aacf-98d7ce86bc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db49efb6d6e555b280f71c32194247ba51a121250abba598286800970b6bec74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74e157a2b8f0e011a72fd9bb3979bede5694bfab635bf7a6c4ee6780d35119e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d41e316c1f5e87b224e0037a94379033e22254097db3d2decf47a5fd335f766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://093509abfd0f010ebce9a116475da8c898a65569861b646314023d4ca81fea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f52ae34a25520c1f48c20a86ec3e00706385ab5dcfeb3bfd616422b454cf5c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84ae27509fbfe9c2f53f5de2a01cfc809ca8abe68dcd96a96963438f0924ddb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e54bb87f6edf8806c89beb2fe9d8ae1c12af001955f19f597597eef3ee7ab5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://301aa9cafe3650778050a75658dae5d8b24f6ec90e8aba20d203ffad996c8c64\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:00:43Z\\\",\\\"message\\\":\\\"/client-go/informers/factory.go:160\\\\nI0309 13:00:43.486509 6619 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:00:43.486575 6619 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 13:00:43.486623 6619 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 13:00:43.486677 6619 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 13:00:43.486659 6619 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0309 13:00:43.486754 6619 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:00:43.486601 6619 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0309 13:00:43.486879 6619 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 13:00:43.486900 6619 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:00:43.487109 6619 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:00:43.487828 6619 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e54bb87f6edf8806c89beb2fe9d8ae1c12af001955f19f597597eef3ee7ab5a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"message\\\":\\\"/127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:45Z is after 2025-08-24T17:21:41Z]\\\\nI0309 13:00:45.329285 6759 lb_config.go:1031] Cluster endpoints for openshift-marketplace/community-operators for network=default are: map[]\\\\nI0309 13:00:45.329184 6759 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca158acc1cc3ee4b2280f8d4a28c1ba412659c33412a3f887c38601163536461\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zngwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:47 crc kubenswrapper[4723]: E0309 13:00:47.435090 4723 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:00:47 crc kubenswrapper[4723]: E0309 13:00:47.435764 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f09eae28-36d6-4c16-8aab-bbd93934f921-metrics-certs podName:f09eae28-36d6-4c16-8aab-bbd93934f921 nodeName:}" failed. No retries permitted until 2026-03-09 13:00:48.435729114 +0000 UTC m=+122.450196694 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f09eae28-36d6-4c16-8aab-bbd93934f921-metrics-certs") pod "network-metrics-daemon-lztcd" (UID: "f09eae28-36d6-4c16-8aab-bbd93934f921") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:00:47 crc kubenswrapper[4723]: I0309 13:00:47.434852 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f09eae28-36d6-4c16-8aab-bbd93934f921-metrics-certs\") pod \"network-metrics-daemon-lztcd\" (UID: \"f09eae28-36d6-4c16-8aab-bbd93934f921\") " pod="openshift-multus/network-metrics-daemon-lztcd" Mar 09 13:00:47 crc kubenswrapper[4723]: I0309 13:00:47.464430 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mbhtt" event={"ID":"dff98527-c16a-48b6-94a8-577172ec6dce","Type":"ContainerStarted","Data":"fc0517301d8c26db37c831773237c0772b14e01392c032f864200d27ec2207ef"} Mar 09 13:00:47 crc kubenswrapper[4723]: I0309 13:00:47.464562 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mbhtt" event={"ID":"dff98527-c16a-48b6-94a8-577172ec6dce","Type":"ContainerStarted","Data":"cefc2989387f88b8b140c8aed05c20406db5392c88da114b489cbd7c177f98ae"} Mar 09 13:00:47 crc kubenswrapper[4723]: I0309 13:00:47.468379 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zngwx_edb23619-78b6-4d63-aacf-98d7ce86bc5b/ovnkube-controller/1.log" Mar 09 13:00:47 crc kubenswrapper[4723]: I0309 13:00:47.474531 4723 scope.go:117] "RemoveContainer" containerID="e54bb87f6edf8806c89beb2fe9d8ae1c12af001955f19f597597eef3ee7ab5a0" Mar 09 13:00:47 crc kubenswrapper[4723]: E0309 13:00:47.474831 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zngwx_openshift-ovn-kubernetes(edb23619-78b6-4d63-aacf-98d7ce86bc5b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" Mar 09 13:00:47 crc kubenswrapper[4723]: I0309 13:00:47.485006 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g92rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"242d3bf9-4462-4562-813a-f3548edc94fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca482bd84eb9a872cae7dd27e7606909c24675c47b5def6c7d3b0a947fe198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7r96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g92rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:47 crc kubenswrapper[4723]: I0309 13:00:47.497502 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7dh57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b334debc-b7ae-4bc1-8e6d-44e4bc86bb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://804f60c896756f05ea5aa24c1a6083c987a657c0753a2a49e401e67f74bee212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7dh57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:47 crc kubenswrapper[4723]: I0309 13:00:47.511891 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:47 crc kubenswrapper[4723]: I0309 13:00:47.527837 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:47 crc kubenswrapper[4723]: I0309 13:00:47.542404 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mbhtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff98527-c16a-48b6-94a8-577172ec6dce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cefc2989387f88b8b140c8aed05c20406db5392c88da114b489cbd7c177f98ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkv4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc0517301d8c26db37c831773237c0772b14e01392c032f864200d27ec2207ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkv4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mbhtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:47 crc kubenswrapper[4723]: I0309 13:00:47.553801 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c8c6bc-647d-452a-9bee-7b7ccafd2816\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ace64875a6e7a6735b141be933b0fe4fb330351eb8b4c9a943e32312d019849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:47 crc kubenswrapper[4723]: I0309 13:00:47.568961 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jb44m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0d50f12886941ea85f21371c6996c216a385703ebb13cd05080fdad052e271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jb44m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:47 crc kubenswrapper[4723]: I0309 13:00:47.592479 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edb23619-78b6-4d63-aacf-98d7ce86bc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db49efb6d6e555b280f71c32194247ba51a121250abba598286800970b6bec74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74e157a2b8f0e011a72fd9bb3979bede5694bfab635bf7a6c4ee6780d35119e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d41e316c1f5e87b224e0037a94379033e22254097db3d2decf47a5fd335f766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://093509abfd0f010ebce9a116475da8c898a65569861b646314023d4ca81fea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f52ae34a25520c1f48c20a86ec3e00706385ab5dcfeb3bfd616422b454cf5c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84ae27509fbfe9c2f53f5de2a01cfc809ca8abe68dcd96a96963438f0924ddb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e54bb87f6edf8806c89beb2fe9d8ae1c12af001955f19f597597eef3ee7ab5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://301aa9cafe3650778050a75658dae5d8b24f6ec90e8aba20d203ffad996c8c64\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:00:43Z\\\",\\\"message\\\":\\\"/client-go/informers/factory.go:160\\\\nI0309 13:00:43.486509 6619 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:00:43.486575 6619 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 13:00:43.486623 6619 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 13:00:43.486677 6619 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 13:00:43.486659 6619 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0309 13:00:43.486754 6619 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:00:43.486601 6619 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0309 13:00:43.486879 6619 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 13:00:43.486900 6619 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:00:43.487109 6619 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:00:43.487828 6619 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e54bb87f6edf8806c89beb2fe9d8ae1c12af001955f19f597597eef3ee7ab5a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"message\\\":\\\"/127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:45Z is after 2025-08-24T17:21:41Z]\\\\nI0309 13:00:45.329285 6759 lb_config.go:1031] Cluster endpoints for openshift-marketplace/community-operators for network=default are: map[]\\\\nI0309 13:00:45.329184 6759 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca158acc1cc3ee4b2280f8d4a28c1ba412659c33412a3f887c38601163536461\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zngwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:47 crc kubenswrapper[4723]: I0309 13:00:47.608026 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de264c73b21eebce2a476c8f7d57aa48db637aa83bb6f49be76940a38163b487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:47 crc kubenswrapper[4723]: I0309 13:00:47.622315 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T12:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 12:59:55.482549 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 12:59:55.482783 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 12:59:55.483649 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2499336393/tls.crt::/tmp/serving-cert-2499336393/tls.key\\\\\\\"\\\\nI0309 12:59:56.049731 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 12:59:56.052155 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 12:59:56.052201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 12:59:56.052220 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 12:59:56.052226 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 12:59:56.066627 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 12:59:56.066654 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066658 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 12:59:56.066665 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 12:59:56.066668 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 12:59:56.066671 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 12:59:56.067066 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 12:59:56.069624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T12:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:47 crc kubenswrapper[4723]: I0309 13:00:47.634047 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8bd72449608bbfb68165af6302aae01ce97dd9cf13392ccd6366dc7fbb7e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:47 crc kubenswrapper[4723]: I0309 13:00:47.646364 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ca3fb3acb84f38e208051240b02c48f3a17be1ed5b3e309c769ae601bac22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06f3f6a6fc3af6d7c96d5860969110d357f79b45ffba8797c8939c39c299a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:47 crc kubenswrapper[4723]: I0309 13:00:47.658746 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:47 crc kubenswrapper[4723]: I0309 13:00:47.668548 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l2l9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91019298-2c2b-48a9-8813-cd58d2681f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fc4cb656087f1906e4337675f18465d9bb6b9e324c185a76aeac922438e5b5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l2l9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:47 crc kubenswrapper[4723]: I0309 13:00:47.677814 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"983d5ed4-cfc7-402a-b226-29dc071c6e4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://090e849795de6303acc7ba93444e1096a0cbb499371f86f3ce97d74a639f8ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b99b2d350bcf051156986ebffa7486b5fd35d0dcd70f0163bbbeb54050408e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cfjq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:47 crc kubenswrapper[4723]: I0309 13:00:47.688778 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lztcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f09eae28-36d6-4c16-8aab-bbd93934f921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gshcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gshcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lztcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:47 crc kubenswrapper[4723]: I0309 13:00:47.704008 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:47 crc kubenswrapper[4723]: I0309 13:00:47.715389 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mbhtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff98527-c16a-48b6-94a8-577172ec6dce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cefc2989387f88b8b140c8aed05c20406db5392c88da114b489cbd7c177f98ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkv4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc0517301d8c26db37c831773237c0772b14e01392c032f864200d27ec2207ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkv4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mbhtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:47 crc kubenswrapper[4723]: I0309 13:00:47.724462 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c8c6bc-647d-452a-9bee-7b7ccafd2816\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ace64875a6e7a6735b141be933b0fe4fb330351eb8b4c9a943e32312d019849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:47 crc kubenswrapper[4723]: I0309 13:00:47.737905 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:47 crc kubenswrapper[4723]: I0309 13:00:47.754153 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edb23619-78b6-4d63-aacf-98d7ce86bc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db49efb6d6e555b280f71c32194247ba51a121250abba598286800970b6bec74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74e157a2b8f0e011a72fd9bb3979bede5694bfab635bf7a6c4ee6780d35119e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d41e316c1f5e87b224e0037a94379033e22254097db3d2decf47a5fd335f766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://093509abfd0f010ebce9a116475da8c898a65569861b646314023d4ca81fea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f52ae34a25520c1f48c20a86ec3e00706385ab5dcfeb3bfd616422b454cf5c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84ae27509fbfe9c2f53f5de2a01cfc809ca8abe68dcd96a96963438f0924ddb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e54bb87f6edf8806c89beb2fe9d8ae1c12af001955f19f597597eef3ee7ab5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e54bb87f6edf8806c89beb2fe9d8ae1c12af001955f19f597597eef3ee7ab5a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"message\\\":\\\"/127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:45Z is after 2025-08-24T17:21:41Z]\\\\nI0309 13:00:45.329285 6759 lb_config.go:1031] Cluster endpoints for openshift-marketplace/community-operators for network=default are: map[]\\\\nI0309 13:00:45.329184 6759 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zngwx_openshift-ovn-kubernetes(edb23619-78b6-4d63-aacf-98d7ce86bc5b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca158acc1cc3ee4b2280f8d4a28c1ba412659c33412a3f887c38601163536461\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zngwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:47 crc kubenswrapper[4723]: I0309 13:00:47.764625 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de264c73b21eebce2a476c8f7d57aa48db637aa83bb6f49be76940a38163b487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:47 crc kubenswrapper[4723]: I0309 13:00:47.783000 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jb44m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0d50f12886941ea85f21371c6996c216a385703ebb13cd05080fdad052e271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jb44m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:47 crc kubenswrapper[4723]: I0309 13:00:47.794321 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8bd72449608bbfb68165af6302aae01ce97dd9cf13392ccd6366dc7fbb7e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:47 crc kubenswrapper[4723]: I0309 13:00:47.808227 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ca3fb3acb84f38e208051240b02c48f3a17be1ed5b3e309c769ae601bac22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06f3f6a6fc3af6d7c96d5860969110d357f79b45ffba8797c8939c39c299a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:47 crc kubenswrapper[4723]: I0309 13:00:47.817822 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:47 crc kubenswrapper[4723]: I0309 13:00:47.825652 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l2l9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91019298-2c2b-48a9-8813-cd58d2681f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fc4cb656087f1906e4337675f18465d9bb6b9e324c185a76aeac922438e5b5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l2l9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:47 crc kubenswrapper[4723]: I0309 13:00:47.834319 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"983d5ed4-cfc7-402a-b226-29dc071c6e4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://090e849795de6303acc7ba93444e1096a0cbb499371f86f3ce97d74a639f8ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b99b2d350bcf051156986ebffa7486b5fd35d0dcd70f0163bbbeb54050408e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cfjq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:47 crc kubenswrapper[4723]: I0309 13:00:47.843097 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lztcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f09eae28-36d6-4c16-8aab-bbd93934f921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gshcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gshcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lztcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:47 crc kubenswrapper[4723]: I0309 13:00:47.855120 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T12:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 12:59:55.482549 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 12:59:55.482783 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 12:59:55.483649 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2499336393/tls.crt::/tmp/serving-cert-2499336393/tls.key\\\\\\\"\\\\nI0309 12:59:56.049731 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 12:59:56.052155 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 12:59:56.052201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 12:59:56.052220 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 12:59:56.052226 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 12:59:56.066627 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 12:59:56.066654 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066658 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 12:59:56.066665 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 12:59:56.066668 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 12:59:56.066671 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 12:59:56.067066 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 12:59:56.069624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T12:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:47 crc kubenswrapper[4723]: I0309 13:00:47.868230 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g92rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"242d3bf9-4462-4562-813a-f3548edc94fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca482bd84eb9a872cae7dd27e7606909c24675c47b5def6c7d3b0a947fe198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7r96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g92rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:47 crc kubenswrapper[4723]: I0309 13:00:47.876774 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7dh57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b334debc-b7ae-4bc1-8e6d-44e4bc86bb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://804f60c896756f05ea5aa24c1a6083c987a657c0753a2a49e401e67f74bee212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7dh57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:48 crc kubenswrapper[4723]: I0309 13:00:48.446638 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f09eae28-36d6-4c16-8aab-bbd93934f921-metrics-certs\") pod \"network-metrics-daemon-lztcd\" (UID: \"f09eae28-36d6-4c16-8aab-bbd93934f921\") " pod="openshift-multus/network-metrics-daemon-lztcd" Mar 09 13:00:48 crc kubenswrapper[4723]: E0309 13:00:48.446789 4723 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:00:48 crc kubenswrapper[4723]: E0309 13:00:48.447165 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f09eae28-36d6-4c16-8aab-bbd93934f921-metrics-certs podName:f09eae28-36d6-4c16-8aab-bbd93934f921 nodeName:}" failed. No retries permitted until 2026-03-09 13:00:50.447142442 +0000 UTC m=+124.461609982 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f09eae28-36d6-4c16-8aab-bbd93934f921-metrics-certs") pod "network-metrics-daemon-lztcd" (UID: "f09eae28-36d6-4c16-8aab-bbd93934f921") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:00:48 crc kubenswrapper[4723]: I0309 13:00:48.649944 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:00:48 crc kubenswrapper[4723]: I0309 13:00:48.650119 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:00:48 crc kubenswrapper[4723]: E0309 13:00:48.650169 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:01:20.650136268 +0000 UTC m=+154.664603848 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:00:48 crc kubenswrapper[4723]: E0309 13:00:48.650215 4723 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:00:48 crc kubenswrapper[4723]: E0309 13:00:48.650272 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:01:20.650256021 +0000 UTC m=+154.664723591 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:00:48 crc kubenswrapper[4723]: I0309 13:00:48.750936 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:00:48 crc kubenswrapper[4723]: I0309 13:00:48.751035 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:00:48 crc kubenswrapper[4723]: I0309 13:00:48.751078 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:00:48 crc kubenswrapper[4723]: E0309 13:00:48.751269 4723 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:00:48 crc kubenswrapper[4723]: E0309 13:00:48.751293 4723 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:00:48 crc kubenswrapper[4723]: E0309 13:00:48.751313 4723 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:00:48 crc kubenswrapper[4723]: E0309 13:00:48.751376 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 13:01:20.751354678 +0000 UTC m=+154.765822258 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:00:48 crc kubenswrapper[4723]: E0309 13:00:48.751898 4723 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:00:48 crc kubenswrapper[4723]: E0309 13:00:48.751993 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:01:20.751971544 +0000 UTC m=+154.766439124 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:00:48 crc kubenswrapper[4723]: E0309 13:00:48.752347 4723 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:00:48 crc kubenswrapper[4723]: E0309 13:00:48.752595 4723 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:00:48 crc kubenswrapper[4723]: E0309 13:00:48.752727 4723 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:00:48 crc kubenswrapper[4723]: E0309 13:00:48.753024 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 13:01:20.752993419 +0000 UTC m=+154.767460989 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:00:48 crc kubenswrapper[4723]: I0309 13:00:48.880558 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:00:48 crc kubenswrapper[4723]: I0309 13:00:48.880623 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lztcd" Mar 09 13:00:48 crc kubenswrapper[4723]: I0309 13:00:48.880586 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:00:48 crc kubenswrapper[4723]: I0309 13:00:48.881479 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:00:48 crc kubenswrapper[4723]: E0309 13:00:48.881645 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:00:48 crc kubenswrapper[4723]: E0309 13:00:48.881837 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:00:48 crc kubenswrapper[4723]: E0309 13:00:48.882055 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lztcd" podUID="f09eae28-36d6-4c16-8aab-bbd93934f921" Mar 09 13:00:48 crc kubenswrapper[4723]: E0309 13:00:48.882194 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:00:48 crc kubenswrapper[4723]: I0309 13:00:48.883044 4723 scope.go:117] "RemoveContainer" containerID="093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468" Mar 09 13:00:49 crc kubenswrapper[4723]: I0309 13:00:49.482924 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 09 13:00:49 crc kubenswrapper[4723]: I0309 13:00:49.486284 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ea9771677551554a5d85cade16547356c2335fa472e5b66b2b18d44a5b5d45c9"} Mar 09 13:00:49 crc kubenswrapper[4723]: I0309 13:00:49.486887 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:00:49 crc kubenswrapper[4723]: I0309 13:00:49.505336 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mbhtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff98527-c16a-48b6-94a8-577172ec6dce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cefc2989387f88b8b140c8aed05c20406db5392c88da114b489cbd7c177f98ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkv4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc0517301d8c26db37c831773237c0772b14e01392c032f864200d27ec2207ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkv4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mbhtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:49 crc kubenswrapper[4723]: I0309 13:00:49.518110 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c8c6bc-647d-452a-9bee-7b7ccafd2816\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ace64875a6e7a6735b141be933b0fe4fb330351eb8b4c9a943e32312d019849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:49 crc kubenswrapper[4723]: I0309 13:00:49.538991 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:49 crc kubenswrapper[4723]: I0309 13:00:49.569937 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:49 crc kubenswrapper[4723]: I0309 13:00:49.591876 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de264c73b21eebce2a476c8f7d57aa48db637aa83bb6f49be76940a38163b487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:49 crc kubenswrapper[4723]: I0309 13:00:49.613608 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jb44m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0d50f12886941ea85f21371c6996c216a385703ebb13cd05080fdad052e271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jb44m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:49 crc kubenswrapper[4723]: I0309 13:00:49.637826 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edb23619-78b6-4d63-aacf-98d7ce86bc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db49efb6d6e555b280f71c32194247ba51a121250abba598286800970b6bec74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74e157a2b8f0e011a72fd9bb3979bede5694bfab635bf7a6c4ee6780d35119e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d41e316c1f5e87b224e0037a94379033e22254097db3d2decf47a5fd335f766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://093509abfd0f010ebce9a116475da8c898a65569861b646314023d4ca81fea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f52ae34a25520c1f48c20a86ec3e00706385ab5dcfeb3bfd616422b454cf5c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84ae27509fbfe9c2f53f5de2a01cfc809ca8abe68dcd96a96963438f0924ddb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e54bb87f6edf8806c89beb2fe9d8ae1c12af001955f19f597597eef3ee7ab5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e54bb87f6edf8806c89beb2fe9d8ae1c12af001955f19f597597eef3ee7ab5a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"message\\\":\\\"/127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:45Z is after 2025-08-24T17:21:41Z]\\\\nI0309 13:00:45.329285 6759 lb_config.go:1031] Cluster endpoints for openshift-marketplace/community-operators for network=default are: map[]\\\\nI0309 13:00:45.329184 6759 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zngwx_openshift-ovn-kubernetes(edb23619-78b6-4d63-aacf-98d7ce86bc5b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca158acc1cc3ee4b2280f8d4a28c1ba412659c33412a3f887c38601163536461\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zngwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:49 crc kubenswrapper[4723]: I0309 13:00:49.651812 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ca3fb3acb84f38e208051240b02c48f3a17be1ed5b3e309c769ae601bac22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06f3f6a6fc3af6d7c96d5860969110d357f79b45ffba8797c8939c39c299a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:49 crc kubenswrapper[4723]: I0309 13:00:49.665187 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:49 crc kubenswrapper[4723]: I0309 13:00:49.674749 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l2l9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91019298-2c2b-48a9-8813-cd58d2681f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fc4cb656087f1906e4337675f18465d9bb6b9e324c185a76aeac922438e5b5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l2l9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:49 crc kubenswrapper[4723]: I0309 13:00:49.684270 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"983d5ed4-cfc7-402a-b226-29dc071c6e4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://090e849795de6303acc7ba93444e1096a0cbb499371f86f3ce97d74a639f8ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b99b2d350bcf051156986ebffa7486b5fd35d0dcd70f0163bbbeb54050408e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cfjq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:49 crc kubenswrapper[4723]: I0309 13:00:49.692285 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lztcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f09eae28-36d6-4c16-8aab-bbd93934f921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gshcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gshcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lztcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:49 crc kubenswrapper[4723]: I0309 13:00:49.703936 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9771677551554a5d85cade16547356c2335fa472e5b66b2b18d44a5b5d45c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T12:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 12:59:55.482549 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 12:59:55.482783 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 12:59:55.483649 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2499336393/tls.crt::/tmp/serving-cert-2499336393/tls.key\\\\\\\"\\\\nI0309 12:59:56.049731 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 12:59:56.052155 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 12:59:56.052201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 12:59:56.052220 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 12:59:56.052226 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 12:59:56.066627 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 12:59:56.066654 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066658 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 12:59:56.066665 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 12:59:56.066668 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 12:59:56.066671 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 12:59:56.067066 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 12:59:56.069624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T12:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:49 crc kubenswrapper[4723]: I0309 13:00:49.717838 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8bd72449608bbfb68165af6302aae01ce97dd9cf13392ccd6366dc7fbb7e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:49 crc kubenswrapper[4723]: I0309 13:00:49.726936 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7dh57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b334debc-b7ae-4bc1-8e6d-44e4bc86bb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://804f60c896756f05ea5aa24c1a6083c987a657c0753a2a49e401e67f74bee212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7dh57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:49 crc kubenswrapper[4723]: I0309 13:00:49.737795 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g92rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"242d3bf9-4462-4562-813a-f3548edc94fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca482bd84eb9a872cae7dd27e7606909c24675c47b5def6c7d3b0a947fe198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7r96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g92rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:50 crc kubenswrapper[4723]: I0309 13:00:50.471123 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f09eae28-36d6-4c16-8aab-bbd93934f921-metrics-certs\") pod \"network-metrics-daemon-lztcd\" (UID: \"f09eae28-36d6-4c16-8aab-bbd93934f921\") " pod="openshift-multus/network-metrics-daemon-lztcd" Mar 09 13:00:50 crc kubenswrapper[4723]: E0309 13:00:50.471285 4723 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:00:50 crc kubenswrapper[4723]: E0309 13:00:50.471338 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f09eae28-36d6-4c16-8aab-bbd93934f921-metrics-certs podName:f09eae28-36d6-4c16-8aab-bbd93934f921 nodeName:}" failed. No retries permitted until 2026-03-09 13:00:54.471321682 +0000 UTC m=+128.485789222 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f09eae28-36d6-4c16-8aab-bbd93934f921-metrics-certs") pod "network-metrics-daemon-lztcd" (UID: "f09eae28-36d6-4c16-8aab-bbd93934f921") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:00:50 crc kubenswrapper[4723]: I0309 13:00:50.880142 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:00:50 crc kubenswrapper[4723]: I0309 13:00:50.880174 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lztcd" Mar 09 13:00:50 crc kubenswrapper[4723]: E0309 13:00:50.880265 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:00:50 crc kubenswrapper[4723]: I0309 13:00:50.880291 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:00:50 crc kubenswrapper[4723]: I0309 13:00:50.880348 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:00:50 crc kubenswrapper[4723]: E0309 13:00:50.880509 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lztcd" podUID="f09eae28-36d6-4c16-8aab-bbd93934f921" Mar 09 13:00:50 crc kubenswrapper[4723]: E0309 13:00:50.880652 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:00:50 crc kubenswrapper[4723]: E0309 13:00:50.880775 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:00:51 crc kubenswrapper[4723]: E0309 13:00:51.975388 4723 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:00:52 crc kubenswrapper[4723]: I0309 13:00:52.880150 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:00:52 crc kubenswrapper[4723]: I0309 13:00:52.880257 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lztcd" Mar 09 13:00:52 crc kubenswrapper[4723]: I0309 13:00:52.880296 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:00:52 crc kubenswrapper[4723]: E0309 13:00:52.880309 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:00:52 crc kubenswrapper[4723]: I0309 13:00:52.880401 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:00:52 crc kubenswrapper[4723]: E0309 13:00:52.880501 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lztcd" podUID="f09eae28-36d6-4c16-8aab-bbd93934f921" Mar 09 13:00:52 crc kubenswrapper[4723]: E0309 13:00:52.880620 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:00:52 crc kubenswrapper[4723]: E0309 13:00:52.880760 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:00:54 crc kubenswrapper[4723]: I0309 13:00:54.510922 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f09eae28-36d6-4c16-8aab-bbd93934f921-metrics-certs\") pod \"network-metrics-daemon-lztcd\" (UID: \"f09eae28-36d6-4c16-8aab-bbd93934f921\") " pod="openshift-multus/network-metrics-daemon-lztcd" Mar 09 13:00:54 crc kubenswrapper[4723]: E0309 13:00:54.511097 4723 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:00:54 crc kubenswrapper[4723]: E0309 13:00:54.511174 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f09eae28-36d6-4c16-8aab-bbd93934f921-metrics-certs podName:f09eae28-36d6-4c16-8aab-bbd93934f921 nodeName:}" failed. No retries permitted until 2026-03-09 13:01:02.511152399 +0000 UTC m=+136.525619979 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f09eae28-36d6-4c16-8aab-bbd93934f921-metrics-certs") pod "network-metrics-daemon-lztcd" (UID: "f09eae28-36d6-4c16-8aab-bbd93934f921") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:00:54 crc kubenswrapper[4723]: I0309 13:00:54.880231 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:00:54 crc kubenswrapper[4723]: E0309 13:00:54.880686 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:00:54 crc kubenswrapper[4723]: I0309 13:00:54.881041 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:00:54 crc kubenswrapper[4723]: E0309 13:00:54.881203 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:00:54 crc kubenswrapper[4723]: I0309 13:00:54.881437 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:00:54 crc kubenswrapper[4723]: E0309 13:00:54.881615 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:00:54 crc kubenswrapper[4723]: I0309 13:00:54.881989 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lztcd" Mar 09 13:00:54 crc kubenswrapper[4723]: E0309 13:00:54.882168 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lztcd" podUID="f09eae28-36d6-4c16-8aab-bbd93934f921" Mar 09 13:00:55 crc kubenswrapper[4723]: I0309 13:00:55.219704 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:55 crc kubenswrapper[4723]: I0309 13:00:55.220159 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:55 crc kubenswrapper[4723]: I0309 13:00:55.220366 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:55 crc kubenswrapper[4723]: I0309 13:00:55.220566 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:55 crc kubenswrapper[4723]: I0309 13:00:55.220716 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:55Z","lastTransitionTime":"2026-03-09T13:00:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:55 crc kubenswrapper[4723]: E0309 13:00:55.235185 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7a1a0b61-b39c-449b-8223-316b94b8c26c\\\",\\\"systemUUID\\\":\\\"548816a4-d164-4614-b3ad-c3e4f48e94b9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:55 crc kubenswrapper[4723]: I0309 13:00:55.240296 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:55 crc kubenswrapper[4723]: I0309 13:00:55.240500 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:55 crc kubenswrapper[4723]: I0309 13:00:55.240600 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:55 crc kubenswrapper[4723]: I0309 13:00:55.240705 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:55 crc kubenswrapper[4723]: I0309 13:00:55.240796 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:55Z","lastTransitionTime":"2026-03-09T13:00:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:55 crc kubenswrapper[4723]: E0309 13:00:55.257650 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7a1a0b61-b39c-449b-8223-316b94b8c26c\\\",\\\"systemUUID\\\":\\\"548816a4-d164-4614-b3ad-c3e4f48e94b9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:55 crc kubenswrapper[4723]: I0309 13:00:55.262973 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:55 crc kubenswrapper[4723]: I0309 13:00:55.263028 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:55 crc kubenswrapper[4723]: I0309 13:00:55.263051 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:55 crc kubenswrapper[4723]: I0309 13:00:55.263083 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:55 crc kubenswrapper[4723]: I0309 13:00:55.263105 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:55Z","lastTransitionTime":"2026-03-09T13:00:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:55 crc kubenswrapper[4723]: E0309 13:00:55.279556 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7a1a0b61-b39c-449b-8223-316b94b8c26c\\\",\\\"systemUUID\\\":\\\"548816a4-d164-4614-b3ad-c3e4f48e94b9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:55 crc kubenswrapper[4723]: I0309 13:00:55.283942 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:55 crc kubenswrapper[4723]: I0309 13:00:55.284106 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:55 crc kubenswrapper[4723]: I0309 13:00:55.284250 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:55 crc kubenswrapper[4723]: I0309 13:00:55.284389 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:55 crc kubenswrapper[4723]: I0309 13:00:55.284521 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:55Z","lastTransitionTime":"2026-03-09T13:00:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:55 crc kubenswrapper[4723]: E0309 13:00:55.299022 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7a1a0b61-b39c-449b-8223-316b94b8c26c\\\",\\\"systemUUID\\\":\\\"548816a4-d164-4614-b3ad-c3e4f48e94b9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:55 crc kubenswrapper[4723]: I0309 13:00:55.303442 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:00:55 crc kubenswrapper[4723]: I0309 13:00:55.303611 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:00:55 crc kubenswrapper[4723]: I0309 13:00:55.303733 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:00:55 crc kubenswrapper[4723]: I0309 13:00:55.303890 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:00:55 crc kubenswrapper[4723]: I0309 13:00:55.304022 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:00:55Z","lastTransitionTime":"2026-03-09T13:00:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:00:55 crc kubenswrapper[4723]: E0309 13:00:55.317113 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:00:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7a1a0b61-b39c-449b-8223-316b94b8c26c\\\",\\\"systemUUID\\\":\\\"548816a4-d164-4614-b3ad-c3e4f48e94b9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:55 crc kubenswrapper[4723]: E0309 13:00:55.317432 4723 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 13:00:55 crc kubenswrapper[4723]: I0309 13:00:55.890636 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 09 13:00:56 crc kubenswrapper[4723]: I0309 13:00:56.880145 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:00:56 crc kubenswrapper[4723]: E0309 13:00:56.880272 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:00:56 crc kubenswrapper[4723]: I0309 13:00:56.880438 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:00:56 crc kubenswrapper[4723]: E0309 13:00:56.880500 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:00:56 crc kubenswrapper[4723]: I0309 13:00:56.880778 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:00:56 crc kubenswrapper[4723]: E0309 13:00:56.880830 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:00:56 crc kubenswrapper[4723]: I0309 13:00:56.880947 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lztcd" Mar 09 13:00:56 crc kubenswrapper[4723]: E0309 13:00:56.880998 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lztcd" podUID="f09eae28-36d6-4c16-8aab-bbd93934f921" Mar 09 13:00:56 crc kubenswrapper[4723]: I0309 13:00:56.899750 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e3a7da2-df7a-46e6-88cf-14c2b5adbf46\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b5ea090dbcca9cb0683626e5137a3e9aea8749a1deb31339e45dc6db3336f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c342ce2d9033896e1d8f3c5428e8b03220c003d3c746103ac17b5cd564663091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0777edbfbe04e20d91d97fb2daa6bc7ef67cbf56cdd1fdb20849350a5f0d4f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://228079a220bbfc7f4d8dc9529398b36cf51b4093c84900cb89f45128c9049f5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://228079a220bbfc7f4d8dc9529398b36cf51b4093c84900cb89f45128c9049f5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:56Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:56 crc kubenswrapper[4723]: I0309 13:00:56.916840 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g92rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"242d3bf9-4462-4562-813a-f3548edc94fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca482bd84eb9a872cae7dd27e7606909c24675c47b5def6c7d3b0a947fe198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7r96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g92rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:56Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:56 crc kubenswrapper[4723]: I0309 13:00:56.930233 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7dh57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b334debc-b7ae-4bc1-8e6d-44e4bc86bb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://804f60c896756f05ea5aa24c1a6083c987a657c0753a2a49e401e67f74bee212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7dh57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:56Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:56 crc kubenswrapper[4723]: I0309 13:00:56.946342 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c8c6bc-647d-452a-9bee-7b7ccafd2816\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ace64875a6e7a6735b141be933b0fe4fb330351eb8b4c9a943e32312d019849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:56Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:56 crc kubenswrapper[4723]: I0309 13:00:56.965299 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:56Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:56 crc kubenswrapper[4723]: E0309 13:00:56.976264 4723 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:00:56 crc kubenswrapper[4723]: I0309 13:00:56.989395 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:56Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:57 crc kubenswrapper[4723]: I0309 13:00:57.002834 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mbhtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff98527-c16a-48b6-94a8-577172ec6dce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cefc2989387f88b8b140c8aed05c20406db5392c88da114b489cbd7c177f98ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkv4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc0517301d8c26db37c831773237c0772b14e01392c032f864200d27ec2207ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkv4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mbhtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:57Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:57 crc kubenswrapper[4723]: I0309 13:00:57.013809 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de264c73b21eebce2a476c8f7d57aa48db637aa83bb6f49be76940a38163b487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:57Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:57 crc kubenswrapper[4723]: I0309 13:00:57.034572 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jb44m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0d50f12886941ea85f21371c6996c216a385703ebb13cd05080fdad052e271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jb44m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:57Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:57 crc kubenswrapper[4723]: I0309 13:00:57.063995 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edb23619-78b6-4d63-aacf-98d7ce86bc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db49efb6d6e555b280f71c32194247ba51a121250abba598286800970b6bec74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74e157a2b8f0e011a72fd9bb3979bede5694bfab635bf7a6c4ee6780d35119e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d41e316c1f5e87b224e0037a94379033e22254097db3d2decf47a5fd335f766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://093509abfd0f010ebce9a116475da8c898a65569861b646314023d4ca81fea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f52ae34a25520c1f48c20a86ec3e00706385ab5dcfeb3bfd616422b454cf5c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84ae27509fbfe9c2f53f5de2a01cfc809ca8abe68dcd96a96963438f0924ddb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e54bb87f6edf8806c89beb2fe9d8ae1c12af001955f19f597597eef3ee7ab5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e54bb87f6edf8806c89beb2fe9d8ae1c12af001955f19f597597eef3ee7ab5a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"message\\\":\\\"/127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:45Z is after 2025-08-24T17:21:41Z]\\\\nI0309 13:00:45.329285 6759 lb_config.go:1031] Cluster endpoints for openshift-marketplace/community-operators for network=default are: map[]\\\\nI0309 13:00:45.329184 6759 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zngwx_openshift-ovn-kubernetes(edb23619-78b6-4d63-aacf-98d7ce86bc5b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca158acc1cc3ee4b2280f8d4a28c1ba412659c33412a3f887c38601163536461\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zngwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:57Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:57 crc kubenswrapper[4723]: I0309 13:00:57.083174 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lztcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f09eae28-36d6-4c16-8aab-bbd93934f921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gshcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gshcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lztcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:57Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:57 crc kubenswrapper[4723]: I0309 13:00:57.102731 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9771677551554a5d85cade16547356c2335fa472e5b66b2b18d44a5b5d45c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T12:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 12:59:55.482549 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 12:59:55.482783 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 12:59:55.483649 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2499336393/tls.crt::/tmp/serving-cert-2499336393/tls.key\\\\\\\"\\\\nI0309 12:59:56.049731 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 12:59:56.052155 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 12:59:56.052201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 12:59:56.052220 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 12:59:56.052226 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 12:59:56.066627 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 12:59:56.066654 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066658 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 12:59:56.066665 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 12:59:56.066668 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 12:59:56.066671 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 12:59:56.067066 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 12:59:56.069624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T12:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:57Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:57 crc kubenswrapper[4723]: I0309 13:00:57.123607 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8bd72449608bbfb68165af6302aae01ce97dd9cf13392ccd6366dc7fbb7e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:57Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:57 crc kubenswrapper[4723]: I0309 13:00:57.149215 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ca3fb3acb84f38e208051240b02c48f3a17be1ed5b3e309c769ae601bac22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06f3f6a6fc3af6d7c96d5860969110d357f79b45ffba8797c8939c39c299a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:57Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:57 crc kubenswrapper[4723]: I0309 13:00:57.169493 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:57Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:57 crc kubenswrapper[4723]: I0309 13:00:57.186522 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l2l9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91019298-2c2b-48a9-8813-cd58d2681f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fc4cb656087f1906e4337675f18465d9bb6b9e324c185a76aeac922438e5b5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l2l9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:57Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:57 crc kubenswrapper[4723]: I0309 13:00:57.201352 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"983d5ed4-cfc7-402a-b226-29dc071c6e4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://090e849795de6303acc7ba93444e1096a0cbb499371f86f3ce97d74a639f8ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b99b2d350bcf051156986ebffa7486b5fd35d0dcd70f0163bbbeb54050408e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cfjq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:57Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:58 crc kubenswrapper[4723]: I0309 13:00:58.880022 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lztcd" Mar 09 13:00:58 crc kubenswrapper[4723]: I0309 13:00:58.880082 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:00:58 crc kubenswrapper[4723]: I0309 13:00:58.880137 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:00:58 crc kubenswrapper[4723]: E0309 13:00:58.881084 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:00:58 crc kubenswrapper[4723]: I0309 13:00:58.880247 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:00:58 crc kubenswrapper[4723]: E0309 13:00:58.881521 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:00:58 crc kubenswrapper[4723]: E0309 13:00:58.881669 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lztcd" podUID="f09eae28-36d6-4c16-8aab-bbd93934f921" Mar 09 13:00:58 crc kubenswrapper[4723]: E0309 13:00:58.882412 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:00:59 crc kubenswrapper[4723]: I0309 13:00:59.184125 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:00:59 crc kubenswrapper[4723]: I0309 13:00:59.203512 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8bd72449608bbfb68165af6302aae01ce97dd9cf13392ccd6366dc7fbb7e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:59Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:59 crc kubenswrapper[4723]: I0309 13:00:59.220034 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ca3fb3acb84f38e208051240b02c48f3a17be1ed5b3e309c769ae601bac22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06f3f6a6fc3af6d7c96d5860969110d357f79b45ffba8797c8939c39c299a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:59Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:59 crc kubenswrapper[4723]: I0309 13:00:59.233824 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:59Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:59 crc kubenswrapper[4723]: I0309 13:00:59.250252 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l2l9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91019298-2c2b-48a9-8813-cd58d2681f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fc4cb656087f1906e4337675f18465d9bb6b9e324c185a76aeac922438e5b5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l2l9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:59Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:59 crc kubenswrapper[4723]: I0309 13:00:59.265681 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"983d5ed4-cfc7-402a-b226-29dc071c6e4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://090e849795de6303acc7ba93444e1096a0cbb499371f86f3ce97d74a639f8ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b99b2d350bcf051156986ebffa7486b5fd35d0dcd70f0163bbbeb54050408e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cfjq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:59Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:59 crc kubenswrapper[4723]: I0309 13:00:59.278850 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lztcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f09eae28-36d6-4c16-8aab-bbd93934f921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gshcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gshcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lztcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:59Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:59 crc kubenswrapper[4723]: I0309 13:00:59.294418 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9771677551554a5d85cade16547356c2335fa472e5b66b2b18d44a5b5d45c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T12:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 12:59:55.482549 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 12:59:55.482783 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 12:59:55.483649 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2499336393/tls.crt::/tmp/serving-cert-2499336393/tls.key\\\\\\\"\\\\nI0309 12:59:56.049731 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 12:59:56.052155 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 12:59:56.052201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 12:59:56.052220 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 12:59:56.052226 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 12:59:56.066627 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 12:59:56.066654 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066658 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 12:59:56.066665 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 12:59:56.066668 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 12:59:56.066671 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 12:59:56.067066 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 12:59:56.069624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T12:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:59Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:59 crc kubenswrapper[4723]: I0309 13:00:59.307121 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g92rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"242d3bf9-4462-4562-813a-f3548edc94fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca482bd84eb9a872cae7dd27e7606909c24675c47b5def6c7d3b0a947fe198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7r96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g92rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:59Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:59 crc kubenswrapper[4723]: I0309 13:00:59.318146 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7dh57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b334debc-b7ae-4bc1-8e6d-44e4bc86bb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://804f60c896756f05ea5aa24c1a6083c987a657c0753a2a49e401e67f74bee212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7dh57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:59Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:59 crc kubenswrapper[4723]: I0309 13:00:59.330495 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e3a7da2-df7a-46e6-88cf-14c2b5adbf46\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b5ea090dbcca9cb0683626e5137a3e9aea8749a1deb31339e45dc6db3336f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c342ce2d9033896e1d8f3c5428e8b03220c003d3c746103ac17b5cd564663091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0777edbfbe04e20d91d97fb2daa6bc7ef67cbf56cdd1fdb20849350a5f0d4f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://228079a220bbfc7f4d8dc9529398b36cf51b4093c84900cb89f45128c9049f5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://228079a220bbfc7f4d8dc9529398b36cf51b4093c84900cb89f45128c9049f5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:59Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:59 crc kubenswrapper[4723]: I0309 13:00:59.343317 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:59Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:59 crc kubenswrapper[4723]: I0309 13:00:59.353503 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mbhtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff98527-c16a-48b6-94a8-577172ec6dce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cefc2989387f88b8b140c8aed05c20406db5392c88da114b489cbd7c177f98ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkv4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc0517301d8c26db37c831773237c0772b14e01392c032f864200d27ec2207ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkv4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mbhtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:59Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:59 crc kubenswrapper[4723]: I0309 13:00:59.362975 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c8c6bc-647d-452a-9bee-7b7ccafd2816\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ace64875a6e7a6735b141be933b0fe4fb330351eb8b4c9a943e32312d019849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:59Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:59 crc kubenswrapper[4723]: I0309 13:00:59.375642 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:59Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:59 crc kubenswrapper[4723]: I0309 13:00:59.392843 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edb23619-78b6-4d63-aacf-98d7ce86bc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db49efb6d6e555b280f71c32194247ba51a121250abba598286800970b6bec74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74e157a2b8f0e011a72fd9bb3979bede5694bfab635bf7a6c4ee6780d35119e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d41e316c1f5e87b224e0037a94379033e22254097db3d2decf47a5fd335f766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://093509abfd0f010ebce9a116475da8c898a65569861b646314023d4ca81fea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f52ae34a25520c1f48c20a86ec3e00706385ab5dcfeb3bfd616422b454cf5c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84ae27509fbfe9c2f53f5de2a01cfc809ca8abe68dcd96a96963438f0924ddb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e54bb87f6edf8806c89beb2fe9d8ae1c12af001955f19f597597eef3ee7ab5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e54bb87f6edf8806c89beb2fe9d8ae1c12af001955f19f597597eef3ee7ab5a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"message\\\":\\\"/127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:45Z is after 2025-08-24T17:21:41Z]\\\\nI0309 13:00:45.329285 6759 lb_config.go:1031] Cluster endpoints for openshift-marketplace/community-operators for network=default are: map[]\\\\nI0309 13:00:45.329184 6759 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zngwx_openshift-ovn-kubernetes(edb23619-78b6-4d63-aacf-98d7ce86bc5b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca158acc1cc3ee4b2280f8d4a28c1ba412659c33412a3f887c38601163536461\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zngwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:59Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:59 crc kubenswrapper[4723]: I0309 13:00:59.404425 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de264c73b21eebce2a476c8f7d57aa48db637aa83bb6f49be76940a38163b487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:59Z is after 2025-08-24T17:21:41Z" Mar 09 13:00:59 crc kubenswrapper[4723]: I0309 13:00:59.418760 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jb44m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0d50f12886941ea85f21371c6996c216a385703ebb13cd05080fdad052e271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jb44m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:59Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:00 crc kubenswrapper[4723]: I0309 13:01:00.880976 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lztcd" Mar 09 13:01:00 crc kubenswrapper[4723]: I0309 13:01:00.881119 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:01:00 crc kubenswrapper[4723]: I0309 13:01:00.881148 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:01:00 crc kubenswrapper[4723]: I0309 13:01:00.881221 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:01:00 crc kubenswrapper[4723]: E0309 13:01:00.881404 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lztcd" podUID="f09eae28-36d6-4c16-8aab-bbd93934f921" Mar 09 13:01:00 crc kubenswrapper[4723]: E0309 13:01:00.881679 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:01:00 crc kubenswrapper[4723]: E0309 13:01:00.882425 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:01:00 crc kubenswrapper[4723]: E0309 13:01:00.882636 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:01:00 crc kubenswrapper[4723]: I0309 13:01:00.882703 4723 scope.go:117] "RemoveContainer" containerID="e54bb87f6edf8806c89beb2fe9d8ae1c12af001955f19f597597eef3ee7ab5a0" Mar 09 13:01:01 crc kubenswrapper[4723]: I0309 13:01:01.529650 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zngwx_edb23619-78b6-4d63-aacf-98d7ce86bc5b/ovnkube-controller/1.log" Mar 09 13:01:01 crc kubenswrapper[4723]: I0309 13:01:01.532325 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" event={"ID":"edb23619-78b6-4d63-aacf-98d7ce86bc5b","Type":"ContainerStarted","Data":"ed3c87a39e89b3d451419241ae60097d87d31eb4e982e9937611c9d1acd2d95f"} Mar 09 13:01:01 crc kubenswrapper[4723]: I0309 13:01:01.532801 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:01:01 crc kubenswrapper[4723]: I0309 13:01:01.546808 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9771677551554a5d85cade16547356c2335fa472e5b66b2b18d44a5b5d45c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T12:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 12:59:55.482549 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 12:59:55.482783 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 12:59:55.483649 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2499336393/tls.crt::/tmp/serving-cert-2499336393/tls.key\\\\\\\"\\\\nI0309 12:59:56.049731 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 12:59:56.052155 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 12:59:56.052201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 12:59:56.052220 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 12:59:56.052226 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 12:59:56.066627 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 12:59:56.066654 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066658 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 12:59:56.066665 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 12:59:56.066668 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 12:59:56.066671 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 12:59:56.067066 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 12:59:56.069624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T12:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:01Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:01 crc kubenswrapper[4723]: I0309 13:01:01.561410 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8bd72449608bbfb68165af6302aae01ce97dd9cf13392ccd6366dc7fbb7e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:01Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:01 crc kubenswrapper[4723]: I0309 13:01:01.573380 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ca3fb3acb84f38e208051240b02c48f3a17be1ed5b3e309c769ae601bac22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06f3f6a6fc3af6d7c96d5860969110d357f79b45ffba8797c8939c39c299a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:01Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:01 crc kubenswrapper[4723]: I0309 13:01:01.585540 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:01Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:01 crc kubenswrapper[4723]: I0309 13:01:01.597343 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l2l9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91019298-2c2b-48a9-8813-cd58d2681f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fc4cb656087f1906e4337675f18465d9bb6b9e324c185a76aeac922438e5b5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l2l9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:01Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:01 crc kubenswrapper[4723]: I0309 13:01:01.610120 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"983d5ed4-cfc7-402a-b226-29dc071c6e4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://090e849795de6303acc7ba93444e1096a0cbb499371f86f3ce97d74a639f8ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b99b2d350bcf051156986ebffa7486b5fd35d0dcd70f0163bbbeb54050408e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cfjq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:01Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:01 crc kubenswrapper[4723]: I0309 13:01:01.621730 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lztcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f09eae28-36d6-4c16-8aab-bbd93934f921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gshcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gshcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lztcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:01Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:01 crc kubenswrapper[4723]: I0309 13:01:01.635309 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e3a7da2-df7a-46e6-88cf-14c2b5adbf46\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b5ea090dbcca9cb0683626e5137a3e9aea8749a1deb31339e45dc6db3336f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c342ce2d9033896e1d8f3c5428e8b03220c003d3c746103ac17b5cd564663091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0777edbfbe04e20d91d97fb2daa6bc7ef67cbf56cdd1fdb20849350a5f0d4f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://228079a220bbfc7f4d8dc9529398b36cf51b4093c84900cb89f45128c9049f5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://228079a220bbfc7f4d8dc9529398b36cf51b4093c84900cb89f45128c9049f5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:01Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:01 crc kubenswrapper[4723]: I0309 13:01:01.647102 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g92rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"242d3bf9-4462-4562-813a-f3548edc94fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca482bd84eb9a872cae7dd27e7606909c24675c47b5def6c7d3b0a947fe198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7r96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g92rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:01Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:01 crc kubenswrapper[4723]: I0309 13:01:01.657785 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7dh57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b334debc-b7ae-4bc1-8e6d-44e4bc86bb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://804f60c896756f05ea5aa24c1a6083c987a657c0753a2a49e401e67f74bee212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7dh57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:01Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:01 crc kubenswrapper[4723]: I0309 13:01:01.670666 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:01Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:01 crc kubenswrapper[4723]: I0309 13:01:01.692271 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:01Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:01 crc kubenswrapper[4723]: I0309 13:01:01.705081 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mbhtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff98527-c16a-48b6-94a8-577172ec6dce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cefc2989387f88b8b140c8aed05c20406db5392c88da114b489cbd7c177f98ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkv4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc0517301d8c26db37c831773237c0772b14e01392c032f864200d27ec2207ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkv4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mbhtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:01Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:01 crc kubenswrapper[4723]: I0309 13:01:01.713760 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c8c6bc-647d-452a-9bee-7b7ccafd2816\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ace64875a6e7a6735b141be933b0fe4fb330351eb8b4c9a943e32312d019849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:01Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:01 crc kubenswrapper[4723]: I0309 13:01:01.726458 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jb44m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0d50f12886941ea85f21371c6996c216a385703ebb13cd05080fdad052e271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jb44m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:01Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:01 crc kubenswrapper[4723]: I0309 13:01:01.742691 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edb23619-78b6-4d63-aacf-98d7ce86bc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db49efb6d6e555b280f71c32194247ba51a121250abba598286800970b6bec74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74e157a2b8f0e011a72fd9bb3979bede5694bfab635bf7a6c4ee6780d35119e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d41e316c1f5e87b224e0037a94379033e22254097db3d2decf47a5fd335f766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://093509abfd0f010ebce9a116475da8c898a65569861b646314023d4ca81fea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f52ae34a25520c1f48c20a86ec3e00706385ab5dcfeb3bfd616422b454cf5c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84ae27509fbfe9c2f53f5de2a01cfc809ca8abe68dcd96a96963438f0924ddb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed3c87a39e89b3d451419241ae60097d87d31eb4e982e9937611c9d1acd2d95f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e54bb87f6edf8806c89beb2fe9d8ae1c12af001955f19f597597eef3ee7ab5a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"message\\\":\\\"/127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:45Z is after 2025-08-24T17:21:41Z]\\\\nI0309 13:00:45.329285 6759 lb_config.go:1031] Cluster endpoints for openshift-marketplace/community-operators for network=default are: map[]\\\\nI0309 13:00:45.329184 6759 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca158acc1cc3ee4b2280f8d4a28c1ba412659c33412a3f887c38601163536461\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zngwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:01Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:01 crc kubenswrapper[4723]: I0309 13:01:01.754785 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de264c73b21eebce2a476c8f7d57aa48db637aa83bb6f49be76940a38163b487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:01Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:01 crc kubenswrapper[4723]: E0309 13:01:01.977839 4723 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:01:02 crc kubenswrapper[4723]: I0309 13:01:02.537134 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zngwx_edb23619-78b6-4d63-aacf-98d7ce86bc5b/ovnkube-controller/2.log" Mar 09 13:01:02 crc kubenswrapper[4723]: I0309 13:01:02.538144 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zngwx_edb23619-78b6-4d63-aacf-98d7ce86bc5b/ovnkube-controller/1.log" Mar 09 13:01:02 crc kubenswrapper[4723]: I0309 13:01:02.541751 4723 generic.go:334] "Generic (PLEG): container finished" podID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerID="ed3c87a39e89b3d451419241ae60097d87d31eb4e982e9937611c9d1acd2d95f" exitCode=1 Mar 09 13:01:02 crc kubenswrapper[4723]: I0309 13:01:02.541811 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" event={"ID":"edb23619-78b6-4d63-aacf-98d7ce86bc5b","Type":"ContainerDied","Data":"ed3c87a39e89b3d451419241ae60097d87d31eb4e982e9937611c9d1acd2d95f"} Mar 09 13:01:02 crc kubenswrapper[4723]: I0309 13:01:02.541892 4723 scope.go:117] "RemoveContainer" containerID="e54bb87f6edf8806c89beb2fe9d8ae1c12af001955f19f597597eef3ee7ab5a0" Mar 09 13:01:02 crc kubenswrapper[4723]: I0309 13:01:02.542780 4723 scope.go:117] "RemoveContainer" containerID="ed3c87a39e89b3d451419241ae60097d87d31eb4e982e9937611c9d1acd2d95f" Mar 09 13:01:02 crc kubenswrapper[4723]: E0309 13:01:02.543069 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zngwx_openshift-ovn-kubernetes(edb23619-78b6-4d63-aacf-98d7ce86bc5b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" Mar 09 13:01:02 crc kubenswrapper[4723]: I0309 13:01:02.557735 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c8c6bc-647d-452a-9bee-7b7ccafd2816\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ace64875a6e7a6735b141be933b0fe4fb330351eb8b4c9a943e32312d019849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:02Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:02 crc kubenswrapper[4723]: I0309 13:01:02.579840 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:02Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:02 crc kubenswrapper[4723]: I0309 13:01:02.595123 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:02Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:02 crc kubenswrapper[4723]: I0309 13:01:02.601347 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f09eae28-36d6-4c16-8aab-bbd93934f921-metrics-certs\") pod \"network-metrics-daemon-lztcd\" (UID: \"f09eae28-36d6-4c16-8aab-bbd93934f921\") " pod="openshift-multus/network-metrics-daemon-lztcd" Mar 09 13:01:02 crc kubenswrapper[4723]: E0309 13:01:02.601478 4723 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:01:02 crc kubenswrapper[4723]: E0309 13:01:02.601536 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f09eae28-36d6-4c16-8aab-bbd93934f921-metrics-certs podName:f09eae28-36d6-4c16-8aab-bbd93934f921 nodeName:}" failed. No retries permitted until 2026-03-09 13:01:18.601519296 +0000 UTC m=+152.615986846 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f09eae28-36d6-4c16-8aab-bbd93934f921-metrics-certs") pod "network-metrics-daemon-lztcd" (UID: "f09eae28-36d6-4c16-8aab-bbd93934f921") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:01:02 crc kubenswrapper[4723]: I0309 13:01:02.610228 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mbhtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff98527-c16a-48b6-94a8-577172ec6dce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cefc2989387f88b8b140c8aed05c20406db5392c88da114b489cbd7c177f98ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkv4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc0517301d8c26db37c831773237c0772b14e01392c032f864200d27ec2207ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkv4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mbhtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:02Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:02 crc kubenswrapper[4723]: I0309 13:01:02.623075 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de264c73b21eebce2a476c8f7d57aa48db637aa83bb6f49be76940a38163b487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:02Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:02 crc kubenswrapper[4723]: I0309 13:01:02.645358 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jb44m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0d50f12886941ea85f21371c6996c216a385703ebb13cd05080fdad052e271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jb44m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:02Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:02 crc kubenswrapper[4723]: I0309 13:01:02.669001 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edb23619-78b6-4d63-aacf-98d7ce86bc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db49efb6d6e555b280f71c32194247ba51a121250abba598286800970b6bec74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74e157a2b8f0e011a72fd9bb3979bede5694bfab635bf7a6c4ee6780d35119e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d41e316c1f5e87b224e0037a94379033e22254097db3d2decf47a5fd335f766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://093509abfd0f010ebce9a116475da8c898a65569861b646314023d4ca81fea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f52ae34a25520c1f48c20a86ec3e00706385ab5dcfeb3bfd616422b454cf5c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84ae27509fbfe9c2f53f5de2a01cfc809ca8abe68dcd96a96963438f0924ddb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed3c87a39e89b3d451419241ae60097d87d31eb4e982e9937611c9d1acd2d95f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e54bb87f6edf8806c89beb2fe9d8ae1c12af001955f19f597597eef3ee7ab5a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"message\\\":\\\"/127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:00:45Z is after 2025-08-24T17:21:41Z]\\\\nI0309 13:00:45.329285 6759 lb_config.go:1031] Cluster endpoints for openshift-marketplace/community-operators for network=default are: map[]\\\\nI0309 13:00:45.329184 6759 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed3c87a39e89b3d451419241ae60097d87d31eb4e982e9937611c9d1acd2d95f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:01:01Z\\\",\\\"message\\\":\\\"Route (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0309 13:01:01.778990 7056 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0309 13:01:01.779018 7056 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:01:01.779050 7056 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:01:01.779211 7056 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0309 13:01:01.779810 7056 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:01:01.785475 7056 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0309 13:01:01.785506 7056 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0309 13:01:01.785571 7056 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:01:01.785614 7056 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0309 13:01:01.785734 7056 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca158acc1cc3ee4b2280f8d4a28c1ba412659c33412a3f887c38601163536461\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zngwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:02Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:02 crc kubenswrapper[4723]: I0309 13:01:02.679127 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l2l9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91019298-2c2b-48a9-8813-cd58d2681f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fc4cb656087f1906e4337675f18465d9bb6b9e324c185a76aeac922438e5b5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l2l9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:02Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:02 crc kubenswrapper[4723]: I0309 13:01:02.690853 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"983d5ed4-cfc7-402a-b226-29dc071c6e4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://090e849795de6303acc7ba93444e1096a0cbb499371f86f3ce97d74a639f8ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b99b2d350bcf051156986ebffa7486b5fd35d0dcd70f0163bbbeb54050408e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cfjq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:02Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:02 crc kubenswrapper[4723]: I0309 13:01:02.701274 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lztcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f09eae28-36d6-4c16-8aab-bbd93934f921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gshcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gshcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lztcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:02Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:02 crc kubenswrapper[4723]: I0309 13:01:02.716028 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9771677551554a5d85cade16547356c2335fa472e5b66b2b18d44a5b5d45c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T12:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 12:59:55.482549 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 12:59:55.482783 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 12:59:55.483649 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2499336393/tls.crt::/tmp/serving-cert-2499336393/tls.key\\\\\\\"\\\\nI0309 12:59:56.049731 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 12:59:56.052155 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 12:59:56.052201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 12:59:56.052220 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 12:59:56.052226 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 12:59:56.066627 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 12:59:56.066654 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066658 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 12:59:56.066665 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 12:59:56.066668 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 12:59:56.066671 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 12:59:56.067066 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 12:59:56.069624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T12:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:02Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:02 crc kubenswrapper[4723]: I0309 13:01:02.729027 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8bd72449608bbfb68165af6302aae01ce97dd9cf13392ccd6366dc7fbb7e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:02Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:02 crc kubenswrapper[4723]: I0309 13:01:02.739972 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ca3fb3acb84f38e208051240b02c48f3a17be1ed5b3e309c769ae601bac22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06f3f6a6fc3af6d7c96d5860969110d357f79b45ffba8797c8939c39c299a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:02Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:02 crc kubenswrapper[4723]: I0309 13:01:02.752560 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:02Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:02 crc kubenswrapper[4723]: I0309 13:01:02.764875 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e3a7da2-df7a-46e6-88cf-14c2b5adbf46\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b5ea090dbcca9cb0683626e5137a3e9aea8749a1deb31339e45dc6db3336f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c342ce2d9033896e1d8f3c5428e8b03220c003d3c746103ac17b5cd564663091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0777edbfbe04e20d91d97fb2daa6bc7ef67cbf56cdd1fdb20849350a5f0d4f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://228079a220bbfc7f4d8dc9529398b36cf51b4093c84900cb89f45128c9049f5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://228079a220bbfc7f4d8dc9529398b36cf51b4093c84900cb89f45128c9049f5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:02Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:02 crc kubenswrapper[4723]: I0309 13:01:02.777348 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g92rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"242d3bf9-4462-4562-813a-f3548edc94fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca482bd84eb9a872cae7dd27e7606909c24675c47b5def6c7d3b0a947fe198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7r96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g92rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:02Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:02 crc kubenswrapper[4723]: I0309 13:01:02.789493 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7dh57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b334debc-b7ae-4bc1-8e6d-44e4bc86bb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://804f60c896756f05ea5aa24c1a6083c987a657c0753a2a49e401e67f74bee212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7dh57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:02Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:02 crc kubenswrapper[4723]: I0309 13:01:02.880332 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:01:02 crc kubenswrapper[4723]: I0309 13:01:02.880360 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lztcd" Mar 09 13:01:02 crc kubenswrapper[4723]: I0309 13:01:02.880396 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:01:02 crc kubenswrapper[4723]: I0309 13:01:02.880375 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:01:02 crc kubenswrapper[4723]: E0309 13:01:02.880450 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:01:02 crc kubenswrapper[4723]: E0309 13:01:02.880550 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:01:02 crc kubenswrapper[4723]: E0309 13:01:02.880681 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:01:02 crc kubenswrapper[4723]: E0309 13:01:02.880769 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lztcd" podUID="f09eae28-36d6-4c16-8aab-bbd93934f921" Mar 09 13:01:03 crc kubenswrapper[4723]: I0309 13:01:03.547778 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zngwx_edb23619-78b6-4d63-aacf-98d7ce86bc5b/ovnkube-controller/2.log" Mar 09 13:01:03 crc kubenswrapper[4723]: I0309 13:01:03.551701 4723 scope.go:117] "RemoveContainer" containerID="ed3c87a39e89b3d451419241ae60097d87d31eb4e982e9937611c9d1acd2d95f" Mar 09 13:01:03 crc kubenswrapper[4723]: E0309 13:01:03.552055 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zngwx_openshift-ovn-kubernetes(edb23619-78b6-4d63-aacf-98d7ce86bc5b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" Mar 09 13:01:03 crc kubenswrapper[4723]: I0309 13:01:03.570380 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:03Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:03 crc kubenswrapper[4723]: I0309 13:01:03.586426 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:03Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:03 crc kubenswrapper[4723]: I0309 13:01:03.598255 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mbhtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff98527-c16a-48b6-94a8-577172ec6dce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cefc2989387f88b8b140c8aed05c20406db5392c88da114b489cbd7c177f98ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkv4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc0517301d8c26db37c831773237c0772b14e01392c032f864200d27ec2207ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkv4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mbhtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:03Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:03 crc kubenswrapper[4723]: I0309 13:01:03.610744 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c8c6bc-647d-452a-9bee-7b7ccafd2816\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ace64875a6e7a6735b141be933b0fe4fb330351eb8b4c9a943e32312d019849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:03Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:03 crc kubenswrapper[4723]: I0309 13:01:03.630755 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jb44m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0d50f12886941ea85f21371c6996c216a385703ebb13cd05080fdad052e271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jb44m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:03Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:03 crc kubenswrapper[4723]: I0309 13:01:03.656745 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edb23619-78b6-4d63-aacf-98d7ce86bc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db49efb6d6e555b280f71c32194247ba51a121250abba598286800970b6bec74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74e157a2b8f0e011a72fd9bb3979bede5694bfab635bf7a6c4ee6780d35119e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d41e316c1f5e87b224e0037a94379033e22254097db3d2decf47a5fd335f766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://093509abfd0f010ebce9a116475da8c898a65569861b646314023d4ca81fea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f52ae34a25520c1f48c20a86ec3e00706385ab5dcfeb3bfd616422b454cf5c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84ae27509fbfe9c2f53f5de2a01cfc809ca8abe68dcd96a96963438f0924ddb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed3c87a39e89b3d451419241ae60097d87d31eb4e982e9937611c9d1acd2d95f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed3c87a39e89b3d451419241ae60097d87d31eb4e982e9937611c9d1acd2d95f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:01:01Z\\\",\\\"message\\\":\\\"Route (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0309 13:01:01.778990 7056 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0309 13:01:01.779018 7056 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:01:01.779050 7056 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:01:01.779211 7056 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0309 13:01:01.779810 7056 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:01:01.785475 7056 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0309 13:01:01.785506 7056 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0309 13:01:01.785571 7056 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:01:01.785614 7056 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0309 13:01:01.785734 7056 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:01:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zngwx_openshift-ovn-kubernetes(edb23619-78b6-4d63-aacf-98d7ce86bc5b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca158acc1cc3ee4b2280f8d4a28c1ba412659c33412a3f887c38601163536461\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zngwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:03Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:03 crc kubenswrapper[4723]: I0309 13:01:03.670364 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de264c73b21eebce2a476c8f7d57aa48db637aa83bb6f49be76940a38163b487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:03Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:03 crc kubenswrapper[4723]: I0309 13:01:03.686919 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9771677551554a5d85cade16547356c2335fa472e5b66b2b18d44a5b5d45c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T12:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 12:59:55.482549 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 12:59:55.482783 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 12:59:55.483649 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2499336393/tls.crt::/tmp/serving-cert-2499336393/tls.key\\\\\\\"\\\\nI0309 12:59:56.049731 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 12:59:56.052155 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 12:59:56.052201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 12:59:56.052220 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 12:59:56.052226 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 12:59:56.066627 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 12:59:56.066654 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066658 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 12:59:56.066665 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 12:59:56.066668 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 12:59:56.066671 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 12:59:56.067066 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 12:59:56.069624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T12:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:03Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:03 crc kubenswrapper[4723]: I0309 13:01:03.704416 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8bd72449608bbfb68165af6302aae01ce97dd9cf13392ccd6366dc7fbb7e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:03Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:03 crc kubenswrapper[4723]: I0309 13:01:03.716136 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ca3fb3acb84f38e208051240b02c48f3a17be1ed5b3e309c769ae601bac22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06f3f6a6fc3af6d7c96d5860969110d357f79b45ffba8797c8939c39c299a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:03Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:03 crc kubenswrapper[4723]: I0309 13:01:03.729174 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:03Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:03 crc kubenswrapper[4723]: I0309 13:01:03.744005 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l2l9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91019298-2c2b-48a9-8813-cd58d2681f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fc4cb656087f1906e4337675f18465d9bb6b9e324c185a76aeac922438e5b5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l2l9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:03Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:03 crc kubenswrapper[4723]: I0309 13:01:03.755266 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"983d5ed4-cfc7-402a-b226-29dc071c6e4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://090e849795de6303acc7ba93444e1096a0cbb499371f86f3ce97d74a639f8ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b99b2d350bcf051156986ebffa7486b5fd35d0dcd70f0163bbbeb54050408e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cfjq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:03Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:03 crc kubenswrapper[4723]: I0309 13:01:03.766616 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lztcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f09eae28-36d6-4c16-8aab-bbd93934f921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gshcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gshcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lztcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:03Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:03 crc kubenswrapper[4723]: I0309 13:01:03.779853 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e3a7da2-df7a-46e6-88cf-14c2b5adbf46\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b5ea090dbcca9cb0683626e5137a3e9aea8749a1deb31339e45dc6db3336f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c342ce2d9033896e1d8f3c5428e8b03220c003d3c746103ac17b5cd564663091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0777edbfbe04e20d91d97fb2daa6bc7ef67cbf56cdd1fdb20849350a5f0d4f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://228079a220bbfc7f4d8dc9529398b36cf51b4093c84900cb89f45128c9049f5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://228079a220bbfc7f4d8dc9529398b36cf51b4093c84900cb89f45128c9049f5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:03Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:03 crc kubenswrapper[4723]: I0309 13:01:03.791852 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g92rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"242d3bf9-4462-4562-813a-f3548edc94fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca482bd84eb9a872cae7dd27e7606909c24675c47b5def6c7d3b0a947fe198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7r96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g92rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:03Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:03 crc kubenswrapper[4723]: I0309 13:01:03.801214 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7dh57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b334debc-b7ae-4bc1-8e6d-44e4bc86bb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://804f60c896756f05ea5aa24c1a6083c987a657c0753a2a49e401e67f74bee212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7dh57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:03Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:04 crc kubenswrapper[4723]: I0309 13:01:04.880324 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:01:04 crc kubenswrapper[4723]: I0309 13:01:04.880376 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:01:04 crc kubenswrapper[4723]: I0309 13:01:04.880356 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:01:04 crc kubenswrapper[4723]: I0309 13:01:04.880324 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lztcd" Mar 09 13:01:04 crc kubenswrapper[4723]: E0309 13:01:04.880462 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:01:04 crc kubenswrapper[4723]: E0309 13:01:04.880538 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:01:04 crc kubenswrapper[4723]: E0309 13:01:04.880628 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:01:04 crc kubenswrapper[4723]: E0309 13:01:04.880720 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lztcd" podUID="f09eae28-36d6-4c16-8aab-bbd93934f921" Mar 09 13:01:05 crc kubenswrapper[4723]: I0309 13:01:05.486517 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:01:05 crc kubenswrapper[4723]: I0309 13:01:05.486568 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:01:05 crc kubenswrapper[4723]: I0309 13:01:05.486583 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:01:05 crc kubenswrapper[4723]: I0309 13:01:05.486603 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:01:05 crc kubenswrapper[4723]: I0309 13:01:05.486613 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:01:05Z","lastTransitionTime":"2026-03-09T13:01:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:01:05 crc kubenswrapper[4723]: E0309 13:01:05.498521 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7a1a0b61-b39c-449b-8223-316b94b8c26c\\\",\\\"systemUUID\\\":\\\"548816a4-d164-4614-b3ad-c3e4f48e94b9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:05 crc kubenswrapper[4723]: I0309 13:01:05.501355 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:01:05 crc kubenswrapper[4723]: I0309 13:01:05.501404 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:01:05 crc kubenswrapper[4723]: I0309 13:01:05.501415 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:01:05 crc kubenswrapper[4723]: I0309 13:01:05.501431 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:01:05 crc kubenswrapper[4723]: I0309 13:01:05.501464 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:01:05Z","lastTransitionTime":"2026-03-09T13:01:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:01:05 crc kubenswrapper[4723]: E0309 13:01:05.517247 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7a1a0b61-b39c-449b-8223-316b94b8c26c\\\",\\\"systemUUID\\\":\\\"548816a4-d164-4614-b3ad-c3e4f48e94b9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:05 crc kubenswrapper[4723]: I0309 13:01:05.520823 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:01:05 crc kubenswrapper[4723]: I0309 13:01:05.520874 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:01:05 crc kubenswrapper[4723]: I0309 13:01:05.520883 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:01:05 crc kubenswrapper[4723]: I0309 13:01:05.520895 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:01:05 crc kubenswrapper[4723]: I0309 13:01:05.520904 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:01:05Z","lastTransitionTime":"2026-03-09T13:01:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:01:05 crc kubenswrapper[4723]: E0309 13:01:05.532805 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7a1a0b61-b39c-449b-8223-316b94b8c26c\\\",\\\"systemUUID\\\":\\\"548816a4-d164-4614-b3ad-c3e4f48e94b9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:05 crc kubenswrapper[4723]: I0309 13:01:05.536993 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:01:05 crc kubenswrapper[4723]: I0309 13:01:05.537058 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:01:05 crc kubenswrapper[4723]: I0309 13:01:05.537070 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:01:05 crc kubenswrapper[4723]: I0309 13:01:05.537086 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:01:05 crc kubenswrapper[4723]: I0309 13:01:05.537096 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:01:05Z","lastTransitionTime":"2026-03-09T13:01:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:01:05 crc kubenswrapper[4723]: E0309 13:01:05.549816 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7a1a0b61-b39c-449b-8223-316b94b8c26c\\\",\\\"systemUUID\\\":\\\"548816a4-d164-4614-b3ad-c3e4f48e94b9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:05 crc kubenswrapper[4723]: I0309 13:01:05.553638 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:01:05 crc kubenswrapper[4723]: I0309 13:01:05.554345 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:01:05 crc kubenswrapper[4723]: I0309 13:01:05.554383 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:01:05 crc kubenswrapper[4723]: I0309 13:01:05.554410 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:01:05 crc kubenswrapper[4723]: I0309 13:01:05.554424 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:01:05Z","lastTransitionTime":"2026-03-09T13:01:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:01:05 crc kubenswrapper[4723]: E0309 13:01:05.568125 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7a1a0b61-b39c-449b-8223-316b94b8c26c\\\",\\\"systemUUID\\\":\\\"548816a4-d164-4614-b3ad-c3e4f48e94b9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:05 crc kubenswrapper[4723]: E0309 13:01:05.568252 4723 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 13:01:06 crc kubenswrapper[4723]: I0309 13:01:06.880088 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lztcd" Mar 09 13:01:06 crc kubenswrapper[4723]: I0309 13:01:06.880152 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:01:06 crc kubenswrapper[4723]: I0309 13:01:06.880098 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:01:06 crc kubenswrapper[4723]: E0309 13:01:06.880528 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lztcd" podUID="f09eae28-36d6-4c16-8aab-bbd93934f921" Mar 09 13:01:06 crc kubenswrapper[4723]: E0309 13:01:06.881259 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:01:06 crc kubenswrapper[4723]: I0309 13:01:06.881349 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:01:06 crc kubenswrapper[4723]: E0309 13:01:06.881414 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:01:06 crc kubenswrapper[4723]: E0309 13:01:06.881507 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:01:06 crc kubenswrapper[4723]: I0309 13:01:06.905924 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9771677551554a5d85cade16547356c2335fa472e5b66b2b18d44a5b5d45c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T12:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 12:59:55.482549 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 12:59:55.482783 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 12:59:55.483649 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2499336393/tls.crt::/tmp/serving-cert-2499336393/tls.key\\\\\\\"\\\\nI0309 12:59:56.049731 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 12:59:56.052155 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 12:59:56.052201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 12:59:56.052220 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 12:59:56.052226 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 12:59:56.066627 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 12:59:56.066654 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066658 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 12:59:56.066665 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 12:59:56.066668 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 12:59:56.066671 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 12:59:56.067066 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 12:59:56.069624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T12:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:06Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:06 crc kubenswrapper[4723]: I0309 13:01:06.926245 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8bd72449608bbfb68165af6302aae01ce97dd9cf13392ccd6366dc7fbb7e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:06Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:06 crc kubenswrapper[4723]: I0309 13:01:06.943816 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ca3fb3acb84f38e208051240b02c48f3a17be1ed5b3e309c769ae601bac22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06f3f6a6fc3af6d7c96d5860969110d357f79b45ffba8797c8939c39c299a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:06Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:06 crc kubenswrapper[4723]: I0309 13:01:06.961206 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:06Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:06 crc kubenswrapper[4723]: I0309 13:01:06.974416 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l2l9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91019298-2c2b-48a9-8813-cd58d2681f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fc4cb656087f1906e4337675f18465d9bb6b9e324c185a76aeac922438e5b5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l2l9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:06Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:06 crc kubenswrapper[4723]: E0309 13:01:06.978311 4723 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:01:06 crc kubenswrapper[4723]: I0309 13:01:06.991438 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"983d5ed4-cfc7-402a-b226-29dc071c6e4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://090e849795de6303acc7ba93444e1096a0cbb499371f86f3ce97d74a639f8ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b99b2d350bcf051156986ebffa7486b5fd35d0dcd70f0163bbbeb54050408e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cfjq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:06Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:07 crc kubenswrapper[4723]: I0309 13:01:07.006001 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lztcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f09eae28-36d6-4c16-8aab-bbd93934f921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gshcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gshcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lztcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:07Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:07 crc kubenswrapper[4723]: I0309 13:01:07.022703 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e3a7da2-df7a-46e6-88cf-14c2b5adbf46\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b5ea090dbcca9cb0683626e5137a3e9aea8749a1deb31339e45dc6db3336f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c342ce2d9033896e1d8f3c5428e8b03220c003d3c746103ac17b5cd564663091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0777edbfbe04e20d91d97fb2daa6bc7ef67cbf56cdd1fdb20849350a5f0d4f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://228079a220bbfc7f4d8dc9529398b36cf51b4093c84900cb89f45128c9049f5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://228079a220bbfc7f4d8dc9529398b36cf51b4093c84900cb89f45128c9049f5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:07Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:07 crc kubenswrapper[4723]: I0309 13:01:07.040931 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g92rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"242d3bf9-4462-4562-813a-f3548edc94fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca482bd84eb9a872cae7dd27e7606909c24675c47b5def6c7d3b0a947fe198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7r96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g92rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:07Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:07 crc kubenswrapper[4723]: I0309 13:01:07.056343 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7dh57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b334debc-b7ae-4bc1-8e6d-44e4bc86bb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://804f60c896756f05ea5aa24c1a6083c987a657c0753a2a49e401e67f74bee212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7dh57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:07Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:07 crc kubenswrapper[4723]: I0309 13:01:07.071271 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c8c6bc-647d-452a-9bee-7b7ccafd2816\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ace64875a6e7a6735b141be933b0fe4fb330351eb8b4c9a943e32312d019849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:07Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:07 crc kubenswrapper[4723]: I0309 13:01:07.087143 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:07Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:07 crc kubenswrapper[4723]: I0309 13:01:07.103435 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:07Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:07 crc kubenswrapper[4723]: I0309 13:01:07.117918 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mbhtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff98527-c16a-48b6-94a8-577172ec6dce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cefc2989387f88b8b140c8aed05c20406db5392c88da114b489cbd7c177f98ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkv4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc0517301d8c26db37c831773237c0772b14e01392c032f864200d27ec2207ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkv4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mbhtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:07Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:07 crc kubenswrapper[4723]: I0309 13:01:07.133221 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de264c73b21eebce2a476c8f7d57aa48db637aa83bb6f49be76940a38163b487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:07Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:07 crc kubenswrapper[4723]: I0309 13:01:07.157846 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jb44m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0d50f12886941ea85f21371c6996c216a385703ebb13cd05080fdad052e271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jb44m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:07Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:07 crc kubenswrapper[4723]: I0309 13:01:07.178228 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edb23619-78b6-4d63-aacf-98d7ce86bc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db49efb6d6e555b280f71c32194247ba51a121250abba598286800970b6bec74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74e157a2b8f0e011a72fd9bb3979bede5694bfab635bf7a6c4ee6780d35119e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d41e316c1f5e87b224e0037a94379033e22254097db3d2decf47a5fd335f766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://093509abfd0f010ebce9a116475da8c898a65569861b646314023d4ca81fea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f52ae34a25520c1f48c20a86ec3e00706385ab5dcfeb3bfd616422b454cf5c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84ae27509fbfe9c2f53f5de2a01cfc809ca8abe68dcd96a96963438f0924ddb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed3c87a39e89b3d451419241ae60097d87d31eb4e982e9937611c9d1acd2d95f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed3c87a39e89b3d451419241ae60097d87d31eb4e982e9937611c9d1acd2d95f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:01:01Z\\\",\\\"message\\\":\\\"Route (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0309 13:01:01.778990 7056 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0309 13:01:01.779018 7056 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:01:01.779050 7056 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:01:01.779211 7056 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0309 13:01:01.779810 7056 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:01:01.785475 7056 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0309 13:01:01.785506 7056 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0309 13:01:01.785571 7056 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:01:01.785614 7056 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0309 13:01:01.785734 7056 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:01:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zngwx_openshift-ovn-kubernetes(edb23619-78b6-4d63-aacf-98d7ce86bc5b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca158acc1cc3ee4b2280f8d4a28c1ba412659c33412a3f887c38601163536461\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zngwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:07Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:08 crc kubenswrapper[4723]: I0309 13:01:08.880508 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:01:08 crc kubenswrapper[4723]: I0309 13:01:08.880524 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:01:08 crc kubenswrapper[4723]: E0309 13:01:08.880637 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:01:08 crc kubenswrapper[4723]: I0309 13:01:08.880562 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lztcd" Mar 09 13:01:08 crc kubenswrapper[4723]: I0309 13:01:08.880542 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:01:08 crc kubenswrapper[4723]: E0309 13:01:08.881157 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:01:08 crc kubenswrapper[4723]: E0309 13:01:08.880904 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lztcd" podUID="f09eae28-36d6-4c16-8aab-bbd93934f921" Mar 09 13:01:08 crc kubenswrapper[4723]: E0309 13:01:08.880838 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:01:10 crc kubenswrapper[4723]: I0309 13:01:10.880574 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:01:10 crc kubenswrapper[4723]: I0309 13:01:10.880571 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:01:10 crc kubenswrapper[4723]: E0309 13:01:10.881541 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:01:10 crc kubenswrapper[4723]: I0309 13:01:10.880721 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:01:10 crc kubenswrapper[4723]: E0309 13:01:10.882005 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:01:10 crc kubenswrapper[4723]: I0309 13:01:10.880664 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lztcd" Mar 09 13:01:10 crc kubenswrapper[4723]: E0309 13:01:10.882311 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:01:10 crc kubenswrapper[4723]: E0309 13:01:10.882339 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lztcd" podUID="f09eae28-36d6-4c16-8aab-bbd93934f921" Mar 09 13:01:11 crc kubenswrapper[4723]: E0309 13:01:11.980060 4723 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:01:12 crc kubenswrapper[4723]: I0309 13:01:12.880946 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:01:12 crc kubenswrapper[4723]: I0309 13:01:12.881152 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:01:12 crc kubenswrapper[4723]: I0309 13:01:12.881510 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lztcd" Mar 09 13:01:12 crc kubenswrapper[4723]: I0309 13:01:12.881829 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:01:12 crc kubenswrapper[4723]: E0309 13:01:12.882065 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:01:12 crc kubenswrapper[4723]: E0309 13:01:12.881816 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:01:12 crc kubenswrapper[4723]: E0309 13:01:12.882406 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:01:12 crc kubenswrapper[4723]: E0309 13:01:12.882556 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lztcd" podUID="f09eae28-36d6-4c16-8aab-bbd93934f921" Mar 09 13:01:14 crc kubenswrapper[4723]: I0309 13:01:14.880225 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:01:14 crc kubenswrapper[4723]: I0309 13:01:14.880264 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:01:14 crc kubenswrapper[4723]: I0309 13:01:14.880519 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lztcd" Mar 09 13:01:14 crc kubenswrapper[4723]: I0309 13:01:14.880530 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:01:14 crc kubenswrapper[4723]: E0309 13:01:14.880441 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:01:14 crc kubenswrapper[4723]: E0309 13:01:14.880712 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:01:14 crc kubenswrapper[4723]: E0309 13:01:14.880848 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lztcd" podUID="f09eae28-36d6-4c16-8aab-bbd93934f921" Mar 09 13:01:14 crc kubenswrapper[4723]: E0309 13:01:14.881186 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:01:15 crc kubenswrapper[4723]: I0309 13:01:15.890743 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:01:15 crc kubenswrapper[4723]: I0309 13:01:15.890801 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:01:15 crc kubenswrapper[4723]: I0309 13:01:15.890817 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:01:15 crc kubenswrapper[4723]: I0309 13:01:15.890841 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:01:15 crc kubenswrapper[4723]: I0309 13:01:15.890897 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:01:15Z","lastTransitionTime":"2026-03-09T13:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:01:15 crc kubenswrapper[4723]: E0309 13:01:15.913140 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7a1a0b61-b39c-449b-8223-316b94b8c26c\\\",\\\"systemUUID\\\":\\\"548816a4-d164-4614-b3ad-c3e4f48e94b9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:15 crc kubenswrapper[4723]: I0309 13:01:15.919321 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:01:15 crc kubenswrapper[4723]: I0309 13:01:15.919361 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:01:15 crc kubenswrapper[4723]: I0309 13:01:15.919374 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:01:15 crc kubenswrapper[4723]: I0309 13:01:15.919392 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:01:15 crc kubenswrapper[4723]: I0309 13:01:15.919409 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:01:15Z","lastTransitionTime":"2026-03-09T13:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:01:15 crc kubenswrapper[4723]: E0309 13:01:15.939635 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7a1a0b61-b39c-449b-8223-316b94b8c26c\\\",\\\"systemUUID\\\":\\\"548816a4-d164-4614-b3ad-c3e4f48e94b9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:15 crc kubenswrapper[4723]: I0309 13:01:15.944418 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:01:15 crc kubenswrapper[4723]: I0309 13:01:15.944479 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:01:15 crc kubenswrapper[4723]: I0309 13:01:15.944493 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:01:15 crc kubenswrapper[4723]: I0309 13:01:15.944519 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:01:15 crc kubenswrapper[4723]: I0309 13:01:15.944538 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:01:15Z","lastTransitionTime":"2026-03-09T13:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:01:15 crc kubenswrapper[4723]: E0309 13:01:15.961847 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7a1a0b61-b39c-449b-8223-316b94b8c26c\\\",\\\"systemUUID\\\":\\\"548816a4-d164-4614-b3ad-c3e4f48e94b9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:15 crc kubenswrapper[4723]: I0309 13:01:15.966040 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:01:15 crc kubenswrapper[4723]: I0309 13:01:15.966071 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:01:15 crc kubenswrapper[4723]: I0309 13:01:15.966083 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:01:15 crc kubenswrapper[4723]: I0309 13:01:15.966100 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:01:15 crc kubenswrapper[4723]: I0309 13:01:15.966109 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:01:15Z","lastTransitionTime":"2026-03-09T13:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:01:15 crc kubenswrapper[4723]: E0309 13:01:15.981881 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7a1a0b61-b39c-449b-8223-316b94b8c26c\\\",\\\"systemUUID\\\":\\\"548816a4-d164-4614-b3ad-c3e4f48e94b9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:15 crc kubenswrapper[4723]: I0309 13:01:15.986765 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:01:15 crc kubenswrapper[4723]: I0309 13:01:15.986836 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:01:15 crc kubenswrapper[4723]: I0309 13:01:15.986889 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:01:15 crc kubenswrapper[4723]: I0309 13:01:15.986919 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:01:15 crc kubenswrapper[4723]: I0309 13:01:15.986942 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:01:15Z","lastTransitionTime":"2026-03-09T13:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:01:16 crc kubenswrapper[4723]: E0309 13:01:16.006911 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7a1a0b61-b39c-449b-8223-316b94b8c26c\\\",\\\"systemUUID\\\":\\\"548816a4-d164-4614-b3ad-c3e4f48e94b9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:16Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:16 crc kubenswrapper[4723]: E0309 13:01:16.007166 4723 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 13:01:16 crc kubenswrapper[4723]: I0309 13:01:16.880756 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:01:16 crc kubenswrapper[4723]: I0309 13:01:16.881347 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:01:16 crc kubenswrapper[4723]: I0309 13:01:16.881405 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lztcd" Mar 09 13:01:16 crc kubenswrapper[4723]: E0309 13:01:16.881399 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:01:16 crc kubenswrapper[4723]: I0309 13:01:16.881237 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:01:16 crc kubenswrapper[4723]: E0309 13:01:16.881573 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:01:16 crc kubenswrapper[4723]: E0309 13:01:16.881753 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lztcd" podUID="f09eae28-36d6-4c16-8aab-bbd93934f921" Mar 09 13:01:16 crc kubenswrapper[4723]: E0309 13:01:16.881923 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:01:16 crc kubenswrapper[4723]: I0309 13:01:16.899880 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9771677551554a5d85cade16547356c2335fa472e5b66b2b18d44a5b5d45c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T12:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 12:59:55.482549 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 12:59:55.482783 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 12:59:55.483649 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2499336393/tls.crt::/tmp/serving-cert-2499336393/tls.key\\\\\\\"\\\\nI0309 12:59:56.049731 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 12:59:56.052155 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 12:59:56.052201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 12:59:56.052220 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 12:59:56.052226 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 12:59:56.066627 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 12:59:56.066654 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066658 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 12:59:56.066665 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 12:59:56.066668 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 12:59:56.066671 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 12:59:56.067066 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 12:59:56.069624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T12:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:16Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:16 crc kubenswrapper[4723]: I0309 13:01:16.912682 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8bd72449608bbfb68165af6302aae01ce97dd9cf13392ccd6366dc7fbb7e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:16Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:16 crc kubenswrapper[4723]: I0309 13:01:16.930999 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ca3fb3acb84f38e208051240b02c48f3a17be1ed5b3e309c769ae601bac22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06f3f6a6fc3af6d7c96d5860969110d357f79b45ffba8797c8939c39c299a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:16Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:16 crc kubenswrapper[4723]: I0309 13:01:16.944667 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:16Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:16 crc kubenswrapper[4723]: I0309 13:01:16.955179 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l2l9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91019298-2c2b-48a9-8813-cd58d2681f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fc4cb656087f1906e4337675f18465d9bb6b9e324c185a76aeac922438e5b5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l2l9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:16Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:16 crc kubenswrapper[4723]: I0309 13:01:16.967170 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"983d5ed4-cfc7-402a-b226-29dc071c6e4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://090e849795de6303acc7ba93444e1096a0cbb499371f86f3ce97d74a639f8ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b99b2d350bcf051156986ebffa7486b5fd35d0dcd70f0163bbbeb54050408e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cfjq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:16Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:16 crc kubenswrapper[4723]: I0309 13:01:16.979295 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lztcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f09eae28-36d6-4c16-8aab-bbd93934f921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gshcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gshcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lztcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:16Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:16 crc kubenswrapper[4723]: E0309 13:01:16.980956 4723 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:01:16 crc kubenswrapper[4723]: I0309 13:01:16.995608 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e3a7da2-df7a-46e6-88cf-14c2b5adbf46\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b5ea090dbcca9cb0683626e5137a3e9aea8749a1deb31339e45dc6db3336f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c342ce2d9033896e1d8f3c5428e8b03220c003d3c746103ac17b5cd564663091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0777edbfbe04e20d91d97fb2daa6bc7ef67cbf56cdd1fdb20849350a5f0d4f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://228079a220bbfc7f4d8dc9529398b36cf51b4093c84900cb89f45128c9049f5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://228079a220bbfc7f4d8dc9529398b36cf51b4093c84900cb89f45128c9049f5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:16Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:17 crc kubenswrapper[4723]: I0309 13:01:17.010623 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g92rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"242d3bf9-4462-4562-813a-f3548edc94fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca482bd84eb9a872cae7dd27e7606909c24675c47b5def6c7d3b0a947fe198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7r96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g92rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:17Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:17 crc kubenswrapper[4723]: I0309 13:01:17.023564 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7dh57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b334debc-b7ae-4bc1-8e6d-44e4bc86bb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://804f60c896756f05ea5aa24c1a6083c987a657c0753a2a49e401e67f74bee212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7dh57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:17Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:17 crc kubenswrapper[4723]: I0309 13:01:17.037562 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:17Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:17 crc kubenswrapper[4723]: I0309 13:01:17.051713 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:17Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:17 crc kubenswrapper[4723]: I0309 13:01:17.064078 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mbhtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff98527-c16a-48b6-94a8-577172ec6dce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cefc2989387f88b8b140c8aed05c20406db5392c88da114b489cbd7c177f98ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkv4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc0517301d8c26db37c831773237c0772b14e01392c032f864200d27ec2207ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkv4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mbhtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:17Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:17 crc kubenswrapper[4723]: I0309 13:01:17.074490 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c8c6bc-647d-452a-9bee-7b7ccafd2816\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ace64875a6e7a6735b141be933b0fe4fb330351eb8b4c9a943e32312d019849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:17Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:17 crc kubenswrapper[4723]: I0309 13:01:17.091235 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jb44m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0d50f12886941ea85f21371c6996c216a385703ebb13cd05080fdad052e271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jb44m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:17Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:17 crc kubenswrapper[4723]: I0309 13:01:17.120263 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edb23619-78b6-4d63-aacf-98d7ce86bc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db49efb6d6e555b280f71c32194247ba51a121250abba598286800970b6bec74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74e157a2b8f0e011a72fd9bb3979bede5694bfab635bf7a6c4ee6780d35119e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d41e316c1f5e87b224e0037a94379033e22254097db3d2decf47a5fd335f766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://093509abfd0f010ebce9a116475da8c898a65569861b646314023d4ca81fea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f52ae34a25520c1f48c20a86ec3e00706385ab5dcfeb3bfd616422b454cf5c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84ae27509fbfe9c2f53f5de2a01cfc809ca8abe68dcd96a96963438f0924ddb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed3c87a39e89b3d451419241ae60097d87d31eb4e982e9937611c9d1acd2d95f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed3c87a39e89b3d451419241ae60097d87d31eb4e982e9937611c9d1acd2d95f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:01:01Z\\\",\\\"message\\\":\\\"Route (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0309 13:01:01.778990 7056 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0309 13:01:01.779018 7056 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:01:01.779050 7056 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:01:01.779211 7056 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0309 13:01:01.779810 7056 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:01:01.785475 7056 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0309 13:01:01.785506 7056 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0309 13:01:01.785571 7056 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:01:01.785614 7056 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0309 13:01:01.785734 7056 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:01:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zngwx_openshift-ovn-kubernetes(edb23619-78b6-4d63-aacf-98d7ce86bc5b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca158acc1cc3ee4b2280f8d4a28c1ba412659c33412a3f887c38601163536461\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zngwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:17Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:17 crc kubenswrapper[4723]: I0309 13:01:17.139021 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de264c73b21eebce2a476c8f7d57aa48db637aa83bb6f49be76940a38163b487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:17Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:17 crc kubenswrapper[4723]: I0309 13:01:17.881320 4723 scope.go:117] "RemoveContainer" containerID="ed3c87a39e89b3d451419241ae60097d87d31eb4e982e9937611c9d1acd2d95f" Mar 09 13:01:17 crc kubenswrapper[4723]: E0309 13:01:17.881652 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zngwx_openshift-ovn-kubernetes(edb23619-78b6-4d63-aacf-98d7ce86bc5b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" Mar 09 13:01:18 crc kubenswrapper[4723]: I0309 13:01:18.675376 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f09eae28-36d6-4c16-8aab-bbd93934f921-metrics-certs\") pod \"network-metrics-daemon-lztcd\" (UID: \"f09eae28-36d6-4c16-8aab-bbd93934f921\") " pod="openshift-multus/network-metrics-daemon-lztcd" Mar 09 13:01:18 crc kubenswrapper[4723]: E0309 13:01:18.675594 4723 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:01:18 crc kubenswrapper[4723]: E0309 13:01:18.675699 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f09eae28-36d6-4c16-8aab-bbd93934f921-metrics-certs podName:f09eae28-36d6-4c16-8aab-bbd93934f921 nodeName:}" failed. No retries permitted until 2026-03-09 13:01:50.675674036 +0000 UTC m=+184.690141586 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f09eae28-36d6-4c16-8aab-bbd93934f921-metrics-certs") pod "network-metrics-daemon-lztcd" (UID: "f09eae28-36d6-4c16-8aab-bbd93934f921") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:01:18 crc kubenswrapper[4723]: I0309 13:01:18.879934 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lztcd" Mar 09 13:01:18 crc kubenswrapper[4723]: I0309 13:01:18.879972 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:01:18 crc kubenswrapper[4723]: I0309 13:01:18.880070 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:01:18 crc kubenswrapper[4723]: I0309 13:01:18.880189 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:01:18 crc kubenswrapper[4723]: E0309 13:01:18.880180 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lztcd" podUID="f09eae28-36d6-4c16-8aab-bbd93934f921" Mar 09 13:01:18 crc kubenswrapper[4723]: E0309 13:01:18.880338 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:01:18 crc kubenswrapper[4723]: E0309 13:01:18.880457 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:01:18 crc kubenswrapper[4723]: E0309 13:01:18.880554 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:01:20 crc kubenswrapper[4723]: I0309 13:01:20.699207 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:01:20 crc kubenswrapper[4723]: I0309 13:01:20.699349 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:01:20 crc kubenswrapper[4723]: E0309 13:01:20.699516 4723 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:01:20 crc kubenswrapper[4723]: E0309 13:01:20.699516 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:02:24.699462856 +0000 UTC m=+218.713930446 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:20 crc kubenswrapper[4723]: E0309 13:01:20.699842 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:02:24.699793084 +0000 UTC m=+218.714260834 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:01:20 crc kubenswrapper[4723]: I0309 13:01:20.800325 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:01:20 crc kubenswrapper[4723]: I0309 13:01:20.800390 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:01:20 crc kubenswrapper[4723]: I0309 13:01:20.800438 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:01:20 crc kubenswrapper[4723]: E0309 13:01:20.800517 4723 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:01:20 crc kubenswrapper[4723]: E0309 13:01:20.800536 4723 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:01:20 crc kubenswrapper[4723]: E0309 13:01:20.800557 4723 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:01:20 crc kubenswrapper[4723]: E0309 13:01:20.800576 4723 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:01:20 crc kubenswrapper[4723]: E0309 13:01:20.800599 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:02:24.800578778 +0000 UTC m=+218.815046318 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:01:20 crc kubenswrapper[4723]: E0309 13:01:20.800631 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 13:02:24.800612659 +0000 UTC m=+218.815080209 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:01:20 crc kubenswrapper[4723]: E0309 13:01:20.800793 4723 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:01:20 crc kubenswrapper[4723]: E0309 13:01:20.800849 4723 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:01:20 crc kubenswrapper[4723]: E0309 13:01:20.800902 4723 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:01:20 crc kubenswrapper[4723]: E0309 13:01:20.801035 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 13:02:24.800997008 +0000 UTC m=+218.815464588 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:01:20 crc kubenswrapper[4723]: I0309 13:01:20.880746 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lztcd" Mar 09 13:01:20 crc kubenswrapper[4723]: I0309 13:01:20.880790 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:01:20 crc kubenswrapper[4723]: I0309 13:01:20.880753 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:01:20 crc kubenswrapper[4723]: I0309 13:01:20.880892 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:01:20 crc kubenswrapper[4723]: E0309 13:01:20.880916 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lztcd" podUID="f09eae28-36d6-4c16-8aab-bbd93934f921" Mar 09 13:01:20 crc kubenswrapper[4723]: E0309 13:01:20.881037 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:01:20 crc kubenswrapper[4723]: E0309 13:01:20.881147 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:01:20 crc kubenswrapper[4723]: E0309 13:01:20.881421 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:01:21 crc kubenswrapper[4723]: I0309 13:01:21.614052 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-g92rf_242d3bf9-4462-4562-813a-f3548edc94fd/kube-multus/0.log" Mar 09 13:01:21 crc kubenswrapper[4723]: I0309 13:01:21.614211 4723 generic.go:334] "Generic (PLEG): container finished" podID="242d3bf9-4462-4562-813a-f3548edc94fd" containerID="ca482bd84eb9a872cae7dd27e7606909c24675c47b5def6c7d3b0a947fe198a5" exitCode=1 Mar 09 13:01:21 crc kubenswrapper[4723]: I0309 13:01:21.614296 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-g92rf" event={"ID":"242d3bf9-4462-4562-813a-f3548edc94fd","Type":"ContainerDied","Data":"ca482bd84eb9a872cae7dd27e7606909c24675c47b5def6c7d3b0a947fe198a5"} Mar 09 13:01:21 crc kubenswrapper[4723]: I0309 13:01:21.615315 4723 scope.go:117] "RemoveContainer" containerID="ca482bd84eb9a872cae7dd27e7606909c24675c47b5def6c7d3b0a947fe198a5" Mar 09 13:01:21 crc kubenswrapper[4723]: I0309 13:01:21.634837 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c8c6bc-647d-452a-9bee-7b7ccafd2816\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ace64875a6e7a6735b141be933b0fe4fb330351eb8b4c9a943e32312d019849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:21Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:21 crc kubenswrapper[4723]: I0309 13:01:21.652917 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:21Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:21 crc kubenswrapper[4723]: I0309 13:01:21.670496 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:21Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:21 crc kubenswrapper[4723]: I0309 13:01:21.685023 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mbhtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff98527-c16a-48b6-94a8-577172ec6dce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cefc2989387f88b8b140c8aed05c20406db5392c88da114b489cbd7c177f98ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkv4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc0517301d8c26db37c831773237c0772b14e01392c032f864200d27ec2207ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkv4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mbhtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:21Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:21 crc kubenswrapper[4723]: I0309 13:01:21.703592 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de264c73b21eebce2a476c8f7d57aa48db637aa83bb6f49be76940a38163b487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:21Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:21 crc kubenswrapper[4723]: I0309 13:01:21.725121 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jb44m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0d50f12886941ea85f21371c6996c216a385703ebb13cd05080fdad052e271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jb44m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:21Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:21 crc kubenswrapper[4723]: I0309 13:01:21.748049 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edb23619-78b6-4d63-aacf-98d7ce86bc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db49efb6d6e555b280f71c32194247ba51a121250abba598286800970b6bec74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74e157a2b8f0e011a72fd9bb3979bede5694bfab635bf7a6c4ee6780d35119e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d41e316c1f5e87b224e0037a94379033e22254097db3d2decf47a5fd335f766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://093509abfd0f010ebce9a116475da8c898a65569861b646314023d4ca81fea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f52ae34a25520c1f48c20a86ec3e00706385ab5dcfeb3bfd616422b454cf5c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84ae27509fbfe9c2f53f5de2a01cfc809ca8abe68dcd96a96963438f0924ddb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed3c87a39e89b3d451419241ae60097d87d31eb4e982e9937611c9d1acd2d95f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed3c87a39e89b3d451419241ae60097d87d31eb4e982e9937611c9d1acd2d95f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:01:01Z\\\",\\\"message\\\":\\\"Route (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0309 13:01:01.778990 7056 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0309 13:01:01.779018 7056 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:01:01.779050 7056 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:01:01.779211 7056 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0309 13:01:01.779810 7056 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:01:01.785475 7056 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0309 13:01:01.785506 7056 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0309 13:01:01.785571 7056 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:01:01.785614 7056 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0309 13:01:01.785734 7056 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:01:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zngwx_openshift-ovn-kubernetes(edb23619-78b6-4d63-aacf-98d7ce86bc5b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca158acc1cc3ee4b2280f8d4a28c1ba412659c33412a3f887c38601163536461\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zngwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:21Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:21 crc kubenswrapper[4723]: I0309 13:01:21.766041 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:21Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:21 crc kubenswrapper[4723]: I0309 13:01:21.779836 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l2l9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91019298-2c2b-48a9-8813-cd58d2681f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fc4cb656087f1906e4337675f18465d9bb6b9e324c185a76aeac922438e5b5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l2l9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:21Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:21 crc kubenswrapper[4723]: I0309 13:01:21.794454 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"983d5ed4-cfc7-402a-b226-29dc071c6e4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://090e849795de6303acc7ba93444e1096a0cbb499371f86f3ce97d74a639f8ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b99b2d350bcf051156986ebffa7486b5fd35d0dcd70f0163bbbeb54050408e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cfjq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:21Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:21 crc kubenswrapper[4723]: I0309 13:01:21.818333 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lztcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f09eae28-36d6-4c16-8aab-bbd93934f921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gshcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gshcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lztcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:21Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:21 crc kubenswrapper[4723]: I0309 13:01:21.839574 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9771677551554a5d85cade16547356c2335fa472e5b66b2b18d44a5b5d45c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T12:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 12:59:55.482549 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 12:59:55.482783 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 12:59:55.483649 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2499336393/tls.crt::/tmp/serving-cert-2499336393/tls.key\\\\\\\"\\\\nI0309 12:59:56.049731 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 12:59:56.052155 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 12:59:56.052201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 12:59:56.052220 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 12:59:56.052226 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 12:59:56.066627 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 12:59:56.066654 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066658 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 12:59:56.066665 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 12:59:56.066668 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 12:59:56.066671 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 12:59:56.067066 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 12:59:56.069624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T12:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:21Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:21 crc kubenswrapper[4723]: I0309 13:01:21.860254 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8bd72449608bbfb68165af6302aae01ce97dd9cf13392ccd6366dc7fbb7e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:21Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:21 crc kubenswrapper[4723]: I0309 13:01:21.874426 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ca3fb3acb84f38e208051240b02c48f3a17be1ed5b3e309c769ae601bac22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06f3f6a6fc3af6d7c96d5860969110d357f79b45ffba8797c8939c39c299a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:21Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:21 crc kubenswrapper[4723]: I0309 13:01:21.886816 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e3a7da2-df7a-46e6-88cf-14c2b5adbf46\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b5ea090dbcca9cb0683626e5137a3e9aea8749a1deb31339e45dc6db3336f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c342ce2d9033896e1d8f3c5428e8b03220c003d3c746103ac17b5cd564663091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0777edbfbe04e20d91d97fb2daa6bc7ef67cbf56cdd1fdb20849350a5f0d4f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://228079a220bbfc7f4d8dc9529398b36cf51b4093c84900cb89f45128c9049f5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://228079a220bbfc7f4d8dc9529398b36cf51b4093c84900cb89f45128c9049f5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:21Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:21 crc kubenswrapper[4723]: I0309 13:01:21.901685 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g92rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"242d3bf9-4462-4562-813a-f3548edc94fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca482bd84eb9a872cae7dd27e7606909c24675c47b5def6c7d3b0a947fe198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca482bd84eb9a872cae7dd27e7606909c24675c47b5def6c7d3b0a947fe198a5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:01:20Z\\\",\\\"message\\\":\\\"2026-03-09T13:00:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d020c429-96e0-45bc-9331-b166f3b71432\\\\n2026-03-09T13:00:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d020c429-96e0-45bc-9331-b166f3b71432 to /host/opt/cni/bin/\\\\n2026-03-09T13:00:35Z [verbose] multus-daemon started\\\\n2026-03-09T13:00:35Z [verbose] Readiness Indicator file check\\\\n2026-03-09T13:01:20Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7r96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g92rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:21Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:21 crc kubenswrapper[4723]: I0309 13:01:21.916799 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7dh57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b334debc-b7ae-4bc1-8e6d-44e4bc86bb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://804f60c896756f05ea5aa24c1a6083c987a657c0753a2a49e401e67f74bee212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7dh57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:21Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:21 crc kubenswrapper[4723]: E0309 13:01:21.982035 4723 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:01:22 crc kubenswrapper[4723]: I0309 13:01:22.620609 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-g92rf_242d3bf9-4462-4562-813a-f3548edc94fd/kube-multus/0.log" Mar 09 13:01:22 crc kubenswrapper[4723]: I0309 13:01:22.620689 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-g92rf" event={"ID":"242d3bf9-4462-4562-813a-f3548edc94fd","Type":"ContainerStarted","Data":"9e3f00295ab5c8b08630d59915b6f04285bc0f618ea72db8e5954cd6b4a21bee"} Mar 09 13:01:22 crc kubenswrapper[4723]: I0309 13:01:22.640017 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9771677551554a5d85cade16547356c2335fa472e5b66b2b18d44a5b5d45c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T12:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 12:59:55.482549 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 12:59:55.482783 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 12:59:55.483649 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2499336393/tls.crt::/tmp/serving-cert-2499336393/tls.key\\\\\\\"\\\\nI0309 12:59:56.049731 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 12:59:56.052155 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 12:59:56.052201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 12:59:56.052220 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 12:59:56.052226 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 12:59:56.066627 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 12:59:56.066654 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066658 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 12:59:56.066665 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 12:59:56.066668 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 12:59:56.066671 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 12:59:56.067066 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 12:59:56.069624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T12:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:22Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:22 crc kubenswrapper[4723]: I0309 13:01:22.665754 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8bd72449608bbfb68165af6302aae01ce97dd9cf13392ccd6366dc7fbb7e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:22Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:22 crc kubenswrapper[4723]: I0309 13:01:22.684320 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ca3fb3acb84f38e208051240b02c48f3a17be1ed5b3e309c769ae601bac22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06f3f6a6fc3af6d7c96d5860969110d357f79b45ffba8797c8939c39c299a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:22Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:22 crc kubenswrapper[4723]: I0309 13:01:22.714207 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:22Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:22 crc kubenswrapper[4723]: I0309 13:01:22.734938 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l2l9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91019298-2c2b-48a9-8813-cd58d2681f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fc4cb656087f1906e4337675f18465d9bb6b9e324c185a76aeac922438e5b5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l2l9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:22Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:22 crc kubenswrapper[4723]: I0309 13:01:22.754333 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"983d5ed4-cfc7-402a-b226-29dc071c6e4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://090e849795de6303acc7ba93444e1096a0cbb499371f86f3ce97d74a639f8ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b99b2d350bcf051156986ebffa7486b5fd35d0dcd70f0163bbbeb54050408e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cfjq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:22Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:22 crc kubenswrapper[4723]: I0309 13:01:22.772810 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lztcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f09eae28-36d6-4c16-8aab-bbd93934f921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gshcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gshcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lztcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:22Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:22 crc kubenswrapper[4723]: I0309 13:01:22.793608 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e3a7da2-df7a-46e6-88cf-14c2b5adbf46\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b5ea090dbcca9cb0683626e5137a3e9aea8749a1deb31339e45dc6db3336f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c342ce2d9033896e1d8f3c5428e8b03220c003d3c746103ac17b5cd564663091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0777edbfbe04e20d91d97fb2daa6bc7ef67cbf56cdd1fdb20849350a5f0d4f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://228079a220bbfc7f4d8dc9529398b36cf51b4093c84900cb89f45128c9049f5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://228079a220bbfc7f4d8dc9529398b36cf51b4093c84900cb89f45128c9049f5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:22Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:22 crc kubenswrapper[4723]: I0309 13:01:22.816636 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g92rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"242d3bf9-4462-4562-813a-f3548edc94fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e3f00295ab5c8b08630d59915b6f04285bc0f618ea72db8e5954cd6b4a21bee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca482bd84eb9a872cae7dd27e7606909c24675c47b5def6c7d3b0a947fe198a5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:01:20Z\\\",\\\"message\\\":\\\"2026-03-09T13:00:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d020c429-96e0-45bc-9331-b166f3b71432\\\\n2026-03-09T13:00:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d020c429-96e0-45bc-9331-b166f3b71432 to /host/opt/cni/bin/\\\\n2026-03-09T13:00:35Z [verbose] multus-daemon started\\\\n2026-03-09T13:00:35Z [verbose] Readiness Indicator file check\\\\n2026-03-09T13:01:20Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7r96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g92rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:22Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:22 crc kubenswrapper[4723]: I0309 13:01:22.834612 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7dh57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b334debc-b7ae-4bc1-8e6d-44e4bc86bb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://804f60c896756f05ea5aa24c1a6083c987a657c0753a2a49e401e67f74bee212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7dh57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:22Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:22 crc kubenswrapper[4723]: I0309 13:01:22.851165 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:22Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:22 crc kubenswrapper[4723]: I0309 13:01:22.871993 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:22Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:22 crc kubenswrapper[4723]: I0309 13:01:22.880128 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:01:22 crc kubenswrapper[4723]: I0309 13:01:22.880178 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:01:22 crc kubenswrapper[4723]: I0309 13:01:22.880242 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:01:22 crc kubenswrapper[4723]: E0309 13:01:22.880436 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:01:22 crc kubenswrapper[4723]: I0309 13:01:22.880556 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lztcd" Mar 09 13:01:22 crc kubenswrapper[4723]: E0309 13:01:22.880628 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:01:22 crc kubenswrapper[4723]: E0309 13:01:22.880842 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lztcd" podUID="f09eae28-36d6-4c16-8aab-bbd93934f921" Mar 09 13:01:22 crc kubenswrapper[4723]: E0309 13:01:22.881099 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:01:22 crc kubenswrapper[4723]: I0309 13:01:22.890626 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mbhtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff98527-c16a-48b6-94a8-577172ec6dce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cefc2989387f88b8b140c8aed05c20406db5392c88da114b489cbd7c177f98ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkv4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc0517301d8c26db37c831773237c0772b14e01392c032f864200d27ec2207ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkv4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mbhtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:22Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:22 crc kubenswrapper[4723]: I0309 13:01:22.906960 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c8c6bc-647d-452a-9bee-7b7ccafd2816\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ace64875a6e7a6735b141be933b0fe4fb330351eb8b4c9a943e32312d019849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:22Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:22 crc kubenswrapper[4723]: I0309 13:01:22.956820 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jb44m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0d50f12886941ea85f21371c6996c216a385703ebb13cd05080fdad052e271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jb44m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:22Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:22 crc kubenswrapper[4723]: I0309 13:01:22.992471 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edb23619-78b6-4d63-aacf-98d7ce86bc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db49efb6d6e555b280f71c32194247ba51a121250abba598286800970b6bec74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74e157a2b8f0e011a72fd9bb3979bede5694bfab635bf7a6c4ee6780d35119e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d41e316c1f5e87b224e0037a94379033e22254097db3d2decf47a5fd335f766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://093509abfd0f010ebce9a116475da8c898a65569861b646314023d4ca81fea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f52ae34a25520c1f48c20a86ec3e00706385ab5dcfeb3bfd616422b454cf5c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84ae27509fbfe9c2f53f5de2a01cfc809ca8abe68dcd96a96963438f0924ddb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed3c87a39e89b3d451419241ae60097d87d31eb4e982e9937611c9d1acd2d95f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed3c87a39e89b3d451419241ae60097d87d31eb4e982e9937611c9d1acd2d95f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:01:01Z\\\",\\\"message\\\":\\\"Route (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0309 13:01:01.778990 7056 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0309 13:01:01.779018 7056 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:01:01.779050 7056 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:01:01.779211 7056 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0309 13:01:01.779810 7056 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:01:01.785475 7056 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0309 13:01:01.785506 7056 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0309 13:01:01.785571 7056 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:01:01.785614 7056 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0309 13:01:01.785734 7056 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:01:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zngwx_openshift-ovn-kubernetes(edb23619-78b6-4d63-aacf-98d7ce86bc5b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca158acc1cc3ee4b2280f8d4a28c1ba412659c33412a3f887c38601163536461\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zngwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:22Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:23 crc kubenswrapper[4723]: I0309 13:01:23.008476 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de264c73b21eebce2a476c8f7d57aa48db637aa83bb6f49be76940a38163b487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:23Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:24 crc kubenswrapper[4723]: I0309 13:01:24.879845 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:01:24 crc kubenswrapper[4723]: I0309 13:01:24.880080 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lztcd" Mar 09 13:01:24 crc kubenswrapper[4723]: E0309 13:01:24.880427 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:01:24 crc kubenswrapper[4723]: I0309 13:01:24.880143 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:01:24 crc kubenswrapper[4723]: I0309 13:01:24.880080 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:01:24 crc kubenswrapper[4723]: E0309 13:01:24.880604 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lztcd" podUID="f09eae28-36d6-4c16-8aab-bbd93934f921" Mar 09 13:01:24 crc kubenswrapper[4723]: E0309 13:01:24.880715 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:01:24 crc kubenswrapper[4723]: E0309 13:01:24.880924 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:01:26 crc kubenswrapper[4723]: I0309 13:01:26.403440 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:01:26 crc kubenswrapper[4723]: I0309 13:01:26.403501 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:01:26 crc kubenswrapper[4723]: I0309 13:01:26.403518 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:01:26 crc kubenswrapper[4723]: I0309 13:01:26.403543 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:01:26 crc kubenswrapper[4723]: I0309 13:01:26.403564 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:01:26Z","lastTransitionTime":"2026-03-09T13:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:01:26 crc kubenswrapper[4723]: E0309 13:01:26.422039 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7a1a0b61-b39c-449b-8223-316b94b8c26c\\\",\\\"systemUUID\\\":\\\"548816a4-d164-4614-b3ad-c3e4f48e94b9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:26Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:26 crc kubenswrapper[4723]: I0309 13:01:26.426850 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:01:26 crc kubenswrapper[4723]: I0309 13:01:26.426918 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:01:26 crc kubenswrapper[4723]: I0309 13:01:26.426933 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:01:26 crc kubenswrapper[4723]: I0309 13:01:26.426951 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:01:26 crc kubenswrapper[4723]: I0309 13:01:26.426963 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:01:26Z","lastTransitionTime":"2026-03-09T13:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:01:26 crc kubenswrapper[4723]: E0309 13:01:26.441422 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7a1a0b61-b39c-449b-8223-316b94b8c26c\\\",\\\"systemUUID\\\":\\\"548816a4-d164-4614-b3ad-c3e4f48e94b9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:26Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:26 crc kubenswrapper[4723]: I0309 13:01:26.446248 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:01:26 crc kubenswrapper[4723]: I0309 13:01:26.446303 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:01:26 crc kubenswrapper[4723]: I0309 13:01:26.446315 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:01:26 crc kubenswrapper[4723]: I0309 13:01:26.446333 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:01:26 crc kubenswrapper[4723]: I0309 13:01:26.446345 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:01:26Z","lastTransitionTime":"2026-03-09T13:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:01:26 crc kubenswrapper[4723]: E0309 13:01:26.472767 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7a1a0b61-b39c-449b-8223-316b94b8c26c\\\",\\\"systemUUID\\\":\\\"548816a4-d164-4614-b3ad-c3e4f48e94b9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:26Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:26 crc kubenswrapper[4723]: I0309 13:01:26.477234 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:01:26 crc kubenswrapper[4723]: I0309 13:01:26.477301 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:01:26 crc kubenswrapper[4723]: I0309 13:01:26.477325 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:01:26 crc kubenswrapper[4723]: I0309 13:01:26.477350 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:01:26 crc kubenswrapper[4723]: I0309 13:01:26.477366 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:01:26Z","lastTransitionTime":"2026-03-09T13:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:01:26 crc kubenswrapper[4723]: E0309 13:01:26.496371 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7a1a0b61-b39c-449b-8223-316b94b8c26c\\\",\\\"systemUUID\\\":\\\"548816a4-d164-4614-b3ad-c3e4f48e94b9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:26Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:26 crc kubenswrapper[4723]: I0309 13:01:26.501051 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:01:26 crc kubenswrapper[4723]: I0309 13:01:26.501096 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:01:26 crc kubenswrapper[4723]: I0309 13:01:26.501113 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:01:26 crc kubenswrapper[4723]: I0309 13:01:26.501126 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:01:26 crc kubenswrapper[4723]: I0309 13:01:26.501135 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:01:26Z","lastTransitionTime":"2026-03-09T13:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:01:26 crc kubenswrapper[4723]: E0309 13:01:26.514811 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:01:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7a1a0b61-b39c-449b-8223-316b94b8c26c\\\",\\\"systemUUID\\\":\\\"548816a4-d164-4614-b3ad-c3e4f48e94b9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:26Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:26 crc kubenswrapper[4723]: E0309 13:01:26.514991 4723 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 13:01:26 crc kubenswrapper[4723]: I0309 13:01:26.881036 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:01:26 crc kubenswrapper[4723]: I0309 13:01:26.881132 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:01:26 crc kubenswrapper[4723]: I0309 13:01:26.881131 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:01:26 crc kubenswrapper[4723]: E0309 13:01:26.881223 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:01:26 crc kubenswrapper[4723]: E0309 13:01:26.881660 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:01:26 crc kubenswrapper[4723]: I0309 13:01:26.881852 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lztcd" Mar 09 13:01:26 crc kubenswrapper[4723]: E0309 13:01:26.882039 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lztcd" podUID="f09eae28-36d6-4c16-8aab-bbd93934f921" Mar 09 13:01:26 crc kubenswrapper[4723]: E0309 13:01:26.882222 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:01:26 crc kubenswrapper[4723]: I0309 13:01:26.898422 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 09 13:01:26 crc kubenswrapper[4723]: I0309 13:01:26.903615 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de264c73b21eebce2a476c8f7d57aa48db637aa83bb6f49be76940a38163b487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:26Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:26 crc kubenswrapper[4723]: I0309 13:01:26.926441 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jb44m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e3fdcc4-63d9-4867-b5d8-5f0a5be569a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0d50f12886941ea85f21371c6996c216a385703ebb13cd05080fdad052e271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f858f5685a6a4f79370931151578605b82127bdece328feb87a977984849dfea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd5e46a2138b63861e055412a466acbdaf4b70c5e237e403f8adc9947f578db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44f047e8a848be175e312387a8ee219063a3e98acc55516d373974282d0782a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08cf70884349c1cb742fe3eac5b719e7aa4bd5a9d3b4410131a7cdc7429395a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e64811b7194c3ef991a19f11540e675e50e9408389a63b055b7d542dc404291b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92c3d630a65dd849154e18837ad8632ee7e93a27b94c6ebb0f1d61972301c523\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jb44m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:26Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:26 crc kubenswrapper[4723]: I0309 13:01:26.959102 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edb23619-78b6-4d63-aacf-98d7ce86bc5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db49efb6d6e555b280f71c32194247ba51a121250abba598286800970b6bec74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74e157a2b8f0e011a72fd9bb3979bede5694bfab635bf7a6c4ee6780d35119e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d41e316c1f5e87b224e0037a94379033e22254097db3d2decf47a5fd335f766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://093509abfd0f010ebce9a116475da8c898a65569861b646314023d4ca81fea7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f52ae34a25520c1f48c20a86ec3e00706385ab5dcfeb3bfd616422b454cf5c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84ae27509fbfe9c2f53f5de2a01cfc809ca8abe68dcd96a96963438f0924ddb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed3c87a39e89b3d451419241ae60097d87d31eb4e982e9937611c9d1acd2d95f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed3c87a39e89b3d451419241ae60097d87d31eb4e982e9937611c9d1acd2d95f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:01:01Z\\\",\\\"message\\\":\\\"Route (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0309 13:01:01.778990 7056 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0309 13:01:01.779018 7056 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:01:01.779050 7056 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:01:01.779211 7056 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0309 13:01:01.779810 7056 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:01:01.785475 7056 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0309 13:01:01.785506 7056 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0309 13:01:01.785571 7056 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:01:01.785614 7056 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0309 13:01:01.785734 7056 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:01:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zngwx_openshift-ovn-kubernetes(edb23619-78b6-4d63-aacf-98d7ce86bc5b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca158acc1cc3ee4b2280f8d4a28c1ba412659c33412a3f887c38601163536461\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjtfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zngwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:26Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:26 crc kubenswrapper[4723]: I0309 13:01:26.976712 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea9771677551554a5d85cade16547356c2335fa472e5b66b2b18d44a5b5d45c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T12:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 12:59:55.482549 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 12:59:55.482783 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 12:59:55.483649 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2499336393/tls.crt::/tmp/serving-cert-2499336393/tls.key\\\\\\\"\\\\nI0309 12:59:56.049731 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 12:59:56.052155 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 12:59:56.052201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 12:59:56.052220 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 12:59:56.052226 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 12:59:56.066627 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 12:59:56.066654 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066658 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 12:59:56.066662 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 12:59:56.066665 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 12:59:56.066668 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 12:59:56.066671 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 12:59:56.067066 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 12:59:56.069624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T12:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:26Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:26 crc kubenswrapper[4723]: E0309 13:01:26.982467 4723 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:01:26 crc kubenswrapper[4723]: I0309 13:01:26.997475 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8bd72449608bbfb68165af6302aae01ce97dd9cf13392ccd6366dc7fbb7e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:26Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:27 crc kubenswrapper[4723]: I0309 13:01:27.012160 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ca3fb3acb84f38e208051240b02c48f3a17be1ed5b3e309c769ae601bac22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06f3f6a6fc3af6d7c96d5860969110d357f79b45ffba8797c8939c39c299a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:27Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:27 crc kubenswrapper[4723]: I0309 13:01:27.026093 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:27Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:27 crc kubenswrapper[4723]: I0309 13:01:27.037368 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l2l9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91019298-2c2b-48a9-8813-cd58d2681f71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fc4cb656087f1906e4337675f18465d9bb6b9e324c185a76aeac922438e5b5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l2l9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:27Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:27 crc kubenswrapper[4723]: I0309 13:01:27.048782 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"983d5ed4-cfc7-402a-b226-29dc071c6e4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://090e849795de6303acc7ba93444e1096a0cbb499371f86f3ce97d74a639f8ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b99b2d350bcf051156986ebffa7486b5fd35d0dcd70f0163bbbeb54050408e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sll9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cfjq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:27Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:27 crc kubenswrapper[4723]: I0309 13:01:27.059556 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lztcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f09eae28-36d6-4c16-8aab-bbd93934f921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gshcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gshcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lztcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:27Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:27 crc kubenswrapper[4723]: I0309 13:01:27.071005 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e3a7da2-df7a-46e6-88cf-14c2b5adbf46\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b5ea090dbcca9cb0683626e5137a3e9aea8749a1deb31339e45dc6db3336f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c342ce2d9033896e1d8f3c5428e8b03220c003d3c746103ac17b5cd564663091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0777edbfbe04e20d91d97fb2daa6bc7ef67cbf56cdd1fdb20849350a5f0d4f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://228079a220bbfc7f4d8dc9529398b36cf51b4093c84900cb89f45128c9049f5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://228079a220bbfc7f4d8dc9529398b36cf51b4093c84900cb89f45128c9049f5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:27Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:27 crc kubenswrapper[4723]: I0309 13:01:27.082936 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g92rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"242d3bf9-4462-4562-813a-f3548edc94fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e3f00295ab5c8b08630d59915b6f04285bc0f618ea72db8e5954cd6b4a21bee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca482bd84eb9a872cae7dd27e7606909c24675c47b5def6c7d3b0a947fe198a5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:01:20Z\\\",\\\"message\\\":\\\"2026-03-09T13:00:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d020c429-96e0-45bc-9331-b166f3b71432\\\\n2026-03-09T13:00:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d020c429-96e0-45bc-9331-b166f3b71432 to /host/opt/cni/bin/\\\\n2026-03-09T13:00:35Z [verbose] multus-daemon started\\\\n2026-03-09T13:00:35Z [verbose] Readiness Indicator file check\\\\n2026-03-09T13:01:20Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:00:34Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c7r96\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g92rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:27Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:27 crc kubenswrapper[4723]: I0309 13:01:27.091766 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7dh57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b334debc-b7ae-4bc1-8e6d-44e4bc86bb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://804f60c896756f05ea5aa24c1a6083c987a657c0753a2a49e401e67f74bee212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nstjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7dh57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:27Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:27 crc kubenswrapper[4723]: I0309 13:01:27.102037 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c8c6bc-647d-452a-9bee-7b7ccafd2816\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T12:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ace64875a6e7a6735b141be933b0fe4fb330351eb8b4c9a943e32312d019849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T12:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f54175498340c3fc358df8c6f1eefee06af9b495a4a3fba98c3856b2382ca165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T12:58:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T12:58:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T12:58:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:27Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:27 crc kubenswrapper[4723]: I0309 13:01:27.114261 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:27Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:27 crc kubenswrapper[4723]: I0309 13:01:27.127524 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:27Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:27 crc kubenswrapper[4723]: I0309 13:01:27.138777 4723 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mbhtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff98527-c16a-48b6-94a8-577172ec6dce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:00:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cefc2989387f88b8b140c8aed05c20406db5392c88da114b489cbd7c177f98ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkv4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc0517301d8c26db37c831773237c0772b14e01392c032f864200d27ec2207ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:00:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gkv4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:00:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mbhtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:01:27Z is after 2025-08-24T17:21:41Z" Mar 09 13:01:28 crc kubenswrapper[4723]: I0309 13:01:28.880662 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lztcd" Mar 09 13:01:28 crc kubenswrapper[4723]: I0309 13:01:28.880662 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:01:28 crc kubenswrapper[4723]: I0309 13:01:28.880695 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:01:28 crc kubenswrapper[4723]: I0309 13:01:28.880748 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:01:28 crc kubenswrapper[4723]: E0309 13:01:28.881007 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lztcd" podUID="f09eae28-36d6-4c16-8aab-bbd93934f921" Mar 09 13:01:28 crc kubenswrapper[4723]: E0309 13:01:28.881105 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:01:28 crc kubenswrapper[4723]: E0309 13:01:28.881216 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:01:28 crc kubenswrapper[4723]: E0309 13:01:28.881384 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:01:30 crc kubenswrapper[4723]: I0309 13:01:30.880922 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:01:30 crc kubenswrapper[4723]: I0309 13:01:30.880960 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:01:30 crc kubenswrapper[4723]: I0309 13:01:30.881061 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:01:30 crc kubenswrapper[4723]: E0309 13:01:30.881182 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:01:30 crc kubenswrapper[4723]: I0309 13:01:30.881202 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lztcd" Mar 09 13:01:30 crc kubenswrapper[4723]: E0309 13:01:30.881445 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lztcd" podUID="f09eae28-36d6-4c16-8aab-bbd93934f921" Mar 09 13:01:30 crc kubenswrapper[4723]: E0309 13:01:30.881568 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:01:30 crc kubenswrapper[4723]: E0309 13:01:30.881686 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:01:31 crc kubenswrapper[4723]: I0309 13:01:31.881342 4723 scope.go:117] "RemoveContainer" containerID="ed3c87a39e89b3d451419241ae60097d87d31eb4e982e9937611c9d1acd2d95f" Mar 09 13:01:31 crc kubenswrapper[4723]: E0309 13:01:31.983713 4723 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:01:32 crc kubenswrapper[4723]: I0309 13:01:32.663635 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zngwx_edb23619-78b6-4d63-aacf-98d7ce86bc5b/ovnkube-controller/2.log" Mar 09 13:01:32 crc kubenswrapper[4723]: I0309 13:01:32.669211 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" event={"ID":"edb23619-78b6-4d63-aacf-98d7ce86bc5b","Type":"ContainerStarted","Data":"9a74506556c2b274ba3b56ae6f4ef88ce7ffffaf51be22738ce1491bcb6f426e"} Mar 09 13:01:32 crc kubenswrapper[4723]: I0309 13:01:32.669810 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:01:32 crc kubenswrapper[4723]: I0309 13:01:32.708476 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-l2l9x" podStartSLOduration=96.708447543 podStartE2EDuration="1m36.708447543s" podCreationTimestamp="2026-03-09 12:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:32.694727573 +0000 UTC m=+166.709195133" watchObservedRunningTime="2026-03-09 13:01:32.708447543 +0000 UTC m=+166.722915103" Mar 09 13:01:32 crc kubenswrapper[4723]: I0309 13:01:32.722586 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podStartSLOduration=96.722565514 podStartE2EDuration="1m36.722565514s" podCreationTimestamp="2026-03-09 12:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:32.708841243 +0000 UTC m=+166.723308793" watchObservedRunningTime="2026-03-09 13:01:32.722565514 +0000 UTC m=+166.737033064" Mar 09 13:01:32 crc kubenswrapper[4723]: I0309 13:01:32.742661 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=74.742645836 podStartE2EDuration="1m14.742645836s" podCreationTimestamp="2026-03-09 13:00:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:32.742459722 +0000 UTC m=+166.756927282" watchObservedRunningTime="2026-03-09 13:01:32.742645836 +0000 UTC m=+166.757113386" Mar 09 13:01:32 crc kubenswrapper[4723]: I0309 13:01:32.795404 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=6.795384993 podStartE2EDuration="6.795384993s" podCreationTimestamp="2026-03-09 13:01:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:32.793815073 +0000 UTC m=+166.808282643" watchObservedRunningTime="2026-03-09 13:01:32.795384993 +0000 UTC m=+166.809852553" Mar 09 13:01:32 crc kubenswrapper[4723]: I0309 13:01:32.814279 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-lztcd"] Mar 09 13:01:32 crc kubenswrapper[4723]: I0309 13:01:32.814453 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lztcd" Mar 09 13:01:32 crc kubenswrapper[4723]: E0309 13:01:32.814566 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lztcd" podUID="f09eae28-36d6-4c16-8aab-bbd93934f921" Mar 09 13:01:32 crc kubenswrapper[4723]: I0309 13:01:32.818639 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=37.818623486 podStartE2EDuration="37.818623486s" podCreationTimestamp="2026-03-09 13:00:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:32.817957059 +0000 UTC m=+166.832424599" watchObservedRunningTime="2026-03-09 13:01:32.818623486 +0000 UTC m=+166.833091026" Mar 09 13:01:32 crc kubenswrapper[4723]: I0309 13:01:32.833221 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-g92rf" podStartSLOduration=96.833203728 podStartE2EDuration="1m36.833203728s" podCreationTimestamp="2026-03-09 12:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:32.833128206 +0000 UTC m=+166.847595756" watchObservedRunningTime="2026-03-09 13:01:32.833203728 +0000 UTC m=+166.847671258" Mar 09 13:01:32 crc kubenswrapper[4723]: I0309 13:01:32.858350 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-7dh57" podStartSLOduration=96.85832874 podStartE2EDuration="1m36.85832874s" podCreationTimestamp="2026-03-09 12:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:32.847334309 +0000 UTC m=+166.861801869" watchObservedRunningTime="2026-03-09 13:01:32.85832874 +0000 UTC m=+166.872796280" Mar 09 13:01:32 crc kubenswrapper[4723]: I0309 13:01:32.871972 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=73.871952118 podStartE2EDuration="1m13.871952118s" podCreationTimestamp="2026-03-09 13:00:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:32.858902605 +0000 UTC m=+166.873370155" watchObservedRunningTime="2026-03-09 13:01:32.871952118 +0000 UTC m=+166.886419658" Mar 09 13:01:32 crc kubenswrapper[4723]: I0309 13:01:32.881999 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:01:32 crc kubenswrapper[4723]: I0309 13:01:32.882092 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:01:32 crc kubenswrapper[4723]: E0309 13:01:32.882986 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:01:32 crc kubenswrapper[4723]: I0309 13:01:32.882318 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:01:32 crc kubenswrapper[4723]: E0309 13:01:32.883107 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:01:32 crc kubenswrapper[4723]: E0309 13:01:32.883215 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:01:32 crc kubenswrapper[4723]: I0309 13:01:32.897127 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mbhtt" podStartSLOduration=96.8971112 podStartE2EDuration="1m36.8971112s" podCreationTimestamp="2026-03-09 12:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:32.896418412 +0000 UTC m=+166.910885972" watchObservedRunningTime="2026-03-09 13:01:32.8971112 +0000 UTC m=+166.911578740" Mar 09 13:01:32 crc kubenswrapper[4723]: I0309 13:01:32.959239 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" podStartSLOduration=96.959220376 podStartE2EDuration="1m36.959220376s" podCreationTimestamp="2026-03-09 12:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:32.959031751 +0000 UTC m=+166.973499291" watchObservedRunningTime="2026-03-09 13:01:32.959220376 +0000 UTC m=+166.973687916" Mar 09 13:01:32 crc kubenswrapper[4723]: I0309 13:01:32.959619 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-jb44m" podStartSLOduration=96.959611816 podStartE2EDuration="1m36.959611816s" podCreationTimestamp="2026-03-09 12:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:32.935612463 +0000 UTC m=+166.950079993" watchObservedRunningTime="2026-03-09 13:01:32.959611816 +0000 UTC m=+166.974079356" Mar 09 13:01:33 crc kubenswrapper[4723]: I0309 13:01:33.903374 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 09 13:01:34 crc kubenswrapper[4723]: I0309 13:01:34.879952 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:01:34 crc kubenswrapper[4723]: I0309 13:01:34.880030 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:01:34 crc kubenswrapper[4723]: I0309 13:01:34.879986 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:01:34 crc kubenswrapper[4723]: E0309 13:01:34.880424 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:01:34 crc kubenswrapper[4723]: E0309 13:01:34.880199 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:01:34 crc kubenswrapper[4723]: I0309 13:01:34.880555 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lztcd" Mar 09 13:01:34 crc kubenswrapper[4723]: E0309 13:01:34.880681 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:01:34 crc kubenswrapper[4723]: E0309 13:01:34.880799 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lztcd" podUID="f09eae28-36d6-4c16-8aab-bbd93934f921" Mar 09 13:01:36 crc kubenswrapper[4723]: I0309 13:01:36.673817 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:01:36 crc kubenswrapper[4723]: I0309 13:01:36.673923 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:01:36 crc kubenswrapper[4723]: I0309 13:01:36.673945 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:01:36 crc kubenswrapper[4723]: I0309 13:01:36.673971 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:01:36 crc kubenswrapper[4723]: I0309 13:01:36.673991 4723 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:01:36Z","lastTransitionTime":"2026-03-09T13:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:01:36 crc kubenswrapper[4723]: I0309 13:01:36.742166 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-rdcfp"] Mar 09 13:01:36 crc kubenswrapper[4723]: I0309 13:01:36.742826 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rdcfp" Mar 09 13:01:36 crc kubenswrapper[4723]: I0309 13:01:36.746698 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 09 13:01:36 crc kubenswrapper[4723]: I0309 13:01:36.751421 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 09 13:01:36 crc kubenswrapper[4723]: I0309 13:01:36.751480 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 09 13:01:36 crc kubenswrapper[4723]: I0309 13:01:36.752908 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 09 13:01:36 crc kubenswrapper[4723]: I0309 13:01:36.796362 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=3.796333523 podStartE2EDuration="3.796333523s" podCreationTimestamp="2026-03-09 13:01:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:36.794087216 +0000 UTC m=+170.808554766" watchObservedRunningTime="2026-03-09 13:01:36.796333523 +0000 UTC m=+170.810801093" Mar 09 13:01:36 crc kubenswrapper[4723]: I0309 13:01:36.880727 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:01:36 crc kubenswrapper[4723]: I0309 13:01:36.880777 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:01:36 crc kubenswrapper[4723]: I0309 13:01:36.883217 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lztcd" Mar 09 13:01:36 crc kubenswrapper[4723]: I0309 13:01:36.883240 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:01:36 crc kubenswrapper[4723]: E0309 13:01:36.883385 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:01:36 crc kubenswrapper[4723]: E0309 13:01:36.883479 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lztcd" podUID="f09eae28-36d6-4c16-8aab-bbd93934f921" Mar 09 13:01:36 crc kubenswrapper[4723]: E0309 13:01:36.883666 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:01:36 crc kubenswrapper[4723]: E0309 13:01:36.883845 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:01:36 crc kubenswrapper[4723]: I0309 13:01:36.884606 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4dd92bf7-976c-41f0-b06a-a3a5ec275e3a-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-rdcfp\" (UID: \"4dd92bf7-976c-41f0-b06a-a3a5ec275e3a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rdcfp" Mar 09 13:01:36 crc kubenswrapper[4723]: I0309 13:01:36.884685 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4dd92bf7-976c-41f0-b06a-a3a5ec275e3a-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-rdcfp\" (UID: \"4dd92bf7-976c-41f0-b06a-a3a5ec275e3a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rdcfp" Mar 09 13:01:36 crc kubenswrapper[4723]: I0309 13:01:36.884732 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4dd92bf7-976c-41f0-b06a-a3a5ec275e3a-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-rdcfp\" (UID: \"4dd92bf7-976c-41f0-b06a-a3a5ec275e3a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rdcfp" Mar 09 13:01:36 crc kubenswrapper[4723]: I0309 13:01:36.884765 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4dd92bf7-976c-41f0-b06a-a3a5ec275e3a-service-ca\") pod \"cluster-version-operator-5c965bbfc6-rdcfp\" (UID: \"4dd92bf7-976c-41f0-b06a-a3a5ec275e3a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rdcfp" Mar 09 13:01:36 crc kubenswrapper[4723]: I0309 13:01:36.884799 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4dd92bf7-976c-41f0-b06a-a3a5ec275e3a-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-rdcfp\" (UID: \"4dd92bf7-976c-41f0-b06a-a3a5ec275e3a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rdcfp" Mar 09 13:01:36 crc kubenswrapper[4723]: I0309 13:01:36.956893 4723 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 09 13:01:36 crc kubenswrapper[4723]: I0309 13:01:36.965421 4723 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 09 13:01:36 crc kubenswrapper[4723]: I0309 13:01:36.985760 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4dd92bf7-976c-41f0-b06a-a3a5ec275e3a-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-rdcfp\" (UID: \"4dd92bf7-976c-41f0-b06a-a3a5ec275e3a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rdcfp" Mar 09 13:01:36 crc kubenswrapper[4723]: I0309 13:01:36.986045 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4dd92bf7-976c-41f0-b06a-a3a5ec275e3a-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-rdcfp\" (UID: \"4dd92bf7-976c-41f0-b06a-a3a5ec275e3a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rdcfp" Mar 09 13:01:36 crc kubenswrapper[4723]: I0309 13:01:36.986157 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4dd92bf7-976c-41f0-b06a-a3a5ec275e3a-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-rdcfp\" (UID: \"4dd92bf7-976c-41f0-b06a-a3a5ec275e3a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rdcfp" Mar 09 13:01:36 crc kubenswrapper[4723]: I0309 13:01:36.986233 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4dd92bf7-976c-41f0-b06a-a3a5ec275e3a-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-rdcfp\" (UID: \"4dd92bf7-976c-41f0-b06a-a3a5ec275e3a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rdcfp" Mar 09 13:01:36 crc kubenswrapper[4723]: I0309 13:01:36.986264 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4dd92bf7-976c-41f0-b06a-a3a5ec275e3a-service-ca\") pod \"cluster-version-operator-5c965bbfc6-rdcfp\" (UID: \"4dd92bf7-976c-41f0-b06a-a3a5ec275e3a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rdcfp" Mar 09 13:01:36 crc kubenswrapper[4723]: I0309 13:01:36.986654 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4dd92bf7-976c-41f0-b06a-a3a5ec275e3a-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-rdcfp\" (UID: \"4dd92bf7-976c-41f0-b06a-a3a5ec275e3a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rdcfp" Mar 09 13:01:36 crc kubenswrapper[4723]: I0309 13:01:36.986695 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4dd92bf7-976c-41f0-b06a-a3a5ec275e3a-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-rdcfp\" (UID: \"4dd92bf7-976c-41f0-b06a-a3a5ec275e3a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rdcfp" Mar 09 13:01:36 crc kubenswrapper[4723]: I0309 13:01:36.988020 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4dd92bf7-976c-41f0-b06a-a3a5ec275e3a-service-ca\") pod \"cluster-version-operator-5c965bbfc6-rdcfp\" (UID: \"4dd92bf7-976c-41f0-b06a-a3a5ec275e3a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rdcfp" Mar 09 13:01:37 crc kubenswrapper[4723]: I0309 13:01:37.005606 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4dd92bf7-976c-41f0-b06a-a3a5ec275e3a-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-rdcfp\" (UID: \"4dd92bf7-976c-41f0-b06a-a3a5ec275e3a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rdcfp" Mar 09 13:01:37 crc kubenswrapper[4723]: I0309 13:01:37.019593 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4dd92bf7-976c-41f0-b06a-a3a5ec275e3a-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-rdcfp\" (UID: \"4dd92bf7-976c-41f0-b06a-a3a5ec275e3a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rdcfp" Mar 09 13:01:37 crc kubenswrapper[4723]: I0309 13:01:37.070570 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rdcfp" Mar 09 13:01:37 crc kubenswrapper[4723]: I0309 13:01:37.689266 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rdcfp" event={"ID":"4dd92bf7-976c-41f0-b06a-a3a5ec275e3a","Type":"ContainerStarted","Data":"7d696865ad0b4faab3b7a3bab02836975715c0aa7027fd9bdd664bfe2f454ed0"} Mar 09 13:01:37 crc kubenswrapper[4723]: I0309 13:01:37.689322 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rdcfp" event={"ID":"4dd92bf7-976c-41f0-b06a-a3a5ec275e3a","Type":"ContainerStarted","Data":"c1d26dc79e2b1daf127be62adfe9185468ab81893a1345f2a97aa83e82500fb0"} Mar 09 13:01:37 crc kubenswrapper[4723]: I0309 13:01:37.702593 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rdcfp" podStartSLOduration=101.702577461 podStartE2EDuration="1m41.702577461s" podCreationTimestamp="2026-03-09 12:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:37.702367856 +0000 UTC m=+171.716835416" watchObservedRunningTime="2026-03-09 13:01:37.702577461 +0000 UTC m=+171.717045001" Mar 09 13:01:38 crc kubenswrapper[4723]: I0309 13:01:38.880961 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:01:38 crc kubenswrapper[4723]: I0309 13:01:38.881113 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:01:38 crc kubenswrapper[4723]: I0309 13:01:38.881137 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:01:38 crc kubenswrapper[4723]: I0309 13:01:38.881224 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lztcd" Mar 09 13:01:38 crc kubenswrapper[4723]: I0309 13:01:38.885440 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 09 13:01:38 crc kubenswrapper[4723]: I0309 13:01:38.886362 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 09 13:01:38 crc kubenswrapper[4723]: I0309 13:01:38.887571 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 09 13:01:38 crc kubenswrapper[4723]: I0309 13:01:38.888541 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 09 13:01:38 crc kubenswrapper[4723]: I0309 13:01:38.889312 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 09 13:01:38 crc kubenswrapper[4723]: I0309 13:01:38.889755 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.772361 4723 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.831601 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-lwpcl"] Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.832249 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-lwpcl" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.834303 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vb8n2"] Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.835172 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-vb8n2" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.839332 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-6gtjl"] Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.839909 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-6gtjl" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.840789 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-dh6qm"] Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.841728 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.895811 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.896491 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.896623 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.896775 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.897075 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.897216 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.897270 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.897294 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.898302 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.898490 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.900965 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.901083 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.901123 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.901557 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.901616 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.901700 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.901797 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.901828 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.901922 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.902024 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.902072 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.902124 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.902224 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.902298 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.902334 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.912110 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.912353 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.913979 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.914349 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.927141 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.927583 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.927726 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.928709 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.942318 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.943902 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.945259 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.954231 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.989924 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-5wxvl"] Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.990465 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-zkxq4"] Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.990696 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wxvl" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.990734 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-95tgr"] Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.991649 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-8w6mb"] Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.991780 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zkxq4" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.992000 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-c5h5p"] Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.992370 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9gpjt"] Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.992550 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-95tgr" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.992708 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.992748 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-t6z8c"] Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.992911 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-8w6mb" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.992959 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-c5h5p" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.993042 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9gpjt" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.996565 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bc33ee8-964b-4b03-b564-5c66068629b9-serving-cert\") pod \"controller-manager-879f6c89f-vb8n2\" (UID: \"9bc33ee8-964b-4b03-b564-5c66068629b9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vb8n2" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.996601 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-dh6qm\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.996622 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61312a96-b8f6-431c-b24e-0046271cf40f-service-ca-bundle\") pod \"authentication-operator-69f744f599-lwpcl\" (UID: \"61312a96-b8f6-431c-b24e-0046271cf40f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lwpcl" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.996639 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrc6q\" (UniqueName: \"kubernetes.io/projected/9ae03d73-b21d-4004-a000-e49a547ef19d-kube-api-access-wrc6q\") pod \"oauth-openshift-558db77b4-dh6qm\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.996655 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61312a96-b8f6-431c-b24e-0046271cf40f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-lwpcl\" (UID: \"61312a96-b8f6-431c-b24e-0046271cf40f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lwpcl" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.996669 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9ae03d73-b21d-4004-a000-e49a547ef19d-audit-dir\") pod \"oauth-openshift-558db77b4-dh6qm\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.996685 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-dh6qm\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.996700 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6775c6a2-49ba-48fb-9f8f-ff26a7155618-console-config\") pod \"console-f9d7485db-6gtjl\" (UID: \"6775c6a2-49ba-48fb-9f8f-ff26a7155618\") " pod="openshift-console/console-f9d7485db-6gtjl" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.996715 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-dh6qm\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.996731 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-dh6qm\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.996747 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-dh6qm\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.996763 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nffd8\" (UniqueName: \"kubernetes.io/projected/9bc33ee8-964b-4b03-b564-5c66068629b9-kube-api-access-nffd8\") pod \"controller-manager-879f6c89f-vb8n2\" (UID: \"9bc33ee8-964b-4b03-b564-5c66068629b9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vb8n2" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.996781 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61312a96-b8f6-431c-b24e-0046271cf40f-config\") pod \"authentication-operator-69f744f599-lwpcl\" (UID: \"61312a96-b8f6-431c-b24e-0046271cf40f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lwpcl" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.996805 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9bc33ee8-964b-4b03-b564-5c66068629b9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-vb8n2\" (UID: \"9bc33ee8-964b-4b03-b564-5c66068629b9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vb8n2" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.996820 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-dh6qm\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.996839 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-dh6qm\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.996879 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6775c6a2-49ba-48fb-9f8f-ff26a7155618-console-serving-cert\") pod \"console-f9d7485db-6gtjl\" (UID: \"6775c6a2-49ba-48fb-9f8f-ff26a7155618\") " pod="openshift-console/console-f9d7485db-6gtjl" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.996897 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6775c6a2-49ba-48fb-9f8f-ff26a7155618-console-oauth-config\") pod \"console-f9d7485db-6gtjl\" (UID: \"6775c6a2-49ba-48fb-9f8f-ff26a7155618\") " pod="openshift-console/console-f9d7485db-6gtjl" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.996913 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-dh6qm\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.996934 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9bc33ee8-964b-4b03-b564-5c66068629b9-client-ca\") pod \"controller-manager-879f6c89f-vb8n2\" (UID: \"9bc33ee8-964b-4b03-b564-5c66068629b9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vb8n2" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.996948 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-dh6qm\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.996963 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5lpg\" (UniqueName: \"kubernetes.io/projected/6775c6a2-49ba-48fb-9f8f-ff26a7155618-kube-api-access-m5lpg\") pod \"console-f9d7485db-6gtjl\" (UID: \"6775c6a2-49ba-48fb-9f8f-ff26a7155618\") " pod="openshift-console/console-f9d7485db-6gtjl" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.996979 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ffvs\" (UniqueName: \"kubernetes.io/projected/61312a96-b8f6-431c-b24e-0046271cf40f-kube-api-access-6ffvs\") pod \"authentication-operator-69f744f599-lwpcl\" (UID: \"61312a96-b8f6-431c-b24e-0046271cf40f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lwpcl" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.997002 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61312a96-b8f6-431c-b24e-0046271cf40f-serving-cert\") pod \"authentication-operator-69f744f599-lwpcl\" (UID: \"61312a96-b8f6-431c-b24e-0046271cf40f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lwpcl" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.997017 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-dh6qm\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.997031 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6775c6a2-49ba-48fb-9f8f-ff26a7155618-trusted-ca-bundle\") pod \"console-f9d7485db-6gtjl\" (UID: \"6775c6a2-49ba-48fb-9f8f-ff26a7155618\") " pod="openshift-console/console-f9d7485db-6gtjl" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.997045 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-dh6qm\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.997060 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bc33ee8-964b-4b03-b564-5c66068629b9-config\") pod \"controller-manager-879f6c89f-vb8n2\" (UID: \"9bc33ee8-964b-4b03-b564-5c66068629b9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vb8n2" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.997077 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6775c6a2-49ba-48fb-9f8f-ff26a7155618-oauth-serving-cert\") pod \"console-f9d7485db-6gtjl\" (UID: \"6775c6a2-49ba-48fb-9f8f-ff26a7155618\") " pod="openshift-console/console-f9d7485db-6gtjl" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.997099 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6775c6a2-49ba-48fb-9f8f-ff26a7155618-service-ca\") pod \"console-f9d7485db-6gtjl\" (UID: \"6775c6a2-49ba-48fb-9f8f-ff26a7155618\") " pod="openshift-console/console-f9d7485db-6gtjl" Mar 09 13:01:46 crc kubenswrapper[4723]: I0309 13:01:46.997116 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9ae03d73-b21d-4004-a000-e49a547ef19d-audit-policies\") pod \"oauth-openshift-558db77b4-dh6qm\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.004629 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bjqw4"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.005123 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-wkpzg"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.005365 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-b5c74"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.005626 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xzd59"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.006049 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p8pxb"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.006258 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fbl76"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.006529 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t25rf"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.006761 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-r5ltl"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.007032 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-b5c74" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.007321 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qp2l4"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.007363 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-t6z8c" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.007454 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-r5ltl" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.007510 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-xzd59" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.007573 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bjqw4" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.007621 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-wkpzg" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.007884 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-p8pxb" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.007927 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fbl76" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.008143 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t25rf" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.008154 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-gk4bd"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.008893 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qp2l4" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.018578 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-s6gh6"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.018887 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-lwpcl"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.018907 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74fx9"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.019134 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-j6zr2"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.019648 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gk4bd" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.021016 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.023404 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.023888 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.023958 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kdxpm"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.031277 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vb8n2"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.031456 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-fzrk5"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.034311 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-7zkls"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.036352 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.036815 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j6zr2" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.038821 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.039376 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.040428 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kdxpm" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.044662 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.044920 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.045183 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.045483 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.045502 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.045541 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qpzn"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.045914 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.046263 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.046882 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.057027 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.057194 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.057228 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.057326 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.058690 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.059058 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.059183 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-fzrk5" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.059256 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7zkls" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.059693 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p9x2d"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.060211 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-n5lnt"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.060613 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551020-rlng8"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.062304 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-pcp4r"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.062757 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-4xwcm"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.063814 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qpzn" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.063935 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-j9lvd"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.064146 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.064188 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-n5lnt" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.064207 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p9x2d" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.064432 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.064479 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551020-rlng8" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.064748 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.064834 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.064903 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74fx9" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.064839 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pcp4r" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.064972 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-58jld"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.065032 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-j9lvd" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.065070 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-4xwcm" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.065072 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.065330 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.065381 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.065669 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.065675 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhxc9"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.065707 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.065815 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.065939 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.066062 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.066174 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.066287 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.066360 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.066543 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vpt9b"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.066566 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-58jld" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.066618 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhxc9" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.066733 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.067447 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.066948 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-6gtjl"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.067495 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-dh6qm"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.067514 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bjqw4"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.067528 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-t6z8c"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.067540 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-b5c74"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.066743 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.066773 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.067191 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.067207 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.067821 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.067233 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.067323 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.068023 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.067356 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.067640 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.068329 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.068364 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vpt9b" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.068853 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.068945 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.069108 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.069219 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.069237 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.069691 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.069691 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fbl76"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.070806 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.073114 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-c5h5p"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.073159 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-8w6mb"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.071108 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.073503 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.073741 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.073808 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.074414 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.074540 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.075695 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.075751 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.075823 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.076186 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.077218 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.083803 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-zkxq4"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.088152 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.088948 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.090662 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.092462 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-gm7hg"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.093024 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-gm7hg" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.093204 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.093303 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.094036 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qp2l4"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.095007 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jh2mc"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.095799 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jh2mc" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.097022 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.097176 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-wkpzg"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.097725 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/337d5692-12d3-4c0a-8187-eb66a2666e95-srv-cert\") pod \"catalog-operator-68c6474976-t25rf\" (UID: \"337d5692-12d3-4c0a-8187-eb66a2666e95\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t25rf" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.097751 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfclc\" (UniqueName: \"kubernetes.io/projected/9d680903-3aac-4f3d-8e55-5fb9ee1cb46a-kube-api-access-kfclc\") pod \"migrator-59844c95c7-r5ltl\" (UID: \"9d680903-3aac-4f3d-8e55-5fb9ee1cb46a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-r5ltl" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.097770 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f18e34a-2f8e-4450-bb3f-7b391bc03e06-config\") pod \"apiserver-76f77b778f-xzd59\" (UID: \"4f18e34a-2f8e-4450-bb3f-7b391bc03e06\") " pod="openshift-apiserver/apiserver-76f77b778f-xzd59" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.097789 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61312a96-b8f6-431c-b24e-0046271cf40f-config\") pod \"authentication-operator-69f744f599-lwpcl\" (UID: \"61312a96-b8f6-431c-b24e-0046271cf40f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lwpcl" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.097807 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9bc33ee8-964b-4b03-b564-5c66068629b9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-vb8n2\" (UID: \"9bc33ee8-964b-4b03-b564-5c66068629b9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vb8n2" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.097825 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-dh6qm\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.097844 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-dh6qm\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.097893 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f30207aa-a4d2-41bb-8f36-8c6809d96191-auth-proxy-config\") pod \"machine-approver-56656f9798-5wxvl\" (UID: \"f30207aa-a4d2-41bb-8f36-8c6809d96191\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wxvl" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.097911 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6775c6a2-49ba-48fb-9f8f-ff26a7155618-console-serving-cert\") pod \"console-f9d7485db-6gtjl\" (UID: \"6775c6a2-49ba-48fb-9f8f-ff26a7155618\") " pod="openshift-console/console-f9d7485db-6gtjl" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.097932 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6775c6a2-49ba-48fb-9f8f-ff26a7155618-console-oauth-config\") pod \"console-f9d7485db-6gtjl\" (UID: \"6775c6a2-49ba-48fb-9f8f-ff26a7155618\") " pod="openshift-console/console-f9d7485db-6gtjl" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.097948 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-dh6qm\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.097963 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/151b2a1f-2df4-49d4-9e55-260eebbb267f-serving-cert\") pod \"etcd-operator-b45778765-8w6mb\" (UID: \"151b2a1f-2df4-49d4-9e55-260eebbb267f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8w6mb" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.097983 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/151b2a1f-2df4-49d4-9e55-260eebbb267f-etcd-service-ca\") pod \"etcd-operator-b45778765-8w6mb\" (UID: \"151b2a1f-2df4-49d4-9e55-260eebbb267f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8w6mb" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.097998 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9bc33ee8-964b-4b03-b564-5c66068629b9-client-ca\") pod \"controller-manager-879f6c89f-vb8n2\" (UID: \"9bc33ee8-964b-4b03-b564-5c66068629b9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vb8n2" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.098013 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-dh6qm\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.098044 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f6455f2-bcad-4e11-8ef5-a272b406be88-config\") pod \"console-operator-58897d9998-wkpzg\" (UID: \"1f6455f2-bcad-4e11-8ef5-a272b406be88\") " pod="openshift-console-operator/console-operator-58897d9998-wkpzg" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.098060 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f6455f2-bcad-4e11-8ef5-a272b406be88-serving-cert\") pod \"console-operator-58897d9998-wkpzg\" (UID: \"1f6455f2-bcad-4e11-8ef5-a272b406be88\") " pod="openshift-console-operator/console-operator-58897d9998-wkpzg" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.098078 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5lpg\" (UniqueName: \"kubernetes.io/projected/6775c6a2-49ba-48fb-9f8f-ff26a7155618-kube-api-access-m5lpg\") pod \"console-f9d7485db-6gtjl\" (UID: \"6775c6a2-49ba-48fb-9f8f-ff26a7155618\") " pod="openshift-console/console-f9d7485db-6gtjl" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.098099 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmzvw\" (UniqueName: \"kubernetes.io/projected/beee0ec0-e83b-41df-b1c5-b6dadb908961-kube-api-access-cmzvw\") pod \"openshift-config-operator-7777fb866f-zkxq4\" (UID: \"beee0ec0-e83b-41df-b1c5-b6dadb908961\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zkxq4" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.098123 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/151b2a1f-2df4-49d4-9e55-260eebbb267f-etcd-client\") pod \"etcd-operator-b45778765-8w6mb\" (UID: \"151b2a1f-2df4-49d4-9e55-260eebbb267f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8w6mb" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.098145 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dr2q\" (UniqueName: \"kubernetes.io/projected/151b2a1f-2df4-49d4-9e55-260eebbb267f-kube-api-access-4dr2q\") pod \"etcd-operator-b45778765-8w6mb\" (UID: \"151b2a1f-2df4-49d4-9e55-260eebbb267f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8w6mb" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.098171 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1f6455f2-bcad-4e11-8ef5-a272b406be88-trusted-ca\") pod \"console-operator-58897d9998-wkpzg\" (UID: \"1f6455f2-bcad-4e11-8ef5-a272b406be88\") " pod="openshift-console-operator/console-operator-58897d9998-wkpzg" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.098186 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3f57f760-4139-4eb7-b7e9-5f0bcd7cb3eb-srv-cert\") pod \"olm-operator-6b444d44fb-qp2l4\" (UID: \"3f57f760-4139-4eb7-b7e9-5f0bcd7cb3eb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qp2l4" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.100520 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61312a96-b8f6-431c-b24e-0046271cf40f-config\") pod \"authentication-operator-69f744f599-lwpcl\" (UID: \"61312a96-b8f6-431c-b24e-0046271cf40f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lwpcl" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.101707 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59291c5b-082b-467b-b87b-cf9af3e613b1-config\") pod \"kube-apiserver-operator-766d6c64bb-fbl76\" (UID: \"59291c5b-082b-467b-b87b-cf9af3e613b1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fbl76" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.101781 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ffvs\" (UniqueName: \"kubernetes.io/projected/61312a96-b8f6-431c-b24e-0046271cf40f-kube-api-access-6ffvs\") pod \"authentication-operator-69f744f599-lwpcl\" (UID: \"61312a96-b8f6-431c-b24e-0046271cf40f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lwpcl" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.101825 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/59291c5b-082b-467b-b87b-cf9af3e613b1-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-fbl76\" (UID: \"59291c5b-082b-467b-b87b-cf9af3e613b1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fbl76" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.101879 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/337d5692-12d3-4c0a-8187-eb66a2666e95-profile-collector-cert\") pod \"catalog-operator-68c6474976-t25rf\" (UID: \"337d5692-12d3-4c0a-8187-eb66a2666e95\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t25rf" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.101908 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1a69d09-d3e2-4af9-857a-3229bc05c992-config\") pod \"machine-api-operator-5694c8668f-t6z8c\" (UID: \"d1a69d09-d3e2-4af9-857a-3229bc05c992\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t6z8c" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.101941 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f30207aa-a4d2-41bb-8f36-8c6809d96191-machine-approver-tls\") pod \"machine-approver-56656f9798-5wxvl\" (UID: \"f30207aa-a4d2-41bb-8f36-8c6809d96191\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wxvl" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.101976 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/beee0ec0-e83b-41df-b1c5-b6dadb908961-available-featuregates\") pod \"openshift-config-operator-7777fb866f-zkxq4\" (UID: \"beee0ec0-e83b-41df-b1c5-b6dadb908961\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zkxq4" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.102084 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d1a69d09-d3e2-4af9-857a-3229bc05c992-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-t6z8c\" (UID: \"d1a69d09-d3e2-4af9-857a-3229bc05c992\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t6z8c" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.102132 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m67cz\" (UniqueName: \"kubernetes.io/projected/56e5be14-f33f-4db0-a372-77b3fd4d9510-kube-api-access-m67cz\") pod \"cluster-samples-operator-665b6dd947-9gpjt\" (UID: \"56e5be14-f33f-4db0-a372-77b3fd4d9510\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9gpjt" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.102166 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdk49\" (UniqueName: \"kubernetes.io/projected/e9a38022-ccc3-43cb-8af2-4252aef56bf8-kube-api-access-vdk49\") pod \"openshift-apiserver-operator-796bbdcf4f-95tgr\" (UID: \"e9a38022-ccc3-43cb-8af2-4252aef56bf8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-95tgr" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.102192 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61312a96-b8f6-431c-b24e-0046271cf40f-serving-cert\") pod \"authentication-operator-69f744f599-lwpcl\" (UID: \"61312a96-b8f6-431c-b24e-0046271cf40f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lwpcl" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.102215 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-dh6qm\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.102239 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f30207aa-a4d2-41bb-8f36-8c6809d96191-config\") pod \"machine-approver-56656f9798-5wxvl\" (UID: \"f30207aa-a4d2-41bb-8f36-8c6809d96191\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wxvl" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.102263 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2d4d2050-65bd-4d00-893c-c7f296d90926-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-bjqw4\" (UID: \"2d4d2050-65bd-4d00-893c-c7f296d90926\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bjqw4" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.102291 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6775c6a2-49ba-48fb-9f8f-ff26a7155618-trusted-ca-bundle\") pod \"console-f9d7485db-6gtjl\" (UID: \"6775c6a2-49ba-48fb-9f8f-ff26a7155618\") " pod="openshift-console/console-f9d7485db-6gtjl" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.102311 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-dh6qm\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.102337 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c6d61e80-043f-4ece-a6a6-eed6357749f5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-p8pxb\" (UID: \"c6d61e80-043f-4ece-a6a6-eed6357749f5\") " pod="openshift-marketplace/marketplace-operator-79b997595-p8pxb" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.102363 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/56e5be14-f33f-4db0-a372-77b3fd4d9510-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-9gpjt\" (UID: \"56e5be14-f33f-4db0-a372-77b3fd4d9510\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9gpjt" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.102387 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c7333389-f183-4d12-b140-3f332e49eaec-metrics-tls\") pod \"dns-operator-744455d44c-c5h5p\" (UID: \"c7333389-f183-4d12-b140-3f332e49eaec\") " pod="openshift-dns-operator/dns-operator-744455d44c-c5h5p" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.102412 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bc33ee8-964b-4b03-b564-5c66068629b9-config\") pod \"controller-manager-879f6c89f-vb8n2\" (UID: \"9bc33ee8-964b-4b03-b564-5c66068629b9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vb8n2" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.102430 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/4f18e34a-2f8e-4450-bb3f-7b391bc03e06-audit\") pod \"apiserver-76f77b778f-xzd59\" (UID: \"4f18e34a-2f8e-4450-bb3f-7b391bc03e06\") " pod="openshift-apiserver/apiserver-76f77b778f-xzd59" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.102455 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3f57f760-4139-4eb7-b7e9-5f0bcd7cb3eb-profile-collector-cert\") pod \"olm-operator-6b444d44fb-qp2l4\" (UID: \"3f57f760-4139-4eb7-b7e9-5f0bcd7cb3eb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qp2l4" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.102476 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2c7j\" (UniqueName: \"kubernetes.io/projected/3f57f760-4139-4eb7-b7e9-5f0bcd7cb3eb-kube-api-access-j2c7j\") pod \"olm-operator-6b444d44fb-qp2l4\" (UID: \"3f57f760-4139-4eb7-b7e9-5f0bcd7cb3eb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qp2l4" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.102500 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4f18e34a-2f8e-4450-bb3f-7b391bc03e06-audit-dir\") pod \"apiserver-76f77b778f-xzd59\" (UID: \"4f18e34a-2f8e-4450-bb3f-7b391bc03e06\") " pod="openshift-apiserver/apiserver-76f77b778f-xzd59" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.102518 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/4f18e34a-2f8e-4450-bb3f-7b391bc03e06-image-import-ca\") pod \"apiserver-76f77b778f-xzd59\" (UID: \"4f18e34a-2f8e-4450-bb3f-7b391bc03e06\") " pod="openshift-apiserver/apiserver-76f77b778f-xzd59" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.102540 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f18e34a-2f8e-4450-bb3f-7b391bc03e06-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xzd59\" (UID: \"4f18e34a-2f8e-4450-bb3f-7b391bc03e06\") " pod="openshift-apiserver/apiserver-76f77b778f-xzd59" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.102561 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9a38022-ccc3-43cb-8af2-4252aef56bf8-config\") pod \"openshift-apiserver-operator-796bbdcf4f-95tgr\" (UID: \"e9a38022-ccc3-43cb-8af2-4252aef56bf8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-95tgr" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.102582 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d1a69d09-d3e2-4af9-857a-3229bc05c992-images\") pod \"machine-api-operator-5694c8668f-t6z8c\" (UID: \"d1a69d09-d3e2-4af9-857a-3229bc05c992\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t6z8c" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.102604 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vsh2\" (UniqueName: \"kubernetes.io/projected/f30207aa-a4d2-41bb-8f36-8c6809d96191-kube-api-access-8vsh2\") pod \"machine-approver-56656f9798-5wxvl\" (UID: \"f30207aa-a4d2-41bb-8f36-8c6809d96191\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wxvl" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.102624 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6775c6a2-49ba-48fb-9f8f-ff26a7155618-oauth-serving-cert\") pod \"console-f9d7485db-6gtjl\" (UID: \"6775c6a2-49ba-48fb-9f8f-ff26a7155618\") " pod="openshift-console/console-f9d7485db-6gtjl" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.102649 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6775c6a2-49ba-48fb-9f8f-ff26a7155618-service-ca\") pod \"console-f9d7485db-6gtjl\" (UID: \"6775c6a2-49ba-48fb-9f8f-ff26a7155618\") " pod="openshift-console/console-f9d7485db-6gtjl" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.102669 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/beee0ec0-e83b-41df-b1c5-b6dadb908961-serving-cert\") pod \"openshift-config-operator-7777fb866f-zkxq4\" (UID: \"beee0ec0-e83b-41df-b1c5-b6dadb908961\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zkxq4" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.102705 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9ae03d73-b21d-4004-a000-e49a547ef19d-audit-policies\") pod \"oauth-openshift-558db77b4-dh6qm\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.102726 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjjls\" (UniqueName: \"kubernetes.io/projected/c6d61e80-043f-4ece-a6a6-eed6357749f5-kube-api-access-qjjls\") pod \"marketplace-operator-79b997595-p8pxb\" (UID: \"c6d61e80-043f-4ece-a6a6-eed6357749f5\") " pod="openshift-marketplace/marketplace-operator-79b997595-p8pxb" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.102754 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q582d\" (UniqueName: \"kubernetes.io/projected/2d4d2050-65bd-4d00-893c-c7f296d90926-kube-api-access-q582d\") pod \"cluster-image-registry-operator-dc59b4c8b-bjqw4\" (UID: \"2d4d2050-65bd-4d00-893c-c7f296d90926\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bjqw4" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.102777 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bc33ee8-964b-4b03-b564-5c66068629b9-serving-cert\") pod \"controller-manager-879f6c89f-vb8n2\" (UID: \"9bc33ee8-964b-4b03-b564-5c66068629b9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vb8n2" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.102801 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/151b2a1f-2df4-49d4-9e55-260eebbb267f-config\") pod \"etcd-operator-b45778765-8w6mb\" (UID: \"151b2a1f-2df4-49d4-9e55-260eebbb267f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8w6mb" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.102823 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f18e34a-2f8e-4450-bb3f-7b391bc03e06-serving-cert\") pod \"apiserver-76f77b778f-xzd59\" (UID: \"4f18e34a-2f8e-4450-bb3f-7b391bc03e06\") " pod="openshift-apiserver/apiserver-76f77b778f-xzd59" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.102844 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2d4d2050-65bd-4d00-893c-c7f296d90926-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-bjqw4\" (UID: \"2d4d2050-65bd-4d00-893c-c7f296d90926\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bjqw4" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.102884 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c6d61e80-043f-4ece-a6a6-eed6357749f5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-p8pxb\" (UID: \"c6d61e80-043f-4ece-a6a6-eed6357749f5\") " pod="openshift-marketplace/marketplace-operator-79b997595-p8pxb" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.102906 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgb7w\" (UniqueName: \"kubernetes.io/projected/1f6455f2-bcad-4e11-8ef5-a272b406be88-kube-api-access-fgb7w\") pod \"console-operator-58897d9998-wkpzg\" (UID: \"1f6455f2-bcad-4e11-8ef5-a272b406be88\") " pod="openshift-console-operator/console-operator-58897d9998-wkpzg" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.102931 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-dh6qm\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.102954 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4f18e34a-2f8e-4450-bb3f-7b391bc03e06-encryption-config\") pod \"apiserver-76f77b778f-xzd59\" (UID: \"4f18e34a-2f8e-4450-bb3f-7b391bc03e06\") " pod="openshift-apiserver/apiserver-76f77b778f-xzd59" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.102980 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61312a96-b8f6-431c-b24e-0046271cf40f-service-ca-bundle\") pod \"authentication-operator-69f744f599-lwpcl\" (UID: \"61312a96-b8f6-431c-b24e-0046271cf40f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lwpcl" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.102998 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrc6q\" (UniqueName: \"kubernetes.io/projected/9ae03d73-b21d-4004-a000-e49a547ef19d-kube-api-access-wrc6q\") pod \"oauth-openshift-558db77b4-dh6qm\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.103018 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59291c5b-082b-467b-b87b-cf9af3e613b1-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-fbl76\" (UID: \"59291c5b-082b-467b-b87b-cf9af3e613b1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fbl76" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.103040 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9a38022-ccc3-43cb-8af2-4252aef56bf8-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-95tgr\" (UID: \"e9a38022-ccc3-43cb-8af2-4252aef56bf8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-95tgr" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.103068 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61312a96-b8f6-431c-b24e-0046271cf40f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-lwpcl\" (UID: \"61312a96-b8f6-431c-b24e-0046271cf40f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lwpcl" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.103094 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-dh6qm\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.103112 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4f18e34a-2f8e-4450-bb3f-7b391bc03e06-etcd-serving-ca\") pod \"apiserver-76f77b778f-xzd59\" (UID: \"4f18e34a-2f8e-4450-bb3f-7b391bc03e06\") " pod="openshift-apiserver/apiserver-76f77b778f-xzd59" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.103137 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bn88\" (UniqueName: \"kubernetes.io/projected/337d5692-12d3-4c0a-8187-eb66a2666e95-kube-api-access-5bn88\") pod \"catalog-operator-68c6474976-t25rf\" (UID: \"337d5692-12d3-4c0a-8187-eb66a2666e95\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t25rf" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.103158 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgfd8\" (UniqueName: \"kubernetes.io/projected/e21fc837-8de2-4af5-a375-b14567f47d67-kube-api-access-dgfd8\") pod \"downloads-7954f5f757-b5c74\" (UID: \"e21fc837-8de2-4af5-a375-b14567f47d67\") " pod="openshift-console/downloads-7954f5f757-b5c74" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.103202 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2d4d2050-65bd-4d00-893c-c7f296d90926-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-bjqw4\" (UID: \"2d4d2050-65bd-4d00-893c-c7f296d90926\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bjqw4" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.103225 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9ae03d73-b21d-4004-a000-e49a547ef19d-audit-dir\") pod \"oauth-openshift-558db77b4-dh6qm\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.103246 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6775c6a2-49ba-48fb-9f8f-ff26a7155618-console-config\") pod \"console-f9d7485db-6gtjl\" (UID: \"6775c6a2-49ba-48fb-9f8f-ff26a7155618\") " pod="openshift-console/console-f9d7485db-6gtjl" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.103268 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-dh6qm\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.103291 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4f18e34a-2f8e-4450-bb3f-7b391bc03e06-node-pullsecrets\") pod \"apiserver-76f77b778f-xzd59\" (UID: \"4f18e34a-2f8e-4450-bb3f-7b391bc03e06\") " pod="openshift-apiserver/apiserver-76f77b778f-xzd59" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.103320 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-dh6qm\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.103344 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nffd8\" (UniqueName: \"kubernetes.io/projected/9bc33ee8-964b-4b03-b564-5c66068629b9-kube-api-access-nffd8\") pod \"controller-manager-879f6c89f-vb8n2\" (UID: \"9bc33ee8-964b-4b03-b564-5c66068629b9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vb8n2" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.103367 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-dh6qm\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.103390 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/151b2a1f-2df4-49d4-9e55-260eebbb267f-etcd-ca\") pod \"etcd-operator-b45778765-8w6mb\" (UID: \"151b2a1f-2df4-49d4-9e55-260eebbb267f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8w6mb" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.103414 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w58j4\" (UniqueName: \"kubernetes.io/projected/d1a69d09-d3e2-4af9-857a-3229bc05c992-kube-api-access-w58j4\") pod \"machine-api-operator-5694c8668f-t6z8c\" (UID: \"d1a69d09-d3e2-4af9-857a-3229bc05c992\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t6z8c" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.103433 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddmvl\" (UniqueName: \"kubernetes.io/projected/4f18e34a-2f8e-4450-bb3f-7b391bc03e06-kube-api-access-ddmvl\") pod \"apiserver-76f77b778f-xzd59\" (UID: \"4f18e34a-2f8e-4450-bb3f-7b391bc03e06\") " pod="openshift-apiserver/apiserver-76f77b778f-xzd59" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.103456 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4f18e34a-2f8e-4450-bb3f-7b391bc03e06-etcd-client\") pod \"apiserver-76f77b778f-xzd59\" (UID: \"4f18e34a-2f8e-4450-bb3f-7b391bc03e06\") " pod="openshift-apiserver/apiserver-76f77b778f-xzd59" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.103480 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2279\" (UniqueName: \"kubernetes.io/projected/c7333389-f183-4d12-b140-3f332e49eaec-kube-api-access-l2279\") pod \"dns-operator-744455d44c-c5h5p\" (UID: \"c7333389-f183-4d12-b140-3f332e49eaec\") " pod="openshift-dns-operator/dns-operator-744455d44c-c5h5p" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.103712 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p8pxb"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.105014 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-dh6qm\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.106024 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6775c6a2-49ba-48fb-9f8f-ff26a7155618-trusted-ca-bundle\") pod \"console-f9d7485db-6gtjl\" (UID: \"6775c6a2-49ba-48fb-9f8f-ff26a7155618\") " pod="openshift-console/console-f9d7485db-6gtjl" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.107073 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9ae03d73-b21d-4004-a000-e49a547ef19d-audit-dir\") pod \"oauth-openshift-558db77b4-dh6qm\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.107297 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61312a96-b8f6-431c-b24e-0046271cf40f-service-ca-bundle\") pod \"authentication-operator-69f744f599-lwpcl\" (UID: \"61312a96-b8f6-431c-b24e-0046271cf40f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lwpcl" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.107363 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-dh6qm\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.108260 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61312a96-b8f6-431c-b24e-0046271cf40f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-lwpcl\" (UID: \"61312a96-b8f6-431c-b24e-0046271cf40f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lwpcl" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.108983 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bc33ee8-964b-4b03-b564-5c66068629b9-config\") pod \"controller-manager-879f6c89f-vb8n2\" (UID: \"9bc33ee8-964b-4b03-b564-5c66068629b9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vb8n2" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.109008 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.109188 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6775c6a2-49ba-48fb-9f8f-ff26a7155618-console-config\") pod \"console-f9d7485db-6gtjl\" (UID: \"6775c6a2-49ba-48fb-9f8f-ff26a7155618\") " pod="openshift-console/console-f9d7485db-6gtjl" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.109256 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9bc33ee8-964b-4b03-b564-5c66068629b9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-vb8n2\" (UID: \"9bc33ee8-964b-4b03-b564-5c66068629b9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vb8n2" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.109715 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6775c6a2-49ba-48fb-9f8f-ff26a7155618-oauth-serving-cert\") pod \"console-f9d7485db-6gtjl\" (UID: \"6775c6a2-49ba-48fb-9f8f-ff26a7155618\") " pod="openshift-console/console-f9d7485db-6gtjl" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.110211 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9bc33ee8-964b-4b03-b564-5c66068629b9-client-ca\") pod \"controller-manager-879f6c89f-vb8n2\" (UID: \"9bc33ee8-964b-4b03-b564-5c66068629b9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vb8n2" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.110293 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-dh6qm\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.110840 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6775c6a2-49ba-48fb-9f8f-ff26a7155618-service-ca\") pod \"console-f9d7485db-6gtjl\" (UID: \"6775c6a2-49ba-48fb-9f8f-ff26a7155618\") " pod="openshift-console/console-f9d7485db-6gtjl" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.110898 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9ae03d73-b21d-4004-a000-e49a547ef19d-audit-policies\") pod \"oauth-openshift-558db77b4-dh6qm\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.111381 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6775c6a2-49ba-48fb-9f8f-ff26a7155618-console-oauth-config\") pod \"console-f9d7485db-6gtjl\" (UID: \"6775c6a2-49ba-48fb-9f8f-ff26a7155618\") " pod="openshift-console/console-f9d7485db-6gtjl" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.112811 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-gk4bd"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.113320 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61312a96-b8f6-431c-b24e-0046271cf40f-serving-cert\") pod \"authentication-operator-69f744f599-lwpcl\" (UID: \"61312a96-b8f6-431c-b24e-0046271cf40f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lwpcl" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.114000 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-dh6qm\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.115290 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-95tgr"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.116632 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6775c6a2-49ba-48fb-9f8f-ff26a7155618-console-serving-cert\") pod \"console-f9d7485db-6gtjl\" (UID: \"6775c6a2-49ba-48fb-9f8f-ff26a7155618\") " pod="openshift-console/console-f9d7485db-6gtjl" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.117034 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-dh6qm\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.118515 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-dh6qm\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.118602 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t25rf"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.118553 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-dh6qm\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.118951 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.124779 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-dh6qm\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.124822 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9gpjt"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.124797 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bc33ee8-964b-4b03-b564-5c66068629b9-serving-cert\") pod \"controller-manager-879f6c89f-vb8n2\" (UID: \"9bc33ee8-964b-4b03-b564-5c66068629b9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vb8n2" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.125224 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-dh6qm\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.125261 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-dh6qm\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.141361 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-dh6qm\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.142480 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-j6zr2"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.143374 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qpzn"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.146667 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.148633 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kdxpm"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.150488 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xzd59"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.151261 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-r5ltl"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.152287 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-58jld"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.153669 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.154953 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551020-rlng8"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.155541 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-n5lnt"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.156588 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-gqscf"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.158252 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhxc9"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.158461 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-gqscf" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.158839 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-pcp4r"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.159694 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jh2mc"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.160700 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-7zkls"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.161686 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-s6gh6"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.162730 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p9x2d"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.163637 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-4xwcm"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.164607 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-j9lvd"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.165777 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vpt9b"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.166717 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74fx9"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.167736 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-gqscf"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.168699 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-4hnlr"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.169644 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4hnlr"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.169800 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4hnlr" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.174268 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.204451 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/151b2a1f-2df4-49d4-9e55-260eebbb267f-serving-cert\") pod \"etcd-operator-b45778765-8w6mb\" (UID: \"151b2a1f-2df4-49d4-9e55-260eebbb267f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8w6mb" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.204565 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/151b2a1f-2df4-49d4-9e55-260eebbb267f-etcd-service-ca\") pod \"etcd-operator-b45778765-8w6mb\" (UID: \"151b2a1f-2df4-49d4-9e55-260eebbb267f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8w6mb" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.204652 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95ca7c55-d937-4670-b10c-a8aaf4b77c84-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jh2mc\" (UID: \"95ca7c55-d937-4670-b10c-a8aaf4b77c84\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jh2mc" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.204734 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f6455f2-bcad-4e11-8ef5-a272b406be88-config\") pod \"console-operator-58897d9998-wkpzg\" (UID: \"1f6455f2-bcad-4e11-8ef5-a272b406be88\") " pod="openshift-console-operator/console-operator-58897d9998-wkpzg" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.204943 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f6455f2-bcad-4e11-8ef5-a272b406be88-serving-cert\") pod \"console-operator-58897d9998-wkpzg\" (UID: \"1f6455f2-bcad-4e11-8ef5-a272b406be88\") " pod="openshift-console-operator/console-operator-58897d9998-wkpzg" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.205039 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmzvw\" (UniqueName: \"kubernetes.io/projected/beee0ec0-e83b-41df-b1c5-b6dadb908961-kube-api-access-cmzvw\") pod \"openshift-config-operator-7777fb866f-zkxq4\" (UID: \"beee0ec0-e83b-41df-b1c5-b6dadb908961\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zkxq4" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.205114 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0e7aa68-dd0a-4803-93c1-d3d824033cad-serving-cert\") pod \"apiserver-7bbb656c7d-7zkls\" (UID: \"e0e7aa68-dd0a-4803-93c1-d3d824033cad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7zkls" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.205179 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/151b2a1f-2df4-49d4-9e55-260eebbb267f-etcd-client\") pod \"etcd-operator-b45778765-8w6mb\" (UID: \"151b2a1f-2df4-49d4-9e55-260eebbb267f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8w6mb" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.205241 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3321a715-9c5f-4417-bec1-4ba3ccce946c-apiservice-cert\") pod \"packageserver-d55dfcdfc-fhxc9\" (UID: \"3321a715-9c5f-4417-bec1-4ba3ccce946c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhxc9" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.205312 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e0e7aa68-dd0a-4803-93c1-d3d824033cad-etcd-client\") pod \"apiserver-7bbb656c7d-7zkls\" (UID: \"e0e7aa68-dd0a-4803-93c1-d3d824033cad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7zkls" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.205380 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dr2q\" (UniqueName: \"kubernetes.io/projected/151b2a1f-2df4-49d4-9e55-260eebbb267f-kube-api-access-4dr2q\") pod \"etcd-operator-b45778765-8w6mb\" (UID: \"151b2a1f-2df4-49d4-9e55-260eebbb267f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8w6mb" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.205558 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1f6455f2-bcad-4e11-8ef5-a272b406be88-trusted-ca\") pod \"console-operator-58897d9998-wkpzg\" (UID: \"1f6455f2-bcad-4e11-8ef5-a272b406be88\") " pod="openshift-console-operator/console-operator-58897d9998-wkpzg" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.205630 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3f57f760-4139-4eb7-b7e9-5f0bcd7cb3eb-srv-cert\") pod \"olm-operator-6b444d44fb-qp2l4\" (UID: \"3f57f760-4139-4eb7-b7e9-5f0bcd7cb3eb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qp2l4" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.205703 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59291c5b-082b-467b-b87b-cf9af3e613b1-config\") pod \"kube-apiserver-operator-766d6c64bb-fbl76\" (UID: \"59291c5b-082b-467b-b87b-cf9af3e613b1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fbl76" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.205803 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/59291c5b-082b-467b-b87b-cf9af3e613b1-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-fbl76\" (UID: \"59291c5b-082b-467b-b87b-cf9af3e613b1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fbl76" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.205956 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/337d5692-12d3-4c0a-8187-eb66a2666e95-profile-collector-cert\") pod \"catalog-operator-68c6474976-t25rf\" (UID: \"337d5692-12d3-4c0a-8187-eb66a2666e95\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t25rf" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.206063 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1a69d09-d3e2-4af9-857a-3229bc05c992-config\") pod \"machine-api-operator-5694c8668f-t6z8c\" (UID: \"d1a69d09-d3e2-4af9-857a-3229bc05c992\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t6z8c" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.206161 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f30207aa-a4d2-41bb-8f36-8c6809d96191-machine-approver-tls\") pod \"machine-approver-56656f9798-5wxvl\" (UID: \"f30207aa-a4d2-41bb-8f36-8c6809d96191\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wxvl" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.206239 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/beee0ec0-e83b-41df-b1c5-b6dadb908961-available-featuregates\") pod \"openshift-config-operator-7777fb866f-zkxq4\" (UID: \"beee0ec0-e83b-41df-b1c5-b6dadb908961\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zkxq4" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.206346 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/95ca7c55-d937-4670-b10c-a8aaf4b77c84-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jh2mc\" (UID: \"95ca7c55-d937-4670-b10c-a8aaf4b77c84\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jh2mc" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.206499 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d1a69d09-d3e2-4af9-857a-3229bc05c992-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-t6z8c\" (UID: \"d1a69d09-d3e2-4af9-857a-3229bc05c992\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t6z8c" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.206585 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95ca7c55-d937-4670-b10c-a8aaf4b77c84-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jh2mc\" (UID: \"95ca7c55-d937-4670-b10c-a8aaf4b77c84\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jh2mc" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.206656 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m67cz\" (UniqueName: \"kubernetes.io/projected/56e5be14-f33f-4db0-a372-77b3fd4d9510-kube-api-access-m67cz\") pod \"cluster-samples-operator-665b6dd947-9gpjt\" (UID: \"56e5be14-f33f-4db0-a372-77b3fd4d9510\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9gpjt" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.206729 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdk49\" (UniqueName: \"kubernetes.io/projected/e9a38022-ccc3-43cb-8af2-4252aef56bf8-kube-api-access-vdk49\") pod \"openshift-apiserver-operator-796bbdcf4f-95tgr\" (UID: \"e9a38022-ccc3-43cb-8af2-4252aef56bf8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-95tgr" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.206815 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/151b2a1f-2df4-49d4-9e55-260eebbb267f-etcd-service-ca\") pod \"etcd-operator-b45778765-8w6mb\" (UID: \"151b2a1f-2df4-49d4-9e55-260eebbb267f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8w6mb" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.206830 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f30207aa-a4d2-41bb-8f36-8c6809d96191-config\") pod \"machine-approver-56656f9798-5wxvl\" (UID: \"f30207aa-a4d2-41bb-8f36-8c6809d96191\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wxvl" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.206961 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2d4d2050-65bd-4d00-893c-c7f296d90926-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-bjqw4\" (UID: \"2d4d2050-65bd-4d00-893c-c7f296d90926\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bjqw4" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.206738 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59291c5b-082b-467b-b87b-cf9af3e613b1-config\") pod \"kube-apiserver-operator-766d6c64bb-fbl76\" (UID: \"59291c5b-082b-467b-b87b-cf9af3e613b1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fbl76" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.207166 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/beee0ec0-e83b-41df-b1c5-b6dadb908961-available-featuregates\") pod \"openshift-config-operator-7777fb866f-zkxq4\" (UID: \"beee0ec0-e83b-41df-b1c5-b6dadb908961\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zkxq4" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.207344 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0383bca0-800c-4f21-a7ee-32e42609b47e-signing-key\") pod \"service-ca-9c57cc56f-4xwcm\" (UID: \"0383bca0-800c-4f21-a7ee-32e42609b47e\") " pod="openshift-service-ca/service-ca-9c57cc56f-4xwcm" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.207401 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c6d61e80-043f-4ece-a6a6-eed6357749f5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-p8pxb\" (UID: \"c6d61e80-043f-4ece-a6a6-eed6357749f5\") " pod="openshift-marketplace/marketplace-operator-79b997595-p8pxb" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.207521 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/56e5be14-f33f-4db0-a372-77b3fd4d9510-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-9gpjt\" (UID: \"56e5be14-f33f-4db0-a372-77b3fd4d9510\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9gpjt" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.207578 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gswvk\" (UniqueName: \"kubernetes.io/projected/e0e7aa68-dd0a-4803-93c1-d3d824033cad-kube-api-access-gswvk\") pod \"apiserver-7bbb656c7d-7zkls\" (UID: \"e0e7aa68-dd0a-4803-93c1-d3d824033cad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7zkls" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.205854 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f6455f2-bcad-4e11-8ef5-a272b406be88-config\") pod \"console-operator-58897d9998-wkpzg\" (UID: \"1f6455f2-bcad-4e11-8ef5-a272b406be88\") " pod="openshift-console-operator/console-operator-58897d9998-wkpzg" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.207677 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c7333389-f183-4d12-b140-3f332e49eaec-metrics-tls\") pod \"dns-operator-744455d44c-c5h5p\" (UID: \"c7333389-f183-4d12-b140-3f332e49eaec\") " pod="openshift-dns-operator/dns-operator-744455d44c-c5h5p" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.207774 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/4f18e34a-2f8e-4450-bb3f-7b391bc03e06-audit\") pod \"apiserver-76f77b778f-xzd59\" (UID: \"4f18e34a-2f8e-4450-bb3f-7b391bc03e06\") " pod="openshift-apiserver/apiserver-76f77b778f-xzd59" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.207820 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0383bca0-800c-4f21-a7ee-32e42609b47e-signing-cabundle\") pod \"service-ca-9c57cc56f-4xwcm\" (UID: \"0383bca0-800c-4f21-a7ee-32e42609b47e\") " pod="openshift-service-ca/service-ca-9c57cc56f-4xwcm" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.207904 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3f57f760-4139-4eb7-b7e9-5f0bcd7cb3eb-profile-collector-cert\") pod \"olm-operator-6b444d44fb-qp2l4\" (UID: \"3f57f760-4139-4eb7-b7e9-5f0bcd7cb3eb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qp2l4" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.207948 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2c7j\" (UniqueName: \"kubernetes.io/projected/3f57f760-4139-4eb7-b7e9-5f0bcd7cb3eb-kube-api-access-j2c7j\") pod \"olm-operator-6b444d44fb-qp2l4\" (UID: \"3f57f760-4139-4eb7-b7e9-5f0bcd7cb3eb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qp2l4" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.207976 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1f6455f2-bcad-4e11-8ef5-a272b406be88-trusted-ca\") pod \"console-operator-58897d9998-wkpzg\" (UID: \"1f6455f2-bcad-4e11-8ef5-a272b406be88\") " pod="openshift-console-operator/console-operator-58897d9998-wkpzg" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.208009 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4f18e34a-2f8e-4450-bb3f-7b391bc03e06-audit-dir\") pod \"apiserver-76f77b778f-xzd59\" (UID: \"4f18e34a-2f8e-4450-bb3f-7b391bc03e06\") " pod="openshift-apiserver/apiserver-76f77b778f-xzd59" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.208055 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4f18e34a-2f8e-4450-bb3f-7b391bc03e06-audit-dir\") pod \"apiserver-76f77b778f-xzd59\" (UID: \"4f18e34a-2f8e-4450-bb3f-7b391bc03e06\") " pod="openshift-apiserver/apiserver-76f77b778f-xzd59" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.208076 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/4f18e34a-2f8e-4450-bb3f-7b391bc03e06-image-import-ca\") pod \"apiserver-76f77b778f-xzd59\" (UID: \"4f18e34a-2f8e-4450-bb3f-7b391bc03e06\") " pod="openshift-apiserver/apiserver-76f77b778f-xzd59" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.208102 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f18e34a-2f8e-4450-bb3f-7b391bc03e06-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xzd59\" (UID: \"4f18e34a-2f8e-4450-bb3f-7b391bc03e06\") " pod="openshift-apiserver/apiserver-76f77b778f-xzd59" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.208129 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9a38022-ccc3-43cb-8af2-4252aef56bf8-config\") pod \"openshift-apiserver-operator-796bbdcf4f-95tgr\" (UID: \"e9a38022-ccc3-43cb-8af2-4252aef56bf8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-95tgr" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.208151 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d1a69d09-d3e2-4af9-857a-3229bc05c992-images\") pod \"machine-api-operator-5694c8668f-t6z8c\" (UID: \"d1a69d09-d3e2-4af9-857a-3229bc05c992\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t6z8c" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.208175 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vsh2\" (UniqueName: \"kubernetes.io/projected/f30207aa-a4d2-41bb-8f36-8c6809d96191-kube-api-access-8vsh2\") pod \"machine-approver-56656f9798-5wxvl\" (UID: \"f30207aa-a4d2-41bb-8f36-8c6809d96191\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wxvl" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.208220 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/beee0ec0-e83b-41df-b1c5-b6dadb908961-serving-cert\") pod \"openshift-config-operator-7777fb866f-zkxq4\" (UID: \"beee0ec0-e83b-41df-b1c5-b6dadb908961\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zkxq4" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.208275 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjjls\" (UniqueName: \"kubernetes.io/projected/c6d61e80-043f-4ece-a6a6-eed6357749f5-kube-api-access-qjjls\") pod \"marketplace-operator-79b997595-p8pxb\" (UID: \"c6d61e80-043f-4ece-a6a6-eed6357749f5\") " pod="openshift-marketplace/marketplace-operator-79b997595-p8pxb" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.208302 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q582d\" (UniqueName: \"kubernetes.io/projected/2d4d2050-65bd-4d00-893c-c7f296d90926-kube-api-access-q582d\") pod \"cluster-image-registry-operator-dc59b4c8b-bjqw4\" (UID: \"2d4d2050-65bd-4d00-893c-c7f296d90926\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bjqw4" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.208336 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3321a715-9c5f-4417-bec1-4ba3ccce946c-webhook-cert\") pod \"packageserver-d55dfcdfc-fhxc9\" (UID: \"3321a715-9c5f-4417-bec1-4ba3ccce946c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhxc9" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.208368 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/151b2a1f-2df4-49d4-9e55-260eebbb267f-config\") pod \"etcd-operator-b45778765-8w6mb\" (UID: \"151b2a1f-2df4-49d4-9e55-260eebbb267f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8w6mb" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.208402 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f18e34a-2f8e-4450-bb3f-7b391bc03e06-serving-cert\") pod \"apiserver-76f77b778f-xzd59\" (UID: \"4f18e34a-2f8e-4450-bb3f-7b391bc03e06\") " pod="openshift-apiserver/apiserver-76f77b778f-xzd59" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.208427 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2d4d2050-65bd-4d00-893c-c7f296d90926-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-bjqw4\" (UID: \"2d4d2050-65bd-4d00-893c-c7f296d90926\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bjqw4" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.208451 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c6d61e80-043f-4ece-a6a6-eed6357749f5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-p8pxb\" (UID: \"c6d61e80-043f-4ece-a6a6-eed6357749f5\") " pod="openshift-marketplace/marketplace-operator-79b997595-p8pxb" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.208492 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgb7w\" (UniqueName: \"kubernetes.io/projected/1f6455f2-bcad-4e11-8ef5-a272b406be88-kube-api-access-fgb7w\") pod \"console-operator-58897d9998-wkpzg\" (UID: \"1f6455f2-bcad-4e11-8ef5-a272b406be88\") " pod="openshift-console-operator/console-operator-58897d9998-wkpzg" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.208518 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4f18e34a-2f8e-4450-bb3f-7b391bc03e06-encryption-config\") pod \"apiserver-76f77b778f-xzd59\" (UID: \"4f18e34a-2f8e-4450-bb3f-7b391bc03e06\") " pod="openshift-apiserver/apiserver-76f77b778f-xzd59" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.208549 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59291c5b-082b-467b-b87b-cf9af3e613b1-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-fbl76\" (UID: \"59291c5b-082b-467b-b87b-cf9af3e613b1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fbl76" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.208574 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9a38022-ccc3-43cb-8af2-4252aef56bf8-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-95tgr\" (UID: \"e9a38022-ccc3-43cb-8af2-4252aef56bf8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-95tgr" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.208604 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4f18e34a-2f8e-4450-bb3f-7b391bc03e06-etcd-serving-ca\") pod \"apiserver-76f77b778f-xzd59\" (UID: \"4f18e34a-2f8e-4450-bb3f-7b391bc03e06\") " pod="openshift-apiserver/apiserver-76f77b778f-xzd59" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.208626 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bn88\" (UniqueName: \"kubernetes.io/projected/337d5692-12d3-4c0a-8187-eb66a2666e95-kube-api-access-5bn88\") pod \"catalog-operator-68c6474976-t25rf\" (UID: \"337d5692-12d3-4c0a-8187-eb66a2666e95\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t25rf" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.208647 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgfd8\" (UniqueName: \"kubernetes.io/projected/e21fc837-8de2-4af5-a375-b14567f47d67-kube-api-access-dgfd8\") pod \"downloads-7954f5f757-b5c74\" (UID: \"e21fc837-8de2-4af5-a375-b14567f47d67\") " pod="openshift-console/downloads-7954f5f757-b5c74" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.208667 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2d4d2050-65bd-4d00-893c-c7f296d90926-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-bjqw4\" (UID: \"2d4d2050-65bd-4d00-893c-c7f296d90926\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bjqw4" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.208693 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4f18e34a-2f8e-4450-bb3f-7b391bc03e06-node-pullsecrets\") pod \"apiserver-76f77b778f-xzd59\" (UID: \"4f18e34a-2f8e-4450-bb3f-7b391bc03e06\") " pod="openshift-apiserver/apiserver-76f77b778f-xzd59" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.208718 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/09470b31-c2ae-42f8-8490-c446e979042d-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-p9x2d\" (UID: \"09470b31-c2ae-42f8-8490-c446e979042d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p9x2d" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.208732 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c6d61e80-043f-4ece-a6a6-eed6357749f5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-p8pxb\" (UID: \"c6d61e80-043f-4ece-a6a6-eed6357749f5\") " pod="openshift-marketplace/marketplace-operator-79b997595-p8pxb" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.208752 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e0e7aa68-dd0a-4803-93c1-d3d824033cad-audit-policies\") pod \"apiserver-7bbb656c7d-7zkls\" (UID: \"e0e7aa68-dd0a-4803-93c1-d3d824033cad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7zkls" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.208806 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/151b2a1f-2df4-49d4-9e55-260eebbb267f-etcd-ca\") pod \"etcd-operator-b45778765-8w6mb\" (UID: \"151b2a1f-2df4-49d4-9e55-260eebbb267f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8w6mb" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.208838 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w58j4\" (UniqueName: \"kubernetes.io/projected/d1a69d09-d3e2-4af9-857a-3229bc05c992-kube-api-access-w58j4\") pod \"machine-api-operator-5694c8668f-t6z8c\" (UID: \"d1a69d09-d3e2-4af9-857a-3229bc05c992\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t6z8c" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.208934 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v96j\" (UniqueName: \"kubernetes.io/projected/3321a715-9c5f-4417-bec1-4ba3ccce946c-kube-api-access-7v96j\") pod \"packageserver-d55dfcdfc-fhxc9\" (UID: \"3321a715-9c5f-4417-bec1-4ba3ccce946c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhxc9" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.208962 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcbln\" (UniqueName: \"kubernetes.io/projected/0383bca0-800c-4f21-a7ee-32e42609b47e-kube-api-access-fcbln\") pod \"service-ca-9c57cc56f-4xwcm\" (UID: \"0383bca0-800c-4f21-a7ee-32e42609b47e\") " pod="openshift-service-ca/service-ca-9c57cc56f-4xwcm" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.208988 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddmvl\" (UniqueName: \"kubernetes.io/projected/4f18e34a-2f8e-4450-bb3f-7b391bc03e06-kube-api-access-ddmvl\") pod \"apiserver-76f77b778f-xzd59\" (UID: \"4f18e34a-2f8e-4450-bb3f-7b391bc03e06\") " pod="openshift-apiserver/apiserver-76f77b778f-xzd59" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.209012 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4f18e34a-2f8e-4450-bb3f-7b391bc03e06-etcd-client\") pod \"apiserver-76f77b778f-xzd59\" (UID: \"4f18e34a-2f8e-4450-bb3f-7b391bc03e06\") " pod="openshift-apiserver/apiserver-76f77b778f-xzd59" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.209036 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2279\" (UniqueName: \"kubernetes.io/projected/c7333389-f183-4d12-b140-3f332e49eaec-kube-api-access-l2279\") pod \"dns-operator-744455d44c-c5h5p\" (UID: \"c7333389-f183-4d12-b140-3f332e49eaec\") " pod="openshift-dns-operator/dns-operator-744455d44c-c5h5p" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.209059 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e0e7aa68-dd0a-4803-93c1-d3d824033cad-encryption-config\") pod \"apiserver-7bbb656c7d-7zkls\" (UID: \"e0e7aa68-dd0a-4803-93c1-d3d824033cad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7zkls" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.209084 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/337d5692-12d3-4c0a-8187-eb66a2666e95-srv-cert\") pod \"catalog-operator-68c6474976-t25rf\" (UID: \"337d5692-12d3-4c0a-8187-eb66a2666e95\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t25rf" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.209106 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3321a715-9c5f-4417-bec1-4ba3ccce946c-tmpfs\") pod \"packageserver-d55dfcdfc-fhxc9\" (UID: \"3321a715-9c5f-4417-bec1-4ba3ccce946c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhxc9" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.209129 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfclc\" (UniqueName: \"kubernetes.io/projected/9d680903-3aac-4f3d-8e55-5fb9ee1cb46a-kube-api-access-kfclc\") pod \"migrator-59844c95c7-r5ltl\" (UID: \"9d680903-3aac-4f3d-8e55-5fb9ee1cb46a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-r5ltl" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.209293 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f18e34a-2f8e-4450-bb3f-7b391bc03e06-config\") pod \"apiserver-76f77b778f-xzd59\" (UID: \"4f18e34a-2f8e-4450-bb3f-7b391bc03e06\") " pod="openshift-apiserver/apiserver-76f77b778f-xzd59" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.209343 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0e7aa68-dd0a-4803-93c1-d3d824033cad-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-7zkls\" (UID: \"e0e7aa68-dd0a-4803-93c1-d3d824033cad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7zkls" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.209374 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e0e7aa68-dd0a-4803-93c1-d3d824033cad-audit-dir\") pod \"apiserver-7bbb656c7d-7zkls\" (UID: \"e0e7aa68-dd0a-4803-93c1-d3d824033cad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7zkls" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.209403 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e0e7aa68-dd0a-4803-93c1-d3d824033cad-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-7zkls\" (UID: \"e0e7aa68-dd0a-4803-93c1-d3d824033cad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7zkls" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.209457 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f30207aa-a4d2-41bb-8f36-8c6809d96191-auth-proxy-config\") pod \"machine-approver-56656f9798-5wxvl\" (UID: \"f30207aa-a4d2-41bb-8f36-8c6809d96191\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wxvl" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.209482 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbstn\" (UniqueName: \"kubernetes.io/projected/09470b31-c2ae-42f8-8490-c446e979042d-kube-api-access-lbstn\") pod \"package-server-manager-789f6589d5-p9x2d\" (UID: \"09470b31-c2ae-42f8-8490-c446e979042d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p9x2d" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.209770 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f30207aa-a4d2-41bb-8f36-8c6809d96191-config\") pod \"machine-approver-56656f9798-5wxvl\" (UID: \"f30207aa-a4d2-41bb-8f36-8c6809d96191\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wxvl" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.209971 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/151b2a1f-2df4-49d4-9e55-260eebbb267f-etcd-ca\") pod \"etcd-operator-b45778765-8w6mb\" (UID: \"151b2a1f-2df4-49d4-9e55-260eebbb267f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8w6mb" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.209972 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/4f18e34a-2f8e-4450-bb3f-7b391bc03e06-audit\") pod \"apiserver-76f77b778f-xzd59\" (UID: \"4f18e34a-2f8e-4450-bb3f-7b391bc03e06\") " pod="openshift-apiserver/apiserver-76f77b778f-xzd59" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.210017 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4f18e34a-2f8e-4450-bb3f-7b391bc03e06-node-pullsecrets\") pod \"apiserver-76f77b778f-xzd59\" (UID: \"4f18e34a-2f8e-4450-bb3f-7b391bc03e06\") " pod="openshift-apiserver/apiserver-76f77b778f-xzd59" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.210513 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4f18e34a-2f8e-4450-bb3f-7b391bc03e06-etcd-serving-ca\") pod \"apiserver-76f77b778f-xzd59\" (UID: \"4f18e34a-2f8e-4450-bb3f-7b391bc03e06\") " pod="openshift-apiserver/apiserver-76f77b778f-xzd59" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.210659 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f30207aa-a4d2-41bb-8f36-8c6809d96191-machine-approver-tls\") pod \"machine-approver-56656f9798-5wxvl\" (UID: \"f30207aa-a4d2-41bb-8f36-8c6809d96191\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wxvl" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.210691 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/4f18e34a-2f8e-4450-bb3f-7b391bc03e06-image-import-ca\") pod \"apiserver-76f77b778f-xzd59\" (UID: \"4f18e34a-2f8e-4450-bb3f-7b391bc03e06\") " pod="openshift-apiserver/apiserver-76f77b778f-xzd59" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.210713 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f18e34a-2f8e-4450-bb3f-7b391bc03e06-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xzd59\" (UID: \"4f18e34a-2f8e-4450-bb3f-7b391bc03e06\") " pod="openshift-apiserver/apiserver-76f77b778f-xzd59" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.211116 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2d4d2050-65bd-4d00-893c-c7f296d90926-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-bjqw4\" (UID: \"2d4d2050-65bd-4d00-893c-c7f296d90926\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bjqw4" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.211185 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9a38022-ccc3-43cb-8af2-4252aef56bf8-config\") pod \"openshift-apiserver-operator-796bbdcf4f-95tgr\" (UID: \"e9a38022-ccc3-43cb-8af2-4252aef56bf8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-95tgr" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.211252 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/56e5be14-f33f-4db0-a372-77b3fd4d9510-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-9gpjt\" (UID: \"56e5be14-f33f-4db0-a372-77b3fd4d9510\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9gpjt" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.211677 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d1a69d09-d3e2-4af9-857a-3229bc05c992-images\") pod \"machine-api-operator-5694c8668f-t6z8c\" (UID: \"d1a69d09-d3e2-4af9-857a-3229bc05c992\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t6z8c" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.207773 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1a69d09-d3e2-4af9-857a-3229bc05c992-config\") pod \"machine-api-operator-5694c8668f-t6z8c\" (UID: \"d1a69d09-d3e2-4af9-857a-3229bc05c992\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t6z8c" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.212371 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c7333389-f183-4d12-b140-3f332e49eaec-metrics-tls\") pod \"dns-operator-744455d44c-c5h5p\" (UID: \"c7333389-f183-4d12-b140-3f332e49eaec\") " pod="openshift-dns-operator/dns-operator-744455d44c-c5h5p" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.212690 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f18e34a-2f8e-4450-bb3f-7b391bc03e06-config\") pod \"apiserver-76f77b778f-xzd59\" (UID: \"4f18e34a-2f8e-4450-bb3f-7b391bc03e06\") " pod="openshift-apiserver/apiserver-76f77b778f-xzd59" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.212998 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/151b2a1f-2df4-49d4-9e55-260eebbb267f-config\") pod \"etcd-operator-b45778765-8w6mb\" (UID: \"151b2a1f-2df4-49d4-9e55-260eebbb267f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8w6mb" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.213125 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f30207aa-a4d2-41bb-8f36-8c6809d96191-auth-proxy-config\") pod \"machine-approver-56656f9798-5wxvl\" (UID: \"f30207aa-a4d2-41bb-8f36-8c6809d96191\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wxvl" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.213397 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/151b2a1f-2df4-49d4-9e55-260eebbb267f-serving-cert\") pod \"etcd-operator-b45778765-8w6mb\" (UID: \"151b2a1f-2df4-49d4-9e55-260eebbb267f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8w6mb" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.213922 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/151b2a1f-2df4-49d4-9e55-260eebbb267f-etcd-client\") pod \"etcd-operator-b45778765-8w6mb\" (UID: \"151b2a1f-2df4-49d4-9e55-260eebbb267f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8w6mb" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.214102 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.214207 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3f57f760-4139-4eb7-b7e9-5f0bcd7cb3eb-profile-collector-cert\") pod \"olm-operator-6b444d44fb-qp2l4\" (UID: \"3f57f760-4139-4eb7-b7e9-5f0bcd7cb3eb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qp2l4" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.214342 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3f57f760-4139-4eb7-b7e9-5f0bcd7cb3eb-srv-cert\") pod \"olm-operator-6b444d44fb-qp2l4\" (UID: \"3f57f760-4139-4eb7-b7e9-5f0bcd7cb3eb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qp2l4" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.215359 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4f18e34a-2f8e-4450-bb3f-7b391bc03e06-etcd-client\") pod \"apiserver-76f77b778f-xzd59\" (UID: \"4f18e34a-2f8e-4450-bb3f-7b391bc03e06\") " pod="openshift-apiserver/apiserver-76f77b778f-xzd59" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.215396 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f6455f2-bcad-4e11-8ef5-a272b406be88-serving-cert\") pod \"console-operator-58897d9998-wkpzg\" (UID: \"1f6455f2-bcad-4e11-8ef5-a272b406be88\") " pod="openshift-console-operator/console-operator-58897d9998-wkpzg" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.215669 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9a38022-ccc3-43cb-8af2-4252aef56bf8-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-95tgr\" (UID: \"e9a38022-ccc3-43cb-8af2-4252aef56bf8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-95tgr" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.215910 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/beee0ec0-e83b-41df-b1c5-b6dadb908961-serving-cert\") pod \"openshift-config-operator-7777fb866f-zkxq4\" (UID: \"beee0ec0-e83b-41df-b1c5-b6dadb908961\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zkxq4" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.216243 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2d4d2050-65bd-4d00-893c-c7f296d90926-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-bjqw4\" (UID: \"2d4d2050-65bd-4d00-893c-c7f296d90926\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bjqw4" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.216378 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c6d61e80-043f-4ece-a6a6-eed6357749f5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-p8pxb\" (UID: \"c6d61e80-043f-4ece-a6a6-eed6357749f5\") " pod="openshift-marketplace/marketplace-operator-79b997595-p8pxb" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.216605 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d1a69d09-d3e2-4af9-857a-3229bc05c992-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-t6z8c\" (UID: \"d1a69d09-d3e2-4af9-857a-3229bc05c992\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t6z8c" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.216771 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/337d5692-12d3-4c0a-8187-eb66a2666e95-srv-cert\") pod \"catalog-operator-68c6474976-t25rf\" (UID: \"337d5692-12d3-4c0a-8187-eb66a2666e95\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t25rf" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.217170 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4f18e34a-2f8e-4450-bb3f-7b391bc03e06-encryption-config\") pod \"apiserver-76f77b778f-xzd59\" (UID: \"4f18e34a-2f8e-4450-bb3f-7b391bc03e06\") " pod="openshift-apiserver/apiserver-76f77b778f-xzd59" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.217259 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59291c5b-082b-467b-b87b-cf9af3e613b1-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-fbl76\" (UID: \"59291c5b-082b-467b-b87b-cf9af3e613b1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fbl76" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.221348 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f18e34a-2f8e-4450-bb3f-7b391bc03e06-serving-cert\") pod \"apiserver-76f77b778f-xzd59\" (UID: \"4f18e34a-2f8e-4450-bb3f-7b391bc03e06\") " pod="openshift-apiserver/apiserver-76f77b778f-xzd59" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.221789 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/337d5692-12d3-4c0a-8187-eb66a2666e95-profile-collector-cert\") pod \"catalog-operator-68c6474976-t25rf\" (UID: \"337d5692-12d3-4c0a-8187-eb66a2666e95\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t25rf" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.235345 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-vmcrt"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.235800 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.236052 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vmcrt" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.242020 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vmcrt"] Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.255012 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.274508 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.294925 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.310734 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/95ca7c55-d937-4670-b10c-a8aaf4b77c84-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jh2mc\" (UID: \"95ca7c55-d937-4670-b10c-a8aaf4b77c84\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jh2mc" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.310791 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95ca7c55-d937-4670-b10c-a8aaf4b77c84-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jh2mc\" (UID: \"95ca7c55-d937-4670-b10c-a8aaf4b77c84\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jh2mc" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.310839 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0383bca0-800c-4f21-a7ee-32e42609b47e-signing-key\") pod \"service-ca-9c57cc56f-4xwcm\" (UID: \"0383bca0-800c-4f21-a7ee-32e42609b47e\") " pod="openshift-service-ca/service-ca-9c57cc56f-4xwcm" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.310889 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gswvk\" (UniqueName: \"kubernetes.io/projected/e0e7aa68-dd0a-4803-93c1-d3d824033cad-kube-api-access-gswvk\") pod \"apiserver-7bbb656c7d-7zkls\" (UID: \"e0e7aa68-dd0a-4803-93c1-d3d824033cad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7zkls" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.310918 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0383bca0-800c-4f21-a7ee-32e42609b47e-signing-cabundle\") pod \"service-ca-9c57cc56f-4xwcm\" (UID: \"0383bca0-800c-4f21-a7ee-32e42609b47e\") " pod="openshift-service-ca/service-ca-9c57cc56f-4xwcm" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.310990 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3321a715-9c5f-4417-bec1-4ba3ccce946c-webhook-cert\") pod \"packageserver-d55dfcdfc-fhxc9\" (UID: \"3321a715-9c5f-4417-bec1-4ba3ccce946c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhxc9" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.311058 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/09470b31-c2ae-42f8-8490-c446e979042d-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-p9x2d\" (UID: \"09470b31-c2ae-42f8-8490-c446e979042d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p9x2d" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.311091 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e0e7aa68-dd0a-4803-93c1-d3d824033cad-audit-policies\") pod \"apiserver-7bbb656c7d-7zkls\" (UID: \"e0e7aa68-dd0a-4803-93c1-d3d824033cad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7zkls" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.311142 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v96j\" (UniqueName: \"kubernetes.io/projected/3321a715-9c5f-4417-bec1-4ba3ccce946c-kube-api-access-7v96j\") pod \"packageserver-d55dfcdfc-fhxc9\" (UID: \"3321a715-9c5f-4417-bec1-4ba3ccce946c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhxc9" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.311169 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcbln\" (UniqueName: \"kubernetes.io/projected/0383bca0-800c-4f21-a7ee-32e42609b47e-kube-api-access-fcbln\") pod \"service-ca-9c57cc56f-4xwcm\" (UID: \"0383bca0-800c-4f21-a7ee-32e42609b47e\") " pod="openshift-service-ca/service-ca-9c57cc56f-4xwcm" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.311198 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e0e7aa68-dd0a-4803-93c1-d3d824033cad-encryption-config\") pod \"apiserver-7bbb656c7d-7zkls\" (UID: \"e0e7aa68-dd0a-4803-93c1-d3d824033cad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7zkls" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.311219 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3321a715-9c5f-4417-bec1-4ba3ccce946c-tmpfs\") pod \"packageserver-d55dfcdfc-fhxc9\" (UID: \"3321a715-9c5f-4417-bec1-4ba3ccce946c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhxc9" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.311240 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0e7aa68-dd0a-4803-93c1-d3d824033cad-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-7zkls\" (UID: \"e0e7aa68-dd0a-4803-93c1-d3d824033cad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7zkls" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.311260 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e0e7aa68-dd0a-4803-93c1-d3d824033cad-audit-dir\") pod \"apiserver-7bbb656c7d-7zkls\" (UID: \"e0e7aa68-dd0a-4803-93c1-d3d824033cad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7zkls" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.311288 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e0e7aa68-dd0a-4803-93c1-d3d824033cad-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-7zkls\" (UID: \"e0e7aa68-dd0a-4803-93c1-d3d824033cad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7zkls" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.311337 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbstn\" (UniqueName: \"kubernetes.io/projected/09470b31-c2ae-42f8-8490-c446e979042d-kube-api-access-lbstn\") pod \"package-server-manager-789f6589d5-p9x2d\" (UID: \"09470b31-c2ae-42f8-8490-c446e979042d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p9x2d" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.311373 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95ca7c55-d937-4670-b10c-a8aaf4b77c84-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jh2mc\" (UID: \"95ca7c55-d937-4670-b10c-a8aaf4b77c84\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jh2mc" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.311427 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0e7aa68-dd0a-4803-93c1-d3d824033cad-serving-cert\") pod \"apiserver-7bbb656c7d-7zkls\" (UID: \"e0e7aa68-dd0a-4803-93c1-d3d824033cad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7zkls" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.311457 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3321a715-9c5f-4417-bec1-4ba3ccce946c-apiservice-cert\") pod \"packageserver-d55dfcdfc-fhxc9\" (UID: \"3321a715-9c5f-4417-bec1-4ba3ccce946c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhxc9" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.311484 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e0e7aa68-dd0a-4803-93c1-d3d824033cad-etcd-client\") pod \"apiserver-7bbb656c7d-7zkls\" (UID: \"e0e7aa68-dd0a-4803-93c1-d3d824033cad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7zkls" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.312649 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3321a715-9c5f-4417-bec1-4ba3ccce946c-tmpfs\") pod \"packageserver-d55dfcdfc-fhxc9\" (UID: \"3321a715-9c5f-4417-bec1-4ba3ccce946c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhxc9" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.312732 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e0e7aa68-dd0a-4803-93c1-d3d824033cad-audit-dir\") pod \"apiserver-7bbb656c7d-7zkls\" (UID: \"e0e7aa68-dd0a-4803-93c1-d3d824033cad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7zkls" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.315144 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.334479 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.353991 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.375219 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.394268 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.414775 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.422797 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e0e7aa68-dd0a-4803-93c1-d3d824033cad-audit-policies\") pod \"apiserver-7bbb656c7d-7zkls\" (UID: \"e0e7aa68-dd0a-4803-93c1-d3d824033cad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7zkls" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.434177 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.443596 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0e7aa68-dd0a-4803-93c1-d3d824033cad-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-7zkls\" (UID: \"e0e7aa68-dd0a-4803-93c1-d3d824033cad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7zkls" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.454487 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.469114 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0e7aa68-dd0a-4803-93c1-d3d824033cad-serving-cert\") pod \"apiserver-7bbb656c7d-7zkls\" (UID: \"e0e7aa68-dd0a-4803-93c1-d3d824033cad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7zkls" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.474322 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.484387 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e0e7aa68-dd0a-4803-93c1-d3d824033cad-encryption-config\") pod \"apiserver-7bbb656c7d-7zkls\" (UID: \"e0e7aa68-dd0a-4803-93c1-d3d824033cad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7zkls" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.494250 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.504764 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e0e7aa68-dd0a-4803-93c1-d3d824033cad-etcd-client\") pod \"apiserver-7bbb656c7d-7zkls\" (UID: \"e0e7aa68-dd0a-4803-93c1-d3d824033cad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7zkls" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.517282 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.523632 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e0e7aa68-dd0a-4803-93c1-d3d824033cad-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-7zkls\" (UID: \"e0e7aa68-dd0a-4803-93c1-d3d824033cad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7zkls" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.535048 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.554315 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.576082 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.595648 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.615594 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.635384 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.654461 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.674638 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.695426 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.714055 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.735966 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.754967 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.775174 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.794601 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.814005 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.833874 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.853764 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.875172 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.894649 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.908015 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/09470b31-c2ae-42f8-8490-c446e979042d-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-p9x2d\" (UID: \"09470b31-c2ae-42f8-8490-c446e979042d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p9x2d" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.914615 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.934919 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.955094 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.975098 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 09 13:01:47 crc kubenswrapper[4723]: I0309 13:01:47.995024 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.014849 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.035568 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.055596 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.072539 4723 request.go:700] Waited for 1.007316948s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-operator/secrets?fieldSelector=metadata.name%3Dingress-operator-dockercfg-7lnqk&limit=500&resourceVersion=0 Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.075306 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.095722 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.115766 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.135775 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.155556 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.175049 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.205513 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.215825 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.234784 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.254495 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.273802 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.288513 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0383bca0-800c-4f21-a7ee-32e42609b47e-signing-key\") pod \"service-ca-9c57cc56f-4xwcm\" (UID: \"0383bca0-800c-4f21-a7ee-32e42609b47e\") " pod="openshift-service-ca/service-ca-9c57cc56f-4xwcm" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.295030 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.303309 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0383bca0-800c-4f21-a7ee-32e42609b47e-signing-cabundle\") pod \"service-ca-9c57cc56f-4xwcm\" (UID: \"0383bca0-800c-4f21-a7ee-32e42609b47e\") " pod="openshift-service-ca/service-ca-9c57cc56f-4xwcm" Mar 09 13:01:48 crc kubenswrapper[4723]: E0309 13:01:48.312747 4723 secret.go:188] Couldn't get secret openshift-kube-scheduler-operator/kube-scheduler-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 09 13:01:48 crc kubenswrapper[4723]: E0309 13:01:48.312837 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95ca7c55-d937-4670-b10c-a8aaf4b77c84-serving-cert podName:95ca7c55-d937-4670-b10c-a8aaf4b77c84 nodeName:}" failed. No retries permitted until 2026-03-09 13:01:48.812814979 +0000 UTC m=+182.827282519 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/95ca7c55-d937-4670-b10c-a8aaf4b77c84-serving-cert") pod "openshift-kube-scheduler-operator-5fdd9b5758-jh2mc" (UID: "95ca7c55-d937-4670-b10c-a8aaf4b77c84") : failed to sync secret cache: timed out waiting for the condition Mar 09 13:01:48 crc kubenswrapper[4723]: E0309 13:01:48.312741 4723 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 09 13:01:48 crc kubenswrapper[4723]: E0309 13:01:48.313035 4723 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 09 13:01:48 crc kubenswrapper[4723]: E0309 13:01:48.313060 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3321a715-9c5f-4417-bec1-4ba3ccce946c-webhook-cert podName:3321a715-9c5f-4417-bec1-4ba3ccce946c nodeName:}" failed. No retries permitted until 2026-03-09 13:01:48.813045545 +0000 UTC m=+182.827513085 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/3321a715-9c5f-4417-bec1-4ba3ccce946c-webhook-cert") pod "packageserver-d55dfcdfc-fhxc9" (UID: "3321a715-9c5f-4417-bec1-4ba3ccce946c") : failed to sync secret cache: timed out waiting for the condition Mar 09 13:01:48 crc kubenswrapper[4723]: E0309 13:01:48.313018 4723 configmap.go:193] Couldn't get configMap openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-config: failed to sync configmap cache: timed out waiting for the condition Mar 09 13:01:48 crc kubenswrapper[4723]: E0309 13:01:48.313080 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3321a715-9c5f-4417-bec1-4ba3ccce946c-apiservice-cert podName:3321a715-9c5f-4417-bec1-4ba3ccce946c nodeName:}" failed. No retries permitted until 2026-03-09 13:01:48.813069265 +0000 UTC m=+182.827536805 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/3321a715-9c5f-4417-bec1-4ba3ccce946c-apiservice-cert") pod "packageserver-d55dfcdfc-fhxc9" (UID: "3321a715-9c5f-4417-bec1-4ba3ccce946c") : failed to sync secret cache: timed out waiting for the condition Mar 09 13:01:48 crc kubenswrapper[4723]: E0309 13:01:48.313372 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/95ca7c55-d937-4670-b10c-a8aaf4b77c84-config podName:95ca7c55-d937-4670-b10c-a8aaf4b77c84 nodeName:}" failed. No retries permitted until 2026-03-09 13:01:48.813324362 +0000 UTC m=+182.827791932 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/95ca7c55-d937-4670-b10c-a8aaf4b77c84-config") pod "openshift-kube-scheduler-operator-5fdd9b5758-jh2mc" (UID: "95ca7c55-d937-4670-b10c-a8aaf4b77c84") : failed to sync configmap cache: timed out waiting for the condition Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.314773 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.334567 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.354445 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.374135 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.394909 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.413986 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.433728 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.454547 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.475260 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.495088 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.515255 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.535604 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.555344 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.575169 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.595114 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.615274 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.635433 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.654715 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.675309 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.721664 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ffvs\" (UniqueName: \"kubernetes.io/projected/61312a96-b8f6-431c-b24e-0046271cf40f-kube-api-access-6ffvs\") pod \"authentication-operator-69f744f599-lwpcl\" (UID: \"61312a96-b8f6-431c-b24e-0046271cf40f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lwpcl" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.732403 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrc6q\" (UniqueName: \"kubernetes.io/projected/9ae03d73-b21d-4004-a000-e49a547ef19d-kube-api-access-wrc6q\") pod \"oauth-openshift-558db77b4-dh6qm\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.750434 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5lpg\" (UniqueName: \"kubernetes.io/projected/6775c6a2-49ba-48fb-9f8f-ff26a7155618-kube-api-access-m5lpg\") pod \"console-f9d7485db-6gtjl\" (UID: \"6775c6a2-49ba-48fb-9f8f-ff26a7155618\") " pod="openshift-console/console-f9d7485db-6gtjl" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.783702 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nffd8\" (UniqueName: \"kubernetes.io/projected/9bc33ee8-964b-4b03-b564-5c66068629b9-kube-api-access-nffd8\") pod \"controller-manager-879f6c89f-vb8n2\" (UID: \"9bc33ee8-964b-4b03-b564-5c66068629b9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vb8n2" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.789738 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-vb8n2" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.799450 4723 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.807595 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-6gtjl" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.815199 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.817811 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.835183 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.839442 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95ca7c55-d937-4670-b10c-a8aaf4b77c84-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jh2mc\" (UID: \"95ca7c55-d937-4670-b10c-a8aaf4b77c84\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jh2mc" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.839565 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3321a715-9c5f-4417-bec1-4ba3ccce946c-apiservice-cert\") pod \"packageserver-d55dfcdfc-fhxc9\" (UID: \"3321a715-9c5f-4417-bec1-4ba3ccce946c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhxc9" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.839661 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95ca7c55-d937-4670-b10c-a8aaf4b77c84-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jh2mc\" (UID: \"95ca7c55-d937-4670-b10c-a8aaf4b77c84\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jh2mc" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.839794 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3321a715-9c5f-4417-bec1-4ba3ccce946c-webhook-cert\") pod \"packageserver-d55dfcdfc-fhxc9\" (UID: \"3321a715-9c5f-4417-bec1-4ba3ccce946c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhxc9" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.840181 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95ca7c55-d937-4670-b10c-a8aaf4b77c84-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jh2mc\" (UID: \"95ca7c55-d937-4670-b10c-a8aaf4b77c84\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jh2mc" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.842914 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95ca7c55-d937-4670-b10c-a8aaf4b77c84-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jh2mc\" (UID: \"95ca7c55-d937-4670-b10c-a8aaf4b77c84\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jh2mc" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.842968 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3321a715-9c5f-4417-bec1-4ba3ccce946c-apiservice-cert\") pod \"packageserver-d55dfcdfc-fhxc9\" (UID: \"3321a715-9c5f-4417-bec1-4ba3ccce946c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhxc9" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.843925 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3321a715-9c5f-4417-bec1-4ba3ccce946c-webhook-cert\") pod \"packageserver-d55dfcdfc-fhxc9\" (UID: \"3321a715-9c5f-4417-bec1-4ba3ccce946c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhxc9" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.854521 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.874323 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.895663 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.938731 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dr2q\" (UniqueName: \"kubernetes.io/projected/151b2a1f-2df4-49d4-9e55-260eebbb267f-kube-api-access-4dr2q\") pod \"etcd-operator-b45778765-8w6mb\" (UID: \"151b2a1f-2df4-49d4-9e55-260eebbb267f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8w6mb" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.963490 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-8w6mb" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.969320 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmzvw\" (UniqueName: \"kubernetes.io/projected/beee0ec0-e83b-41df-b1c5-b6dadb908961-kube-api-access-cmzvw\") pod \"openshift-config-operator-7777fb866f-zkxq4\" (UID: \"beee0ec0-e83b-41df-b1c5-b6dadb908961\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zkxq4" Mar 09 13:01:48 crc kubenswrapper[4723]: I0309 13:01:48.969408 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-lwpcl" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.003281 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/59291c5b-082b-467b-b87b-cf9af3e613b1-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-fbl76\" (UID: \"59291c5b-082b-467b-b87b-cf9af3e613b1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fbl76" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.010890 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m67cz\" (UniqueName: \"kubernetes.io/projected/56e5be14-f33f-4db0-a372-77b3fd4d9510-kube-api-access-m67cz\") pod \"cluster-samples-operator-665b6dd947-9gpjt\" (UID: \"56e5be14-f33f-4db0-a372-77b3fd4d9510\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9gpjt" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.028387 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdk49\" (UniqueName: \"kubernetes.io/projected/e9a38022-ccc3-43cb-8af2-4252aef56bf8-kube-api-access-vdk49\") pod \"openshift-apiserver-operator-796bbdcf4f-95tgr\" (UID: \"e9a38022-ccc3-43cb-8af2-4252aef56bf8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-95tgr" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.042447 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-dh6qm"] Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.049816 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2c7j\" (UniqueName: \"kubernetes.io/projected/3f57f760-4139-4eb7-b7e9-5f0bcd7cb3eb-kube-api-access-j2c7j\") pod \"olm-operator-6b444d44fb-qp2l4\" (UID: \"3f57f760-4139-4eb7-b7e9-5f0bcd7cb3eb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qp2l4" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.071083 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bn88\" (UniqueName: \"kubernetes.io/projected/337d5692-12d3-4c0a-8187-eb66a2666e95-kube-api-access-5bn88\") pod \"catalog-operator-68c6474976-t25rf\" (UID: \"337d5692-12d3-4c0a-8187-eb66a2666e95\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t25rf" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.078243 4723 request.go:700] Waited for 1.868541186s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-apiserver/serviceaccounts/openshift-apiserver-sa/token Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.084582 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qp2l4" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.097670 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddmvl\" (UniqueName: \"kubernetes.io/projected/4f18e34a-2f8e-4450-bb3f-7b391bc03e06-kube-api-access-ddmvl\") pod \"apiserver-76f77b778f-xzd59\" (UID: \"4f18e34a-2f8e-4450-bb3f-7b391bc03e06\") " pod="openshift-apiserver/apiserver-76f77b778f-xzd59" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.110113 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w58j4\" (UniqueName: \"kubernetes.io/projected/d1a69d09-d3e2-4af9-857a-3229bc05c992-kube-api-access-w58j4\") pod \"machine-api-operator-5694c8668f-t6z8c\" (UID: \"d1a69d09-d3e2-4af9-857a-3229bc05c992\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t6z8c" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.133125 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgb7w\" (UniqueName: \"kubernetes.io/projected/1f6455f2-bcad-4e11-8ef5-a272b406be88-kube-api-access-fgb7w\") pod \"console-operator-58897d9998-wkpzg\" (UID: \"1f6455f2-bcad-4e11-8ef5-a272b406be88\") " pod="openshift-console-operator/console-operator-58897d9998-wkpzg" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.152673 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjjls\" (UniqueName: \"kubernetes.io/projected/c6d61e80-043f-4ece-a6a6-eed6357749f5-kube-api-access-qjjls\") pod \"marketplace-operator-79b997595-p8pxb\" (UID: \"c6d61e80-043f-4ece-a6a6-eed6357749f5\") " pod="openshift-marketplace/marketplace-operator-79b997595-p8pxb" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.170665 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fbl76" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.171128 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q582d\" (UniqueName: \"kubernetes.io/projected/2d4d2050-65bd-4d00-893c-c7f296d90926-kube-api-access-q582d\") pod \"cluster-image-registry-operator-dc59b4c8b-bjqw4\" (UID: \"2d4d2050-65bd-4d00-893c-c7f296d90926\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bjqw4" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.191962 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-lwpcl"] Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.204259 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2279\" (UniqueName: \"kubernetes.io/projected/c7333389-f183-4d12-b140-3f332e49eaec-kube-api-access-l2279\") pod \"dns-operator-744455d44c-c5h5p\" (UID: \"c7333389-f183-4d12-b140-3f332e49eaec\") " pod="openshift-dns-operator/dns-operator-744455d44c-c5h5p" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.210421 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2d4d2050-65bd-4d00-893c-c7f296d90926-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-bjqw4\" (UID: \"2d4d2050-65bd-4d00-893c-c7f296d90926\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bjqw4" Mar 09 13:01:49 crc kubenswrapper[4723]: W0309 13:01:49.217333 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61312a96_b8f6_431c_b24e_0046271cf40f.slice/crio-4238f7dce485c475babc6b7cc0871de44b70a52c15eef7a64193f8825464b246 WatchSource:0}: Error finding container 4238f7dce485c475babc6b7cc0871de44b70a52c15eef7a64193f8825464b246: Status 404 returned error can't find the container with id 4238f7dce485c475babc6b7cc0871de44b70a52c15eef7a64193f8825464b246 Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.230054 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-8w6mb"] Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.234359 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgfd8\" (UniqueName: \"kubernetes.io/projected/e21fc837-8de2-4af5-a375-b14567f47d67-kube-api-access-dgfd8\") pod \"downloads-7954f5f757-b5c74\" (UID: \"e21fc837-8de2-4af5-a375-b14567f47d67\") " pod="openshift-console/downloads-7954f5f757-b5c74" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.247682 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zkxq4" Mar 09 13:01:49 crc kubenswrapper[4723]: W0309 13:01:49.258931 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod151b2a1f_2df4_49d4_9e55_260eebbb267f.slice/crio-d5b04c76867ed0b2d8b5f6ddfa5f5ff000dd2c393587ddab4b498da90a124895 WatchSource:0}: Error finding container d5b04c76867ed0b2d8b5f6ddfa5f5ff000dd2c393587ddab4b498da90a124895: Status 404 returned error can't find the container with id d5b04c76867ed0b2d8b5f6ddfa5f5ff000dd2c393587ddab4b498da90a124895 Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.259006 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-95tgr" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.263463 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vsh2\" (UniqueName: \"kubernetes.io/projected/f30207aa-a4d2-41bb-8f36-8c6809d96191-kube-api-access-8vsh2\") pod \"machine-approver-56656f9798-5wxvl\" (UID: \"f30207aa-a4d2-41bb-8f36-8c6809d96191\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wxvl" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.263739 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-6gtjl"] Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.267802 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vb8n2"] Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.270680 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9gpjt" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.274666 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.277338 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfclc\" (UniqueName: \"kubernetes.io/projected/9d680903-3aac-4f3d-8e55-5fb9ee1cb46a-kube-api-access-kfclc\") pod \"migrator-59844c95c7-r5ltl\" (UID: \"9d680903-3aac-4f3d-8e55-5fb9ee1cb46a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-r5ltl" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.283363 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-c5h5p" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.287363 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-b5c74" Mar 09 13:01:49 crc kubenswrapper[4723]: W0309 13:01:49.290375 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bc33ee8_964b_4b03_b564_5c66068629b9.slice/crio-abdf4adce12e0920215d9fc04894f299b9d5f163b9e055b0af05cb53fe2115db WatchSource:0}: Error finding container abdf4adce12e0920215d9fc04894f299b9d5f163b9e055b0af05cb53fe2115db: Status 404 returned error can't find the container with id abdf4adce12e0920215d9fc04894f299b9d5f163b9e055b0af05cb53fe2115db Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.295278 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.296370 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-t6z8c" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.299046 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qp2l4"] Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.301050 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-r5ltl" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.314169 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bjqw4" Mar 09 13:01:49 crc kubenswrapper[4723]: W0309 13:01:49.315233 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f57f760_4139_4eb7_b7e9_5f0bcd7cb3eb.slice/crio-c3de5ff9535ddf8dbf3c05d6cd151a5bef0cb0c21cd5869d1b4b830db2e59487 WatchSource:0}: Error finding container c3de5ff9535ddf8dbf3c05d6cd151a5bef0cb0c21cd5869d1b4b830db2e59487: Status 404 returned error can't find the container with id c3de5ff9535ddf8dbf3c05d6cd151a5bef0cb0c21cd5869d1b4b830db2e59487 Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.316491 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.340190 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.353380 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-p8pxb" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.357229 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t25rf" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.367672 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-wkpzg" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.375900 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-xzd59" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.384474 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/95ca7c55-d937-4670-b10c-a8aaf4b77c84-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jh2mc\" (UID: \"95ca7c55-d937-4670-b10c-a8aaf4b77c84\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jh2mc" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.394856 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fbl76"] Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.413363 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gswvk\" (UniqueName: \"kubernetes.io/projected/e0e7aa68-dd0a-4803-93c1-d3d824033cad-kube-api-access-gswvk\") pod \"apiserver-7bbb656c7d-7zkls\" (UID: \"e0e7aa68-dd0a-4803-93c1-d3d824033cad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7zkls" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.420285 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v96j\" (UniqueName: \"kubernetes.io/projected/3321a715-9c5f-4417-bec1-4ba3ccce946c-kube-api-access-7v96j\") pod \"packageserver-d55dfcdfc-fhxc9\" (UID: \"3321a715-9c5f-4417-bec1-4ba3ccce946c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhxc9" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.425708 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7zkls" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.433252 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcbln\" (UniqueName: \"kubernetes.io/projected/0383bca0-800c-4f21-a7ee-32e42609b47e-kube-api-access-fcbln\") pod \"service-ca-9c57cc56f-4xwcm\" (UID: \"0383bca0-800c-4f21-a7ee-32e42609b47e\") " pod="openshift-service-ca/service-ca-9c57cc56f-4xwcm" Mar 09 13:01:49 crc kubenswrapper[4723]: W0309 13:01:49.447011 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59291c5b_082b_467b_b87b_cf9af3e613b1.slice/crio-04fd656f0a5c57d7e04913ef36705e3f5de3f69fb123a3ba25652541829c18b0 WatchSource:0}: Error finding container 04fd656f0a5c57d7e04913ef36705e3f5de3f69fb123a3ba25652541829c18b0: Status 404 returned error can't find the container with id 04fd656f0a5c57d7e04913ef36705e3f5de3f69fb123a3ba25652541829c18b0 Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.455658 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbstn\" (UniqueName: \"kubernetes.io/projected/09470b31-c2ae-42f8-8490-c446e979042d-kube-api-access-lbstn\") pod \"package-server-manager-789f6589d5-p9x2d\" (UID: \"09470b31-c2ae-42f8-8490-c446e979042d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p9x2d" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.460030 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p9x2d" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.492569 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wxvl" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.493076 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-4xwcm" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.499586 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhxc9" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.550529 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c15fd52d-a005-4417-aaca-84839023e2b4-secret-volume\") pod \"collect-profiles-29551020-rlng8\" (UID: \"c15fd52d-a005-4417-aaca-84839023e2b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551020-rlng8" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.550574 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77a70858-7982-4c72-9dad-3fb8a8547361-serving-cert\") pod \"service-ca-operator-777779d784-j9lvd\" (UID: \"77a70858-7982-4c72-9dad-3fb8a8547361\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-j9lvd" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.550610 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bgjp\" (UniqueName: \"kubernetes.io/projected/f386e3de-9b2f-478d-b7c5-8ac3f8aff8d1-kube-api-access-5bgjp\") pod \"router-default-5444994796-fzrk5\" (UID: \"f386e3de-9b2f-478d-b7c5-8ac3f8aff8d1\") " pod="openshift-ingress/router-default-5444994796-fzrk5" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.550628 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nkl5\" (UniqueName: \"kubernetes.io/projected/2c3f87db-54f2-4c2e-98ca-1718ed598a7d-kube-api-access-4nkl5\") pod \"openshift-controller-manager-operator-756b6f6bc6-74fx9\" (UID: \"2c3f87db-54f2-4c2e-98ca-1718ed598a7d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74fx9" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.550681 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd476\" (UniqueName: \"kubernetes.io/projected/330d0ce9-3cc0-427c-acf4-7c14f36add18-kube-api-access-gd476\") pod \"control-plane-machine-set-operator-78cbb6b69f-kdxpm\" (UID: \"330d0ce9-3cc0-427c-acf4-7c14f36add18\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kdxpm" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.550698 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4e4bf80a-dd91-49ad-a418-2edfce2d0fac-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-gk4bd\" (UID: \"4e4bf80a-dd91-49ad-a418-2edfce2d0fac\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gk4bd" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.550712 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc891b08-d815-4b09-94bb-bd0dc6dc01f4-config\") pod \"kube-controller-manager-operator-78b949d7b-vpt9b\" (UID: \"dc891b08-d815-4b09-94bb-bd0dc6dc01f4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vpt9b" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.550725 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c15fd52d-a005-4417-aaca-84839023e2b4-config-volume\") pod \"collect-profiles-29551020-rlng8\" (UID: \"c15fd52d-a005-4417-aaca-84839023e2b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551020-rlng8" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.550740 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eaf6376b-8a99-41a9-bbd7-c93567fe1f24-trusted-ca\") pod \"ingress-operator-5b745b69d9-pcp4r\" (UID: \"eaf6376b-8a99-41a9-bbd7-c93567fe1f24\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pcp4r" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.550762 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77a70858-7982-4c72-9dad-3fb8a8547361-config\") pod \"service-ca-operator-777779d784-j9lvd\" (UID: \"77a70858-7982-4c72-9dad-3fb8a8547361\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-j9lvd" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.550819 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f386e3de-9b2f-478d-b7c5-8ac3f8aff8d1-service-ca-bundle\") pod \"router-default-5444994796-fzrk5\" (UID: \"f386e3de-9b2f-478d-b7c5-8ac3f8aff8d1\") " pod="openshift-ingress/router-default-5444994796-fzrk5" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.550834 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eaf6376b-8a99-41a9-bbd7-c93567fe1f24-metrics-tls\") pod \"ingress-operator-5b745b69d9-pcp4r\" (UID: \"eaf6376b-8a99-41a9-bbd7-c93567fe1f24\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pcp4r" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.550859 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4e4bf80a-dd91-49ad-a418-2edfce2d0fac-proxy-tls\") pod \"machine-config-controller-84d6567774-gk4bd\" (UID: \"4e4bf80a-dd91-49ad-a418-2edfce2d0fac\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gk4bd" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.550929 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32ff745d-fa0d-44cf-8ab8-bd25d48f3c1f-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-58jld\" (UID: \"32ff745d-fa0d-44cf-8ab8-bd25d48f3c1f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-58jld" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.550962 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc891b08-d815-4b09-94bb-bd0dc6dc01f4-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vpt9b\" (UID: \"dc891b08-d815-4b09-94bb-bd0dc6dc01f4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vpt9b" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.550995 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtx2b\" (UniqueName: \"kubernetes.io/projected/8a793f97-7e99-491f-a21f-e501491f98d0-kube-api-access-mtx2b\") pod \"machine-config-operator-74547568cd-j6zr2\" (UID: \"8a793f97-7e99-491f-a21f-e501491f98d0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j6zr2" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.551011 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8e50ffdf-4ab9-4237-b7c9-e9e641711d6c-node-bootstrap-token\") pod \"machine-config-server-gm7hg\" (UID: \"8e50ffdf-4ab9-4237-b7c9-e9e641711d6c\") " pod="openshift-machine-config-operator/machine-config-server-gm7hg" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.551025 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32ff745d-fa0d-44cf-8ab8-bd25d48f3c1f-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-58jld\" (UID: \"32ff745d-fa0d-44cf-8ab8-bd25d48f3c1f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-58jld" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.551079 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.551120 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2001358b-ab58-4093-82cb-465bb04941c4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-n5lnt\" (UID: \"2001358b-ab58-4093-82cb-465bb04941c4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-n5lnt" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.551135 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c3f87db-54f2-4c2e-98ca-1718ed598a7d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-74fx9\" (UID: \"2c3f87db-54f2-4c2e-98ca-1718ed598a7d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74fx9" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.551167 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh4m7\" (UniqueName: \"kubernetes.io/projected/8e50ffdf-4ab9-4237-b7c9-e9e641711d6c-kube-api-access-qh4m7\") pod \"machine-config-server-gm7hg\" (UID: \"8e50ffdf-4ab9-4237-b7c9-e9e641711d6c\") " pod="openshift-machine-config-operator/machine-config-server-gm7hg" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.551192 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef-registry-tls\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.551208 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef-bound-sa-token\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.551221 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8a793f97-7e99-491f-a21f-e501491f98d0-proxy-tls\") pod \"machine-config-operator-74547568cd-j6zr2\" (UID: \"8a793f97-7e99-491f-a21f-e501491f98d0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j6zr2" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.551236 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbmd8\" (UniqueName: \"kubernetes.io/projected/ea205a39-cbd1-4704-8e93-0b1747a88e8a-kube-api-access-qbmd8\") pod \"route-controller-manager-6576b87f9c-6qpzn\" (UID: \"ea205a39-cbd1-4704-8e93-0b1747a88e8a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qpzn" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.551250 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef-trusted-ca\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.551264 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f386e3de-9b2f-478d-b7c5-8ac3f8aff8d1-stats-auth\") pod \"router-default-5444994796-fzrk5\" (UID: \"f386e3de-9b2f-478d-b7c5-8ac3f8aff8d1\") " pod="openshift-ingress/router-default-5444994796-fzrk5" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.551293 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef-registry-certificates\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.551308 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea205a39-cbd1-4704-8e93-0b1747a88e8a-client-ca\") pod \"route-controller-manager-6576b87f9c-6qpzn\" (UID: \"ea205a39-cbd1-4704-8e93-0b1747a88e8a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qpzn" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.551388 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef-installation-pull-secrets\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.551403 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eaf6376b-8a99-41a9-bbd7-c93567fe1f24-bound-sa-token\") pod \"ingress-operator-5b745b69d9-pcp4r\" (UID: \"eaf6376b-8a99-41a9-bbd7-c93567fe1f24\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pcp4r" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.551426 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqf7n\" (UniqueName: \"kubernetes.io/projected/77a70858-7982-4c72-9dad-3fb8a8547361-kube-api-access-cqf7n\") pod \"service-ca-operator-777779d784-j9lvd\" (UID: \"77a70858-7982-4c72-9dad-3fb8a8547361\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-j9lvd" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.551441 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8a793f97-7e99-491f-a21f-e501491f98d0-images\") pod \"machine-config-operator-74547568cd-j6zr2\" (UID: \"8a793f97-7e99-491f-a21f-e501491f98d0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j6zr2" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.551457 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea205a39-cbd1-4704-8e93-0b1747a88e8a-serving-cert\") pod \"route-controller-manager-6576b87f9c-6qpzn\" (UID: \"ea205a39-cbd1-4704-8e93-0b1747a88e8a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qpzn" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.551471 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w57wp\" (UniqueName: \"kubernetes.io/projected/32ff745d-fa0d-44cf-8ab8-bd25d48f3c1f-kube-api-access-w57wp\") pod \"kube-storage-version-migrator-operator-b67b599dd-58jld\" (UID: \"32ff745d-fa0d-44cf-8ab8-bd25d48f3c1f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-58jld" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.551496 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f386e3de-9b2f-478d-b7c5-8ac3f8aff8d1-default-certificate\") pod \"router-default-5444994796-fzrk5\" (UID: \"f386e3de-9b2f-478d-b7c5-8ac3f8aff8d1\") " pod="openshift-ingress/router-default-5444994796-fzrk5" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.551511 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn7wq\" (UniqueName: \"kubernetes.io/projected/eaf6376b-8a99-41a9-bbd7-c93567fe1f24-kube-api-access-kn7wq\") pod \"ingress-operator-5b745b69d9-pcp4r\" (UID: \"eaf6376b-8a99-41a9-bbd7-c93567fe1f24\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pcp4r" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.551543 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8a793f97-7e99-491f-a21f-e501491f98d0-auth-proxy-config\") pod \"machine-config-operator-74547568cd-j6zr2\" (UID: \"8a793f97-7e99-491f-a21f-e501491f98d0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j6zr2" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.551558 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8e50ffdf-4ab9-4237-b7c9-e9e641711d6c-certs\") pod \"machine-config-server-gm7hg\" (UID: \"8e50ffdf-4ab9-4237-b7c9-e9e641711d6c\") " pod="openshift-machine-config-operator/machine-config-server-gm7hg" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.551580 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4fhz\" (UniqueName: \"kubernetes.io/projected/2001358b-ab58-4093-82cb-465bb04941c4-kube-api-access-v4fhz\") pod \"multus-admission-controller-857f4d67dd-n5lnt\" (UID: \"2001358b-ab58-4093-82cb-465bb04941c4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-n5lnt" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.551596 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef-ca-trust-extracted\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.551612 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/330d0ce9-3cc0-427c-acf4-7c14f36add18-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-kdxpm\" (UID: \"330d0ce9-3cc0-427c-acf4-7c14f36add18\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kdxpm" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.551637 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc891b08-d815-4b09-94bb-bd0dc6dc01f4-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vpt9b\" (UID: \"dc891b08-d815-4b09-94bb-bd0dc6dc01f4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vpt9b" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.551669 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgg2m\" (UniqueName: \"kubernetes.io/projected/c15fd52d-a005-4417-aaca-84839023e2b4-kube-api-access-xgg2m\") pod \"collect-profiles-29551020-rlng8\" (UID: \"c15fd52d-a005-4417-aaca-84839023e2b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551020-rlng8" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.551716 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea205a39-cbd1-4704-8e93-0b1747a88e8a-config\") pod \"route-controller-manager-6576b87f9c-6qpzn\" (UID: \"ea205a39-cbd1-4704-8e93-0b1747a88e8a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qpzn" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.551734 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd7gm\" (UniqueName: \"kubernetes.io/projected/6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef-kube-api-access-dd7gm\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.551753 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c3f87db-54f2-4c2e-98ca-1718ed598a7d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-74fx9\" (UID: \"2c3f87db-54f2-4c2e-98ca-1718ed598a7d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74fx9" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.551789 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f386e3de-9b2f-478d-b7c5-8ac3f8aff8d1-metrics-certs\") pod \"router-default-5444994796-fzrk5\" (UID: \"f386e3de-9b2f-478d-b7c5-8ac3f8aff8d1\") " pod="openshift-ingress/router-default-5444994796-fzrk5" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.551851 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdxnv\" (UniqueName: \"kubernetes.io/projected/4e4bf80a-dd91-49ad-a418-2edfce2d0fac-kube-api-access-hdxnv\") pod \"machine-config-controller-84d6567774-gk4bd\" (UID: \"4e4bf80a-dd91-49ad-a418-2edfce2d0fac\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gk4bd" Mar 09 13:01:49 crc kubenswrapper[4723]: E0309 13:01:49.555239 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:01:50.055219629 +0000 UTC m=+184.069687169 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s6gh6" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.561271 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jh2mc" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.647273 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-c5h5p"] Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.653716 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:01:49 crc kubenswrapper[4723]: E0309 13:01:49.653903 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:01:50.153843508 +0000 UTC m=+184.168311048 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.654010 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd476\" (UniqueName: \"kubernetes.io/projected/330d0ce9-3cc0-427c-acf4-7c14f36add18-kube-api-access-gd476\") pod \"control-plane-machine-set-operator-78cbb6b69f-kdxpm\" (UID: \"330d0ce9-3cc0-427c-acf4-7c14f36add18\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kdxpm" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.654083 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4e4bf80a-dd91-49ad-a418-2edfce2d0fac-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-gk4bd\" (UID: \"4e4bf80a-dd91-49ad-a418-2edfce2d0fac\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gk4bd" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.654129 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0ae8bb36-2555-4976-89ee-a9b6cb99ec9f-registration-dir\") pod \"csi-hostpathplugin-gqscf\" (UID: \"0ae8bb36-2555-4976-89ee-a9b6cb99ec9f\") " pod="hostpath-provisioner/csi-hostpathplugin-gqscf" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.654166 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc891b08-d815-4b09-94bb-bd0dc6dc01f4-config\") pod \"kube-controller-manager-operator-78b949d7b-vpt9b\" (UID: \"dc891b08-d815-4b09-94bb-bd0dc6dc01f4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vpt9b" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.654208 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c15fd52d-a005-4417-aaca-84839023e2b4-config-volume\") pod \"collect-profiles-29551020-rlng8\" (UID: \"c15fd52d-a005-4417-aaca-84839023e2b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551020-rlng8" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.654230 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eaf6376b-8a99-41a9-bbd7-c93567fe1f24-trusted-ca\") pod \"ingress-operator-5b745b69d9-pcp4r\" (UID: \"eaf6376b-8a99-41a9-bbd7-c93567fe1f24\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pcp4r" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.654884 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77a70858-7982-4c72-9dad-3fb8a8547361-config\") pod \"service-ca-operator-777779d784-j9lvd\" (UID: \"77a70858-7982-4c72-9dad-3fb8a8547361\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-j9lvd" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.654972 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2bc1b3df-48a8-46f0-a01e-2596af170d85-config-volume\") pod \"dns-default-4hnlr\" (UID: \"2bc1b3df-48a8-46f0-a01e-2596af170d85\") " pod="openshift-dns/dns-default-4hnlr" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.655022 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f386e3de-9b2f-478d-b7c5-8ac3f8aff8d1-service-ca-bundle\") pod \"router-default-5444994796-fzrk5\" (UID: \"f386e3de-9b2f-478d-b7c5-8ac3f8aff8d1\") " pod="openshift-ingress/router-default-5444994796-fzrk5" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.655673 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c15fd52d-a005-4417-aaca-84839023e2b4-config-volume\") pod \"collect-profiles-29551020-rlng8\" (UID: \"c15fd52d-a005-4417-aaca-84839023e2b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551020-rlng8" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.656304 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eaf6376b-8a99-41a9-bbd7-c93567fe1f24-trusted-ca\") pod \"ingress-operator-5b745b69d9-pcp4r\" (UID: \"eaf6376b-8a99-41a9-bbd7-c93567fe1f24\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pcp4r" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.656741 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77a70858-7982-4c72-9dad-3fb8a8547361-config\") pod \"service-ca-operator-777779d784-j9lvd\" (UID: \"77a70858-7982-4c72-9dad-3fb8a8547361\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-j9lvd" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.657152 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc891b08-d815-4b09-94bb-bd0dc6dc01f4-config\") pod \"kube-controller-manager-operator-78b949d7b-vpt9b\" (UID: \"dc891b08-d815-4b09-94bb-bd0dc6dc01f4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vpt9b" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.655126 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eaf6376b-8a99-41a9-bbd7-c93567fe1f24-metrics-tls\") pod \"ingress-operator-5b745b69d9-pcp4r\" (UID: \"eaf6376b-8a99-41a9-bbd7-c93567fe1f24\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pcp4r" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.657684 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/0ae8bb36-2555-4976-89ee-a9b6cb99ec9f-plugins-dir\") pod \"csi-hostpathplugin-gqscf\" (UID: \"0ae8bb36-2555-4976-89ee-a9b6cb99ec9f\") " pod="hostpath-provisioner/csi-hostpathplugin-gqscf" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.657706 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f386e3de-9b2f-478d-b7c5-8ac3f8aff8d1-service-ca-bundle\") pod \"router-default-5444994796-fzrk5\" (UID: \"f386e3de-9b2f-478d-b7c5-8ac3f8aff8d1\") " pod="openshift-ingress/router-default-5444994796-fzrk5" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.657714 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4e4bf80a-dd91-49ad-a418-2edfce2d0fac-proxy-tls\") pod \"machine-config-controller-84d6567774-gk4bd\" (UID: \"4e4bf80a-dd91-49ad-a418-2edfce2d0fac\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gk4bd" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.657770 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhgsm\" (UniqueName: \"kubernetes.io/projected/0ae8bb36-2555-4976-89ee-a9b6cb99ec9f-kube-api-access-rhgsm\") pod \"csi-hostpathplugin-gqscf\" (UID: \"0ae8bb36-2555-4976-89ee-a9b6cb99ec9f\") " pod="hostpath-provisioner/csi-hostpathplugin-gqscf" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.657889 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32ff745d-fa0d-44cf-8ab8-bd25d48f3c1f-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-58jld\" (UID: \"32ff745d-fa0d-44cf-8ab8-bd25d48f3c1f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-58jld" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.657921 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0ae8bb36-2555-4976-89ee-a9b6cb99ec9f-socket-dir\") pod \"csi-hostpathplugin-gqscf\" (UID: \"0ae8bb36-2555-4976-89ee-a9b6cb99ec9f\") " pod="hostpath-provisioner/csi-hostpathplugin-gqscf" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.657979 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc891b08-d815-4b09-94bb-bd0dc6dc01f4-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vpt9b\" (UID: \"dc891b08-d815-4b09-94bb-bd0dc6dc01f4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vpt9b" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.658007 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf8rs\" (UniqueName: \"kubernetes.io/projected/2bc1b3df-48a8-46f0-a01e-2596af170d85-kube-api-access-sf8rs\") pod \"dns-default-4hnlr\" (UID: \"2bc1b3df-48a8-46f0-a01e-2596af170d85\") " pod="openshift-dns/dns-default-4hnlr" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.658053 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtx2b\" (UniqueName: \"kubernetes.io/projected/8a793f97-7e99-491f-a21f-e501491f98d0-kube-api-access-mtx2b\") pod \"machine-config-operator-74547568cd-j6zr2\" (UID: \"8a793f97-7e99-491f-a21f-e501491f98d0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j6zr2" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.658079 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8e50ffdf-4ab9-4237-b7c9-e9e641711d6c-node-bootstrap-token\") pod \"machine-config-server-gm7hg\" (UID: \"8e50ffdf-4ab9-4237-b7c9-e9e641711d6c\") " pod="openshift-machine-config-operator/machine-config-server-gm7hg" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.658114 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32ff745d-fa0d-44cf-8ab8-bd25d48f3c1f-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-58jld\" (UID: \"32ff745d-fa0d-44cf-8ab8-bd25d48f3c1f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-58jld" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.658179 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.658267 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2001358b-ab58-4093-82cb-465bb04941c4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-n5lnt\" (UID: \"2001358b-ab58-4093-82cb-465bb04941c4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-n5lnt" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.658298 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c3f87db-54f2-4c2e-98ca-1718ed598a7d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-74fx9\" (UID: \"2c3f87db-54f2-4c2e-98ca-1718ed598a7d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74fx9" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.658336 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qh4m7\" (UniqueName: \"kubernetes.io/projected/8e50ffdf-4ab9-4237-b7c9-e9e641711d6c-kube-api-access-qh4m7\") pod \"machine-config-server-gm7hg\" (UID: \"8e50ffdf-4ab9-4237-b7c9-e9e641711d6c\") " pod="openshift-machine-config-operator/machine-config-server-gm7hg" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.658399 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e6ad82e9-b2c3-4265-8738-8b944724faa7-cert\") pod \"ingress-canary-vmcrt\" (UID: \"e6ad82e9-b2c3-4265-8738-8b944724faa7\") " pod="openshift-ingress-canary/ingress-canary-vmcrt" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.658429 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef-registry-tls\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.658465 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef-bound-sa-token\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.658493 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8a793f97-7e99-491f-a21f-e501491f98d0-proxy-tls\") pod \"machine-config-operator-74547568cd-j6zr2\" (UID: \"8a793f97-7e99-491f-a21f-e501491f98d0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j6zr2" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.658518 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbmd8\" (UniqueName: \"kubernetes.io/projected/ea205a39-cbd1-4704-8e93-0b1747a88e8a-kube-api-access-qbmd8\") pod \"route-controller-manager-6576b87f9c-6qpzn\" (UID: \"ea205a39-cbd1-4704-8e93-0b1747a88e8a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qpzn" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.658550 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef-trusted-ca\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.658573 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f386e3de-9b2f-478d-b7c5-8ac3f8aff8d1-stats-auth\") pod \"router-default-5444994796-fzrk5\" (UID: \"f386e3de-9b2f-478d-b7c5-8ac3f8aff8d1\") " pod="openshift-ingress/router-default-5444994796-fzrk5" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.658694 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef-registry-certificates\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.658722 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea205a39-cbd1-4704-8e93-0b1747a88e8a-client-ca\") pod \"route-controller-manager-6576b87f9c-6qpzn\" (UID: \"ea205a39-cbd1-4704-8e93-0b1747a88e8a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qpzn" Mar 09 13:01:49 crc kubenswrapper[4723]: E0309 13:01:49.658748 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:01:50.158731502 +0000 UTC m=+184.173199042 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s6gh6" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.658823 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/0ae8bb36-2555-4976-89ee-a9b6cb99ec9f-mountpoint-dir\") pod \"csi-hostpathplugin-gqscf\" (UID: \"0ae8bb36-2555-4976-89ee-a9b6cb99ec9f\") " pod="hostpath-provisioner/csi-hostpathplugin-gqscf" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.658890 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef-installation-pull-secrets\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.658921 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eaf6376b-8a99-41a9-bbd7-c93567fe1f24-bound-sa-token\") pod \"ingress-operator-5b745b69d9-pcp4r\" (UID: \"eaf6376b-8a99-41a9-bbd7-c93567fe1f24\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pcp4r" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.658950 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqf7n\" (UniqueName: \"kubernetes.io/projected/77a70858-7982-4c72-9dad-3fb8a8547361-kube-api-access-cqf7n\") pod \"service-ca-operator-777779d784-j9lvd\" (UID: \"77a70858-7982-4c72-9dad-3fb8a8547361\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-j9lvd" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.659167 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8a793f97-7e99-491f-a21f-e501491f98d0-images\") pod \"machine-config-operator-74547568cd-j6zr2\" (UID: \"8a793f97-7e99-491f-a21f-e501491f98d0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j6zr2" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.659195 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/0ae8bb36-2555-4976-89ee-a9b6cb99ec9f-csi-data-dir\") pod \"csi-hostpathplugin-gqscf\" (UID: \"0ae8bb36-2555-4976-89ee-a9b6cb99ec9f\") " pod="hostpath-provisioner/csi-hostpathplugin-gqscf" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.659237 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea205a39-cbd1-4704-8e93-0b1747a88e8a-serving-cert\") pod \"route-controller-manager-6576b87f9c-6qpzn\" (UID: \"ea205a39-cbd1-4704-8e93-0b1747a88e8a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qpzn" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.659261 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w57wp\" (UniqueName: \"kubernetes.io/projected/32ff745d-fa0d-44cf-8ab8-bd25d48f3c1f-kube-api-access-w57wp\") pod \"kube-storage-version-migrator-operator-b67b599dd-58jld\" (UID: \"32ff745d-fa0d-44cf-8ab8-bd25d48f3c1f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-58jld" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.659380 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4e4bf80a-dd91-49ad-a418-2edfce2d0fac-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-gk4bd\" (UID: \"4e4bf80a-dd91-49ad-a418-2edfce2d0fac\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gk4bd" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.659456 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f386e3de-9b2f-478d-b7c5-8ac3f8aff8d1-default-certificate\") pod \"router-default-5444994796-fzrk5\" (UID: \"f386e3de-9b2f-478d-b7c5-8ac3f8aff8d1\") " pod="openshift-ingress/router-default-5444994796-fzrk5" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.659486 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn7wq\" (UniqueName: \"kubernetes.io/projected/eaf6376b-8a99-41a9-bbd7-c93567fe1f24-kube-api-access-kn7wq\") pod \"ingress-operator-5b745b69d9-pcp4r\" (UID: \"eaf6376b-8a99-41a9-bbd7-c93567fe1f24\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pcp4r" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.659517 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8a793f97-7e99-491f-a21f-e501491f98d0-auth-proxy-config\") pod \"machine-config-operator-74547568cd-j6zr2\" (UID: \"8a793f97-7e99-491f-a21f-e501491f98d0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j6zr2" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.659538 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8e50ffdf-4ab9-4237-b7c9-e9e641711d6c-certs\") pod \"machine-config-server-gm7hg\" (UID: \"8e50ffdf-4ab9-4237-b7c9-e9e641711d6c\") " pod="openshift-machine-config-operator/machine-config-server-gm7hg" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.659562 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4fhz\" (UniqueName: \"kubernetes.io/projected/2001358b-ab58-4093-82cb-465bb04941c4-kube-api-access-v4fhz\") pod \"multus-admission-controller-857f4d67dd-n5lnt\" (UID: \"2001358b-ab58-4093-82cb-465bb04941c4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-n5lnt" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.659606 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef-ca-trust-extracted\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.659632 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/330d0ce9-3cc0-427c-acf4-7c14f36add18-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-kdxpm\" (UID: \"330d0ce9-3cc0-427c-acf4-7c14f36add18\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kdxpm" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.659678 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc891b08-d815-4b09-94bb-bd0dc6dc01f4-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vpt9b\" (UID: \"dc891b08-d815-4b09-94bb-bd0dc6dc01f4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vpt9b" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.659720 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgg2m\" (UniqueName: \"kubernetes.io/projected/c15fd52d-a005-4417-aaca-84839023e2b4-kube-api-access-xgg2m\") pod \"collect-profiles-29551020-rlng8\" (UID: \"c15fd52d-a005-4417-aaca-84839023e2b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551020-rlng8" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.659745 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea205a39-cbd1-4704-8e93-0b1747a88e8a-config\") pod \"route-controller-manager-6576b87f9c-6qpzn\" (UID: \"ea205a39-cbd1-4704-8e93-0b1747a88e8a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qpzn" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.659800 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd7gm\" (UniqueName: \"kubernetes.io/projected/6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef-kube-api-access-dd7gm\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.659825 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c3f87db-54f2-4c2e-98ca-1718ed598a7d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-74fx9\" (UID: \"2c3f87db-54f2-4c2e-98ca-1718ed598a7d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74fx9" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.659850 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea205a39-cbd1-4704-8e93-0b1747a88e8a-client-ca\") pod \"route-controller-manager-6576b87f9c-6qpzn\" (UID: \"ea205a39-cbd1-4704-8e93-0b1747a88e8a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qpzn" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.659853 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f386e3de-9b2f-478d-b7c5-8ac3f8aff8d1-metrics-certs\") pod \"router-default-5444994796-fzrk5\" (UID: \"f386e3de-9b2f-478d-b7c5-8ac3f8aff8d1\") " pod="openshift-ingress/router-default-5444994796-fzrk5" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.659983 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdxnv\" (UniqueName: \"kubernetes.io/projected/4e4bf80a-dd91-49ad-a418-2edfce2d0fac-kube-api-access-hdxnv\") pod \"machine-config-controller-84d6567774-gk4bd\" (UID: \"4e4bf80a-dd91-49ad-a418-2edfce2d0fac\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gk4bd" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.660194 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txxzp\" (UniqueName: \"kubernetes.io/projected/e6ad82e9-b2c3-4265-8738-8b944724faa7-kube-api-access-txxzp\") pod \"ingress-canary-vmcrt\" (UID: \"e6ad82e9-b2c3-4265-8738-8b944724faa7\") " pod="openshift-ingress-canary/ingress-canary-vmcrt" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.660221 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c15fd52d-a005-4417-aaca-84839023e2b4-secret-volume\") pod \"collect-profiles-29551020-rlng8\" (UID: \"c15fd52d-a005-4417-aaca-84839023e2b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551020-rlng8" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.660245 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77a70858-7982-4c72-9dad-3fb8a8547361-serving-cert\") pod \"service-ca-operator-777779d784-j9lvd\" (UID: \"77a70858-7982-4c72-9dad-3fb8a8547361\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-j9lvd" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.660268 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2bc1b3df-48a8-46f0-a01e-2596af170d85-metrics-tls\") pod \"dns-default-4hnlr\" (UID: \"2bc1b3df-48a8-46f0-a01e-2596af170d85\") " pod="openshift-dns/dns-default-4hnlr" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.660345 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bgjp\" (UniqueName: \"kubernetes.io/projected/f386e3de-9b2f-478d-b7c5-8ac3f8aff8d1-kube-api-access-5bgjp\") pod \"router-default-5444994796-fzrk5\" (UID: \"f386e3de-9b2f-478d-b7c5-8ac3f8aff8d1\") " pod="openshift-ingress/router-default-5444994796-fzrk5" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.660373 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nkl5\" (UniqueName: \"kubernetes.io/projected/2c3f87db-54f2-4c2e-98ca-1718ed598a7d-kube-api-access-4nkl5\") pod \"openshift-controller-manager-operator-756b6f6bc6-74fx9\" (UID: \"2c3f87db-54f2-4c2e-98ca-1718ed598a7d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74fx9" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.661201 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c3f87db-54f2-4c2e-98ca-1718ed598a7d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-74fx9\" (UID: \"2c3f87db-54f2-4c2e-98ca-1718ed598a7d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74fx9" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.663935 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eaf6376b-8a99-41a9-bbd7-c93567fe1f24-metrics-tls\") pod \"ingress-operator-5b745b69d9-pcp4r\" (UID: \"eaf6376b-8a99-41a9-bbd7-c93567fe1f24\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pcp4r" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.664014 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32ff745d-fa0d-44cf-8ab8-bd25d48f3c1f-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-58jld\" (UID: \"32ff745d-fa0d-44cf-8ab8-bd25d48f3c1f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-58jld" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.664434 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8e50ffdf-4ab9-4237-b7c9-e9e641711d6c-node-bootstrap-token\") pod \"machine-config-server-gm7hg\" (UID: \"8e50ffdf-4ab9-4237-b7c9-e9e641711d6c\") " pod="openshift-machine-config-operator/machine-config-server-gm7hg" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.664666 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef-registry-certificates\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.665005 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8a793f97-7e99-491f-a21f-e501491f98d0-images\") pod \"machine-config-operator-74547568cd-j6zr2\" (UID: \"8a793f97-7e99-491f-a21f-e501491f98d0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j6zr2" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.665421 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8a793f97-7e99-491f-a21f-e501491f98d0-auth-proxy-config\") pod \"machine-config-operator-74547568cd-j6zr2\" (UID: \"8a793f97-7e99-491f-a21f-e501491f98d0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j6zr2" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.666512 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea205a39-cbd1-4704-8e93-0b1747a88e8a-config\") pod \"route-controller-manager-6576b87f9c-6qpzn\" (UID: \"ea205a39-cbd1-4704-8e93-0b1747a88e8a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qpzn" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.667828 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef-installation-pull-secrets\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.674390 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4e4bf80a-dd91-49ad-a418-2edfce2d0fac-proxy-tls\") pod \"machine-config-controller-84d6567774-gk4bd\" (UID: \"4e4bf80a-dd91-49ad-a418-2edfce2d0fac\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gk4bd" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.674918 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32ff745d-fa0d-44cf-8ab8-bd25d48f3c1f-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-58jld\" (UID: \"32ff745d-fa0d-44cf-8ab8-bd25d48f3c1f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-58jld" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.675505 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef-registry-tls\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.677560 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2001358b-ab58-4093-82cb-465bb04941c4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-n5lnt\" (UID: \"2001358b-ab58-4093-82cb-465bb04941c4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-n5lnt" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.678076 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8a793f97-7e99-491f-a21f-e501491f98d0-proxy-tls\") pod \"machine-config-operator-74547568cd-j6zr2\" (UID: \"8a793f97-7e99-491f-a21f-e501491f98d0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j6zr2" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.678411 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef-ca-trust-extracted\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.680795 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f386e3de-9b2f-478d-b7c5-8ac3f8aff8d1-stats-auth\") pod \"router-default-5444994796-fzrk5\" (UID: \"f386e3de-9b2f-478d-b7c5-8ac3f8aff8d1\") " pod="openshift-ingress/router-default-5444994796-fzrk5" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.681676 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef-trusted-ca\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.681992 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c3f87db-54f2-4c2e-98ca-1718ed598a7d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-74fx9\" (UID: \"2c3f87db-54f2-4c2e-98ca-1718ed598a7d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74fx9" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.682148 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea205a39-cbd1-4704-8e93-0b1747a88e8a-serving-cert\") pod \"route-controller-manager-6576b87f9c-6qpzn\" (UID: \"ea205a39-cbd1-4704-8e93-0b1747a88e8a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qpzn" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.682951 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c15fd52d-a005-4417-aaca-84839023e2b4-secret-volume\") pod \"collect-profiles-29551020-rlng8\" (UID: \"c15fd52d-a005-4417-aaca-84839023e2b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551020-rlng8" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.683093 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77a70858-7982-4c72-9dad-3fb8a8547361-serving-cert\") pod \"service-ca-operator-777779d784-j9lvd\" (UID: \"77a70858-7982-4c72-9dad-3fb8a8547361\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-j9lvd" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.683153 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8e50ffdf-4ab9-4237-b7c9-e9e641711d6c-certs\") pod \"machine-config-server-gm7hg\" (UID: \"8e50ffdf-4ab9-4237-b7c9-e9e641711d6c\") " pod="openshift-machine-config-operator/machine-config-server-gm7hg" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.683808 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f386e3de-9b2f-478d-b7c5-8ac3f8aff8d1-default-certificate\") pod \"router-default-5444994796-fzrk5\" (UID: \"f386e3de-9b2f-478d-b7c5-8ac3f8aff8d1\") " pod="openshift-ingress/router-default-5444994796-fzrk5" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.683930 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/330d0ce9-3cc0-427c-acf4-7c14f36add18-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-kdxpm\" (UID: \"330d0ce9-3cc0-427c-acf4-7c14f36add18\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kdxpm" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.684245 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f386e3de-9b2f-478d-b7c5-8ac3f8aff8d1-metrics-certs\") pod \"router-default-5444994796-fzrk5\" (UID: \"f386e3de-9b2f-478d-b7c5-8ac3f8aff8d1\") " pod="openshift-ingress/router-default-5444994796-fzrk5" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.685831 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc891b08-d815-4b09-94bb-bd0dc6dc01f4-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vpt9b\" (UID: \"dc891b08-d815-4b09-94bb-bd0dc6dc01f4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vpt9b" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.700285 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd476\" (UniqueName: \"kubernetes.io/projected/330d0ce9-3cc0-427c-acf4-7c14f36add18-kube-api-access-gd476\") pod \"control-plane-machine-set-operator-78cbb6b69f-kdxpm\" (UID: \"330d0ce9-3cc0-427c-acf4-7c14f36add18\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kdxpm" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.718344 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kdxpm" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.728320 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc891b08-d815-4b09-94bb-bd0dc6dc01f4-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vpt9b\" (UID: \"dc891b08-d815-4b09-94bb-bd0dc6dc01f4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vpt9b" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.746201 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtx2b\" (UniqueName: \"kubernetes.io/projected/8a793f97-7e99-491f-a21f-e501491f98d0-kube-api-access-mtx2b\") pod \"machine-config-operator-74547568cd-j6zr2\" (UID: \"8a793f97-7e99-491f-a21f-e501491f98d0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j6zr2" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.760022 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-95tgr"] Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.761171 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-zkxq4"] Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.761404 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.761763 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0ae8bb36-2555-4976-89ee-a9b6cb99ec9f-registration-dir\") pod \"csi-hostpathplugin-gqscf\" (UID: \"0ae8bb36-2555-4976-89ee-a9b6cb99ec9f\") " pod="hostpath-provisioner/csi-hostpathplugin-gqscf" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.761803 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2bc1b3df-48a8-46f0-a01e-2596af170d85-config-volume\") pod \"dns-default-4hnlr\" (UID: \"2bc1b3df-48a8-46f0-a01e-2596af170d85\") " pod="openshift-dns/dns-default-4hnlr" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.761822 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/0ae8bb36-2555-4976-89ee-a9b6cb99ec9f-plugins-dir\") pod \"csi-hostpathplugin-gqscf\" (UID: \"0ae8bb36-2555-4976-89ee-a9b6cb99ec9f\") " pod="hostpath-provisioner/csi-hostpathplugin-gqscf" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.761838 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhgsm\" (UniqueName: \"kubernetes.io/projected/0ae8bb36-2555-4976-89ee-a9b6cb99ec9f-kube-api-access-rhgsm\") pod \"csi-hostpathplugin-gqscf\" (UID: \"0ae8bb36-2555-4976-89ee-a9b6cb99ec9f\") " pod="hostpath-provisioner/csi-hostpathplugin-gqscf" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.761922 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0ae8bb36-2555-4976-89ee-a9b6cb99ec9f-socket-dir\") pod \"csi-hostpathplugin-gqscf\" (UID: \"0ae8bb36-2555-4976-89ee-a9b6cb99ec9f\") " pod="hostpath-provisioner/csi-hostpathplugin-gqscf" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.761944 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sf8rs\" (UniqueName: \"kubernetes.io/projected/2bc1b3df-48a8-46f0-a01e-2596af170d85-kube-api-access-sf8rs\") pod \"dns-default-4hnlr\" (UID: \"2bc1b3df-48a8-46f0-a01e-2596af170d85\") " pod="openshift-dns/dns-default-4hnlr" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.761981 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e6ad82e9-b2c3-4265-8738-8b944724faa7-cert\") pod \"ingress-canary-vmcrt\" (UID: \"e6ad82e9-b2c3-4265-8738-8b944724faa7\") " pod="openshift-ingress-canary/ingress-canary-vmcrt" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.762020 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/0ae8bb36-2555-4976-89ee-a9b6cb99ec9f-mountpoint-dir\") pod \"csi-hostpathplugin-gqscf\" (UID: \"0ae8bb36-2555-4976-89ee-a9b6cb99ec9f\") " pod="hostpath-provisioner/csi-hostpathplugin-gqscf" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.762049 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/0ae8bb36-2555-4976-89ee-a9b6cb99ec9f-csi-data-dir\") pod \"csi-hostpathplugin-gqscf\" (UID: \"0ae8bb36-2555-4976-89ee-a9b6cb99ec9f\") " pod="hostpath-provisioner/csi-hostpathplugin-gqscf" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.762117 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txxzp\" (UniqueName: \"kubernetes.io/projected/e6ad82e9-b2c3-4265-8738-8b944724faa7-kube-api-access-txxzp\") pod \"ingress-canary-vmcrt\" (UID: \"e6ad82e9-b2c3-4265-8738-8b944724faa7\") " pod="openshift-ingress-canary/ingress-canary-vmcrt" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.762132 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2bc1b3df-48a8-46f0-a01e-2596af170d85-metrics-tls\") pod \"dns-default-4hnlr\" (UID: \"2bc1b3df-48a8-46f0-a01e-2596af170d85\") " pod="openshift-dns/dns-default-4hnlr" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.763150 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0ae8bb36-2555-4976-89ee-a9b6cb99ec9f-socket-dir\") pod \"csi-hostpathplugin-gqscf\" (UID: \"0ae8bb36-2555-4976-89ee-a9b6cb99ec9f\") " pod="hostpath-provisioner/csi-hostpathplugin-gqscf" Mar 09 13:01:49 crc kubenswrapper[4723]: E0309 13:01:49.763268 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:01:50.263245611 +0000 UTC m=+184.277713151 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.763309 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0ae8bb36-2555-4976-89ee-a9b6cb99ec9f-registration-dir\") pod \"csi-hostpathplugin-gqscf\" (UID: \"0ae8bb36-2555-4976-89ee-a9b6cb99ec9f\") " pod="hostpath-provisioner/csi-hostpathplugin-gqscf" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.764118 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2bc1b3df-48a8-46f0-a01e-2596af170d85-config-volume\") pod \"dns-default-4hnlr\" (UID: \"2bc1b3df-48a8-46f0-a01e-2596af170d85\") " pod="openshift-dns/dns-default-4hnlr" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.764191 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/0ae8bb36-2555-4976-89ee-a9b6cb99ec9f-plugins-dir\") pod \"csi-hostpathplugin-gqscf\" (UID: \"0ae8bb36-2555-4976-89ee-a9b6cb99ec9f\") " pod="hostpath-provisioner/csi-hostpathplugin-gqscf" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.764326 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/0ae8bb36-2555-4976-89ee-a9b6cb99ec9f-mountpoint-dir\") pod \"csi-hostpathplugin-gqscf\" (UID: \"0ae8bb36-2555-4976-89ee-a9b6cb99ec9f\") " pod="hostpath-provisioner/csi-hostpathplugin-gqscf" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.764717 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/0ae8bb36-2555-4976-89ee-a9b6cb99ec9f-csi-data-dir\") pod \"csi-hostpathplugin-gqscf\" (UID: \"0ae8bb36-2555-4976-89ee-a9b6cb99ec9f\") " pod="hostpath-provisioner/csi-hostpathplugin-gqscf" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.766229 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh4m7\" (UniqueName: \"kubernetes.io/projected/8e50ffdf-4ab9-4237-b7c9-e9e641711d6c-kube-api-access-qh4m7\") pod \"machine-config-server-gm7hg\" (UID: \"8e50ffdf-4ab9-4237-b7c9-e9e641711d6c\") " pod="openshift-machine-config-operator/machine-config-server-gm7hg" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.770218 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2bc1b3df-48a8-46f0-a01e-2596af170d85-metrics-tls\") pod \"dns-default-4hnlr\" (UID: \"2bc1b3df-48a8-46f0-a01e-2596af170d85\") " pod="openshift-dns/dns-default-4hnlr" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.772360 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e6ad82e9-b2c3-4265-8738-8b944724faa7-cert\") pod \"ingress-canary-vmcrt\" (UID: \"e6ad82e9-b2c3-4265-8738-8b944724faa7\") " pod="openshift-ingress-canary/ingress-canary-vmcrt" Mar 09 13:01:49 crc kubenswrapper[4723]: W0309 13:01:49.776735 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7333389_f183_4d12_b140_3f332e49eaec.slice/crio-217e9ea5c84b0a12f633f23ee5b6f6626e36868f12883b348f41db97661343a6 WatchSource:0}: Error finding container 217e9ea5c84b0a12f633f23ee5b6f6626e36868f12883b348f41db97661343a6: Status 404 returned error can't find the container with id 217e9ea5c84b0a12f633f23ee5b6f6626e36868f12883b348f41db97661343a6 Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.781813 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef-bound-sa-token\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.783407 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-6gtjl" event={"ID":"6775c6a2-49ba-48fb-9f8f-ff26a7155618","Type":"ContainerStarted","Data":"3528b54bcf3c325faeb80344f900bd80b025b73afeaf6efeef5c37ad41e9beda"} Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.783452 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-6gtjl" event={"ID":"6775c6a2-49ba-48fb-9f8f-ff26a7155618","Type":"ContainerStarted","Data":"7627da4ef3ad5637aec325793c6d20eb62f3261ec552f8ae4c7afb1c042e35e9"} Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.793493 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fbl76" event={"ID":"59291c5b-082b-467b-b87b-cf9af3e613b1","Type":"ContainerStarted","Data":"04fd656f0a5c57d7e04913ef36705e3f5de3f69fb123a3ba25652541829c18b0"} Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.799108 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" event={"ID":"9ae03d73-b21d-4004-a000-e49a547ef19d","Type":"ContainerStarted","Data":"97edd0b231105266aa5e79d53a279aecad1c294047940549fef18653ebec0290"} Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.799150 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" event={"ID":"9ae03d73-b21d-4004-a000-e49a547ef19d","Type":"ContainerStarted","Data":"58ac658811e294849a47a07873fb96a56ed1b8c33a137ed49fff1e2c9981cdb2"} Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.799959 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.801521 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgg2m\" (UniqueName: \"kubernetes.io/projected/c15fd52d-a005-4417-aaca-84839023e2b4-kube-api-access-xgg2m\") pod \"collect-profiles-29551020-rlng8\" (UID: \"c15fd52d-a005-4417-aaca-84839023e2b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551020-rlng8" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.801519 4723 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-dh6qm container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.8:6443/healthz\": dial tcp 10.217.0.8:6443: connect: connection refused" start-of-body= Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.801615 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" podUID="9ae03d73-b21d-4004-a000-e49a547ef19d" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.8:6443/healthz\": dial tcp 10.217.0.8:6443: connect: connection refused" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.810454 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qp2l4" event={"ID":"3f57f760-4139-4eb7-b7e9-5f0bcd7cb3eb","Type":"ContainerStarted","Data":"a1a12d32405f542b8ac3daccfd27027f2c1ca9601a973a16292bcaf4e0ecd65d"} Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.810499 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qp2l4" event={"ID":"3f57f760-4139-4eb7-b7e9-5f0bcd7cb3eb","Type":"ContainerStarted","Data":"c3de5ff9535ddf8dbf3c05d6cd151a5bef0cb0c21cd5869d1b4b830db2e59487"} Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.811164 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qp2l4" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.813843 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eaf6376b-8a99-41a9-bbd7-c93567fe1f24-bound-sa-token\") pod \"ingress-operator-5b745b69d9-pcp4r\" (UID: \"eaf6376b-8a99-41a9-bbd7-c93567fe1f24\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pcp4r" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.814932 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-vb8n2" event={"ID":"9bc33ee8-964b-4b03-b564-5c66068629b9","Type":"ContainerStarted","Data":"a48c65a1c3f512880d9dc44b2eb45307e3d945306c8a613c05ca483aa288e4a8"} Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.814966 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-vb8n2" event={"ID":"9bc33ee8-964b-4b03-b564-5c66068629b9","Type":"ContainerStarted","Data":"abdf4adce12e0920215d9fc04894f299b9d5f163b9e055b0af05cb53fe2115db"} Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.815194 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-vb8n2" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.816579 4723 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-qp2l4 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.816614 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qp2l4" podUID="3f57f760-4139-4eb7-b7e9-5f0bcd7cb3eb" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.816903 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wxvl" event={"ID":"f30207aa-a4d2-41bb-8f36-8c6809d96191","Type":"ContainerStarted","Data":"9b4d8058a3aa77b7483aad09e9e7d94aad334b59b449a6a9c64c09b221a873ae"} Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.817821 4723 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-vb8n2 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.817846 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-vb8n2" podUID="9bc33ee8-964b-4b03-b564-5c66068629b9" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.821691 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-lwpcl" event={"ID":"61312a96-b8f6-431c-b24e-0046271cf40f","Type":"ContainerStarted","Data":"44b4eec54c1a41912c42ad25c6f2da3afbee7664d18e1c63cf74a5d2e99e28aa"} Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.822111 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-lwpcl" event={"ID":"61312a96-b8f6-431c-b24e-0046271cf40f","Type":"ContainerStarted","Data":"4238f7dce485c475babc6b7cc0871de44b70a52c15eef7a64193f8825464b246"} Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.824772 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-8w6mb" event={"ID":"151b2a1f-2df4-49d4-9e55-260eebbb267f","Type":"ContainerStarted","Data":"580b982cb74e30c2cb864ede8ea098b01cba71c4875b7b0e9a4d28ffb1f6742e"} Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.824817 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-8w6mb" event={"ID":"151b2a1f-2df4-49d4-9e55-260eebbb267f","Type":"ContainerStarted","Data":"d5b04c76867ed0b2d8b5f6ddfa5f5ff000dd2c393587ddab4b498da90a124895"} Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.829499 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vpt9b" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.831267 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqf7n\" (UniqueName: \"kubernetes.io/projected/77a70858-7982-4c72-9dad-3fb8a8547361-kube-api-access-cqf7n\") pod \"service-ca-operator-777779d784-j9lvd\" (UID: \"77a70858-7982-4c72-9dad-3fb8a8547361\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-j9lvd" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.848489 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nkl5\" (UniqueName: \"kubernetes.io/projected/2c3f87db-54f2-4c2e-98ca-1718ed598a7d-kube-api-access-4nkl5\") pod \"openshift-controller-manager-operator-756b6f6bc6-74fx9\" (UID: \"2c3f87db-54f2-4c2e-98ca-1718ed598a7d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74fx9" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.850228 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-gm7hg" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.865844 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:49 crc kubenswrapper[4723]: E0309 13:01:49.866305 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:01:50.366289372 +0000 UTC m=+184.380756912 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s6gh6" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.872005 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bgjp\" (UniqueName: \"kubernetes.io/projected/f386e3de-9b2f-478d-b7c5-8ac3f8aff8d1-kube-api-access-5bgjp\") pod \"router-default-5444994796-fzrk5\" (UID: \"f386e3de-9b2f-478d-b7c5-8ac3f8aff8d1\") " pod="openshift-ingress/router-default-5444994796-fzrk5" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.903386 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9gpjt"] Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.908576 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbmd8\" (UniqueName: \"kubernetes.io/projected/ea205a39-cbd1-4704-8e93-0b1747a88e8a-kube-api-access-qbmd8\") pod \"route-controller-manager-6576b87f9c-6qpzn\" (UID: \"ea205a39-cbd1-4704-8e93-0b1747a88e8a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qpzn" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.956906 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdxnv\" (UniqueName: \"kubernetes.io/projected/4e4bf80a-dd91-49ad-a418-2edfce2d0fac-kube-api-access-hdxnv\") pod \"machine-config-controller-84d6567774-gk4bd\" (UID: \"4e4bf80a-dd91-49ad-a418-2edfce2d0fac\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gk4bd" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.959064 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn7wq\" (UniqueName: \"kubernetes.io/projected/eaf6376b-8a99-41a9-bbd7-c93567fe1f24-kube-api-access-kn7wq\") pod \"ingress-operator-5b745b69d9-pcp4r\" (UID: \"eaf6376b-8a99-41a9-bbd7-c93567fe1f24\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pcp4r" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.963330 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-b5c74"] Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.966820 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.969627 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w57wp\" (UniqueName: \"kubernetes.io/projected/32ff745d-fa0d-44cf-8ab8-bd25d48f3c1f-kube-api-access-w57wp\") pod \"kube-storage-version-migrator-operator-b67b599dd-58jld\" (UID: \"32ff745d-fa0d-44cf-8ab8-bd25d48f3c1f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-58jld" Mar 09 13:01:49 crc kubenswrapper[4723]: E0309 13:01:49.970353 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:01:50.470330678 +0000 UTC m=+184.484798228 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.979806 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd7gm\" (UniqueName: \"kubernetes.io/projected/6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef-kube-api-access-dd7gm\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.993356 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t25rf"] Mar 09 13:01:49 crc kubenswrapper[4723]: I0309 13:01:49.994438 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gk4bd" Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.008661 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j6zr2" Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.017347 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4fhz\" (UniqueName: \"kubernetes.io/projected/2001358b-ab58-4093-82cb-465bb04941c4-kube-api-access-v4fhz\") pod \"multus-admission-controller-857f4d67dd-n5lnt\" (UID: \"2001358b-ab58-4093-82cb-465bb04941c4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-n5lnt" Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.039454 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-fzrk5" Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.041821 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qpzn" Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.046383 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhgsm\" (UniqueName: \"kubernetes.io/projected/0ae8bb36-2555-4976-89ee-a9b6cb99ec9f-kube-api-access-rhgsm\") pod \"csi-hostpathplugin-gqscf\" (UID: \"0ae8bb36-2555-4976-89ee-a9b6cb99ec9f\") " pod="hostpath-provisioner/csi-hostpathplugin-gqscf" Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.049944 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-n5lnt" Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.056650 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bjqw4"] Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.065704 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551020-rlng8" Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.065780 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf8rs\" (UniqueName: \"kubernetes.io/projected/2bc1b3df-48a8-46f0-a01e-2596af170d85-kube-api-access-sf8rs\") pod \"dns-default-4hnlr\" (UID: \"2bc1b3df-48a8-46f0-a01e-2596af170d85\") " pod="openshift-dns/dns-default-4hnlr" Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.070979 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74fx9" Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.071680 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:50 crc kubenswrapper[4723]: E0309 13:01:50.072076 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:01:50.572059815 +0000 UTC m=+184.586527355 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s6gh6" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.074750 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-t6z8c"] Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.080369 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pcp4r" Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.085148 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-j9lvd" Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.091491 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txxzp\" (UniqueName: \"kubernetes.io/projected/e6ad82e9-b2c3-4265-8738-8b944724faa7-kube-api-access-txxzp\") pod \"ingress-canary-vmcrt\" (UID: \"e6ad82e9-b2c3-4265-8738-8b944724faa7\") " pod="openshift-ingress-canary/ingress-canary-vmcrt" Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.106600 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-58jld" Mar 09 13:01:50 crc kubenswrapper[4723]: W0309 13:01:50.107504 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e50ffdf_4ab9_4237_b7c9_e9e641711d6c.slice/crio-31f1fe5a639bfff82833387d30d5194d2564e66715b464a87d86e560fa330e36 WatchSource:0}: Error finding container 31f1fe5a639bfff82833387d30d5194d2564e66715b464a87d86e560fa330e36: Status 404 returned error can't find the container with id 31f1fe5a639bfff82833387d30d5194d2564e66715b464a87d86e560fa330e36 Mar 09 13:01:50 crc kubenswrapper[4723]: W0309 13:01:50.124105 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d4d2050_65bd_4d00_893c_c7f296d90926.slice/crio-f15b02dc3780a2ad3fc71ca5aae568cf5d6d2930645b15b51138e3841ae95969 WatchSource:0}: Error finding container f15b02dc3780a2ad3fc71ca5aae568cf5d6d2930645b15b51138e3841ae95969: Status 404 returned error can't find the container with id f15b02dc3780a2ad3fc71ca5aae568cf5d6d2930645b15b51138e3841ae95969 Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.172436 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:01:50 crc kubenswrapper[4723]: E0309 13:01:50.172607 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:01:50.672571612 +0000 UTC m=+184.687039152 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.173105 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:50 crc kubenswrapper[4723]: E0309 13:01:50.173471 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:01:50.673457044 +0000 UTC m=+184.687924584 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s6gh6" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.192854 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-gqscf" Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.205104 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4hnlr" Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.207334 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vmcrt" Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.278587 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:01:50 crc kubenswrapper[4723]: E0309 13:01:50.278782 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:01:50.778752383 +0000 UTC m=+184.793219923 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.278915 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:50 crc kubenswrapper[4723]: E0309 13:01:50.279257 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:01:50.779249455 +0000 UTC m=+184.793716995 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s6gh6" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.317782 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xzd59"] Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.379508 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:01:50 crc kubenswrapper[4723]: E0309 13:01:50.379845 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:01:50.879826843 +0000 UTC m=+184.894294373 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.382188 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-7zkls"] Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.389079 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-r5ltl"] Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.423157 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p8pxb"] Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.483979 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:50 crc kubenswrapper[4723]: E0309 13:01:50.484414 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:01:50.984397493 +0000 UTC m=+184.998865033 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s6gh6" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.487028 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-wkpzg"] Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.517338 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jh2mc"] Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.586499 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:01:50 crc kubenswrapper[4723]: E0309 13:01:50.590537 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:01:51.090495982 +0000 UTC m=+185.104963522 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.593317 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:50 crc kubenswrapper[4723]: E0309 13:01:50.593997 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:01:51.093978991 +0000 UTC m=+185.108446531 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s6gh6" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.598392 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-4xwcm"] Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.638193 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kdxpm"] Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.640152 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhxc9"] Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.640239 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p9x2d"] Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.663632 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qpzn"] Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.695587 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.695705 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f09eae28-36d6-4c16-8aab-bbd93934f921-metrics-certs\") pod \"network-metrics-daemon-lztcd\" (UID: \"f09eae28-36d6-4c16-8aab-bbd93934f921\") " pod="openshift-multus/network-metrics-daemon-lztcd" Mar 09 13:01:50 crc kubenswrapper[4723]: E0309 13:01:50.695929 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:01:51.195911403 +0000 UTC m=+185.210378943 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:50 crc kubenswrapper[4723]: W0309 13:01:50.710839 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f6455f2_bcad_4e11_8ef5_a272b406be88.slice/crio-17e93ac9fec764ee8ebc4a239344fc76d597d83b24fde48cd613dea04d4bc1b0 WatchSource:0}: Error finding container 17e93ac9fec764ee8ebc4a239344fc76d597d83b24fde48cd613dea04d4bc1b0: Status 404 returned error can't find the container with id 17e93ac9fec764ee8ebc4a239344fc76d597d83b24fde48cd613dea04d4bc1b0 Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.716377 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-n5lnt"] Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.717418 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 09 13:01:50 crc kubenswrapper[4723]: W0309 13:01:50.734436 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0383bca0_800c_4f21_a7ee_32e42609b47e.slice/crio-14691816ed3ddef32c19a33007687bc84a78dc331b7417fe1ac4e2c8edb1094e WatchSource:0}: Error finding container 14691816ed3ddef32c19a33007687bc84a78dc331b7417fe1ac4e2c8edb1094e: Status 404 returned error can't find the container with id 14691816ed3ddef32c19a33007687bc84a78dc331b7417fe1ac4e2c8edb1094e Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.746403 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f09eae28-36d6-4c16-8aab-bbd93934f921-metrics-certs\") pod \"network-metrics-daemon-lztcd\" (UID: \"f09eae28-36d6-4c16-8aab-bbd93934f921\") " pod="openshift-multus/network-metrics-daemon-lztcd" Mar 09 13:01:50 crc kubenswrapper[4723]: W0309 13:01:50.754045 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3321a715_9c5f_4417_bec1_4ba3ccce946c.slice/crio-b37e1770982fee57819d5bbd4b275469381a27249599eff9b0fbec2b4fb9e3e1 WatchSource:0}: Error finding container b37e1770982fee57819d5bbd4b275469381a27249599eff9b0fbec2b4fb9e3e1: Status 404 returned error can't find the container with id b37e1770982fee57819d5bbd4b275469381a27249599eff9b0fbec2b4fb9e3e1 Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.789761 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-gk4bd"] Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.789878 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vpt9b"] Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.792139 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-j6zr2"] Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.796350 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:50 crc kubenswrapper[4723]: E0309 13:01:50.796772 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:01:51.296753878 +0000 UTC m=+185.311221418 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s6gh6" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.864812 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhxc9" event={"ID":"3321a715-9c5f-4417-bec1-4ba3ccce946c","Type":"ContainerStarted","Data":"b37e1770982fee57819d5bbd4b275469381a27249599eff9b0fbec2b4fb9e3e1"} Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.905038 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:01:50 crc kubenswrapper[4723]: E0309 13:01:50.921948 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:01:51.421923544 +0000 UTC m=+185.436391084 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.922276 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-p8pxb" event={"ID":"c6d61e80-043f-4ece-a6a6-eed6357749f5","Type":"ContainerStarted","Data":"9eeba3cd17532de8486e559017adf9d4e639ac9458c1a69ee6498e95be9e3265"} Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.925988 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-b5c74" event={"ID":"e21fc837-8de2-4af5-a375-b14567f47d67","Type":"ContainerStarted","Data":"6be3f068d251d0cfb81160357888670743b9672f632254b89f4370d2172baa62"} Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.942365 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-95tgr" event={"ID":"e9a38022-ccc3-43cb-8af2-4252aef56bf8","Type":"ContainerStarted","Data":"c00704eed8a3b35e9d522ec13b196bc5c7263ceb78133f4d19b45e033b034846"} Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.955529 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-c5h5p" event={"ID":"c7333389-f183-4d12-b140-3f332e49eaec","Type":"ContainerStarted","Data":"cbbc4f84e81a826a081223bdb31df7bf3f8ccf5af77941f831a61dde92d9d828"} Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.955575 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-c5h5p" event={"ID":"c7333389-f183-4d12-b140-3f332e49eaec","Type":"ContainerStarted","Data":"217e9ea5c84b0a12f633f23ee5b6f6626e36868f12883b348f41db97661343a6"} Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.956553 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qpzn" event={"ID":"ea205a39-cbd1-4704-8e93-0b1747a88e8a","Type":"ContainerStarted","Data":"3eedfd4aa15d5a0882827922038af91a0ee51cc4537bcf3b86de9da7fa7b7854"} Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.963998 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.964058 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lztcd" Mar 09 13:01:50 crc kubenswrapper[4723]: I0309 13:01:50.965992 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wxvl" event={"ID":"f30207aa-a4d2-41bb-8f36-8c6809d96191","Type":"ContainerStarted","Data":"e21bdc20a8ac8f778f7dbd3d64307ef5143d7db4813cb1d85f200de62838164d"} Mar 09 13:01:51 crc kubenswrapper[4723]: I0309 13:01:51.005250 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-58jld"] Mar 09 13:01:51 crc kubenswrapper[4723]: I0309 13:01:51.009235 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:51 crc kubenswrapper[4723]: E0309 13:01:51.009663 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:01:51.509646724 +0000 UTC m=+185.524114274 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s6gh6" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:51 crc kubenswrapper[4723]: I0309 13:01:51.036083 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-pcp4r"] Mar 09 13:01:51 crc kubenswrapper[4723]: I0309 13:01:51.053998 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74fx9"] Mar 09 13:01:51 crc kubenswrapper[4723]: I0309 13:01:51.054369 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t25rf" event={"ID":"337d5692-12d3-4c0a-8187-eb66a2666e95","Type":"ContainerStarted","Data":"c48a552513c4a10981d8320d01e1748218788ee047221cbf7ce579f680911d89"} Mar 09 13:01:51 crc kubenswrapper[4723]: I0309 13:01:51.106690 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-t6z8c" event={"ID":"d1a69d09-d3e2-4af9-857a-3229bc05c992","Type":"ContainerStarted","Data":"6c9158c4621e912b40e830ea030bf7e5141c195499e5235f205e310a2aed4c2c"} Mar 09 13:01:51 crc kubenswrapper[4723]: I0309 13:01:51.108457 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-n5lnt" event={"ID":"2001358b-ab58-4093-82cb-465bb04941c4","Type":"ContainerStarted","Data":"e462f639c34055d068eccc1d8f9a6531605e7736e08191a058e2f8b6907396aa"} Mar 09 13:01:51 crc kubenswrapper[4723]: I0309 13:01:51.110184 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:01:51 crc kubenswrapper[4723]: E0309 13:01:51.110540 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:01:51.610517698 +0000 UTC m=+185.624985248 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:51 crc kubenswrapper[4723]: I0309 13:01:51.118008 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-4xwcm" event={"ID":"0383bca0-800c-4f21-a7ee-32e42609b47e","Type":"ContainerStarted","Data":"14691816ed3ddef32c19a33007687bc84a78dc331b7417fe1ac4e2c8edb1094e"} Mar 09 13:01:51 crc kubenswrapper[4723]: I0309 13:01:51.122330 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xzd59" event={"ID":"4f18e34a-2f8e-4450-bb3f-7b391bc03e06","Type":"ContainerStarted","Data":"6a3fde0e4a9e4039529dd8801d8a91c871143950f21e05b794fe9ee50959aaf3"} Mar 09 13:01:51 crc kubenswrapper[4723]: I0309 13:01:51.130194 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-j9lvd"] Mar 09 13:01:51 crc kubenswrapper[4723]: I0309 13:01:51.205725 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-gm7hg" event={"ID":"8e50ffdf-4ab9-4237-b7c9-e9e641711d6c","Type":"ContainerStarted","Data":"31f1fe5a639bfff82833387d30d5194d2564e66715b464a87d86e560fa330e36"} Mar 09 13:01:51 crc kubenswrapper[4723]: I0309 13:01:51.215726 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:51 crc kubenswrapper[4723]: E0309 13:01:51.216958 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:01:51.716938685 +0000 UTC m=+185.731406225 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s6gh6" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:51 crc kubenswrapper[4723]: I0309 13:01:51.232681 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9gpjt" event={"ID":"56e5be14-f33f-4db0-a372-77b3fd4d9510","Type":"ContainerStarted","Data":"aed3f6bc9561661628dbc4d8e99c2d4c4eb143618db911733389c5f59248ff13"} Mar 09 13:01:51 crc kubenswrapper[4723]: I0309 13:01:51.247658 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-r5ltl" event={"ID":"9d680903-3aac-4f3d-8e55-5fb9ee1cb46a","Type":"ContainerStarted","Data":"af532f1fd4d98c21b951bd14eb5ba55e6c82c09f2d968218e1b9898f9c4218f3"} Mar 09 13:01:51 crc kubenswrapper[4723]: I0309 13:01:51.249144 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bjqw4" event={"ID":"2d4d2050-65bd-4d00-893c-c7f296d90926","Type":"ContainerStarted","Data":"f15b02dc3780a2ad3fc71ca5aae568cf5d6d2930645b15b51138e3841ae95969"} Mar 09 13:01:51 crc kubenswrapper[4723]: I0309 13:01:51.268166 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-wkpzg" event={"ID":"1f6455f2-bcad-4e11-8ef5-a272b406be88","Type":"ContainerStarted","Data":"17e93ac9fec764ee8ebc4a239344fc76d597d83b24fde48cd613dea04d4bc1b0"} Mar 09 13:01:51 crc kubenswrapper[4723]: I0309 13:01:51.278908 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-fzrk5" event={"ID":"f386e3de-9b2f-478d-b7c5-8ac3f8aff8d1","Type":"ContainerStarted","Data":"b154ece0f0bfe149136c257ff0454f854644892c0d089c39123cb6fde48fccb4"} Mar 09 13:01:51 crc kubenswrapper[4723]: I0309 13:01:51.322778 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:01:51 crc kubenswrapper[4723]: E0309 13:01:51.324845 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:01:51.824812809 +0000 UTC m=+185.839280349 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:51 crc kubenswrapper[4723]: I0309 13:01:51.363820 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kdxpm" event={"ID":"330d0ce9-3cc0-427c-acf4-7c14f36add18","Type":"ContainerStarted","Data":"1f8bd9adb64fd4b9a6c83b503edd7c05741cb93ae9b71fed848436f918b8e696"} Mar 09 13:01:51 crc kubenswrapper[4723]: I0309 13:01:51.389785 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zkxq4" event={"ID":"beee0ec0-e83b-41df-b1c5-b6dadb908961","Type":"ContainerStarted","Data":"402bca30dbd9a6d8112c8ba3ef119d71646ca38800dec3346fc02cc5bc2a8552"} Mar 09 13:01:51 crc kubenswrapper[4723]: I0309 13:01:51.393876 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jh2mc" event={"ID":"95ca7c55-d937-4670-b10c-a8aaf4b77c84","Type":"ContainerStarted","Data":"505f147439e9af20e90487ad4931070d315c95c54e04745b6fbfa718b055fd64"} Mar 09 13:01:51 crc kubenswrapper[4723]: I0309 13:01:51.397935 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7zkls" event={"ID":"e0e7aa68-dd0a-4803-93c1-d3d824033cad","Type":"ContainerStarted","Data":"e46971432cb8143aa0db2e8f78498dad09ce9e06148f46c680cc995b65884edc"} Mar 09 13:01:51 crc kubenswrapper[4723]: I0309 13:01:51.408970 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551020-rlng8"] Mar 09 13:01:51 crc kubenswrapper[4723]: I0309 13:01:51.418373 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fbl76" event={"ID":"59291c5b-082b-467b-b87b-cf9af3e613b1","Type":"ContainerStarted","Data":"c5ba8c33e99d846cd75af7b67e4e260cb13f9a1c25bc82ee9fda86484235865d"} Mar 09 13:01:51 crc kubenswrapper[4723]: I0309 13:01:51.427917 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:51 crc kubenswrapper[4723]: E0309 13:01:51.428288 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:01:51.928276681 +0000 UTC m=+185.942744221 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s6gh6" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:51 crc kubenswrapper[4723]: I0309 13:01:51.474505 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p9x2d" event={"ID":"09470b31-c2ae-42f8-8490-c446e979042d","Type":"ContainerStarted","Data":"5647e5b5fa4705ebf3ca9a9e47ff7611e34147d709df5fcc3e1325f4d1a88385"} Mar 09 13:01:51 crc kubenswrapper[4723]: W0309 13:01:51.474695 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c3f87db_54f2_4c2e_98ca_1718ed598a7d.slice/crio-94f94afd394a021261455b81053df45e8146d267e7d9628570a2b2f9482cb04f WatchSource:0}: Error finding container 94f94afd394a021261455b81053df45e8146d267e7d9628570a2b2f9482cb04f: Status 404 returned error can't find the container with id 94f94afd394a021261455b81053df45e8146d267e7d9628570a2b2f9482cb04f Mar 09 13:01:51 crc kubenswrapper[4723]: I0309 13:01:51.500737 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qp2l4" Mar 09 13:01:51 crc kubenswrapper[4723]: I0309 13:01:51.503775 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:01:51 crc kubenswrapper[4723]: I0309 13:01:51.505835 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-gqscf"] Mar 09 13:01:51 crc kubenswrapper[4723]: I0309 13:01:51.514075 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4hnlr"] Mar 09 13:01:51 crc kubenswrapper[4723]: I0309 13:01:51.518147 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-vb8n2" Mar 09 13:01:51 crc kubenswrapper[4723]: I0309 13:01:51.532983 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:01:51 crc kubenswrapper[4723]: E0309 13:01:51.545423 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:01:52.045398341 +0000 UTC m=+186.059865881 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:51 crc kubenswrapper[4723]: I0309 13:01:51.576822 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-6gtjl" podStartSLOduration=115.576798013 podStartE2EDuration="1m55.576798013s" podCreationTimestamp="2026-03-09 12:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:51.574219397 +0000 UTC m=+185.588686937" watchObservedRunningTime="2026-03-09 13:01:51.576798013 +0000 UTC m=+185.591265553" Mar 09 13:01:51 crc kubenswrapper[4723]: I0309 13:01:51.650565 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:51 crc kubenswrapper[4723]: E0309 13:01:51.650962 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:01:52.150949866 +0000 UTC m=+186.165417406 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s6gh6" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:51 crc kubenswrapper[4723]: W0309 13:01:51.666764 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc15fd52d_a005_4417_aaca_84839023e2b4.slice/crio-3d0d1dfcb12382288c5a2b73e36848d29c08f3f0bf6edbd95b93903ffcac1363 WatchSource:0}: Error finding container 3d0d1dfcb12382288c5a2b73e36848d29c08f3f0bf6edbd95b93903ffcac1363: Status 404 returned error can't find the container with id 3d0d1dfcb12382288c5a2b73e36848d29c08f3f0bf6edbd95b93903ffcac1363 Mar 09 13:01:51 crc kubenswrapper[4723]: I0309 13:01:51.681955 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qp2l4" podStartSLOduration=115.681934287 podStartE2EDuration="1m55.681934287s" podCreationTimestamp="2026-03-09 12:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:51.639484763 +0000 UTC m=+185.653952303" watchObservedRunningTime="2026-03-09 13:01:51.681934287 +0000 UTC m=+185.696401827" Mar 09 13:01:51 crc kubenswrapper[4723]: I0309 13:01:51.708162 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-8w6mb" podStartSLOduration=115.708142976 podStartE2EDuration="1m55.708142976s" podCreationTimestamp="2026-03-09 12:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:51.707335136 +0000 UTC m=+185.721802676" watchObservedRunningTime="2026-03-09 13:01:51.708142976 +0000 UTC m=+185.722610516" Mar 09 13:01:51 crc kubenswrapper[4723]: I0309 13:01:51.708570 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-vb8n2" podStartSLOduration=115.708565947 podStartE2EDuration="1m55.708565947s" podCreationTimestamp="2026-03-09 12:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:51.680626484 +0000 UTC m=+185.695094024" watchObservedRunningTime="2026-03-09 13:01:51.708565947 +0000 UTC m=+185.723033487" Mar 09 13:01:51 crc kubenswrapper[4723]: W0309 13:01:51.727118 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ae8bb36_2555_4976_89ee_a9b6cb99ec9f.slice/crio-94c0fe0409bd6313c0254f2b8ba876e2d59d84b4f4855fe5337e93e06ba7b79d WatchSource:0}: Error finding container 94c0fe0409bd6313c0254f2b8ba876e2d59d84b4f4855fe5337e93e06ba7b79d: Status 404 returned error can't find the container with id 94c0fe0409bd6313c0254f2b8ba876e2d59d84b4f4855fe5337e93e06ba7b79d Mar 09 13:01:51 crc kubenswrapper[4723]: I0309 13:01:51.751929 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:01:51 crc kubenswrapper[4723]: E0309 13:01:51.752281 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:01:52.252263583 +0000 UTC m=+186.266731123 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:51 crc kubenswrapper[4723]: I0309 13:01:51.752991 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fbl76" podStartSLOduration=115.752982171 podStartE2EDuration="1m55.752982171s" podCreationTimestamp="2026-03-09 12:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:51.752581731 +0000 UTC m=+185.767049271" watchObservedRunningTime="2026-03-09 13:01:51.752982171 +0000 UTC m=+185.767449711" Mar 09 13:01:51 crc kubenswrapper[4723]: W0309 13:01:51.758098 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bc1b3df_48a8_46f0_a01e_2596af170d85.slice/crio-de2c976436a4793ba32c2ccbd8ae8169bd542951aaa9de25c8b8cbf232a5e77a WatchSource:0}: Error finding container de2c976436a4793ba32c2ccbd8ae8169bd542951aaa9de25c8b8cbf232a5e77a: Status 404 returned error can't find the container with id de2c976436a4793ba32c2ccbd8ae8169bd542951aaa9de25c8b8cbf232a5e77a Mar 09 13:01:51 crc kubenswrapper[4723]: I0309 13:01:51.789136 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" podStartSLOduration=115.789114654 podStartE2EDuration="1m55.789114654s" podCreationTimestamp="2026-03-09 12:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:51.785767388 +0000 UTC m=+185.800234948" watchObservedRunningTime="2026-03-09 13:01:51.789114654 +0000 UTC m=+185.803582194" Mar 09 13:01:51 crc kubenswrapper[4723]: I0309 13:01:51.858645 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:51 crc kubenswrapper[4723]: E0309 13:01:51.860909 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:01:52.360889866 +0000 UTC m=+186.375357406 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s6gh6" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:51 crc kubenswrapper[4723]: I0309 13:01:51.914738 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-lwpcl" podStartSLOduration=115.914716031 podStartE2EDuration="1m55.914716031s" podCreationTimestamp="2026-03-09 12:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:51.819985052 +0000 UTC m=+185.834452592" watchObservedRunningTime="2026-03-09 13:01:51.914716031 +0000 UTC m=+185.929183571" Mar 09 13:01:51 crc kubenswrapper[4723]: I0309 13:01:51.955980 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vmcrt"] Mar 09 13:01:51 crc kubenswrapper[4723]: I0309 13:01:51.964000 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:01:51 crc kubenswrapper[4723]: E0309 13:01:51.964439 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:01:52.46441915 +0000 UTC m=+186.478886690 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:51 crc kubenswrapper[4723]: W0309 13:01:51.996502 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6ad82e9_b2c3_4265_8738_8b944724faa7.slice/crio-6fba53dc0022a54d38688137e78534e451f514a451b6705f767d040cbe296c02 WatchSource:0}: Error finding container 6fba53dc0022a54d38688137e78534e451f514a451b6705f767d040cbe296c02: Status 404 returned error can't find the container with id 6fba53dc0022a54d38688137e78534e451f514a451b6705f767d040cbe296c02 Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.065582 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:52 crc kubenswrapper[4723]: E0309 13:01:52.067011 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:01:52.566994899 +0000 UTC m=+186.581462439 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s6gh6" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.166538 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:01:52 crc kubenswrapper[4723]: E0309 13:01:52.166919 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:01:52.666899199 +0000 UTC m=+186.681366739 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.268583 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:52 crc kubenswrapper[4723]: E0309 13:01:52.269016 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:01:52.768999496 +0000 UTC m=+186.783467036 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s6gh6" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.275684 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-lztcd"] Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.369460 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:01:52 crc kubenswrapper[4723]: E0309 13:01:52.370003 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:01:52.869984354 +0000 UTC m=+186.884451894 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.471501 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:52 crc kubenswrapper[4723]: E0309 13:01:52.471880 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:01:52.971847795 +0000 UTC m=+186.986315335 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s6gh6" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.542660 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9gpjt" event={"ID":"56e5be14-f33f-4db0-a372-77b3fd4d9510","Type":"ContainerStarted","Data":"313a4d2a16d2f81080394d798e3d317e48bf56b9810723f7d344d7825ff9ccf2"} Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.544724 4723 generic.go:334] "Generic (PLEG): container finished" podID="4f18e34a-2f8e-4450-bb3f-7b391bc03e06" containerID="bff6ea4dd740c5e52d7907d8961cac5c07ad3725962450bc954366dcaae43520" exitCode=0 Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.544783 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xzd59" event={"ID":"4f18e34a-2f8e-4450-bb3f-7b391bc03e06","Type":"ContainerDied","Data":"bff6ea4dd740c5e52d7907d8961cac5c07ad3725962450bc954366dcaae43520"} Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.568904 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t25rf" event={"ID":"337d5692-12d3-4c0a-8187-eb66a2666e95","Type":"ContainerStarted","Data":"626afabcba6c4a3017875ce4877f25ce104feb64f20f4aa7d031f6aaca0c7733"} Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.569701 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t25rf" Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.572370 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-j9lvd" event={"ID":"77a70858-7982-4c72-9dad-3fb8a8547361","Type":"ContainerStarted","Data":"1437fc40932dbf896dbeeb28f3536805be8ccad3c04466c2fc09a3fe2ffd092e"} Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.573277 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:01:52 crc kubenswrapper[4723]: E0309 13:01:52.573987 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:01:53.073962002 +0000 UTC m=+187.088429542 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.583673 4723 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-t25rf container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.583736 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t25rf" podUID="337d5692-12d3-4c0a-8187-eb66a2666e95" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.598585 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-t6z8c" event={"ID":"d1a69d09-d3e2-4af9-857a-3229bc05c992","Type":"ContainerStarted","Data":"678d667a403517132fb07638c08b767d9906ec9f43bf48db2d88e30e70499f6f"} Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.612954 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-p8pxb" event={"ID":"c6d61e80-043f-4ece-a6a6-eed6357749f5","Type":"ContainerStarted","Data":"8c8e2b68a7e5d4c3484d0f64943ccbf1d2fb0a14d4c8d1f1ae6f774121072bc7"} Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.614495 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-p8pxb" Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.617097 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t25rf" podStartSLOduration=116.617078463 podStartE2EDuration="1m56.617078463s" podCreationTimestamp="2026-03-09 12:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:52.612362163 +0000 UTC m=+186.626829703" watchObservedRunningTime="2026-03-09 13:01:52.617078463 +0000 UTC m=+186.631546003" Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.625061 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74fx9" event={"ID":"2c3f87db-54f2-4c2e-98ca-1718ed598a7d","Type":"ContainerStarted","Data":"94f94afd394a021261455b81053df45e8146d267e7d9628570a2b2f9482cb04f"} Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.627582 4723 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-p8pxb container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/healthz\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.627635 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-p8pxb" podUID="c6d61e80-043f-4ece-a6a6-eed6357749f5" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.18:8080/healthz\": dial tcp 10.217.0.18:8080: connect: connection refused" Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.628827 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bjqw4" event={"ID":"2d4d2050-65bd-4d00-893c-c7f296d90926","Type":"ContainerStarted","Data":"0f8f330cdb3a684ce30777e14edd99f436c675ac1c648a3791793009078d9399"} Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.644909 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wxvl" event={"ID":"f30207aa-a4d2-41bb-8f36-8c6809d96191","Type":"ContainerStarted","Data":"4a2b6257ace5ac531e7ecf04f7cef66fc2cb4bdcd9e82a4cb98332d45d1822d1"} Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.648758 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kdxpm" event={"ID":"330d0ce9-3cc0-427c-acf4-7c14f36add18","Type":"ContainerStarted","Data":"801553cdade7656017814e713ab964a2ff423264a64d4401de8cf744e096e40d"} Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.655970 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4hnlr" event={"ID":"2bc1b3df-48a8-46f0-a01e-2596af170d85","Type":"ContainerStarted","Data":"de2c976436a4793ba32c2ccbd8ae8169bd542951aaa9de25c8b8cbf232a5e77a"} Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.657681 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qpzn" event={"ID":"ea205a39-cbd1-4704-8e93-0b1747a88e8a","Type":"ContainerStarted","Data":"4d8cb376ca0ff69bc914ff7c0b9d81f9346a01857efa9816c23ea2725a28c503"} Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.661554 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qpzn" Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.672309 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bjqw4" podStartSLOduration=116.672292563 podStartE2EDuration="1m56.672292563s" podCreationTimestamp="2026-03-09 12:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:52.671486532 +0000 UTC m=+186.685954072" watchObservedRunningTime="2026-03-09 13:01:52.672292563 +0000 UTC m=+186.686760103" Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.672831 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-p8pxb" podStartSLOduration=116.672825837 podStartE2EDuration="1m56.672825837s" podCreationTimestamp="2026-03-09 12:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:52.640236854 +0000 UTC m=+186.654704394" watchObservedRunningTime="2026-03-09 13:01:52.672825837 +0000 UTC m=+186.687293377" Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.676180 4723 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-6qpzn container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.676230 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qpzn" podUID="ea205a39-cbd1-4704-8e93-0b1747a88e8a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.677255 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:52 crc kubenswrapper[4723]: E0309 13:01:52.680423 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:01:53.18040768 +0000 UTC m=+187.194875300 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s6gh6" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.695317 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gk4bd" event={"ID":"4e4bf80a-dd91-49ad-a418-2edfce2d0fac","Type":"ContainerStarted","Data":"e35616828d7ca9a623cfd830bbad268984fe10c7e5e20a3853fcc68e674d460c"} Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.704808 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qpzn" podStartSLOduration=116.704793923 podStartE2EDuration="1m56.704793923s" podCreationTimestamp="2026-03-09 12:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:52.696728157 +0000 UTC m=+186.711195697" watchObservedRunningTime="2026-03-09 13:01:52.704793923 +0000 UTC m=+186.719261463" Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.705247 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-gqscf" event={"ID":"0ae8bb36-2555-4976-89ee-a9b6cb99ec9f","Type":"ContainerStarted","Data":"94c0fe0409bd6313c0254f2b8ba876e2d59d84b4f4855fe5337e93e06ba7b79d"} Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.723508 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-4xwcm" event={"ID":"0383bca0-800c-4f21-a7ee-32e42609b47e","Type":"ContainerStarted","Data":"a619377723c998cce3ac81c1efc318de620381a246e9b0436ad3d89b2b535287"} Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.729453 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pcp4r" event={"ID":"eaf6376b-8a99-41a9-bbd7-c93567fe1f24","Type":"ContainerStarted","Data":"97cd5c585054548b9e7999014781ae73ec20b2a9578d17c90ab1486d86af76b3"} Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.746756 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j6zr2" event={"ID":"8a793f97-7e99-491f-a21f-e501491f98d0","Type":"ContainerStarted","Data":"e376daa899e30f818a2ce51e43c0f93561b7582cd6d1dc4b999f06999368e702"} Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.760853 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wxvl" podStartSLOduration=116.760833433 podStartE2EDuration="1m56.760833433s" podCreationTimestamp="2026-03-09 12:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:52.738797971 +0000 UTC m=+186.753265501" watchObservedRunningTime="2026-03-09 13:01:52.760833433 +0000 UTC m=+186.775300973" Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.769450 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-58jld" event={"ID":"32ff745d-fa0d-44cf-8ab8-bd25d48f3c1f","Type":"ContainerStarted","Data":"b54cce97b4beb1a26f11deda93801715c053db6cde43d70fd455e023df5a6973"} Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.771873 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vpt9b" event={"ID":"dc891b08-d815-4b09-94bb-bd0dc6dc01f4","Type":"ContainerStarted","Data":"1d4d6ce719532224215e4bc5406036c2076e013488ab292868e1c5e06d153c90"} Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.778694 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:01:52 crc kubenswrapper[4723]: E0309 13:01:52.779933 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:01:53.279914501 +0000 UTC m=+187.294382041 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.796852 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kdxpm" podStartSLOduration=116.796830232 podStartE2EDuration="1m56.796830232s" podCreationTimestamp="2026-03-09 12:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:52.767044862 +0000 UTC m=+186.781512402" watchObservedRunningTime="2026-03-09 13:01:52.796830232 +0000 UTC m=+186.811297772" Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.798823 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-4xwcm" podStartSLOduration=116.798816593 podStartE2EDuration="1m56.798816593s" podCreationTimestamp="2026-03-09 12:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:52.795708684 +0000 UTC m=+186.810176244" watchObservedRunningTime="2026-03-09 13:01:52.798816593 +0000 UTC m=+186.813284133" Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.820591 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-58jld" podStartSLOduration=116.820573749 podStartE2EDuration="1m56.820573749s" podCreationTimestamp="2026-03-09 12:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:52.812279757 +0000 UTC m=+186.826747317" watchObservedRunningTime="2026-03-09 13:01:52.820573749 +0000 UTC m=+186.835041289" Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.836560 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-r5ltl" event={"ID":"9d680903-3aac-4f3d-8e55-5fb9ee1cb46a","Type":"ContainerStarted","Data":"cb4b815c69fef2c3106a62ce5778a1a9c91f78df36983333f9f407a97ca74014"} Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.855511 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-c5h5p" event={"ID":"c7333389-f183-4d12-b140-3f332e49eaec","Type":"ContainerStarted","Data":"bc0117e732899342a0e760ac9b27b1b9e52c83df06e43b561fe700f8bc6da707"} Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.875555 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-n5lnt" event={"ID":"2001358b-ab58-4093-82cb-465bb04941c4","Type":"ContainerStarted","Data":"ba12004a6600c5a9b8a1c852f38c35f50b0046cb95580bff145748736aa12fc6"} Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.879391 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:52 crc kubenswrapper[4723]: E0309 13:01:52.880550 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:01:53.3805374 +0000 UTC m=+187.395004940 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s6gh6" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.892008 4723 patch_prober.go:28] interesting pod/downloads-7954f5f757-b5c74 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.892148 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-b5c74" podUID="e21fc837-8de2-4af5-a375-b14567f47d67" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.896634 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-b5c74" event={"ID":"e21fc837-8de2-4af5-a375-b14567f47d67","Type":"ContainerStarted","Data":"557925fb2f1608c7024d280104b7c48ee90f68f299f6912019bd8ea2d13de48f"} Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.896841 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-b5c74" Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.896852 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vmcrt" event={"ID":"e6ad82e9-b2c3-4265-8738-8b944724faa7","Type":"ContainerStarted","Data":"6fba53dc0022a54d38688137e78534e451f514a451b6705f767d040cbe296c02"} Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.910460 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-c5h5p" podStartSLOduration=116.910435973 podStartE2EDuration="1m56.910435973s" podCreationTimestamp="2026-03-09 12:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:52.897166844 +0000 UTC m=+186.911634404" watchObservedRunningTime="2026-03-09 13:01:52.910435973 +0000 UTC m=+186.924903513" Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.911260 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551020-rlng8" event={"ID":"c15fd52d-a005-4417-aaca-84839023e2b4","Type":"ContainerStarted","Data":"3d0d1dfcb12382288c5a2b73e36848d29c08f3f0bf6edbd95b93903ffcac1363"} Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.947078 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-b5c74" podStartSLOduration=116.947058938 podStartE2EDuration="1m56.947058938s" podCreationTimestamp="2026-03-09 12:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:52.944226826 +0000 UTC m=+186.958694366" watchObservedRunningTime="2026-03-09 13:01:52.947058938 +0000 UTC m=+186.961526478" Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.973846 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29551020-rlng8" podStartSLOduration=112.973822632 podStartE2EDuration="1m52.973822632s" podCreationTimestamp="2026-03-09 13:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:52.970064156 +0000 UTC m=+186.984531716" watchObservedRunningTime="2026-03-09 13:01:52.973822632 +0000 UTC m=+186.988290172" Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.975735 4723 generic.go:334] "Generic (PLEG): container finished" podID="beee0ec0-e83b-41df-b1c5-b6dadb908961" containerID="667959de6bf29fc2323a7f2322197b91b734afb4cc2421f38d765303c2e2e0b6" exitCode=0 Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.975883 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zkxq4" event={"ID":"beee0ec0-e83b-41df-b1c5-b6dadb908961","Type":"ContainerDied","Data":"667959de6bf29fc2323a7f2322197b91b734afb4cc2421f38d765303c2e2e0b6"} Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.977436 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zkxq4" Mar 09 13:01:52 crc kubenswrapper[4723]: I0309 13:01:52.980620 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:01:52 crc kubenswrapper[4723]: E0309 13:01:52.981776 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:01:53.481756744 +0000 UTC m=+187.496224284 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:53 crc kubenswrapper[4723]: I0309 13:01:53.015727 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jh2mc" event={"ID":"95ca7c55-d937-4670-b10c-a8aaf4b77c84","Type":"ContainerStarted","Data":"624d94cc5b3114b68b4d52845429fe64cf91092f1dce3328cb8bb192c85de747"} Mar 09 13:01:53 crc kubenswrapper[4723]: I0309 13:01:53.021487 4723 generic.go:334] "Generic (PLEG): container finished" podID="e0e7aa68-dd0a-4803-93c1-d3d824033cad" containerID="b262035f4c2a30f556f6b1ae5ed049fe272125499fa4a0e4e713cceccb06cdf2" exitCode=0 Mar 09 13:01:53 crc kubenswrapper[4723]: I0309 13:01:53.021554 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7zkls" event={"ID":"e0e7aa68-dd0a-4803-93c1-d3d824033cad","Type":"ContainerDied","Data":"b262035f4c2a30f556f6b1ae5ed049fe272125499fa4a0e4e713cceccb06cdf2"} Mar 09 13:01:53 crc kubenswrapper[4723]: I0309 13:01:53.048219 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lztcd" event={"ID":"f09eae28-36d6-4c16-8aab-bbd93934f921","Type":"ContainerStarted","Data":"de01803d752f28923e98aa68d90b2386da3bb42d071088797aa637101a567ec1"} Mar 09 13:01:53 crc kubenswrapper[4723]: I0309 13:01:53.065004 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zkxq4" podStartSLOduration=117.064981969 podStartE2EDuration="1m57.064981969s" podCreationTimestamp="2026-03-09 12:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:53.01373138 +0000 UTC m=+187.028198920" watchObservedRunningTime="2026-03-09 13:01:53.064981969 +0000 UTC m=+187.079449519" Mar 09 13:01:53 crc kubenswrapper[4723]: I0309 13:01:53.086727 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:53 crc kubenswrapper[4723]: E0309 13:01:53.087053 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:01:53.587040172 +0000 UTC m=+187.601507712 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s6gh6" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:53 crc kubenswrapper[4723]: I0309 13:01:53.111604 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhxc9" event={"ID":"3321a715-9c5f-4417-bec1-4ba3ccce946c","Type":"ContainerStarted","Data":"46a6f990613d3789d7d30c91ec7ff1b64d39a3d0c008963524b8fc8f38728b1d"} Mar 09 13:01:53 crc kubenswrapper[4723]: I0309 13:01:53.112567 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhxc9" Mar 09 13:01:53 crc kubenswrapper[4723]: I0309 13:01:53.137266 4723 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-fhxc9 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:5443/healthz\": dial tcp 10.217.0.30:5443: connect: connection refused" start-of-body= Mar 09 13:01:53 crc kubenswrapper[4723]: I0309 13:01:53.137330 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhxc9" podUID="3321a715-9c5f-4417-bec1-4ba3ccce946c" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.30:5443/healthz\": dial tcp 10.217.0.30:5443: connect: connection refused" Mar 09 13:01:53 crc kubenswrapper[4723]: I0309 13:01:53.138948 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jh2mc" podStartSLOduration=117.138932247 podStartE2EDuration="1m57.138932247s" podCreationTimestamp="2026-03-09 12:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:53.066666532 +0000 UTC m=+187.081134072" watchObservedRunningTime="2026-03-09 13:01:53.138932247 +0000 UTC m=+187.153399777" Mar 09 13:01:53 crc kubenswrapper[4723]: I0309 13:01:53.169508 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p9x2d" event={"ID":"09470b31-c2ae-42f8-8490-c446e979042d","Type":"ContainerStarted","Data":"2a4a60e023b541085733c8da1f7b59126ba7b75532dc9f56a2f5ac4079a5954a"} Mar 09 13:01:53 crc kubenswrapper[4723]: I0309 13:01:53.170189 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p9x2d" Mar 09 13:01:53 crc kubenswrapper[4723]: I0309 13:01:53.174785 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhxc9" podStartSLOduration=117.174772082 podStartE2EDuration="1m57.174772082s" podCreationTimestamp="2026-03-09 12:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:53.174527686 +0000 UTC m=+187.188995236" watchObservedRunningTime="2026-03-09 13:01:53.174772082 +0000 UTC m=+187.189239622" Mar 09 13:01:53 crc kubenswrapper[4723]: I0309 13:01:53.179723 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-gm7hg" event={"ID":"8e50ffdf-4ab9-4237-b7c9-e9e641711d6c","Type":"ContainerStarted","Data":"4ed48a3133e312cc7f177d462e851354ed7d9ec5a28c3f7b064a4a32ba01d45d"} Mar 09 13:01:53 crc kubenswrapper[4723]: I0309 13:01:53.191785 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:01:53 crc kubenswrapper[4723]: E0309 13:01:53.192440 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:01:53.692423453 +0000 UTC m=+187.706890993 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:53 crc kubenswrapper[4723]: I0309 13:01:53.207297 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-95tgr" event={"ID":"e9a38022-ccc3-43cb-8af2-4252aef56bf8","Type":"ContainerStarted","Data":"2013f3081d379640640cf614c711ba5e8ecd80c4d34de6fb05f31fb52fe61772"} Mar 09 13:01:53 crc kubenswrapper[4723]: I0309 13:01:53.218966 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p9x2d" podStartSLOduration=117.21895001 podStartE2EDuration="1m57.21895001s" podCreationTimestamp="2026-03-09 12:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:53.217501013 +0000 UTC m=+187.231968553" watchObservedRunningTime="2026-03-09 13:01:53.21895001 +0000 UTC m=+187.233417550" Mar 09 13:01:53 crc kubenswrapper[4723]: I0309 13:01:53.256369 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-gm7hg" podStartSLOduration=7.256352585 podStartE2EDuration="7.256352585s" podCreationTimestamp="2026-03-09 13:01:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:53.255223546 +0000 UTC m=+187.269691086" watchObservedRunningTime="2026-03-09 13:01:53.256352585 +0000 UTC m=+187.270820125" Mar 09 13:01:53 crc kubenswrapper[4723]: I0309 13:01:53.293570 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:53 crc kubenswrapper[4723]: E0309 13:01:53.295112 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:01:53.795096274 +0000 UTC m=+187.809563814 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s6gh6" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:53 crc kubenswrapper[4723]: I0309 13:01:53.375618 4723 ???:1] "http: TLS handshake error from 192.168.126.11:35818: no serving certificate available for the kubelet" Mar 09 13:01:53 crc kubenswrapper[4723]: I0309 13:01:53.401251 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:01:53 crc kubenswrapper[4723]: E0309 13:01:53.401731 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:01:53.901712706 +0000 UTC m=+187.916180246 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:53 crc kubenswrapper[4723]: I0309 13:01:53.469350 4723 ???:1] "http: TLS handshake error from 192.168.126.11:35828: no serving certificate available for the kubelet" Mar 09 13:01:53 crc kubenswrapper[4723]: I0309 13:01:53.504022 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:53 crc kubenswrapper[4723]: E0309 13:01:53.504374 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:01:54.004361417 +0000 UTC m=+188.018828957 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s6gh6" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:53 crc kubenswrapper[4723]: I0309 13:01:53.606388 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:01:53 crc kubenswrapper[4723]: E0309 13:01:53.606759 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:01:54.106738401 +0000 UTC m=+188.121205941 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:53 crc kubenswrapper[4723]: I0309 13:01:53.651170 4723 ???:1] "http: TLS handshake error from 192.168.126.11:35830: no serving certificate available for the kubelet" Mar 09 13:01:53 crc kubenswrapper[4723]: I0309 13:01:53.707883 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:53 crc kubenswrapper[4723]: E0309 13:01:53.708186 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:01:54.208173271 +0000 UTC m=+188.222640811 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s6gh6" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:53 crc kubenswrapper[4723]: I0309 13:01:53.768886 4723 ???:1] "http: TLS handshake error from 192.168.126.11:35844: no serving certificate available for the kubelet" Mar 09 13:01:53 crc kubenswrapper[4723]: I0309 13:01:53.811885 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:01:53 crc kubenswrapper[4723]: E0309 13:01:53.812128 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:01:54.312111815 +0000 UTC m=+188.326579355 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:53 crc kubenswrapper[4723]: I0309 13:01:53.871802 4723 ???:1] "http: TLS handshake error from 192.168.126.11:35852: no serving certificate available for the kubelet" Mar 09 13:01:53 crc kubenswrapper[4723]: I0309 13:01:53.920665 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:53 crc kubenswrapper[4723]: E0309 13:01:53.921114 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:01:54.421094627 +0000 UTC m=+188.435562167 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s6gh6" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.022141 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:01:54 crc kubenswrapper[4723]: E0309 13:01:54.022567 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:01:54.522547307 +0000 UTC m=+188.537014837 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.048813 4723 ???:1] "http: TLS handshake error from 192.168.126.11:35864: no serving certificate available for the kubelet" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.123352 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:54 crc kubenswrapper[4723]: E0309 13:01:54.123755 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:01:54.623735871 +0000 UTC m=+188.638203411 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s6gh6" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.224431 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:01:54 crc kubenswrapper[4723]: E0309 13:01:54.224845 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:01:54.724827522 +0000 UTC m=+188.739295062 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.227938 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gk4bd" event={"ID":"4e4bf80a-dd91-49ad-a418-2edfce2d0fac","Type":"ContainerStarted","Data":"ea38b563e9ce1295be6fcd74a19e5ebfb2b8e3ace4afd76dc932099ccada096f"} Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.227974 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gk4bd" event={"ID":"4e4bf80a-dd91-49ad-a418-2edfce2d0fac","Type":"ContainerStarted","Data":"300f77f16e7da70bf8baddb4cc22a229a9c07bee327d1224122aa58da7442f3a"} Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.231547 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551020-rlng8" event={"ID":"c15fd52d-a005-4417-aaca-84839023e2b4","Type":"ContainerStarted","Data":"071de674372c6d212f8340b6e56e0ccc976bb9a08fd325a215bdfba2286169c9"} Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.235090 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-wkpzg" event={"ID":"1f6455f2-bcad-4e11-8ef5-a272b406be88","Type":"ContainerStarted","Data":"b6cf57e91e838d2637897bd9893d85047af5a69c0747e4b77c808af59b8169e0"} Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.235824 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-wkpzg" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.239708 4723 patch_prober.go:28] interesting pod/console-operator-58897d9998-wkpzg container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/readyz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.239738 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-wkpzg" podUID="1f6455f2-bcad-4e11-8ef5-a272b406be88" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/readyz\": dial tcp 10.217.0.23:8443: connect: connection refused" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.240129 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74fx9" event={"ID":"2c3f87db-54f2-4c2e-98ca-1718ed598a7d","Type":"ContainerStarted","Data":"e525c52e2400bb09bdeca1546999291c1444d5f9d965df161170cd57d48618d7"} Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.242324 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7zkls" event={"ID":"e0e7aa68-dd0a-4803-93c1-d3d824033cad","Type":"ContainerStarted","Data":"2c72b63d5fa9d34b11b28684c64c50e85c94b6135981c5e210a1008abfeee3d3"} Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.259880 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zkxq4" event={"ID":"beee0ec0-e83b-41df-b1c5-b6dadb908961","Type":"ContainerStarted","Data":"89a30d5b744f9a256482aa65435082ebc6efe191877294a9b15133d2d29003aa"} Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.262837 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-r5ltl" event={"ID":"9d680903-3aac-4f3d-8e55-5fb9ee1cb46a","Type":"ContainerStarted","Data":"a2fd45330abfa2b45406b461ceb2a5bdd45141dd9e6b088bddc8dbfcff635695"} Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.263417 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-95tgr" podStartSLOduration=119.263403387 podStartE2EDuration="1m59.263403387s" podCreationTimestamp="2026-03-09 12:59:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:53.281158638 +0000 UTC m=+187.295626178" watchObservedRunningTime="2026-03-09 13:01:54.263403387 +0000 UTC m=+188.277870927" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.264010 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gk4bd" podStartSLOduration=118.264005952 podStartE2EDuration="1m58.264005952s" podCreationTimestamp="2026-03-09 12:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:54.262091123 +0000 UTC m=+188.276558663" watchObservedRunningTime="2026-03-09 13:01:54.264005952 +0000 UTC m=+188.278473492" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.265847 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p9x2d" event={"ID":"09470b31-c2ae-42f8-8490-c446e979042d","Type":"ContainerStarted","Data":"30cbe7591fb1b4340ea9282b8d1006eb723624757c456b282004be14204830d7"} Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.287177 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pcp4r" event={"ID":"eaf6376b-8a99-41a9-bbd7-c93567fe1f24","Type":"ContainerStarted","Data":"c9f5c8a5b5675a18ec4038a09d9d98ee7651f9675849d1890411117b195576cc"} Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.287223 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pcp4r" event={"ID":"eaf6376b-8a99-41a9-bbd7-c93567fe1f24","Type":"ContainerStarted","Data":"33bb06214bc452cb6d8bcae4dfb763be882b7dd61bd2c47b25d7aeb0132d0db1"} Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.290936 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-58jld" event={"ID":"32ff745d-fa0d-44cf-8ab8-bd25d48f3c1f","Type":"ContainerStarted","Data":"6413612c8f1f9bb68a13e388c47e19083b3b775c092b0b8757dbaeee9be5c3a0"} Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.303217 4723 ???:1] "http: TLS handshake error from 192.168.126.11:35866: no serving certificate available for the kubelet" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.316681 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9gpjt" event={"ID":"56e5be14-f33f-4db0-a372-77b3fd4d9510","Type":"ContainerStarted","Data":"a329724defd4e98d7b989a0497f4ba3c903da0301ff3deb8752811b7bb07cf25"} Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.327585 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:54 crc kubenswrapper[4723]: E0309 13:01:54.328046 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:01:54.828030277 +0000 UTC m=+188.842497817 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s6gh6" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.328951 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vmcrt" event={"ID":"e6ad82e9-b2c3-4265-8738-8b944724faa7","Type":"ContainerStarted","Data":"b5c8d2749c9acd44e19fe9269b78ddd1cd893e63032b30d15f6f8d43180fc38b"} Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.336799 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7zkls" podStartSLOduration=118.33677862 podStartE2EDuration="1m58.33677862s" podCreationTimestamp="2026-03-09 12:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:54.312056809 +0000 UTC m=+188.326524349" watchObservedRunningTime="2026-03-09 13:01:54.33677862 +0000 UTC m=+188.351246160" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.337764 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4hnlr" event={"ID":"2bc1b3df-48a8-46f0-a01e-2596af170d85","Type":"ContainerStarted","Data":"f6335e59386969067204c4532b21f9a8512a89449743c4a4b3aaba6c8e704be8"} Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.338430 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-4hnlr" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.338588 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-wkpzg" podStartSLOduration=118.338578836 podStartE2EDuration="1m58.338578836s" podCreationTimestamp="2026-03-09 12:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:54.33757383 +0000 UTC m=+188.352041370" watchObservedRunningTime="2026-03-09 13:01:54.338578836 +0000 UTC m=+188.353046376" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.355095 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-j9lvd" event={"ID":"77a70858-7982-4c72-9dad-3fb8a8547361","Type":"ContainerStarted","Data":"da755b245ebb2f7647bfcdb148ad1788ee1afe2d214302c1042df7956e27e270"} Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.371796 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-t6z8c" event={"ID":"d1a69d09-d3e2-4af9-857a-3229bc05c992","Type":"ContainerStarted","Data":"0fc5b207fe51fc9e9c34a9a0b7a6d0d4672af446fcc74eff39dc8f8dbe9cdfd9"} Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.379069 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j6zr2" event={"ID":"8a793f97-7e99-491f-a21f-e501491f98d0","Type":"ContainerStarted","Data":"ce25aae0299765f7e540e140789527afcd1afbfe375b559ef22bbf3bdcc8ed72"} Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.379120 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j6zr2" event={"ID":"8a793f97-7e99-491f-a21f-e501491f98d0","Type":"ContainerStarted","Data":"63353a7f4dc708a359fb0bd4a3cb4eff10de797e5cdcbf73b26c619d8f487ebf"} Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.395767 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lztcd" event={"ID":"f09eae28-36d6-4c16-8aab-bbd93934f921","Type":"ContainerStarted","Data":"7a4cd0822990668fa2acc6b1d194bab1434045c70012854c0c44662f5c9af0b0"} Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.403728 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-fzrk5" event={"ID":"f386e3de-9b2f-478d-b7c5-8ac3f8aff8d1","Type":"ContainerStarted","Data":"958a4132ba0d2d127621309887d22d3ccc18e851f9c5de85cbad9986c138ad30"} Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.407372 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vpt9b" event={"ID":"dc891b08-d815-4b09-94bb-bd0dc6dc01f4","Type":"ContainerStarted","Data":"bd07715680e3aafd806209b78e91cfabde6f30584759d8f2d67f9e91e580a158"} Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.428409 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xzd59" event={"ID":"4f18e34a-2f8e-4450-bb3f-7b391bc03e06","Type":"ContainerStarted","Data":"237be1b413a29e8049da2906a056f3509145308c909ad8df15513629d0f9bd43"} Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.428760 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7zkls" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.429192 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7zkls" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.429698 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:01:54 crc kubenswrapper[4723]: E0309 13:01:54.431955 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:01:54.931928709 +0000 UTC m=+188.946396289 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.435244 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:54 crc kubenswrapper[4723]: E0309 13:01:54.435754 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:01:54.935738937 +0000 UTC m=+188.950206477 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s6gh6" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.452171 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-gqscf" event={"ID":"0ae8bb36-2555-4976-89ee-a9b6cb99ec9f","Type":"ContainerStarted","Data":"c49e7366c495c99737f3fa69605b1c1e0ed752fe3216765adb97bff1c2b0ca34"} Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.454603 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74fx9" podStartSLOduration=118.454579708 podStartE2EDuration="1m58.454579708s" podCreationTimestamp="2026-03-09 12:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:54.37828002 +0000 UTC m=+188.392747560" watchObservedRunningTime="2026-03-09 13:01:54.454579708 +0000 UTC m=+188.469047248" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.454811 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pcp4r" podStartSLOduration=118.454806984 podStartE2EDuration="1m58.454806984s" podCreationTimestamp="2026-03-09 12:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:54.444835649 +0000 UTC m=+188.459303189" watchObservedRunningTime="2026-03-09 13:01:54.454806984 +0000 UTC m=+188.469274514" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.478266 4723 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-7zkls container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.24:8443/livez\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.478331 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7zkls" podUID="e0e7aa68-dd0a-4803-93c1-d3d824033cad" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.24:8443/livez\": dial tcp 10.217.0.24:8443: connect: connection refused" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.497706 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4x6zm"] Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.498766 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4x6zm" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.503213 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.503595 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-4hnlr" podStartSLOduration=7.503560298 podStartE2EDuration="7.503560298s" podCreationTimestamp="2026-03-09 13:01:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:54.49383387 +0000 UTC m=+188.508301410" watchObservedRunningTime="2026-03-09 13:01:54.503560298 +0000 UTC m=+188.518027848" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.511024 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4x6zm"] Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.514526 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-n5lnt" event={"ID":"2001358b-ab58-4093-82cb-465bb04941c4","Type":"ContainerStarted","Data":"fd60dfd8b506ae05ae49259a1e406008145c1a7d5635f22cb4a3a99fd213f4f5"} Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.519231 4723 patch_prober.go:28] interesting pod/downloads-7954f5f757-b5c74 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.527556 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-b5c74" podUID="e21fc837-8de2-4af5-a375-b14567f47d67" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.521490 4723 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-p8pxb container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/healthz\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.527649 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-p8pxb" podUID="c6d61e80-043f-4ece-a6a6-eed6357749f5" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.18:8080/healthz\": dial tcp 10.217.0.18:8080: connect: connection refused" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.531441 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qpzn" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.541907 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.542006 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t25rf" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.542130 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv9fj\" (UniqueName: \"kubernetes.io/projected/a7d103aa-232e-4705-a061-8ad7025339cf-kube-api-access-vv9fj\") pod \"community-operators-4x6zm\" (UID: \"a7d103aa-232e-4705-a061-8ad7025339cf\") " pod="openshift-marketplace/community-operators-4x6zm" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.542572 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7d103aa-232e-4705-a061-8ad7025339cf-utilities\") pod \"community-operators-4x6zm\" (UID: \"a7d103aa-232e-4705-a061-8ad7025339cf\") " pod="openshift-marketplace/community-operators-4x6zm" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.542610 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7d103aa-232e-4705-a061-8ad7025339cf-catalog-content\") pod \"community-operators-4x6zm\" (UID: \"a7d103aa-232e-4705-a061-8ad7025339cf\") " pod="openshift-marketplace/community-operators-4x6zm" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.543329 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-r5ltl" podStartSLOduration=118.543319614 podStartE2EDuration="1m58.543319614s" podCreationTimestamp="2026-03-09 12:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:54.541322452 +0000 UTC m=+188.555789992" watchObservedRunningTime="2026-03-09 13:01:54.543319614 +0000 UTC m=+188.557787154" Mar 09 13:01:54 crc kubenswrapper[4723]: E0309 13:01:54.547030 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:01:55.047005798 +0000 UTC m=+189.061473338 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.574548 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-vmcrt" podStartSLOduration=7.57453201 podStartE2EDuration="7.57453201s" podCreationTimestamp="2026-03-09 13:01:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:54.571146704 +0000 UTC m=+188.585614244" watchObservedRunningTime="2026-03-09 13:01:54.57453201 +0000 UTC m=+188.588999550" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.648543 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7d103aa-232e-4705-a061-8ad7025339cf-utilities\") pod \"community-operators-4x6zm\" (UID: \"a7d103aa-232e-4705-a061-8ad7025339cf\") " pod="openshift-marketplace/community-operators-4x6zm" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.648599 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7d103aa-232e-4705-a061-8ad7025339cf-catalog-content\") pod \"community-operators-4x6zm\" (UID: \"a7d103aa-232e-4705-a061-8ad7025339cf\") " pod="openshift-marketplace/community-operators-4x6zm" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.648649 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.648681 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vv9fj\" (UniqueName: \"kubernetes.io/projected/a7d103aa-232e-4705-a061-8ad7025339cf-kube-api-access-vv9fj\") pod \"community-operators-4x6zm\" (UID: \"a7d103aa-232e-4705-a061-8ad7025339cf\") " pod="openshift-marketplace/community-operators-4x6zm" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.649402 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7d103aa-232e-4705-a061-8ad7025339cf-utilities\") pod \"community-operators-4x6zm\" (UID: \"a7d103aa-232e-4705-a061-8ad7025339cf\") " pod="openshift-marketplace/community-operators-4x6zm" Mar 09 13:01:54 crc kubenswrapper[4723]: E0309 13:01:54.649694 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:01:55.149680179 +0000 UTC m=+189.164147719 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s6gh6" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.649835 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7d103aa-232e-4705-a061-8ad7025339cf-catalog-content\") pod \"community-operators-4x6zm\" (UID: \"a7d103aa-232e-4705-a061-8ad7025339cf\") " pod="openshift-marketplace/community-operators-4x6zm" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.651120 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-j9lvd" podStartSLOduration=118.651106966 podStartE2EDuration="1m58.651106966s" podCreationTimestamp="2026-03-09 12:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:54.650220593 +0000 UTC m=+188.664688133" watchObservedRunningTime="2026-03-09 13:01:54.651106966 +0000 UTC m=+188.665574516" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.685788 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dqh66"] Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.687063 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dqh66" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.689194 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.714569 4723 ???:1] "http: TLS handshake error from 192.168.126.11:35878: no serving certificate available for the kubelet" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.717680 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv9fj\" (UniqueName: \"kubernetes.io/projected/a7d103aa-232e-4705-a061-8ad7025339cf-kube-api-access-vv9fj\") pod \"community-operators-4x6zm\" (UID: \"a7d103aa-232e-4705-a061-8ad7025339cf\") " pod="openshift-marketplace/community-operators-4x6zm" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.720451 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-t6z8c" podStartSLOduration=118.720410294 podStartE2EDuration="1m58.720410294s" podCreationTimestamp="2026-03-09 12:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:54.715260312 +0000 UTC m=+188.729727852" watchObservedRunningTime="2026-03-09 13:01:54.720410294 +0000 UTC m=+188.734877834" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.721608 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dqh66"] Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.750241 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.750445 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5cafa5f-a4bc-4029-b136-ba9b3e2b6709-utilities\") pod \"certified-operators-dqh66\" (UID: \"b5cafa5f-a4bc-4029-b136-ba9b3e2b6709\") " pod="openshift-marketplace/certified-operators-dqh66" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.750515 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgmzt\" (UniqueName: \"kubernetes.io/projected/b5cafa5f-a4bc-4029-b136-ba9b3e2b6709-kube-api-access-rgmzt\") pod \"certified-operators-dqh66\" (UID: \"b5cafa5f-a4bc-4029-b136-ba9b3e2b6709\") " pod="openshift-marketplace/certified-operators-dqh66" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.750537 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5cafa5f-a4bc-4029-b136-ba9b3e2b6709-catalog-content\") pod \"certified-operators-dqh66\" (UID: \"b5cafa5f-a4bc-4029-b136-ba9b3e2b6709\") " pod="openshift-marketplace/certified-operators-dqh66" Mar 09 13:01:54 crc kubenswrapper[4723]: E0309 13:01:54.750642 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:01:55.250626485 +0000 UTC m=+189.265094025 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.753972 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9gpjt" podStartSLOduration=118.75393984 podStartE2EDuration="1m58.75393984s" podCreationTimestamp="2026-03-09 12:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:54.743094573 +0000 UTC m=+188.757562113" watchObservedRunningTime="2026-03-09 13:01:54.75393984 +0000 UTC m=+188.768407400" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.776189 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j6zr2" podStartSLOduration=118.776171208 podStartE2EDuration="1m58.776171208s" podCreationTimestamp="2026-03-09 12:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:54.774019733 +0000 UTC m=+188.788487283" watchObservedRunningTime="2026-03-09 13:01:54.776171208 +0000 UTC m=+188.790638748" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.855354 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5cafa5f-a4bc-4029-b136-ba9b3e2b6709-utilities\") pod \"certified-operators-dqh66\" (UID: \"b5cafa5f-a4bc-4029-b136-ba9b3e2b6709\") " pod="openshift-marketplace/certified-operators-dqh66" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.855440 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.855457 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgmzt\" (UniqueName: \"kubernetes.io/projected/b5cafa5f-a4bc-4029-b136-ba9b3e2b6709-kube-api-access-rgmzt\") pod \"certified-operators-dqh66\" (UID: \"b5cafa5f-a4bc-4029-b136-ba9b3e2b6709\") " pod="openshift-marketplace/certified-operators-dqh66" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.855478 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5cafa5f-a4bc-4029-b136-ba9b3e2b6709-catalog-content\") pod \"certified-operators-dqh66\" (UID: \"b5cafa5f-a4bc-4029-b136-ba9b3e2b6709\") " pod="openshift-marketplace/certified-operators-dqh66" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.856241 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5cafa5f-a4bc-4029-b136-ba9b3e2b6709-catalog-content\") pod \"certified-operators-dqh66\" (UID: \"b5cafa5f-a4bc-4029-b136-ba9b3e2b6709\") " pod="openshift-marketplace/certified-operators-dqh66" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.856453 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5cafa5f-a4bc-4029-b136-ba9b3e2b6709-utilities\") pod \"certified-operators-dqh66\" (UID: \"b5cafa5f-a4bc-4029-b136-ba9b3e2b6709\") " pod="openshift-marketplace/certified-operators-dqh66" Mar 09 13:01:54 crc kubenswrapper[4723]: E0309 13:01:54.856719 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:01:55.356707914 +0000 UTC m=+189.371175454 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s6gh6" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.872165 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4x6zm" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.904671 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgmzt\" (UniqueName: \"kubernetes.io/projected/b5cafa5f-a4bc-4029-b136-ba9b3e2b6709-kube-api-access-rgmzt\") pod \"certified-operators-dqh66\" (UID: \"b5cafa5f-a4bc-4029-b136-ba9b3e2b6709\") " pod="openshift-marketplace/certified-operators-dqh66" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.939521 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-lztcd" podStartSLOduration=118.939504798 podStartE2EDuration="1m58.939504798s" podCreationTimestamp="2026-03-09 12:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:54.890465736 +0000 UTC m=+188.904933276" watchObservedRunningTime="2026-03-09 13:01:54.939504798 +0000 UTC m=+188.953972338" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.941839 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jq2cv"] Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.944418 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jq2cv" Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.957310 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:01:54 crc kubenswrapper[4723]: I0309 13:01:54.970807 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jq2cv"] Mar 09 13:01:54 crc kubenswrapper[4723]: E0309 13:01:54.974234 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:01:55.474193134 +0000 UTC m=+189.488660674 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.023669 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dqh66" Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.048286 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-fzrk5" Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.075796 4723 patch_prober.go:28] interesting pod/router-default-5444994796-fzrk5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:01:55 crc kubenswrapper[4723]: [-]has-synced failed: reason withheld Mar 09 13:01:55 crc kubenswrapper[4723]: [+]process-running ok Mar 09 13:01:55 crc kubenswrapper[4723]: healthz check failed Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.075854 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fzrk5" podUID="f386e3de-9b2f-478d-b7c5-8ac3f8aff8d1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.076146 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-fzrk5" podStartSLOduration=119.076131176 podStartE2EDuration="1m59.076131176s" podCreationTimestamp="2026-03-09 12:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:55.07352097 +0000 UTC m=+189.087988510" watchObservedRunningTime="2026-03-09 13:01:55.076131176 +0000 UTC m=+189.090598716" Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.076725 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fd2j\" (UniqueName: \"kubernetes.io/projected/84890bd9-0d95-48f4-89d3-6619e5e5525a-kube-api-access-6fd2j\") pod \"community-operators-jq2cv\" (UID: \"84890bd9-0d95-48f4-89d3-6619e5e5525a\") " pod="openshift-marketplace/community-operators-jq2cv" Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.076752 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.076777 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84890bd9-0d95-48f4-89d3-6619e5e5525a-catalog-content\") pod \"community-operators-jq2cv\" (UID: \"84890bd9-0d95-48f4-89d3-6619e5e5525a\") " pod="openshift-marketplace/community-operators-jq2cv" Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.076798 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84890bd9-0d95-48f4-89d3-6619e5e5525a-utilities\") pod \"community-operators-jq2cv\" (UID: \"84890bd9-0d95-48f4-89d3-6619e5e5525a\") " pod="openshift-marketplace/community-operators-jq2cv" Mar 09 13:01:55 crc kubenswrapper[4723]: E0309 13:01:55.077153 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:01:55.577143862 +0000 UTC m=+189.591611402 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s6gh6" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.094239 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9qb2k"] Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.095177 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9qb2k" Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.116332 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9qb2k"] Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.185413 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.186061 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fd2j\" (UniqueName: \"kubernetes.io/projected/84890bd9-0d95-48f4-89d3-6619e5e5525a-kube-api-access-6fd2j\") pod \"community-operators-jq2cv\" (UID: \"84890bd9-0d95-48f4-89d3-6619e5e5525a\") " pod="openshift-marketplace/community-operators-jq2cv" Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.186099 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa9195db-65d7-4777-8869-948a26e41933-catalog-content\") pod \"certified-operators-9qb2k\" (UID: \"fa9195db-65d7-4777-8869-948a26e41933\") " pod="openshift-marketplace/certified-operators-9qb2k" Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.186140 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9nm8\" (UniqueName: \"kubernetes.io/projected/fa9195db-65d7-4777-8869-948a26e41933-kube-api-access-t9nm8\") pod \"certified-operators-9qb2k\" (UID: \"fa9195db-65d7-4777-8869-948a26e41933\") " pod="openshift-marketplace/certified-operators-9qb2k" Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.186173 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84890bd9-0d95-48f4-89d3-6619e5e5525a-catalog-content\") pod \"community-operators-jq2cv\" (UID: \"84890bd9-0d95-48f4-89d3-6619e5e5525a\") " pod="openshift-marketplace/community-operators-jq2cv" Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.186199 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84890bd9-0d95-48f4-89d3-6619e5e5525a-utilities\") pod \"community-operators-jq2cv\" (UID: \"84890bd9-0d95-48f4-89d3-6619e5e5525a\") " pod="openshift-marketplace/community-operators-jq2cv" Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.186237 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa9195db-65d7-4777-8869-948a26e41933-utilities\") pod \"certified-operators-9qb2k\" (UID: \"fa9195db-65d7-4777-8869-948a26e41933\") " pod="openshift-marketplace/certified-operators-9qb2k" Mar 09 13:01:55 crc kubenswrapper[4723]: E0309 13:01:55.186374 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:01:55.68635138 +0000 UTC m=+189.700818930 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.187025 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84890bd9-0d95-48f4-89d3-6619e5e5525a-catalog-content\") pod \"community-operators-jq2cv\" (UID: \"84890bd9-0d95-48f4-89d3-6619e5e5525a\") " pod="openshift-marketplace/community-operators-jq2cv" Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.187211 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84890bd9-0d95-48f4-89d3-6619e5e5525a-utilities\") pod \"community-operators-jq2cv\" (UID: \"84890bd9-0d95-48f4-89d3-6619e5e5525a\") " pod="openshift-marketplace/community-operators-jq2cv" Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.219340 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-n5lnt" podStartSLOduration=119.219323522 podStartE2EDuration="1m59.219323522s" podCreationTimestamp="2026-03-09 12:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:55.218747947 +0000 UTC m=+189.233215487" watchObservedRunningTime="2026-03-09 13:01:55.219323522 +0000 UTC m=+189.233791062" Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.220030 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-xzd59" podStartSLOduration=119.22002529 podStartE2EDuration="1m59.22002529s" podCreationTimestamp="2026-03-09 12:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:55.194336254 +0000 UTC m=+189.208803794" watchObservedRunningTime="2026-03-09 13:01:55.22002529 +0000 UTC m=+189.234492830" Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.227036 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fd2j\" (UniqueName: \"kubernetes.io/projected/84890bd9-0d95-48f4-89d3-6619e5e5525a-kube-api-access-6fd2j\") pod \"community-operators-jq2cv\" (UID: \"84890bd9-0d95-48f4-89d3-6619e5e5525a\") " pod="openshift-marketplace/community-operators-jq2cv" Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.276182 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jq2cv" Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.287457 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa9195db-65d7-4777-8869-948a26e41933-catalog-content\") pod \"certified-operators-9qb2k\" (UID: \"fa9195db-65d7-4777-8869-948a26e41933\") " pod="openshift-marketplace/certified-operators-9qb2k" Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.287496 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.287516 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9nm8\" (UniqueName: \"kubernetes.io/projected/fa9195db-65d7-4777-8869-948a26e41933-kube-api-access-t9nm8\") pod \"certified-operators-9qb2k\" (UID: \"fa9195db-65d7-4777-8869-948a26e41933\") " pod="openshift-marketplace/certified-operators-9qb2k" Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.287555 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa9195db-65d7-4777-8869-948a26e41933-utilities\") pod \"certified-operators-9qb2k\" (UID: \"fa9195db-65d7-4777-8869-948a26e41933\") " pod="openshift-marketplace/certified-operators-9qb2k" Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.287996 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa9195db-65d7-4777-8869-948a26e41933-utilities\") pod \"certified-operators-9qb2k\" (UID: \"fa9195db-65d7-4777-8869-948a26e41933\") " pod="openshift-marketplace/certified-operators-9qb2k" Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.288208 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa9195db-65d7-4777-8869-948a26e41933-catalog-content\") pod \"certified-operators-9qb2k\" (UID: \"fa9195db-65d7-4777-8869-948a26e41933\") " pod="openshift-marketplace/certified-operators-9qb2k" Mar 09 13:01:55 crc kubenswrapper[4723]: E0309 13:01:55.288438 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:01:55.788427056 +0000 UTC m=+189.802894596 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s6gh6" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.291658 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vpt9b" podStartSLOduration=119.291636388 podStartE2EDuration="1m59.291636388s" podCreationTimestamp="2026-03-09 12:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:55.251197636 +0000 UTC m=+189.265665176" watchObservedRunningTime="2026-03-09 13:01:55.291636388 +0000 UTC m=+189.306103928" Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.317534 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9nm8\" (UniqueName: \"kubernetes.io/projected/fa9195db-65d7-4777-8869-948a26e41933-kube-api-access-t9nm8\") pod \"certified-operators-9qb2k\" (UID: \"fa9195db-65d7-4777-8869-948a26e41933\") " pod="openshift-marketplace/certified-operators-9qb2k" Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.391397 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zkxq4" Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.391598 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:01:55 crc kubenswrapper[4723]: E0309 13:01:55.391681 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:01:55.891658102 +0000 UTC m=+189.906125642 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.392017 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:55 crc kubenswrapper[4723]: E0309 13:01:55.392553 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:01:55.892539505 +0000 UTC m=+189.907007045 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s6gh6" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.406235 4723 ???:1] "http: TLS handshake error from 192.168.126.11:35892: no serving certificate available for the kubelet" Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.497826 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:01:55 crc kubenswrapper[4723]: E0309 13:01:55.498188 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:01:55.998170382 +0000 UTC m=+190.012637922 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.526398 4723 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-fhxc9 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.526458 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhxc9" podUID="3321a715-9c5f-4417-bec1-4ba3ccce946c" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.30:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.526539 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9qb2k" Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.564027 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4x6zm"] Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.581723 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lztcd" event={"ID":"f09eae28-36d6-4c16-8aab-bbd93934f921","Type":"ContainerStarted","Data":"34a5e55ed0288ddb70164466717bb0f3b0e49128805e923b20e8d6392c1bb841"} Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.599640 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:55 crc kubenswrapper[4723]: E0309 13:01:55.599941 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:01:56.09992877 +0000 UTC m=+190.114396310 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s6gh6" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.615604 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xzd59" event={"ID":"4f18e34a-2f8e-4450-bb3f-7b391bc03e06","Type":"ContainerStarted","Data":"504f0efb85ad870bc9bd79850fd12e63c3af6a29cba1aa94ca65858571e3b2c1"} Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.642088 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dqh66"] Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.648136 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-gqscf" event={"ID":"0ae8bb36-2555-4976-89ee-a9b6cb99ec9f","Type":"ContainerStarted","Data":"7788390d56523c5a40898907841ea3d99ede1566ae54608a325d20875df1de5c"} Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.686908 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4hnlr" event={"ID":"2bc1b3df-48a8-46f0-a01e-2596af170d85","Type":"ContainerStarted","Data":"46e43d4d6ec648c35533749bbee7c92e338078d2bd22a3e55b3e788a0aa40f32"} Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.701619 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:01:55 crc kubenswrapper[4723]: E0309 13:01:55.701777 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:01:56.201755739 +0000 UTC m=+190.216223279 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.702073 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:55 crc kubenswrapper[4723]: E0309 13:01:55.703932 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:01:56.203913235 +0000 UTC m=+190.218380775 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s6gh6" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.704272 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-p8pxb" Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.757729 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhxc9" Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.773227 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-wkpzg" Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.810512 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:01:55 crc kubenswrapper[4723]: E0309 13:01:55.817527 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:01:56.317502545 +0000 UTC m=+190.331970085 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.912776 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:55 crc kubenswrapper[4723]: E0309 13:01:55.913350 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:01:56.413336302 +0000 UTC m=+190.427803842 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s6gh6" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:55 crc kubenswrapper[4723]: I0309 13:01:55.985518 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jq2cv"] Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.013593 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:01:56 crc kubenswrapper[4723]: E0309 13:01:56.013807 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:01:56.513789476 +0000 UTC m=+190.528257016 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.014221 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:56 crc kubenswrapper[4723]: E0309 13:01:56.014647 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:01:56.514625228 +0000 UTC m=+190.529092768 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s6gh6" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.046543 4723 patch_prober.go:28] interesting pod/router-default-5444994796-fzrk5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:01:56 crc kubenswrapper[4723]: [-]has-synced failed: reason withheld Mar 09 13:01:56 crc kubenswrapper[4723]: [+]process-running ok Mar 09 13:01:56 crc kubenswrapper[4723]: healthz check failed Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.046622 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fzrk5" podUID="f386e3de-9b2f-478d-b7c5-8ac3f8aff8d1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.123445 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:01:56 crc kubenswrapper[4723]: E0309 13:01:56.123559 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:01:56.623528908 +0000 UTC m=+190.637996448 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.124598 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:56 crc kubenswrapper[4723]: E0309 13:01:56.125484 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:01:56.625465208 +0000 UTC m=+190.639932748 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s6gh6" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.147112 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9qb2k"] Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.180313 4723 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.226553 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:01:56 crc kubenswrapper[4723]: E0309 13:01:56.226744 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:01:56.726712203 +0000 UTC m=+190.741179743 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.226830 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:56 crc kubenswrapper[4723]: E0309 13:01:56.227277 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:01:56.727261747 +0000 UTC m=+190.741729287 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s6gh6" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.328329 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:01:56 crc kubenswrapper[4723]: E0309 13:01:56.328661 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:01:56.828627155 +0000 UTC m=+190.843094685 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.328993 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:56 crc kubenswrapper[4723]: E0309 13:01:56.329354 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:01:56.829344463 +0000 UTC m=+190.843812003 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s6gh6" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.430061 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:01:56 crc kubenswrapper[4723]: E0309 13:01:56.430423 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:01:56.930403603 +0000 UTC m=+190.944871133 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.463750 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pvf4v"] Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.464785 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pvf4v" Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.466932 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.482518 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pvf4v"] Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.531319 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ebe9a64-25f7-4d32-bdce-3a3942ba53a2-utilities\") pod \"redhat-marketplace-pvf4v\" (UID: \"8ebe9a64-25f7-4d32-bdce-3a3942ba53a2\") " pod="openshift-marketplace/redhat-marketplace-pvf4v" Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.531375 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ebe9a64-25f7-4d32-bdce-3a3942ba53a2-catalog-content\") pod \"redhat-marketplace-pvf4v\" (UID: \"8ebe9a64-25f7-4d32-bdce-3a3942ba53a2\") " pod="openshift-marketplace/redhat-marketplace-pvf4v" Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.531405 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxs2b\" (UniqueName: \"kubernetes.io/projected/8ebe9a64-25f7-4d32-bdce-3a3942ba53a2-kube-api-access-cxs2b\") pod \"redhat-marketplace-pvf4v\" (UID: \"8ebe9a64-25f7-4d32-bdce-3a3942ba53a2\") " pod="openshift-marketplace/redhat-marketplace-pvf4v" Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.531482 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:56 crc kubenswrapper[4723]: E0309 13:01:56.531816 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:01:57.031798822 +0000 UTC m=+191.046266362 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s6gh6" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.618986 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vb8n2"] Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.619621 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-vb8n2" podUID="9bc33ee8-964b-4b03-b564-5c66068629b9" containerName="controller-manager" containerID="cri-o://a48c65a1c3f512880d9dc44b2eb45307e3d945306c8a613c05ca483aa288e4a8" gracePeriod=30 Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.632577 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:01:56 crc kubenswrapper[4723]: E0309 13:01:56.632783 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:01:57.132748079 +0000 UTC m=+191.147215619 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.632848 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ebe9a64-25f7-4d32-bdce-3a3942ba53a2-utilities\") pod \"redhat-marketplace-pvf4v\" (UID: \"8ebe9a64-25f7-4d32-bdce-3a3942ba53a2\") " pod="openshift-marketplace/redhat-marketplace-pvf4v" Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.632913 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ebe9a64-25f7-4d32-bdce-3a3942ba53a2-catalog-content\") pod \"redhat-marketplace-pvf4v\" (UID: \"8ebe9a64-25f7-4d32-bdce-3a3942ba53a2\") " pod="openshift-marketplace/redhat-marketplace-pvf4v" Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.632949 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxs2b\" (UniqueName: \"kubernetes.io/projected/8ebe9a64-25f7-4d32-bdce-3a3942ba53a2-kube-api-access-cxs2b\") pod \"redhat-marketplace-pvf4v\" (UID: \"8ebe9a64-25f7-4d32-bdce-3a3942ba53a2\") " pod="openshift-marketplace/redhat-marketplace-pvf4v" Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.633007 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:56 crc kubenswrapper[4723]: E0309 13:01:56.633335 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:01:57.133321844 +0000 UTC m=+191.147789384 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s6gh6" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.633920 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ebe9a64-25f7-4d32-bdce-3a3942ba53a2-catalog-content\") pod \"redhat-marketplace-pvf4v\" (UID: \"8ebe9a64-25f7-4d32-bdce-3a3942ba53a2\") " pod="openshift-marketplace/redhat-marketplace-pvf4v" Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.634225 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ebe9a64-25f7-4d32-bdce-3a3942ba53a2-utilities\") pod \"redhat-marketplace-pvf4v\" (UID: \"8ebe9a64-25f7-4d32-bdce-3a3942ba53a2\") " pod="openshift-marketplace/redhat-marketplace-pvf4v" Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.649263 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qpzn"] Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.655081 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxs2b\" (UniqueName: \"kubernetes.io/projected/8ebe9a64-25f7-4d32-bdce-3a3942ba53a2-kube-api-access-cxs2b\") pod \"redhat-marketplace-pvf4v\" (UID: \"8ebe9a64-25f7-4d32-bdce-3a3942ba53a2\") " pod="openshift-marketplace/redhat-marketplace-pvf4v" Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.693537 4723 generic.go:334] "Generic (PLEG): container finished" podID="a7d103aa-232e-4705-a061-8ad7025339cf" containerID="1e9fbc13b683916412c35d60c517f1a2ec6392cc8226e266a1ca4fcb24256272" exitCode=0 Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.693623 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4x6zm" event={"ID":"a7d103aa-232e-4705-a061-8ad7025339cf","Type":"ContainerDied","Data":"1e9fbc13b683916412c35d60c517f1a2ec6392cc8226e266a1ca4fcb24256272"} Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.693659 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4x6zm" event={"ID":"a7d103aa-232e-4705-a061-8ad7025339cf","Type":"ContainerStarted","Data":"267d32d38f4ac36ab26ff8a9fd90384b80a1153d16dc73fb0012983d27013595"} Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.695815 4723 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.696694 4723 generic.go:334] "Generic (PLEG): container finished" podID="fa9195db-65d7-4777-8869-948a26e41933" containerID="fbe4dc38e4f9f1245c6c0a12d941471c716b36a377fea5bf529d5d3518b2c420" exitCode=0 Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.696744 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qb2k" event={"ID":"fa9195db-65d7-4777-8869-948a26e41933","Type":"ContainerDied","Data":"fbe4dc38e4f9f1245c6c0a12d941471c716b36a377fea5bf529d5d3518b2c420"} Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.696765 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qb2k" event={"ID":"fa9195db-65d7-4777-8869-948a26e41933","Type":"ContainerStarted","Data":"84ec7db969b2fb40634cda3071d8f872cbac94caaaf2a068eb91ebb10bb9fbd6"} Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.700538 4723 generic.go:334] "Generic (PLEG): container finished" podID="c15fd52d-a005-4417-aaca-84839023e2b4" containerID="071de674372c6d212f8340b6e56e0ccc976bb9a08fd325a215bdfba2286169c9" exitCode=0 Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.700629 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551020-rlng8" event={"ID":"c15fd52d-a005-4417-aaca-84839023e2b4","Type":"ContainerDied","Data":"071de674372c6d212f8340b6e56e0ccc976bb9a08fd325a215bdfba2286169c9"} Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.705078 4723 generic.go:334] "Generic (PLEG): container finished" podID="b5cafa5f-a4bc-4029-b136-ba9b3e2b6709" containerID="263b436902a04df4cb817bfe6083969a396c37ce40294eacd3745a9fc76d9942" exitCode=0 Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.705448 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqh66" event={"ID":"b5cafa5f-a4bc-4029-b136-ba9b3e2b6709","Type":"ContainerDied","Data":"263b436902a04df4cb817bfe6083969a396c37ce40294eacd3745a9fc76d9942"} Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.705487 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqh66" event={"ID":"b5cafa5f-a4bc-4029-b136-ba9b3e2b6709","Type":"ContainerStarted","Data":"24094a17a6fbfe474fc5d590964344bc77b407f77e662c38c3fe583ec34d7876"} Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.707822 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-gqscf" event={"ID":"0ae8bb36-2555-4976-89ee-a9b6cb99ec9f","Type":"ContainerStarted","Data":"2ef06f5d68e755d9681477b5f655ec23a3f489535755156be60248270d4c8d73"} Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.707878 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-gqscf" event={"ID":"0ae8bb36-2555-4976-89ee-a9b6cb99ec9f","Type":"ContainerStarted","Data":"688d35586d926880d41dac24a1a2a012ffbf264f6a2a56e370bf0c0dab3cb773"} Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.710249 4723 generic.go:334] "Generic (PLEG): container finished" podID="84890bd9-0d95-48f4-89d3-6619e5e5525a" containerID="7a776bd9c4199b485783dad74583e7b1b1934d99c1d2155c4cbbc2866b078b4e" exitCode=0 Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.711201 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jq2cv" event={"ID":"84890bd9-0d95-48f4-89d3-6619e5e5525a","Type":"ContainerDied","Data":"7a776bd9c4199b485783dad74583e7b1b1934d99c1d2155c4cbbc2866b078b4e"} Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.711225 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jq2cv" event={"ID":"84890bd9-0d95-48f4-89d3-6619e5e5525a","Type":"ContainerStarted","Data":"18882635a5ebc0eb89e65369741b36ec94c51f70df60d1767ecb5cfd13eda526"} Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.712789 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qpzn" podUID="ea205a39-cbd1-4704-8e93-0b1747a88e8a" containerName="route-controller-manager" containerID="cri-o://4d8cb376ca0ff69bc914ff7c0b9d81f9346a01857efa9816c23ea2725a28c503" gracePeriod=30 Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.734133 4723 ???:1] "http: TLS handshake error from 192.168.126.11:48828: no serving certificate available for the kubelet" Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.734570 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:01:56 crc kubenswrapper[4723]: E0309 13:01:56.735022 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:01:57.2349974 +0000 UTC m=+191.249464950 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.754000 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-gqscf" podStartSLOduration=10.753981135 podStartE2EDuration="10.753981135s" podCreationTimestamp="2026-03-09 13:01:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:56.749568092 +0000 UTC m=+190.764035642" watchObservedRunningTime="2026-03-09 13:01:56.753981135 +0000 UTC m=+190.768448675" Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.783404 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pvf4v" Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.836345 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:56 crc kubenswrapper[4723]: E0309 13:01:56.845812 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:01:57.345790609 +0000 UTC m=+191.360258219 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s6gh6" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.907481 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.913820 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.917131 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lsmcg"] Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.919418 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lsmcg" Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.935101 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.935388 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.940429 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:01:56 crc kubenswrapper[4723]: E0309 13:01:56.940600 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:01:57.440575399 +0000 UTC m=+191.455042939 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.940788 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:56 crc kubenswrapper[4723]: E0309 13:01:56.941144 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:01:57.441133843 +0000 UTC m=+191.455601393 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s6gh6" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.955585 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lsmcg"] Mar 09 13:01:56 crc kubenswrapper[4723]: I0309 13:01:56.960120 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.043582 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.043791 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bc2fb38-5759-4ce6-9c1d-84a6537050e9-utilities\") pod \"redhat-marketplace-lsmcg\" (UID: \"6bc2fb38-5759-4ce6-9c1d-84a6537050e9\") " pod="openshift-marketplace/redhat-marketplace-lsmcg" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.043847 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqp2g\" (UniqueName: \"kubernetes.io/projected/6bc2fb38-5759-4ce6-9c1d-84a6537050e9-kube-api-access-rqp2g\") pod \"redhat-marketplace-lsmcg\" (UID: \"6bc2fb38-5759-4ce6-9c1d-84a6537050e9\") " pod="openshift-marketplace/redhat-marketplace-lsmcg" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.043904 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/90fa396b-9473-4da4-af0c-c08fd8f78e3a-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"90fa396b-9473-4da4-af0c-c08fd8f78e3a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.043936 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bc2fb38-5759-4ce6-9c1d-84a6537050e9-catalog-content\") pod \"redhat-marketplace-lsmcg\" (UID: \"6bc2fb38-5759-4ce6-9c1d-84a6537050e9\") " pod="openshift-marketplace/redhat-marketplace-lsmcg" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.043960 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/90fa396b-9473-4da4-af0c-c08fd8f78e3a-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"90fa396b-9473-4da4-af0c-c08fd8f78e3a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 13:01:57 crc kubenswrapper[4723]: E0309 13:01:57.044058 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:01:57.54404118 +0000 UTC m=+191.558508720 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.047466 4723 patch_prober.go:28] interesting pod/router-default-5444994796-fzrk5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:01:57 crc kubenswrapper[4723]: [-]has-synced failed: reason withheld Mar 09 13:01:57 crc kubenswrapper[4723]: [+]process-running ok Mar 09 13:01:57 crc kubenswrapper[4723]: healthz check failed Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.047518 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fzrk5" podUID="f386e3de-9b2f-478d-b7c5-8ac3f8aff8d1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.079196 4723 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-09T13:01:56.180579325Z","Handler":null,"Name":""} Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.083758 4723 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.083806 4723 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.146491 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/90fa396b-9473-4da4-af0c-c08fd8f78e3a-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"90fa396b-9473-4da4-af0c-c08fd8f78e3a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.146556 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.146582 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bc2fb38-5759-4ce6-9c1d-84a6537050e9-utilities\") pod \"redhat-marketplace-lsmcg\" (UID: \"6bc2fb38-5759-4ce6-9c1d-84a6537050e9\") " pod="openshift-marketplace/redhat-marketplace-lsmcg" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.146628 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqp2g\" (UniqueName: \"kubernetes.io/projected/6bc2fb38-5759-4ce6-9c1d-84a6537050e9-kube-api-access-rqp2g\") pod \"redhat-marketplace-lsmcg\" (UID: \"6bc2fb38-5759-4ce6-9c1d-84a6537050e9\") " pod="openshift-marketplace/redhat-marketplace-lsmcg" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.146666 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/90fa396b-9473-4da4-af0c-c08fd8f78e3a-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"90fa396b-9473-4da4-af0c-c08fd8f78e3a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.146701 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bc2fb38-5759-4ce6-9c1d-84a6537050e9-catalog-content\") pod \"redhat-marketplace-lsmcg\" (UID: \"6bc2fb38-5759-4ce6-9c1d-84a6537050e9\") " pod="openshift-marketplace/redhat-marketplace-lsmcg" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.147455 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bc2fb38-5759-4ce6-9c1d-84a6537050e9-catalog-content\") pod \"redhat-marketplace-lsmcg\" (UID: \"6bc2fb38-5759-4ce6-9c1d-84a6537050e9\") " pod="openshift-marketplace/redhat-marketplace-lsmcg" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.148263 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bc2fb38-5759-4ce6-9c1d-84a6537050e9-utilities\") pod \"redhat-marketplace-lsmcg\" (UID: \"6bc2fb38-5759-4ce6-9c1d-84a6537050e9\") " pod="openshift-marketplace/redhat-marketplace-lsmcg" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.148438 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/90fa396b-9473-4da4-af0c-c08fd8f78e3a-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"90fa396b-9473-4da4-af0c-c08fd8f78e3a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.156898 4723 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.156935 4723 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.172016 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/90fa396b-9473-4da4-af0c-c08fd8f78e3a-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"90fa396b-9473-4da4-af0c-c08fd8f78e3a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.174258 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqp2g\" (UniqueName: \"kubernetes.io/projected/6bc2fb38-5759-4ce6-9c1d-84a6537050e9-kube-api-access-rqp2g\") pod \"redhat-marketplace-lsmcg\" (UID: \"6bc2fb38-5759-4ce6-9c1d-84a6537050e9\") " pod="openshift-marketplace/redhat-marketplace-lsmcg" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.191382 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s6gh6\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.204956 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.244345 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pvf4v"] Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.248427 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.265805 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.270663 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-vb8n2" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.275230 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qpzn" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.295625 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.323411 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lsmcg" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.349517 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9bc33ee8-964b-4b03-b564-5c66068629b9-proxy-ca-bundles\") pod \"9bc33ee8-964b-4b03-b564-5c66068629b9\" (UID: \"9bc33ee8-964b-4b03-b564-5c66068629b9\") " Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.349571 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9bc33ee8-964b-4b03-b564-5c66068629b9-client-ca\") pod \"9bc33ee8-964b-4b03-b564-5c66068629b9\" (UID: \"9bc33ee8-964b-4b03-b564-5c66068629b9\") " Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.349612 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea205a39-cbd1-4704-8e93-0b1747a88e8a-serving-cert\") pod \"ea205a39-cbd1-4704-8e93-0b1747a88e8a\" (UID: \"ea205a39-cbd1-4704-8e93-0b1747a88e8a\") " Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.349636 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nffd8\" (UniqueName: \"kubernetes.io/projected/9bc33ee8-964b-4b03-b564-5c66068629b9-kube-api-access-nffd8\") pod \"9bc33ee8-964b-4b03-b564-5c66068629b9\" (UID: \"9bc33ee8-964b-4b03-b564-5c66068629b9\") " Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.349695 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea205a39-cbd1-4704-8e93-0b1747a88e8a-client-ca\") pod \"ea205a39-cbd1-4704-8e93-0b1747a88e8a\" (UID: \"ea205a39-cbd1-4704-8e93-0b1747a88e8a\") " Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.349743 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbmd8\" (UniqueName: \"kubernetes.io/projected/ea205a39-cbd1-4704-8e93-0b1747a88e8a-kube-api-access-qbmd8\") pod \"ea205a39-cbd1-4704-8e93-0b1747a88e8a\" (UID: \"ea205a39-cbd1-4704-8e93-0b1747a88e8a\") " Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.349767 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bc33ee8-964b-4b03-b564-5c66068629b9-config\") pod \"9bc33ee8-964b-4b03-b564-5c66068629b9\" (UID: \"9bc33ee8-964b-4b03-b564-5c66068629b9\") " Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.349787 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea205a39-cbd1-4704-8e93-0b1747a88e8a-config\") pod \"ea205a39-cbd1-4704-8e93-0b1747a88e8a\" (UID: \"ea205a39-cbd1-4704-8e93-0b1747a88e8a\") " Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.349816 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bc33ee8-964b-4b03-b564-5c66068629b9-serving-cert\") pod \"9bc33ee8-964b-4b03-b564-5c66068629b9\" (UID: \"9bc33ee8-964b-4b03-b564-5c66068629b9\") " Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.350711 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea205a39-cbd1-4704-8e93-0b1747a88e8a-client-ca" (OuterVolumeSpecName: "client-ca") pod "ea205a39-cbd1-4704-8e93-0b1747a88e8a" (UID: "ea205a39-cbd1-4704-8e93-0b1747a88e8a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.351188 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea205a39-cbd1-4704-8e93-0b1747a88e8a-config" (OuterVolumeSpecName: "config") pod "ea205a39-cbd1-4704-8e93-0b1747a88e8a" (UID: "ea205a39-cbd1-4704-8e93-0b1747a88e8a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.354440 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea205a39-cbd1-4704-8e93-0b1747a88e8a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ea205a39-cbd1-4704-8e93-0b1747a88e8a" (UID: "ea205a39-cbd1-4704-8e93-0b1747a88e8a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.355913 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bc33ee8-964b-4b03-b564-5c66068629b9-kube-api-access-nffd8" (OuterVolumeSpecName: "kube-api-access-nffd8") pod "9bc33ee8-964b-4b03-b564-5c66068629b9" (UID: "9bc33ee8-964b-4b03-b564-5c66068629b9"). InnerVolumeSpecName "kube-api-access-nffd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.355982 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea205a39-cbd1-4704-8e93-0b1747a88e8a-kube-api-access-qbmd8" (OuterVolumeSpecName: "kube-api-access-qbmd8") pod "ea205a39-cbd1-4704-8e93-0b1747a88e8a" (UID: "ea205a39-cbd1-4704-8e93-0b1747a88e8a"). InnerVolumeSpecName "kube-api-access-qbmd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.355998 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bc33ee8-964b-4b03-b564-5c66068629b9-client-ca" (OuterVolumeSpecName: "client-ca") pod "9bc33ee8-964b-4b03-b564-5c66068629b9" (UID: "9bc33ee8-964b-4b03-b564-5c66068629b9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.356026 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bc33ee8-964b-4b03-b564-5c66068629b9-config" (OuterVolumeSpecName: "config") pod "9bc33ee8-964b-4b03-b564-5c66068629b9" (UID: "9bc33ee8-964b-4b03-b564-5c66068629b9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.356264 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bc33ee8-964b-4b03-b564-5c66068629b9-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "9bc33ee8-964b-4b03-b564-5c66068629b9" (UID: "9bc33ee8-964b-4b03-b564-5c66068629b9"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.356524 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bc33ee8-964b-4b03-b564-5c66068629b9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9bc33ee8-964b-4b03-b564-5c66068629b9" (UID: "9bc33ee8-964b-4b03-b564-5c66068629b9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.451663 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea205a39-cbd1-4704-8e93-0b1747a88e8a-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.452099 4723 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bc33ee8-964b-4b03-b564-5c66068629b9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.452113 4723 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9bc33ee8-964b-4b03-b564-5c66068629b9-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.452127 4723 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9bc33ee8-964b-4b03-b564-5c66068629b9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.452139 4723 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea205a39-cbd1-4704-8e93-0b1747a88e8a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.452152 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nffd8\" (UniqueName: \"kubernetes.io/projected/9bc33ee8-964b-4b03-b564-5c66068629b9-kube-api-access-nffd8\") on node \"crc\" DevicePath \"\"" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.452162 4723 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea205a39-cbd1-4704-8e93-0b1747a88e8a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.452173 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbmd8\" (UniqueName: \"kubernetes.io/projected/ea205a39-cbd1-4704-8e93-0b1747a88e8a-kube-api-access-qbmd8\") on node \"crc\" DevicePath \"\"" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.452185 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bc33ee8-964b-4b03-b564-5c66068629b9-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.568129 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-s6gh6"] Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.658313 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lsmcg"] Mar 09 13:01:57 crc kubenswrapper[4723]: W0309 13:01:57.690124 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bc2fb38_5759_4ce6_9c1d_84a6537050e9.slice/crio-cdb375ec401951a0010fbf23f12587fcd32c42dc0ad76f1c2ad0f40c8798b95a WatchSource:0}: Error finding container cdb375ec401951a0010fbf23f12587fcd32c42dc0ad76f1c2ad0f40c8798b95a: Status 404 returned error can't find the container with id cdb375ec401951a0010fbf23f12587fcd32c42dc0ad76f1c2ad0f40c8798b95a Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.711108 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2lhjt"] Mar 09 13:01:57 crc kubenswrapper[4723]: E0309 13:01:57.711419 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea205a39-cbd1-4704-8e93-0b1747a88e8a" containerName="route-controller-manager" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.711441 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea205a39-cbd1-4704-8e93-0b1747a88e8a" containerName="route-controller-manager" Mar 09 13:01:57 crc kubenswrapper[4723]: E0309 13:01:57.711452 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bc33ee8-964b-4b03-b564-5c66068629b9" containerName="controller-manager" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.711462 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bc33ee8-964b-4b03-b564-5c66068629b9" containerName="controller-manager" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.711601 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bc33ee8-964b-4b03-b564-5c66068629b9" containerName="controller-manager" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.711627 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea205a39-cbd1-4704-8e93-0b1747a88e8a" containerName="route-controller-manager" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.712927 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2lhjt" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.717696 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.721440 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lsmcg" event={"ID":"6bc2fb38-5759-4ce6-9c1d-84a6537050e9","Type":"ContainerStarted","Data":"cdb375ec401951a0010fbf23f12587fcd32c42dc0ad76f1c2ad0f40c8798b95a"} Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.727095 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2lhjt"] Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.727148 4723 generic.go:334] "Generic (PLEG): container finished" podID="9bc33ee8-964b-4b03-b564-5c66068629b9" containerID="a48c65a1c3f512880d9dc44b2eb45307e3d945306c8a613c05ca483aa288e4a8" exitCode=0 Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.727163 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-vb8n2" event={"ID":"9bc33ee8-964b-4b03-b564-5c66068629b9","Type":"ContainerDied","Data":"a48c65a1c3f512880d9dc44b2eb45307e3d945306c8a613c05ca483aa288e4a8"} Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.727190 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-vb8n2" event={"ID":"9bc33ee8-964b-4b03-b564-5c66068629b9","Type":"ContainerDied","Data":"abdf4adce12e0920215d9fc04894f299b9d5f163b9e055b0af05cb53fe2115db"} Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.727207 4723 scope.go:117] "RemoveContainer" containerID="a48c65a1c3f512880d9dc44b2eb45307e3d945306c8a613c05ca483aa288e4a8" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.727298 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-vb8n2" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.746077 4723 generic.go:334] "Generic (PLEG): container finished" podID="ea205a39-cbd1-4704-8e93-0b1747a88e8a" containerID="4d8cb376ca0ff69bc914ff7c0b9d81f9346a01857efa9816c23ea2725a28c503" exitCode=0 Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.746188 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qpzn" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.746292 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qpzn" event={"ID":"ea205a39-cbd1-4704-8e93-0b1747a88e8a","Type":"ContainerDied","Data":"4d8cb376ca0ff69bc914ff7c0b9d81f9346a01857efa9816c23ea2725a28c503"} Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.746331 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qpzn" event={"ID":"ea205a39-cbd1-4704-8e93-0b1747a88e8a","Type":"ContainerDied","Data":"3eedfd4aa15d5a0882827922038af91a0ee51cc4537bcf3b86de9da7fa7b7854"} Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.748853 4723 generic.go:334] "Generic (PLEG): container finished" podID="8ebe9a64-25f7-4d32-bdce-3a3942ba53a2" containerID="0919fe583c9c158b8aef61c78369eca1ffe1b146fd4349657fbe61e5955df06a" exitCode=0 Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.748938 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvf4v" event={"ID":"8ebe9a64-25f7-4d32-bdce-3a3942ba53a2","Type":"ContainerDied","Data":"0919fe583c9c158b8aef61c78369eca1ffe1b146fd4349657fbe61e5955df06a"} Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.748968 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvf4v" event={"ID":"8ebe9a64-25f7-4d32-bdce-3a3942ba53a2","Type":"ContainerStarted","Data":"6f1271a1ca8ffe2a3c61f046dc0babc22d8c025eb24407ba32a635aefeaf5fb0"} Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.758249 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5adbe8b6-fabd-4e21-8507-84df16004837-utilities\") pod \"redhat-operators-2lhjt\" (UID: \"5adbe8b6-fabd-4e21-8507-84df16004837\") " pod="openshift-marketplace/redhat-operators-2lhjt" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.758332 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5adbe8b6-fabd-4e21-8507-84df16004837-catalog-content\") pod \"redhat-operators-2lhjt\" (UID: \"5adbe8b6-fabd-4e21-8507-84df16004837\") " pod="openshift-marketplace/redhat-operators-2lhjt" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.758424 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5bxr\" (UniqueName: \"kubernetes.io/projected/5adbe8b6-fabd-4e21-8507-84df16004837-kube-api-access-m5bxr\") pod \"redhat-operators-2lhjt\" (UID: \"5adbe8b6-fabd-4e21-8507-84df16004837\") " pod="openshift-marketplace/redhat-operators-2lhjt" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.764190 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" event={"ID":"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef","Type":"ContainerStarted","Data":"a447c0c706b067fe924b41af78e9a1b9a2863adcee5f0dfe8e3c35e433051e7b"} Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.820776 4723 scope.go:117] "RemoveContainer" containerID="a48c65a1c3f512880d9dc44b2eb45307e3d945306c8a613c05ca483aa288e4a8" Mar 09 13:01:57 crc kubenswrapper[4723]: E0309 13:01:57.825738 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a48c65a1c3f512880d9dc44b2eb45307e3d945306c8a613c05ca483aa288e4a8\": container with ID starting with a48c65a1c3f512880d9dc44b2eb45307e3d945306c8a613c05ca483aa288e4a8 not found: ID does not exist" containerID="a48c65a1c3f512880d9dc44b2eb45307e3d945306c8a613c05ca483aa288e4a8" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.825788 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a48c65a1c3f512880d9dc44b2eb45307e3d945306c8a613c05ca483aa288e4a8"} err="failed to get container status \"a48c65a1c3f512880d9dc44b2eb45307e3d945306c8a613c05ca483aa288e4a8\": rpc error: code = NotFound desc = could not find container \"a48c65a1c3f512880d9dc44b2eb45307e3d945306c8a613c05ca483aa288e4a8\": container with ID starting with a48c65a1c3f512880d9dc44b2eb45307e3d945306c8a613c05ca483aa288e4a8 not found: ID does not exist" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.825816 4723 scope.go:117] "RemoveContainer" containerID="4d8cb376ca0ff69bc914ff7c0b9d81f9346a01857efa9816c23ea2725a28c503" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.830833 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.862079 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5adbe8b6-fabd-4e21-8507-84df16004837-catalog-content\") pod \"redhat-operators-2lhjt\" (UID: \"5adbe8b6-fabd-4e21-8507-84df16004837\") " pod="openshift-marketplace/redhat-operators-2lhjt" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.862252 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5bxr\" (UniqueName: \"kubernetes.io/projected/5adbe8b6-fabd-4e21-8507-84df16004837-kube-api-access-m5bxr\") pod \"redhat-operators-2lhjt\" (UID: \"5adbe8b6-fabd-4e21-8507-84df16004837\") " pod="openshift-marketplace/redhat-operators-2lhjt" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.862395 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5adbe8b6-fabd-4e21-8507-84df16004837-utilities\") pod \"redhat-operators-2lhjt\" (UID: \"5adbe8b6-fabd-4e21-8507-84df16004837\") " pod="openshift-marketplace/redhat-operators-2lhjt" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.880761 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5adbe8b6-fabd-4e21-8507-84df16004837-catalog-content\") pod \"redhat-operators-2lhjt\" (UID: \"5adbe8b6-fabd-4e21-8507-84df16004837\") " pod="openshift-marketplace/redhat-operators-2lhjt" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.888220 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5adbe8b6-fabd-4e21-8507-84df16004837-utilities\") pod \"redhat-operators-2lhjt\" (UID: \"5adbe8b6-fabd-4e21-8507-84df16004837\") " pod="openshift-marketplace/redhat-operators-2lhjt" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.893868 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qpzn"] Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.894068 4723 scope.go:117] "RemoveContainer" containerID="4d8cb376ca0ff69bc914ff7c0b9d81f9346a01857efa9816c23ea2725a28c503" Mar 09 13:01:57 crc kubenswrapper[4723]: E0309 13:01:57.903786 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d8cb376ca0ff69bc914ff7c0b9d81f9346a01857efa9816c23ea2725a28c503\": container with ID starting with 4d8cb376ca0ff69bc914ff7c0b9d81f9346a01857efa9816c23ea2725a28c503 not found: ID does not exist" containerID="4d8cb376ca0ff69bc914ff7c0b9d81f9346a01857efa9816c23ea2725a28c503" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.903844 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d8cb376ca0ff69bc914ff7c0b9d81f9346a01857efa9816c23ea2725a28c503"} err="failed to get container status \"4d8cb376ca0ff69bc914ff7c0b9d81f9346a01857efa9816c23ea2725a28c503\": rpc error: code = NotFound desc = could not find container \"4d8cb376ca0ff69bc914ff7c0b9d81f9346a01857efa9816c23ea2725a28c503\": container with ID starting with 4d8cb376ca0ff69bc914ff7c0b9d81f9346a01857efa9816c23ea2725a28c503 not found: ID does not exist" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.907227 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qpzn"] Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.911145 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5bxr\" (UniqueName: \"kubernetes.io/projected/5adbe8b6-fabd-4e21-8507-84df16004837-kube-api-access-m5bxr\") pod \"redhat-operators-2lhjt\" (UID: \"5adbe8b6-fabd-4e21-8507-84df16004837\") " pod="openshift-marketplace/redhat-operators-2lhjt" Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.913416 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vb8n2"] Mar 09 13:01:57 crc kubenswrapper[4723]: I0309 13:01:57.926186 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vb8n2"] Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.060945 4723 patch_prober.go:28] interesting pod/router-default-5444994796-fzrk5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:01:58 crc kubenswrapper[4723]: [-]has-synced failed: reason withheld Mar 09 13:01:58 crc kubenswrapper[4723]: [+]process-running ok Mar 09 13:01:58 crc kubenswrapper[4723]: healthz check failed Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.060998 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fzrk5" podUID="f386e3de-9b2f-478d-b7c5-8ac3f8aff8d1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.062387 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kcvkd"] Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.064072 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kcvkd" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.074267 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kcvkd"] Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.119252 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551020-rlng8" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.161511 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2lhjt" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.167176 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c15fd52d-a005-4417-aaca-84839023e2b4-secret-volume\") pod \"c15fd52d-a005-4417-aaca-84839023e2b4\" (UID: \"c15fd52d-a005-4417-aaca-84839023e2b4\") " Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.167372 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c15fd52d-a005-4417-aaca-84839023e2b4-config-volume\") pod \"c15fd52d-a005-4417-aaca-84839023e2b4\" (UID: \"c15fd52d-a005-4417-aaca-84839023e2b4\") " Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.167462 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgg2m\" (UniqueName: \"kubernetes.io/projected/c15fd52d-a005-4417-aaca-84839023e2b4-kube-api-access-xgg2m\") pod \"c15fd52d-a005-4417-aaca-84839023e2b4\" (UID: \"c15fd52d-a005-4417-aaca-84839023e2b4\") " Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.167654 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d85gj\" (UniqueName: \"kubernetes.io/projected/980fde08-18f8-4e22-93a1-3846f9e367ad-kube-api-access-d85gj\") pod \"redhat-operators-kcvkd\" (UID: \"980fde08-18f8-4e22-93a1-3846f9e367ad\") " pod="openshift-marketplace/redhat-operators-kcvkd" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.167718 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/980fde08-18f8-4e22-93a1-3846f9e367ad-utilities\") pod \"redhat-operators-kcvkd\" (UID: \"980fde08-18f8-4e22-93a1-3846f9e367ad\") " pod="openshift-marketplace/redhat-operators-kcvkd" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.167745 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/980fde08-18f8-4e22-93a1-3846f9e367ad-catalog-content\") pod \"redhat-operators-kcvkd\" (UID: \"980fde08-18f8-4e22-93a1-3846f9e367ad\") " pod="openshift-marketplace/redhat-operators-kcvkd" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.169978 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c15fd52d-a005-4417-aaca-84839023e2b4-config-volume" (OuterVolumeSpecName: "config-volume") pod "c15fd52d-a005-4417-aaca-84839023e2b4" (UID: "c15fd52d-a005-4417-aaca-84839023e2b4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.176058 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c15fd52d-a005-4417-aaca-84839023e2b4-kube-api-access-xgg2m" (OuterVolumeSpecName: "kube-api-access-xgg2m") pod "c15fd52d-a005-4417-aaca-84839023e2b4" (UID: "c15fd52d-a005-4417-aaca-84839023e2b4"). InnerVolumeSpecName "kube-api-access-xgg2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.176629 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c15fd52d-a005-4417-aaca-84839023e2b4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c15fd52d-a005-4417-aaca-84839023e2b4" (UID: "c15fd52d-a005-4417-aaca-84839023e2b4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.269480 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d85gj\" (UniqueName: \"kubernetes.io/projected/980fde08-18f8-4e22-93a1-3846f9e367ad-kube-api-access-d85gj\") pod \"redhat-operators-kcvkd\" (UID: \"980fde08-18f8-4e22-93a1-3846f9e367ad\") " pod="openshift-marketplace/redhat-operators-kcvkd" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.269558 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/980fde08-18f8-4e22-93a1-3846f9e367ad-utilities\") pod \"redhat-operators-kcvkd\" (UID: \"980fde08-18f8-4e22-93a1-3846f9e367ad\") " pod="openshift-marketplace/redhat-operators-kcvkd" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.269577 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/980fde08-18f8-4e22-93a1-3846f9e367ad-catalog-content\") pod \"redhat-operators-kcvkd\" (UID: \"980fde08-18f8-4e22-93a1-3846f9e367ad\") " pod="openshift-marketplace/redhat-operators-kcvkd" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.269709 4723 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c15fd52d-a005-4417-aaca-84839023e2b4-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.269720 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgg2m\" (UniqueName: \"kubernetes.io/projected/c15fd52d-a005-4417-aaca-84839023e2b4-kube-api-access-xgg2m\") on node \"crc\" DevicePath \"\"" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.269731 4723 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c15fd52d-a005-4417-aaca-84839023e2b4-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.270206 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/980fde08-18f8-4e22-93a1-3846f9e367ad-catalog-content\") pod \"redhat-operators-kcvkd\" (UID: \"980fde08-18f8-4e22-93a1-3846f9e367ad\") " pod="openshift-marketplace/redhat-operators-kcvkd" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.270455 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/980fde08-18f8-4e22-93a1-3846f9e367ad-utilities\") pod \"redhat-operators-kcvkd\" (UID: \"980fde08-18f8-4e22-93a1-3846f9e367ad\") " pod="openshift-marketplace/redhat-operators-kcvkd" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.292058 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d85gj\" (UniqueName: \"kubernetes.io/projected/980fde08-18f8-4e22-93a1-3846f9e367ad-kube-api-access-d85gj\") pod \"redhat-operators-kcvkd\" (UID: \"980fde08-18f8-4e22-93a1-3846f9e367ad\") " pod="openshift-marketplace/redhat-operators-kcvkd" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.397694 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kcvkd" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.422217 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2lhjt"] Mar 09 13:01:58 crc kubenswrapper[4723]: W0309 13:01:58.505931 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5adbe8b6_fabd_4e21_8507_84df16004837.slice/crio-e65e0c5d60346ad68a3042a270ed19b462677a140d0fb88df19a328f6f77e82b WatchSource:0}: Error finding container e65e0c5d60346ad68a3042a270ed19b462677a140d0fb88df19a328f6f77e82b: Status 404 returned error can't find the container with id e65e0c5d60346ad68a3042a270ed19b462677a140d0fb88df19a328f6f77e82b Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.518275 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-54c96f5997-tm95d"] Mar 09 13:01:58 crc kubenswrapper[4723]: E0309 13:01:58.518910 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c15fd52d-a005-4417-aaca-84839023e2b4" containerName="collect-profiles" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.519061 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="c15fd52d-a005-4417-aaca-84839023e2b4" containerName="collect-profiles" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.519768 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="c15fd52d-a005-4417-aaca-84839023e2b4" containerName="collect-profiles" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.520553 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54c96f5997-tm95d" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.529545 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.529819 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d8d56cfd7-9xf8z"] Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.530018 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.530145 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.530321 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.530805 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.531224 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.531674 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d8d56cfd7-9xf8z" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.534497 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.534830 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.535107 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.535271 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.535543 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.535761 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.537795 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.538595 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d8d56cfd7-9xf8z"] Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.542714 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-54c96f5997-tm95d"] Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.573044 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7db77024-0f3c-4752-afa6-2f344ac04325-serving-cert\") pod \"controller-manager-54c96f5997-tm95d\" (UID: \"7db77024-0f3c-4752-afa6-2f344ac04325\") " pod="openshift-controller-manager/controller-manager-54c96f5997-tm95d" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.573107 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6159204-045f-40e4-9447-7e04d6e4ef49-client-ca\") pod \"route-controller-manager-6d8d56cfd7-9xf8z\" (UID: \"b6159204-045f-40e4-9447-7e04d6e4ef49\") " pod="openshift-route-controller-manager/route-controller-manager-6d8d56cfd7-9xf8z" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.573144 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6159204-045f-40e4-9447-7e04d6e4ef49-serving-cert\") pod \"route-controller-manager-6d8d56cfd7-9xf8z\" (UID: \"b6159204-045f-40e4-9447-7e04d6e4ef49\") " pod="openshift-route-controller-manager/route-controller-manager-6d8d56cfd7-9xf8z" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.573198 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7db77024-0f3c-4752-afa6-2f344ac04325-config\") pod \"controller-manager-54c96f5997-tm95d\" (UID: \"7db77024-0f3c-4752-afa6-2f344ac04325\") " pod="openshift-controller-manager/controller-manager-54c96f5997-tm95d" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.573396 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wbbr\" (UniqueName: \"kubernetes.io/projected/7db77024-0f3c-4752-afa6-2f344ac04325-kube-api-access-7wbbr\") pod \"controller-manager-54c96f5997-tm95d\" (UID: \"7db77024-0f3c-4752-afa6-2f344ac04325\") " pod="openshift-controller-manager/controller-manager-54c96f5997-tm95d" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.573496 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6159204-045f-40e4-9447-7e04d6e4ef49-config\") pod \"route-controller-manager-6d8d56cfd7-9xf8z\" (UID: \"b6159204-045f-40e4-9447-7e04d6e4ef49\") " pod="openshift-route-controller-manager/route-controller-manager-6d8d56cfd7-9xf8z" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.573594 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7db77024-0f3c-4752-afa6-2f344ac04325-client-ca\") pod \"controller-manager-54c96f5997-tm95d\" (UID: \"7db77024-0f3c-4752-afa6-2f344ac04325\") " pod="openshift-controller-manager/controller-manager-54c96f5997-tm95d" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.573649 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7db77024-0f3c-4752-afa6-2f344ac04325-proxy-ca-bundles\") pod \"controller-manager-54c96f5997-tm95d\" (UID: \"7db77024-0f3c-4752-afa6-2f344ac04325\") " pod="openshift-controller-manager/controller-manager-54c96f5997-tm95d" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.573675 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lbzn\" (UniqueName: \"kubernetes.io/projected/b6159204-045f-40e4-9447-7e04d6e4ef49-kube-api-access-8lbzn\") pod \"route-controller-manager-6d8d56cfd7-9xf8z\" (UID: \"b6159204-045f-40e4-9447-7e04d6e4ef49\") " pod="openshift-route-controller-manager/route-controller-manager-6d8d56cfd7-9xf8z" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.674638 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7db77024-0f3c-4752-afa6-2f344ac04325-proxy-ca-bundles\") pod \"controller-manager-54c96f5997-tm95d\" (UID: \"7db77024-0f3c-4752-afa6-2f344ac04325\") " pod="openshift-controller-manager/controller-manager-54c96f5997-tm95d" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.674682 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lbzn\" (UniqueName: \"kubernetes.io/projected/b6159204-045f-40e4-9447-7e04d6e4ef49-kube-api-access-8lbzn\") pod \"route-controller-manager-6d8d56cfd7-9xf8z\" (UID: \"b6159204-045f-40e4-9447-7e04d6e4ef49\") " pod="openshift-route-controller-manager/route-controller-manager-6d8d56cfd7-9xf8z" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.676220 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7db77024-0f3c-4752-afa6-2f344ac04325-serving-cert\") pod \"controller-manager-54c96f5997-tm95d\" (UID: \"7db77024-0f3c-4752-afa6-2f344ac04325\") " pod="openshift-controller-manager/controller-manager-54c96f5997-tm95d" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.676276 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6159204-045f-40e4-9447-7e04d6e4ef49-client-ca\") pod \"route-controller-manager-6d8d56cfd7-9xf8z\" (UID: \"b6159204-045f-40e4-9447-7e04d6e4ef49\") " pod="openshift-route-controller-manager/route-controller-manager-6d8d56cfd7-9xf8z" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.676294 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6159204-045f-40e4-9447-7e04d6e4ef49-serving-cert\") pod \"route-controller-manager-6d8d56cfd7-9xf8z\" (UID: \"b6159204-045f-40e4-9447-7e04d6e4ef49\") " pod="openshift-route-controller-manager/route-controller-manager-6d8d56cfd7-9xf8z" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.676312 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7db77024-0f3c-4752-afa6-2f344ac04325-config\") pod \"controller-manager-54c96f5997-tm95d\" (UID: \"7db77024-0f3c-4752-afa6-2f344ac04325\") " pod="openshift-controller-manager/controller-manager-54c96f5997-tm95d" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.676338 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wbbr\" (UniqueName: \"kubernetes.io/projected/7db77024-0f3c-4752-afa6-2f344ac04325-kube-api-access-7wbbr\") pod \"controller-manager-54c96f5997-tm95d\" (UID: \"7db77024-0f3c-4752-afa6-2f344ac04325\") " pod="openshift-controller-manager/controller-manager-54c96f5997-tm95d" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.677093 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6159204-045f-40e4-9447-7e04d6e4ef49-config\") pod \"route-controller-manager-6d8d56cfd7-9xf8z\" (UID: \"b6159204-045f-40e4-9447-7e04d6e4ef49\") " pod="openshift-route-controller-manager/route-controller-manager-6d8d56cfd7-9xf8z" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.677144 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7db77024-0f3c-4752-afa6-2f344ac04325-client-ca\") pod \"controller-manager-54c96f5997-tm95d\" (UID: \"7db77024-0f3c-4752-afa6-2f344ac04325\") " pod="openshift-controller-manager/controller-manager-54c96f5997-tm95d" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.677214 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6159204-045f-40e4-9447-7e04d6e4ef49-client-ca\") pod \"route-controller-manager-6d8d56cfd7-9xf8z\" (UID: \"b6159204-045f-40e4-9447-7e04d6e4ef49\") " pod="openshift-route-controller-manager/route-controller-manager-6d8d56cfd7-9xf8z" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.677675 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7db77024-0f3c-4752-afa6-2f344ac04325-config\") pod \"controller-manager-54c96f5997-tm95d\" (UID: \"7db77024-0f3c-4752-afa6-2f344ac04325\") " pod="openshift-controller-manager/controller-manager-54c96f5997-tm95d" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.678041 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6159204-045f-40e4-9447-7e04d6e4ef49-config\") pod \"route-controller-manager-6d8d56cfd7-9xf8z\" (UID: \"b6159204-045f-40e4-9447-7e04d6e4ef49\") " pod="openshift-route-controller-manager/route-controller-manager-6d8d56cfd7-9xf8z" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.679209 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7db77024-0f3c-4752-afa6-2f344ac04325-proxy-ca-bundles\") pod \"controller-manager-54c96f5997-tm95d\" (UID: \"7db77024-0f3c-4752-afa6-2f344ac04325\") " pod="openshift-controller-manager/controller-manager-54c96f5997-tm95d" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.683527 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6159204-045f-40e4-9447-7e04d6e4ef49-serving-cert\") pod \"route-controller-manager-6d8d56cfd7-9xf8z\" (UID: \"b6159204-045f-40e4-9447-7e04d6e4ef49\") " pod="openshift-route-controller-manager/route-controller-manager-6d8d56cfd7-9xf8z" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.684409 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7db77024-0f3c-4752-afa6-2f344ac04325-serving-cert\") pod \"controller-manager-54c96f5997-tm95d\" (UID: \"7db77024-0f3c-4752-afa6-2f344ac04325\") " pod="openshift-controller-manager/controller-manager-54c96f5997-tm95d" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.685776 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7db77024-0f3c-4752-afa6-2f344ac04325-client-ca\") pod \"controller-manager-54c96f5997-tm95d\" (UID: \"7db77024-0f3c-4752-afa6-2f344ac04325\") " pod="openshift-controller-manager/controller-manager-54c96f5997-tm95d" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.693248 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wbbr\" (UniqueName: \"kubernetes.io/projected/7db77024-0f3c-4752-afa6-2f344ac04325-kube-api-access-7wbbr\") pod \"controller-manager-54c96f5997-tm95d\" (UID: \"7db77024-0f3c-4752-afa6-2f344ac04325\") " pod="openshift-controller-manager/controller-manager-54c96f5997-tm95d" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.699431 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lbzn\" (UniqueName: \"kubernetes.io/projected/b6159204-045f-40e4-9447-7e04d6e4ef49-kube-api-access-8lbzn\") pod \"route-controller-manager-6d8d56cfd7-9xf8z\" (UID: \"b6159204-045f-40e4-9447-7e04d6e4ef49\") " pod="openshift-route-controller-manager/route-controller-manager-6d8d56cfd7-9xf8z" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.716818 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.717818 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.720429 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.720681 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.741983 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.779248 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/53b85866-5b33-4bf8-ab1a-dbbb761ac06f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"53b85866-5b33-4bf8-ab1a-dbbb761ac06f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.780114 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/53b85866-5b33-4bf8-ab1a-dbbb761ac06f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"53b85866-5b33-4bf8-ab1a-dbbb761ac06f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.799485 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kcvkd"] Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.805006 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"90fa396b-9473-4da4-af0c-c08fd8f78e3a","Type":"ContainerStarted","Data":"2bb170badf2e1d419a00ca87a626a8e76c166c3b2f2ec1d358f13f838a82dccd"} Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.805061 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"90fa396b-9473-4da4-af0c-c08fd8f78e3a","Type":"ContainerStarted","Data":"616b36c8bca13ebbfe0e466113e4296c748c8dbb89d7ba07ea85285e0d4c362e"} Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.807840 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-6gtjl" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.807889 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-6gtjl" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.811882 4723 patch_prober.go:28] interesting pod/console-f9d7485db-6gtjl container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.7:8443/health\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.811983 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-6gtjl" podUID="6775c6a2-49ba-48fb-9f8f-ff26a7155618" containerName="console" probeResult="failure" output="Get \"https://10.217.0.7:8443/health\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.825599 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" event={"ID":"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef","Type":"ContainerStarted","Data":"6997df8f497c4ff917189e52e29cc2842ade8e6e2ca3ac5caf201f9cffa6296d"} Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.825683 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.825999 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.825973775 podStartE2EDuration="2.825973775s" podCreationTimestamp="2026-03-09 13:01:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:58.819295645 +0000 UTC m=+192.833763185" watchObservedRunningTime="2026-03-09 13:01:58.825973775 +0000 UTC m=+192.840441305" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.833533 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551020-rlng8" event={"ID":"c15fd52d-a005-4417-aaca-84839023e2b4","Type":"ContainerDied","Data":"3d0d1dfcb12382288c5a2b73e36848d29c08f3f0bf6edbd95b93903ffcac1363"} Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.833592 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d0d1dfcb12382288c5a2b73e36848d29c08f3f0bf6edbd95b93903ffcac1363" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.833667 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551020-rlng8" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.851644 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" podStartSLOduration=122.85161959 podStartE2EDuration="2m2.85161959s" podCreationTimestamp="2026-03-09 12:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:01:58.844625582 +0000 UTC m=+192.859093142" watchObservedRunningTime="2026-03-09 13:01:58.85161959 +0000 UTC m=+192.866087130" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.852266 4723 generic.go:334] "Generic (PLEG): container finished" podID="6bc2fb38-5759-4ce6-9c1d-84a6537050e9" containerID="644ceeb0c8726b6e543836ebcd44d1c5d7cdcf25069fbfd1fe05bc0f3340d617" exitCode=0 Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.853258 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lsmcg" event={"ID":"6bc2fb38-5759-4ce6-9c1d-84a6537050e9","Type":"ContainerDied","Data":"644ceeb0c8726b6e543836ebcd44d1c5d7cdcf25069fbfd1fe05bc0f3340d617"} Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.860471 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54c96f5997-tm95d" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.868822 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d8d56cfd7-9xf8z" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.906774 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/53b85866-5b33-4bf8-ab1a-dbbb761ac06f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"53b85866-5b33-4bf8-ab1a-dbbb761ac06f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.906827 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/53b85866-5b33-4bf8-ab1a-dbbb761ac06f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"53b85866-5b33-4bf8-ab1a-dbbb761ac06f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.909817 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/53b85866-5b33-4bf8-ab1a-dbbb761ac06f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"53b85866-5b33-4bf8-ab1a-dbbb761ac06f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.930408 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/53b85866-5b33-4bf8-ab1a-dbbb761ac06f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"53b85866-5b33-4bf8-ab1a-dbbb761ac06f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.937076 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.937785 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bc33ee8-964b-4b03-b564-5c66068629b9" path="/var/lib/kubelet/pods/9bc33ee8-964b-4b03-b564-5c66068629b9/volumes" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.944604 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea205a39-cbd1-4704-8e93-0b1747a88e8a" path="/var/lib/kubelet/pods/ea205a39-cbd1-4704-8e93-0b1747a88e8a/volumes" Mar 09 13:01:58 crc kubenswrapper[4723]: I0309 13:01:58.947918 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2lhjt" event={"ID":"5adbe8b6-fabd-4e21-8507-84df16004837","Type":"ContainerStarted","Data":"e65e0c5d60346ad68a3042a270ed19b462677a140d0fb88df19a328f6f77e82b"} Mar 09 13:01:59 crc kubenswrapper[4723]: I0309 13:01:59.046650 4723 patch_prober.go:28] interesting pod/router-default-5444994796-fzrk5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:01:59 crc kubenswrapper[4723]: [-]has-synced failed: reason withheld Mar 09 13:01:59 crc kubenswrapper[4723]: [+]process-running ok Mar 09 13:01:59 crc kubenswrapper[4723]: healthz check failed Mar 09 13:01:59 crc kubenswrapper[4723]: I0309 13:01:59.046783 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fzrk5" podUID="f386e3de-9b2f-478d-b7c5-8ac3f8aff8d1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:01:59 crc kubenswrapper[4723]: I0309 13:01:59.062236 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 13:01:59 crc kubenswrapper[4723]: I0309 13:01:59.231783 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-54c96f5997-tm95d"] Mar 09 13:01:59 crc kubenswrapper[4723]: W0309 13:01:59.248671 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7db77024_0f3c_4752_afa6_2f344ac04325.slice/crio-b5012704aec4670c80a725267058ea981ad6812033d103db02dc87a22f4174af WatchSource:0}: Error finding container b5012704aec4670c80a725267058ea981ad6812033d103db02dc87a22f4174af: Status 404 returned error can't find the container with id b5012704aec4670c80a725267058ea981ad6812033d103db02dc87a22f4174af Mar 09 13:01:59 crc kubenswrapper[4723]: I0309 13:01:59.288779 4723 patch_prober.go:28] interesting pod/downloads-7954f5f757-b5c74 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 09 13:01:59 crc kubenswrapper[4723]: I0309 13:01:59.288837 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-b5c74" podUID="e21fc837-8de2-4af5-a375-b14567f47d67" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 09 13:01:59 crc kubenswrapper[4723]: I0309 13:01:59.288930 4723 patch_prober.go:28] interesting pod/downloads-7954f5f757-b5c74 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 09 13:01:59 crc kubenswrapper[4723]: I0309 13:01:59.288985 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-b5c74" podUID="e21fc837-8de2-4af5-a375-b14567f47d67" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 09 13:01:59 crc kubenswrapper[4723]: I0309 13:01:59.294431 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d8d56cfd7-9xf8z"] Mar 09 13:01:59 crc kubenswrapper[4723]: W0309 13:01:59.323397 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6159204_045f_40e4_9447_7e04d6e4ef49.slice/crio-f69b20d80557b436fd37e05c8b48aa89207ee2b1c6949e1c0245f4309ad06fa4 WatchSource:0}: Error finding container f69b20d80557b436fd37e05c8b48aa89207ee2b1c6949e1c0245f4309ad06fa4: Status 404 returned error can't find the container with id f69b20d80557b436fd37e05c8b48aa89207ee2b1c6949e1c0245f4309ad06fa4 Mar 09 13:01:59 crc kubenswrapper[4723]: I0309 13:01:59.330774 4723 ???:1] "http: TLS handshake error from 192.168.126.11:48840: no serving certificate available for the kubelet" Mar 09 13:01:59 crc kubenswrapper[4723]: I0309 13:01:59.376569 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-xzd59" Mar 09 13:01:59 crc kubenswrapper[4723]: I0309 13:01:59.376757 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-xzd59" Mar 09 13:01:59 crc kubenswrapper[4723]: I0309 13:01:59.383406 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-xzd59" Mar 09 13:01:59 crc kubenswrapper[4723]: I0309 13:01:59.435031 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7zkls" Mar 09 13:01:59 crc kubenswrapper[4723]: I0309 13:01:59.441744 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7zkls" Mar 09 13:01:59 crc kubenswrapper[4723]: I0309 13:01:59.672105 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 09 13:01:59 crc kubenswrapper[4723]: I0309 13:01:59.966772 4723 generic.go:334] "Generic (PLEG): container finished" podID="90fa396b-9473-4da4-af0c-c08fd8f78e3a" containerID="2bb170badf2e1d419a00ca87a626a8e76c166c3b2f2ec1d358f13f838a82dccd" exitCode=0 Mar 09 13:01:59 crc kubenswrapper[4723]: I0309 13:01:59.967416 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"90fa396b-9473-4da4-af0c-c08fd8f78e3a","Type":"ContainerDied","Data":"2bb170badf2e1d419a00ca87a626a8e76c166c3b2f2ec1d358f13f838a82dccd"} Mar 09 13:01:59 crc kubenswrapper[4723]: I0309 13:01:59.973400 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54c96f5997-tm95d" event={"ID":"7db77024-0f3c-4752-afa6-2f344ac04325","Type":"ContainerStarted","Data":"72ebdb37f1babfda133fa762ef21355d82a5c0d86c2d842d844e547eaa9f4914"} Mar 09 13:01:59 crc kubenswrapper[4723]: I0309 13:01:59.974189 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54c96f5997-tm95d" event={"ID":"7db77024-0f3c-4752-afa6-2f344ac04325","Type":"ContainerStarted","Data":"b5012704aec4670c80a725267058ea981ad6812033d103db02dc87a22f4174af"} Mar 09 13:01:59 crc kubenswrapper[4723]: I0309 13:01:59.975087 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-54c96f5997-tm95d" Mar 09 13:01:59 crc kubenswrapper[4723]: I0309 13:01:59.991891 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"53b85866-5b33-4bf8-ab1a-dbbb761ac06f","Type":"ContainerStarted","Data":"c71b34d8ab32669513fbeeb61fe4be60fb2f39ed5a67b9a6ec7595a73588f468"} Mar 09 13:02:00 crc kubenswrapper[4723]: I0309 13:02:00.002524 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-54c96f5997-tm95d" Mar 09 13:02:00 crc kubenswrapper[4723]: I0309 13:02:00.009546 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-54c96f5997-tm95d" podStartSLOduration=4.009535494 podStartE2EDuration="4.009535494s" podCreationTimestamp="2026-03-09 13:01:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:02:00.009348589 +0000 UTC m=+194.023816149" watchObservedRunningTime="2026-03-09 13:02:00.009535494 +0000 UTC m=+194.024003034" Mar 09 13:02:00 crc kubenswrapper[4723]: I0309 13:02:00.021199 4723 generic.go:334] "Generic (PLEG): container finished" podID="5adbe8b6-fabd-4e21-8507-84df16004837" containerID="aad7f6d12e19f3f187c33be6859a22310f784401b9f27bf5cf65a192902f32cb" exitCode=0 Mar 09 13:02:00 crc kubenswrapper[4723]: I0309 13:02:00.021231 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2lhjt" event={"ID":"5adbe8b6-fabd-4e21-8507-84df16004837","Type":"ContainerDied","Data":"aad7f6d12e19f3f187c33be6859a22310f784401b9f27bf5cf65a192902f32cb"} Mar 09 13:02:00 crc kubenswrapper[4723]: I0309 13:02:00.045628 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-fzrk5" Mar 09 13:02:00 crc kubenswrapper[4723]: I0309 13:02:00.047939 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d8d56cfd7-9xf8z" event={"ID":"b6159204-045f-40e4-9447-7e04d6e4ef49","Type":"ContainerStarted","Data":"ecf1da64b0b9bc4174a17a47210349df8556dac74e0181a769c736875e007f46"} Mar 09 13:02:00 crc kubenswrapper[4723]: I0309 13:02:00.047985 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d8d56cfd7-9xf8z" event={"ID":"b6159204-045f-40e4-9447-7e04d6e4ef49","Type":"ContainerStarted","Data":"f69b20d80557b436fd37e05c8b48aa89207ee2b1c6949e1c0245f4309ad06fa4"} Mar 09 13:02:00 crc kubenswrapper[4723]: I0309 13:02:00.049122 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6d8d56cfd7-9xf8z" Mar 09 13:02:00 crc kubenswrapper[4723]: I0309 13:02:00.051912 4723 patch_prober.go:28] interesting pod/router-default-5444994796-fzrk5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:02:00 crc kubenswrapper[4723]: [-]has-synced failed: reason withheld Mar 09 13:02:00 crc kubenswrapper[4723]: [+]process-running ok Mar 09 13:02:00 crc kubenswrapper[4723]: healthz check failed Mar 09 13:02:00 crc kubenswrapper[4723]: I0309 13:02:00.051955 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fzrk5" podUID="f386e3de-9b2f-478d-b7c5-8ac3f8aff8d1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:02:00 crc kubenswrapper[4723]: I0309 13:02:00.061756 4723 generic.go:334] "Generic (PLEG): container finished" podID="980fde08-18f8-4e22-93a1-3846f9e367ad" containerID="4dfd81283017ab558113998e30e2d1364cdca40a7573010d280a40087dec8f95" exitCode=0 Mar 09 13:02:00 crc kubenswrapper[4723]: I0309 13:02:00.061981 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kcvkd" event={"ID":"980fde08-18f8-4e22-93a1-3846f9e367ad","Type":"ContainerDied","Data":"4dfd81283017ab558113998e30e2d1364cdca40a7573010d280a40087dec8f95"} Mar 09 13:02:00 crc kubenswrapper[4723]: I0309 13:02:00.062066 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kcvkd" event={"ID":"980fde08-18f8-4e22-93a1-3846f9e367ad","Type":"ContainerStarted","Data":"00770bf6d02ef6c880f6f04c9309af835537d0a14e71af65f34e605e92979447"} Mar 09 13:02:00 crc kubenswrapper[4723]: I0309 13:02:00.071091 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6d8d56cfd7-9xf8z" podStartSLOduration=4.071072395 podStartE2EDuration="4.071072395s" podCreationTimestamp="2026-03-09 13:01:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:02:00.068762026 +0000 UTC m=+194.083229566" watchObservedRunningTime="2026-03-09 13:02:00.071072395 +0000 UTC m=+194.085539925" Mar 09 13:02:00 crc kubenswrapper[4723]: I0309 13:02:00.078099 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-xzd59" Mar 09 13:02:00 crc kubenswrapper[4723]: I0309 13:02:00.160377 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551022-xgzbd"] Mar 09 13:02:00 crc kubenswrapper[4723]: I0309 13:02:00.161186 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551022-xgzbd" Mar 09 13:02:00 crc kubenswrapper[4723]: I0309 13:02:00.164355 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551022-xgzbd"] Mar 09 13:02:00 crc kubenswrapper[4723]: I0309 13:02:00.166441 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jrp4x" Mar 09 13:02:00 crc kubenswrapper[4723]: I0309 13:02:00.166741 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:02:00 crc kubenswrapper[4723]: I0309 13:02:00.166933 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:02:00 crc kubenswrapper[4723]: I0309 13:02:00.238231 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69n7q\" (UniqueName: \"kubernetes.io/projected/b2586638-1604-4545-8203-6b89e38129e6-kube-api-access-69n7q\") pod \"auto-csr-approver-29551022-xgzbd\" (UID: \"b2586638-1604-4545-8203-6b89e38129e6\") " pod="openshift-infra/auto-csr-approver-29551022-xgzbd" Mar 09 13:02:00 crc kubenswrapper[4723]: I0309 13:02:00.270439 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6d8d56cfd7-9xf8z" Mar 09 13:02:00 crc kubenswrapper[4723]: I0309 13:02:00.341762 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69n7q\" (UniqueName: \"kubernetes.io/projected/b2586638-1604-4545-8203-6b89e38129e6-kube-api-access-69n7q\") pod \"auto-csr-approver-29551022-xgzbd\" (UID: \"b2586638-1604-4545-8203-6b89e38129e6\") " pod="openshift-infra/auto-csr-approver-29551022-xgzbd" Mar 09 13:02:00 crc kubenswrapper[4723]: I0309 13:02:00.389922 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69n7q\" (UniqueName: \"kubernetes.io/projected/b2586638-1604-4545-8203-6b89e38129e6-kube-api-access-69n7q\") pod \"auto-csr-approver-29551022-xgzbd\" (UID: \"b2586638-1604-4545-8203-6b89e38129e6\") " pod="openshift-infra/auto-csr-approver-29551022-xgzbd" Mar 09 13:02:00 crc kubenswrapper[4723]: I0309 13:02:00.529241 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551022-xgzbd" Mar 09 13:02:01 crc kubenswrapper[4723]: I0309 13:02:01.043665 4723 patch_prober.go:28] interesting pod/router-default-5444994796-fzrk5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:02:01 crc kubenswrapper[4723]: [-]has-synced failed: reason withheld Mar 09 13:02:01 crc kubenswrapper[4723]: [+]process-running ok Mar 09 13:02:01 crc kubenswrapper[4723]: healthz check failed Mar 09 13:02:01 crc kubenswrapper[4723]: I0309 13:02:01.043925 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fzrk5" podUID="f386e3de-9b2f-478d-b7c5-8ac3f8aff8d1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:02:01 crc kubenswrapper[4723]: I0309 13:02:01.073162 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"53b85866-5b33-4bf8-ab1a-dbbb761ac06f","Type":"ContainerStarted","Data":"acec13287fd1981d74e41d8b3b6cddd872149015976949e3ab99f34e9c05511d"} Mar 09 13:02:01 crc kubenswrapper[4723]: I0309 13:02:01.090655 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.090636646 podStartE2EDuration="3.090636646s" podCreationTimestamp="2026-03-09 13:01:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:02:01.086508301 +0000 UTC m=+195.100975841" watchObservedRunningTime="2026-03-09 13:02:01.090636646 +0000 UTC m=+195.105104176" Mar 09 13:02:01 crc kubenswrapper[4723]: I0309 13:02:01.113730 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551022-xgzbd"] Mar 09 13:02:01 crc kubenswrapper[4723]: W0309 13:02:01.129094 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2586638_1604_4545_8203_6b89e38129e6.slice/crio-dac153412c58f3ebd4f1bab6f7ba3b05b7ac9e158db88214adad14ace49dd292 WatchSource:0}: Error finding container dac153412c58f3ebd4f1bab6f7ba3b05b7ac9e158db88214adad14ace49dd292: Status 404 returned error can't find the container with id dac153412c58f3ebd4f1bab6f7ba3b05b7ac9e158db88214adad14ace49dd292 Mar 09 13:02:01 crc kubenswrapper[4723]: I0309 13:02:01.342084 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 13:02:01 crc kubenswrapper[4723]: I0309 13:02:01.359767 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/90fa396b-9473-4da4-af0c-c08fd8f78e3a-kube-api-access\") pod \"90fa396b-9473-4da4-af0c-c08fd8f78e3a\" (UID: \"90fa396b-9473-4da4-af0c-c08fd8f78e3a\") " Mar 09 13:02:01 crc kubenswrapper[4723]: I0309 13:02:01.359924 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/90fa396b-9473-4da4-af0c-c08fd8f78e3a-kubelet-dir\") pod \"90fa396b-9473-4da4-af0c-c08fd8f78e3a\" (UID: \"90fa396b-9473-4da4-af0c-c08fd8f78e3a\") " Mar 09 13:02:01 crc kubenswrapper[4723]: I0309 13:02:01.360212 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90fa396b-9473-4da4-af0c-c08fd8f78e3a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "90fa396b-9473-4da4-af0c-c08fd8f78e3a" (UID: "90fa396b-9473-4da4-af0c-c08fd8f78e3a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:02:01 crc kubenswrapper[4723]: I0309 13:02:01.365693 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90fa396b-9473-4da4-af0c-c08fd8f78e3a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "90fa396b-9473-4da4-af0c-c08fd8f78e3a" (UID: "90fa396b-9473-4da4-af0c-c08fd8f78e3a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:02:01 crc kubenswrapper[4723]: I0309 13:02:01.460705 4723 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/90fa396b-9473-4da4-af0c-c08fd8f78e3a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 09 13:02:01 crc kubenswrapper[4723]: I0309 13:02:01.460739 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/90fa396b-9473-4da4-af0c-c08fd8f78e3a-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 13:02:02 crc kubenswrapper[4723]: I0309 13:02:02.044282 4723 patch_prober.go:28] interesting pod/router-default-5444994796-fzrk5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:02:02 crc kubenswrapper[4723]: [-]has-synced failed: reason withheld Mar 09 13:02:02 crc kubenswrapper[4723]: [+]process-running ok Mar 09 13:02:02 crc kubenswrapper[4723]: healthz check failed Mar 09 13:02:02 crc kubenswrapper[4723]: I0309 13:02:02.044594 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fzrk5" podUID="f386e3de-9b2f-478d-b7c5-8ac3f8aff8d1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:02:02 crc kubenswrapper[4723]: I0309 13:02:02.096288 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 13:02:02 crc kubenswrapper[4723]: I0309 13:02:02.096303 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"90fa396b-9473-4da4-af0c-c08fd8f78e3a","Type":"ContainerDied","Data":"616b36c8bca13ebbfe0e466113e4296c748c8dbb89d7ba07ea85285e0d4c362e"} Mar 09 13:02:02 crc kubenswrapper[4723]: I0309 13:02:02.096376 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="616b36c8bca13ebbfe0e466113e4296c748c8dbb89d7ba07ea85285e0d4c362e" Mar 09 13:02:02 crc kubenswrapper[4723]: I0309 13:02:02.116023 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551022-xgzbd" event={"ID":"b2586638-1604-4545-8203-6b89e38129e6","Type":"ContainerStarted","Data":"dac153412c58f3ebd4f1bab6f7ba3b05b7ac9e158db88214adad14ace49dd292"} Mar 09 13:02:02 crc kubenswrapper[4723]: I0309 13:02:02.171944 4723 generic.go:334] "Generic (PLEG): container finished" podID="53b85866-5b33-4bf8-ab1a-dbbb761ac06f" containerID="acec13287fd1981d74e41d8b3b6cddd872149015976949e3ab99f34e9c05511d" exitCode=0 Mar 09 13:02:02 crc kubenswrapper[4723]: I0309 13:02:02.172036 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"53b85866-5b33-4bf8-ab1a-dbbb761ac06f","Type":"ContainerDied","Data":"acec13287fd1981d74e41d8b3b6cddd872149015976949e3ab99f34e9c05511d"} Mar 09 13:02:02 crc kubenswrapper[4723]: I0309 13:02:02.219398 4723 ???:1] "http: TLS handshake error from 192.168.126.11:48852: no serving certificate available for the kubelet" Mar 09 13:02:03 crc kubenswrapper[4723]: I0309 13:02:03.043725 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-fzrk5" Mar 09 13:02:03 crc kubenswrapper[4723]: I0309 13:02:03.054522 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-fzrk5" Mar 09 13:02:03 crc kubenswrapper[4723]: I0309 13:02:03.550296 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 13:02:03 crc kubenswrapper[4723]: I0309 13:02:03.596532 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/53b85866-5b33-4bf8-ab1a-dbbb761ac06f-kube-api-access\") pod \"53b85866-5b33-4bf8-ab1a-dbbb761ac06f\" (UID: \"53b85866-5b33-4bf8-ab1a-dbbb761ac06f\") " Mar 09 13:02:03 crc kubenswrapper[4723]: I0309 13:02:03.596719 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/53b85866-5b33-4bf8-ab1a-dbbb761ac06f-kubelet-dir\") pod \"53b85866-5b33-4bf8-ab1a-dbbb761ac06f\" (UID: \"53b85866-5b33-4bf8-ab1a-dbbb761ac06f\") " Mar 09 13:02:03 crc kubenswrapper[4723]: I0309 13:02:03.597071 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/53b85866-5b33-4bf8-ab1a-dbbb761ac06f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "53b85866-5b33-4bf8-ab1a-dbbb761ac06f" (UID: "53b85866-5b33-4bf8-ab1a-dbbb761ac06f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:02:03 crc kubenswrapper[4723]: I0309 13:02:03.606772 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53b85866-5b33-4bf8-ab1a-dbbb761ac06f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "53b85866-5b33-4bf8-ab1a-dbbb761ac06f" (UID: "53b85866-5b33-4bf8-ab1a-dbbb761ac06f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:02:03 crc kubenswrapper[4723]: I0309 13:02:03.699187 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/53b85866-5b33-4bf8-ab1a-dbbb761ac06f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 13:02:03 crc kubenswrapper[4723]: I0309 13:02:03.699642 4723 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/53b85866-5b33-4bf8-ab1a-dbbb761ac06f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 09 13:02:04 crc kubenswrapper[4723]: I0309 13:02:04.203962 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"53b85866-5b33-4bf8-ab1a-dbbb761ac06f","Type":"ContainerDied","Data":"c71b34d8ab32669513fbeeb61fe4be60fb2f39ed5a67b9a6ec7595a73588f468"} Mar 09 13:02:04 crc kubenswrapper[4723]: I0309 13:02:04.204116 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c71b34d8ab32669513fbeeb61fe4be60fb2f39ed5a67b9a6ec7595a73588f468" Mar 09 13:02:04 crc kubenswrapper[4723]: I0309 13:02:04.204185 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 13:02:04 crc kubenswrapper[4723]: I0309 13:02:04.476848 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:02:04 crc kubenswrapper[4723]: I0309 13:02:04.479920 4723 ???:1] "http: TLS handshake error from 192.168.126.11:48864: no serving certificate available for the kubelet" Mar 09 13:02:05 crc kubenswrapper[4723]: I0309 13:02:05.208307 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-4hnlr" Mar 09 13:02:08 crc kubenswrapper[4723]: I0309 13:02:08.820459 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-6gtjl" Mar 09 13:02:08 crc kubenswrapper[4723]: I0309 13:02:08.826932 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-6gtjl" Mar 09 13:02:09 crc kubenswrapper[4723]: I0309 13:02:09.300765 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-b5c74" Mar 09 13:02:16 crc kubenswrapper[4723]: I0309 13:02:16.407166 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-54c96f5997-tm95d"] Mar 09 13:02:16 crc kubenswrapper[4723]: I0309 13:02:16.408079 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-54c96f5997-tm95d" podUID="7db77024-0f3c-4752-afa6-2f344ac04325" containerName="controller-manager" containerID="cri-o://72ebdb37f1babfda133fa762ef21355d82a5c0d86c2d842d844e547eaa9f4914" gracePeriod=30 Mar 09 13:02:16 crc kubenswrapper[4723]: I0309 13:02:16.427666 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d8d56cfd7-9xf8z"] Mar 09 13:02:16 crc kubenswrapper[4723]: I0309 13:02:16.427997 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6d8d56cfd7-9xf8z" podUID="b6159204-045f-40e4-9447-7e04d6e4ef49" containerName="route-controller-manager" containerID="cri-o://ecf1da64b0b9bc4174a17a47210349df8556dac74e0181a769c736875e007f46" gracePeriod=30 Mar 09 13:02:17 crc kubenswrapper[4723]: I0309 13:02:17.213881 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:02:17 crc kubenswrapper[4723]: I0309 13:02:17.304728 4723 generic.go:334] "Generic (PLEG): container finished" podID="b6159204-045f-40e4-9447-7e04d6e4ef49" containerID="ecf1da64b0b9bc4174a17a47210349df8556dac74e0181a769c736875e007f46" exitCode=0 Mar 09 13:02:17 crc kubenswrapper[4723]: I0309 13:02:17.304795 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d8d56cfd7-9xf8z" event={"ID":"b6159204-045f-40e4-9447-7e04d6e4ef49","Type":"ContainerDied","Data":"ecf1da64b0b9bc4174a17a47210349df8556dac74e0181a769c736875e007f46"} Mar 09 13:02:17 crc kubenswrapper[4723]: I0309 13:02:17.306732 4723 generic.go:334] "Generic (PLEG): container finished" podID="7db77024-0f3c-4752-afa6-2f344ac04325" containerID="72ebdb37f1babfda133fa762ef21355d82a5c0d86c2d842d844e547eaa9f4914" exitCode=0 Mar 09 13:02:17 crc kubenswrapper[4723]: I0309 13:02:17.306759 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54c96f5997-tm95d" event={"ID":"7db77024-0f3c-4752-afa6-2f344ac04325","Type":"ContainerDied","Data":"72ebdb37f1babfda133fa762ef21355d82a5c0d86c2d842d844e547eaa9f4914"} Mar 09 13:02:18 crc kubenswrapper[4723]: I0309 13:02:18.862209 4723 patch_prober.go:28] interesting pod/controller-manager-54c96f5997-tm95d container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" start-of-body= Mar 09 13:02:18 crc kubenswrapper[4723]: I0309 13:02:18.862509 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-54c96f5997-tm95d" podUID="7db77024-0f3c-4752-afa6-2f344ac04325" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" Mar 09 13:02:18 crc kubenswrapper[4723]: I0309 13:02:18.871282 4723 patch_prober.go:28] interesting pod/route-controller-manager-6d8d56cfd7-9xf8z container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.54:8443/healthz\": dial tcp 10.217.0.54:8443: connect: connection refused" start-of-body= Mar 09 13:02:18 crc kubenswrapper[4723]: I0309 13:02:18.871317 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6d8d56cfd7-9xf8z" podUID="b6159204-045f-40e4-9447-7e04d6e4ef49" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.54:8443/healthz\": dial tcp 10.217.0.54:8443: connect: connection refused" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.315370 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d8d56cfd7-9xf8z" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.320240 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54c96f5997-tm95d" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.355836 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-765f544df7-rlqv5"] Mar 09 13:02:22 crc kubenswrapper[4723]: E0309 13:02:22.356199 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90fa396b-9473-4da4-af0c-c08fd8f78e3a" containerName="pruner" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.356224 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="90fa396b-9473-4da4-af0c-c08fd8f78e3a" containerName="pruner" Mar 09 13:02:22 crc kubenswrapper[4723]: E0309 13:02:22.356250 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7db77024-0f3c-4752-afa6-2f344ac04325" containerName="controller-manager" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.356261 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="7db77024-0f3c-4752-afa6-2f344ac04325" containerName="controller-manager" Mar 09 13:02:22 crc kubenswrapper[4723]: E0309 13:02:22.356273 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53b85866-5b33-4bf8-ab1a-dbbb761ac06f" containerName="pruner" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.356282 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="53b85866-5b33-4bf8-ab1a-dbbb761ac06f" containerName="pruner" Mar 09 13:02:22 crc kubenswrapper[4723]: E0309 13:02:22.356293 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6159204-045f-40e4-9447-7e04d6e4ef49" containerName="route-controller-manager" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.356302 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6159204-045f-40e4-9447-7e04d6e4ef49" containerName="route-controller-manager" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.356436 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="90fa396b-9473-4da4-af0c-c08fd8f78e3a" containerName="pruner" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.356451 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="53b85866-5b33-4bf8-ab1a-dbbb761ac06f" containerName="pruner" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.356463 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="7db77024-0f3c-4752-afa6-2f344ac04325" containerName="controller-manager" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.356477 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6159204-045f-40e4-9447-7e04d6e4ef49" containerName="route-controller-manager" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.356984 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-765f544df7-rlqv5" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.363046 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-765f544df7-rlqv5"] Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.366030 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54c96f5997-tm95d" event={"ID":"7db77024-0f3c-4752-afa6-2f344ac04325","Type":"ContainerDied","Data":"b5012704aec4670c80a725267058ea981ad6812033d103db02dc87a22f4174af"} Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.366077 4723 scope.go:117] "RemoveContainer" containerID="72ebdb37f1babfda133fa762ef21355d82a5c0d86c2d842d844e547eaa9f4914" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.366761 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54c96f5997-tm95d" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.372704 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d8d56cfd7-9xf8z" event={"ID":"b6159204-045f-40e4-9447-7e04d6e4ef49","Type":"ContainerDied","Data":"f69b20d80557b436fd37e05c8b48aa89207ee2b1c6949e1c0245f4309ad06fa4"} Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.372791 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d8d56cfd7-9xf8z" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.429564 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7db77024-0f3c-4752-afa6-2f344ac04325-proxy-ca-bundles\") pod \"7db77024-0f3c-4752-afa6-2f344ac04325\" (UID: \"7db77024-0f3c-4752-afa6-2f344ac04325\") " Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.429622 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6159204-045f-40e4-9447-7e04d6e4ef49-serving-cert\") pod \"b6159204-045f-40e4-9447-7e04d6e4ef49\" (UID: \"b6159204-045f-40e4-9447-7e04d6e4ef49\") " Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.429666 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lbzn\" (UniqueName: \"kubernetes.io/projected/b6159204-045f-40e4-9447-7e04d6e4ef49-kube-api-access-8lbzn\") pod \"b6159204-045f-40e4-9447-7e04d6e4ef49\" (UID: \"b6159204-045f-40e4-9447-7e04d6e4ef49\") " Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.429698 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7db77024-0f3c-4752-afa6-2f344ac04325-serving-cert\") pod \"7db77024-0f3c-4752-afa6-2f344ac04325\" (UID: \"7db77024-0f3c-4752-afa6-2f344ac04325\") " Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.429717 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6159204-045f-40e4-9447-7e04d6e4ef49-client-ca\") pod \"b6159204-045f-40e4-9447-7e04d6e4ef49\" (UID: \"b6159204-045f-40e4-9447-7e04d6e4ef49\") " Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.429750 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7db77024-0f3c-4752-afa6-2f344ac04325-config\") pod \"7db77024-0f3c-4752-afa6-2f344ac04325\" (UID: \"7db77024-0f3c-4752-afa6-2f344ac04325\") " Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.429766 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6159204-045f-40e4-9447-7e04d6e4ef49-config\") pod \"b6159204-045f-40e4-9447-7e04d6e4ef49\" (UID: \"b6159204-045f-40e4-9447-7e04d6e4ef49\") " Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.429801 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wbbr\" (UniqueName: \"kubernetes.io/projected/7db77024-0f3c-4752-afa6-2f344ac04325-kube-api-access-7wbbr\") pod \"7db77024-0f3c-4752-afa6-2f344ac04325\" (UID: \"7db77024-0f3c-4752-afa6-2f344ac04325\") " Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.429837 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7db77024-0f3c-4752-afa6-2f344ac04325-client-ca\") pod \"7db77024-0f3c-4752-afa6-2f344ac04325\" (UID: \"7db77024-0f3c-4752-afa6-2f344ac04325\") " Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.431319 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7db77024-0f3c-4752-afa6-2f344ac04325-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7db77024-0f3c-4752-afa6-2f344ac04325" (UID: "7db77024-0f3c-4752-afa6-2f344ac04325"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.431383 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7db77024-0f3c-4752-afa6-2f344ac04325-config" (OuterVolumeSpecName: "config") pod "7db77024-0f3c-4752-afa6-2f344ac04325" (UID: "7db77024-0f3c-4752-afa6-2f344ac04325"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.431553 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6159204-045f-40e4-9447-7e04d6e4ef49-client-ca" (OuterVolumeSpecName: "client-ca") pod "b6159204-045f-40e4-9447-7e04d6e4ef49" (UID: "b6159204-045f-40e4-9447-7e04d6e4ef49"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.431568 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7db77024-0f3c-4752-afa6-2f344ac04325-client-ca" (OuterVolumeSpecName: "client-ca") pod "7db77024-0f3c-4752-afa6-2f344ac04325" (UID: "7db77024-0f3c-4752-afa6-2f344ac04325"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.431696 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6159204-045f-40e4-9447-7e04d6e4ef49-config" (OuterVolumeSpecName: "config") pod "b6159204-045f-40e4-9447-7e04d6e4ef49" (UID: "b6159204-045f-40e4-9447-7e04d6e4ef49"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.436039 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6159204-045f-40e4-9447-7e04d6e4ef49-kube-api-access-8lbzn" (OuterVolumeSpecName: "kube-api-access-8lbzn") pod "b6159204-045f-40e4-9447-7e04d6e4ef49" (UID: "b6159204-045f-40e4-9447-7e04d6e4ef49"). InnerVolumeSpecName "kube-api-access-8lbzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.440686 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7db77024-0f3c-4752-afa6-2f344ac04325-kube-api-access-7wbbr" (OuterVolumeSpecName: "kube-api-access-7wbbr") pod "7db77024-0f3c-4752-afa6-2f344ac04325" (UID: "7db77024-0f3c-4752-afa6-2f344ac04325"). InnerVolumeSpecName "kube-api-access-7wbbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.441471 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7db77024-0f3c-4752-afa6-2f344ac04325-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7db77024-0f3c-4752-afa6-2f344ac04325" (UID: "7db77024-0f3c-4752-afa6-2f344ac04325"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.442023 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6159204-045f-40e4-9447-7e04d6e4ef49-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b6159204-045f-40e4-9447-7e04d6e4ef49" (UID: "b6159204-045f-40e4-9447-7e04d6e4ef49"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.531652 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9141cfe-9794-4248-8631-e9424aa7dbdf-serving-cert\") pod \"route-controller-manager-765f544df7-rlqv5\" (UID: \"e9141cfe-9794-4248-8631-e9424aa7dbdf\") " pod="openshift-route-controller-manager/route-controller-manager-765f544df7-rlqv5" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.531729 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrvh8\" (UniqueName: \"kubernetes.io/projected/e9141cfe-9794-4248-8631-e9424aa7dbdf-kube-api-access-qrvh8\") pod \"route-controller-manager-765f544df7-rlqv5\" (UID: \"e9141cfe-9794-4248-8631-e9424aa7dbdf\") " pod="openshift-route-controller-manager/route-controller-manager-765f544df7-rlqv5" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.531810 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9141cfe-9794-4248-8631-e9424aa7dbdf-config\") pod \"route-controller-manager-765f544df7-rlqv5\" (UID: \"e9141cfe-9794-4248-8631-e9424aa7dbdf\") " pod="openshift-route-controller-manager/route-controller-manager-765f544df7-rlqv5" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.531846 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e9141cfe-9794-4248-8631-e9424aa7dbdf-client-ca\") pod \"route-controller-manager-765f544df7-rlqv5\" (UID: \"e9141cfe-9794-4248-8631-e9424aa7dbdf\") " pod="openshift-route-controller-manager/route-controller-manager-765f544df7-rlqv5" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.531939 4723 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7db77024-0f3c-4752-afa6-2f344ac04325-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.531952 4723 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6159204-045f-40e4-9447-7e04d6e4ef49-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.531961 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7db77024-0f3c-4752-afa6-2f344ac04325-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.531970 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6159204-045f-40e4-9447-7e04d6e4ef49-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.531979 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wbbr\" (UniqueName: \"kubernetes.io/projected/7db77024-0f3c-4752-afa6-2f344ac04325-kube-api-access-7wbbr\") on node \"crc\" DevicePath \"\"" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.531987 4723 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7db77024-0f3c-4752-afa6-2f344ac04325-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.531995 4723 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7db77024-0f3c-4752-afa6-2f344ac04325-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.532006 4723 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6159204-045f-40e4-9447-7e04d6e4ef49-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.532014 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lbzn\" (UniqueName: \"kubernetes.io/projected/b6159204-045f-40e4-9447-7e04d6e4ef49-kube-api-access-8lbzn\") on node \"crc\" DevicePath \"\"" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.634042 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrvh8\" (UniqueName: \"kubernetes.io/projected/e9141cfe-9794-4248-8631-e9424aa7dbdf-kube-api-access-qrvh8\") pod \"route-controller-manager-765f544df7-rlqv5\" (UID: \"e9141cfe-9794-4248-8631-e9424aa7dbdf\") " pod="openshift-route-controller-manager/route-controller-manager-765f544df7-rlqv5" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.634166 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9141cfe-9794-4248-8631-e9424aa7dbdf-config\") pod \"route-controller-manager-765f544df7-rlqv5\" (UID: \"e9141cfe-9794-4248-8631-e9424aa7dbdf\") " pod="openshift-route-controller-manager/route-controller-manager-765f544df7-rlqv5" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.634301 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e9141cfe-9794-4248-8631-e9424aa7dbdf-client-ca\") pod \"route-controller-manager-765f544df7-rlqv5\" (UID: \"e9141cfe-9794-4248-8631-e9424aa7dbdf\") " pod="openshift-route-controller-manager/route-controller-manager-765f544df7-rlqv5" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.634452 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9141cfe-9794-4248-8631-e9424aa7dbdf-serving-cert\") pod \"route-controller-manager-765f544df7-rlqv5\" (UID: \"e9141cfe-9794-4248-8631-e9424aa7dbdf\") " pod="openshift-route-controller-manager/route-controller-manager-765f544df7-rlqv5" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.635309 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e9141cfe-9794-4248-8631-e9424aa7dbdf-client-ca\") pod \"route-controller-manager-765f544df7-rlqv5\" (UID: \"e9141cfe-9794-4248-8631-e9424aa7dbdf\") " pod="openshift-route-controller-manager/route-controller-manager-765f544df7-rlqv5" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.635586 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9141cfe-9794-4248-8631-e9424aa7dbdf-config\") pod \"route-controller-manager-765f544df7-rlqv5\" (UID: \"e9141cfe-9794-4248-8631-e9424aa7dbdf\") " pod="openshift-route-controller-manager/route-controller-manager-765f544df7-rlqv5" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.640278 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9141cfe-9794-4248-8631-e9424aa7dbdf-serving-cert\") pod \"route-controller-manager-765f544df7-rlqv5\" (UID: \"e9141cfe-9794-4248-8631-e9424aa7dbdf\") " pod="openshift-route-controller-manager/route-controller-manager-765f544df7-rlqv5" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.656206 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrvh8\" (UniqueName: \"kubernetes.io/projected/e9141cfe-9794-4248-8631-e9424aa7dbdf-kube-api-access-qrvh8\") pod \"route-controller-manager-765f544df7-rlqv5\" (UID: \"e9141cfe-9794-4248-8631-e9424aa7dbdf\") " pod="openshift-route-controller-manager/route-controller-manager-765f544df7-rlqv5" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.676417 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-765f544df7-rlqv5" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.701356 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d8d56cfd7-9xf8z"] Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.704395 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d8d56cfd7-9xf8z"] Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.713068 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-54c96f5997-tm95d"] Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.716204 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-54c96f5997-tm95d"] Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.888261 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7db77024-0f3c-4752-afa6-2f344ac04325" path="/var/lib/kubelet/pods/7db77024-0f3c-4752-afa6-2f344ac04325/volumes" Mar 09 13:02:22 crc kubenswrapper[4723]: I0309 13:02:22.889012 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6159204-045f-40e4-9447-7e04d6e4ef49" path="/var/lib/kubelet/pods/b6159204-045f-40e4-9447-7e04d6e4ef49/volumes" Mar 09 13:02:24 crc kubenswrapper[4723]: I0309 13:02:24.537185 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7d9f75f444-qxsqq"] Mar 09 13:02:24 crc kubenswrapper[4723]: I0309 13:02:24.538359 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7d9f75f444-qxsqq" Mar 09 13:02:24 crc kubenswrapper[4723]: I0309 13:02:24.540320 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 13:02:24 crc kubenswrapper[4723]: I0309 13:02:24.540480 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 13:02:24 crc kubenswrapper[4723]: I0309 13:02:24.540606 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 13:02:24 crc kubenswrapper[4723]: I0309 13:02:24.540879 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 13:02:24 crc kubenswrapper[4723]: I0309 13:02:24.541042 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 13:02:24 crc kubenswrapper[4723]: I0309 13:02:24.542029 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 13:02:24 crc kubenswrapper[4723]: I0309 13:02:24.549548 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 13:02:24 crc kubenswrapper[4723]: I0309 13:02:24.554438 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7d9f75f444-qxsqq"] Mar 09 13:02:24 crc kubenswrapper[4723]: I0309 13:02:24.662925 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/14629369-42de-4bea-936a-d78ef91b8514-proxy-ca-bundles\") pod \"controller-manager-7d9f75f444-qxsqq\" (UID: \"14629369-42de-4bea-936a-d78ef91b8514\") " pod="openshift-controller-manager/controller-manager-7d9f75f444-qxsqq" Mar 09 13:02:24 crc kubenswrapper[4723]: I0309 13:02:24.662976 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14629369-42de-4bea-936a-d78ef91b8514-serving-cert\") pod \"controller-manager-7d9f75f444-qxsqq\" (UID: \"14629369-42de-4bea-936a-d78ef91b8514\") " pod="openshift-controller-manager/controller-manager-7d9f75f444-qxsqq" Mar 09 13:02:24 crc kubenswrapper[4723]: I0309 13:02:24.663003 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14629369-42de-4bea-936a-d78ef91b8514-config\") pod \"controller-manager-7d9f75f444-qxsqq\" (UID: \"14629369-42de-4bea-936a-d78ef91b8514\") " pod="openshift-controller-manager/controller-manager-7d9f75f444-qxsqq" Mar 09 13:02:24 crc kubenswrapper[4723]: I0309 13:02:24.663068 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14629369-42de-4bea-936a-d78ef91b8514-client-ca\") pod \"controller-manager-7d9f75f444-qxsqq\" (UID: \"14629369-42de-4bea-936a-d78ef91b8514\") " pod="openshift-controller-manager/controller-manager-7d9f75f444-qxsqq" Mar 09 13:02:24 crc kubenswrapper[4723]: I0309 13:02:24.663182 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9drdm\" (UniqueName: \"kubernetes.io/projected/14629369-42de-4bea-936a-d78ef91b8514-kube-api-access-9drdm\") pod \"controller-manager-7d9f75f444-qxsqq\" (UID: \"14629369-42de-4bea-936a-d78ef91b8514\") " pod="openshift-controller-manager/controller-manager-7d9f75f444-qxsqq" Mar 09 13:02:24 crc kubenswrapper[4723]: I0309 13:02:24.764011 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/14629369-42de-4bea-936a-d78ef91b8514-proxy-ca-bundles\") pod \"controller-manager-7d9f75f444-qxsqq\" (UID: \"14629369-42de-4bea-936a-d78ef91b8514\") " pod="openshift-controller-manager/controller-manager-7d9f75f444-qxsqq" Mar 09 13:02:24 crc kubenswrapper[4723]: I0309 13:02:24.764063 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14629369-42de-4bea-936a-d78ef91b8514-serving-cert\") pod \"controller-manager-7d9f75f444-qxsqq\" (UID: \"14629369-42de-4bea-936a-d78ef91b8514\") " pod="openshift-controller-manager/controller-manager-7d9f75f444-qxsqq" Mar 09 13:02:24 crc kubenswrapper[4723]: I0309 13:02:24.764098 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14629369-42de-4bea-936a-d78ef91b8514-config\") pod \"controller-manager-7d9f75f444-qxsqq\" (UID: \"14629369-42de-4bea-936a-d78ef91b8514\") " pod="openshift-controller-manager/controller-manager-7d9f75f444-qxsqq" Mar 09 13:02:24 crc kubenswrapper[4723]: I0309 13:02:24.764147 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14629369-42de-4bea-936a-d78ef91b8514-client-ca\") pod \"controller-manager-7d9f75f444-qxsqq\" (UID: \"14629369-42de-4bea-936a-d78ef91b8514\") " pod="openshift-controller-manager/controller-manager-7d9f75f444-qxsqq" Mar 09 13:02:24 crc kubenswrapper[4723]: I0309 13:02:24.764248 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9drdm\" (UniqueName: \"kubernetes.io/projected/14629369-42de-4bea-936a-d78ef91b8514-kube-api-access-9drdm\") pod \"controller-manager-7d9f75f444-qxsqq\" (UID: \"14629369-42de-4bea-936a-d78ef91b8514\") " pod="openshift-controller-manager/controller-manager-7d9f75f444-qxsqq" Mar 09 13:02:24 crc kubenswrapper[4723]: I0309 13:02:24.764279 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:02:24 crc kubenswrapper[4723]: I0309 13:02:24.765451 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14629369-42de-4bea-936a-d78ef91b8514-client-ca\") pod \"controller-manager-7d9f75f444-qxsqq\" (UID: \"14629369-42de-4bea-936a-d78ef91b8514\") " pod="openshift-controller-manager/controller-manager-7d9f75f444-qxsqq" Mar 09 13:02:24 crc kubenswrapper[4723]: I0309 13:02:24.765521 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/14629369-42de-4bea-936a-d78ef91b8514-proxy-ca-bundles\") pod \"controller-manager-7d9f75f444-qxsqq\" (UID: \"14629369-42de-4bea-936a-d78ef91b8514\") " pod="openshift-controller-manager/controller-manager-7d9f75f444-qxsqq" Mar 09 13:02:24 crc kubenswrapper[4723]: I0309 13:02:24.766663 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 09 13:02:24 crc kubenswrapper[4723]: I0309 13:02:24.767654 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14629369-42de-4bea-936a-d78ef91b8514-config\") pod \"controller-manager-7d9f75f444-qxsqq\" (UID: \"14629369-42de-4bea-936a-d78ef91b8514\") " pod="openshift-controller-manager/controller-manager-7d9f75f444-qxsqq" Mar 09 13:02:24 crc kubenswrapper[4723]: I0309 13:02:24.769652 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14629369-42de-4bea-936a-d78ef91b8514-serving-cert\") pod \"controller-manager-7d9f75f444-qxsqq\" (UID: \"14629369-42de-4bea-936a-d78ef91b8514\") " pod="openshift-controller-manager/controller-manager-7d9f75f444-qxsqq" Mar 09 13:02:24 crc kubenswrapper[4723]: I0309 13:02:24.775423 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:02:24 crc kubenswrapper[4723]: I0309 13:02:24.780986 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9drdm\" (UniqueName: \"kubernetes.io/projected/14629369-42de-4bea-936a-d78ef91b8514-kube-api-access-9drdm\") pod \"controller-manager-7d9f75f444-qxsqq\" (UID: \"14629369-42de-4bea-936a-d78ef91b8514\") " pod="openshift-controller-manager/controller-manager-7d9f75f444-qxsqq" Mar 09 13:02:24 crc kubenswrapper[4723]: I0309 13:02:24.860101 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7d9f75f444-qxsqq" Mar 09 13:02:24 crc kubenswrapper[4723]: I0309 13:02:24.865302 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:02:24 crc kubenswrapper[4723]: I0309 13:02:24.865351 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:02:24 crc kubenswrapper[4723]: I0309 13:02:24.865382 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:02:24 crc kubenswrapper[4723]: I0309 13:02:24.867126 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 09 13:02:24 crc kubenswrapper[4723]: I0309 13:02:24.867710 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 09 13:02:24 crc kubenswrapper[4723]: I0309 13:02:24.877458 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 09 13:02:24 crc kubenswrapper[4723]: I0309 13:02:24.878816 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:02:24 crc kubenswrapper[4723]: I0309 13:02:24.888736 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:02:24 crc kubenswrapper[4723]: I0309 13:02:24.889631 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:02:24 crc kubenswrapper[4723]: I0309 13:02:24.979999 4723 ???:1] "http: TLS handshake error from 192.168.126.11:37816: no serving certificate available for the kubelet" Mar 09 13:02:25 crc kubenswrapper[4723]: I0309 13:02:25.098554 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:02:25 crc kubenswrapper[4723]: I0309 13:02:25.108613 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:02:25 crc kubenswrapper[4723]: I0309 13:02:25.115834 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:02:28 crc kubenswrapper[4723]: E0309 13:02:28.100005 4723 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 09 13:02:28 crc kubenswrapper[4723]: E0309 13:02:28.100154 4723 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m5bxr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-2lhjt_openshift-marketplace(5adbe8b6-fabd-4e21-8507-84df16004837): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 13:02:28 crc kubenswrapper[4723]: E0309 13:02:28.101367 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-2lhjt" podUID="5adbe8b6-fabd-4e21-8507-84df16004837" Mar 09 13:02:28 crc kubenswrapper[4723]: E0309 13:02:28.107518 4723 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 09 13:02:28 crc kubenswrapper[4723]: E0309 13:02:28.107611 4723 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d85gj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-kcvkd_openshift-marketplace(980fde08-18f8-4e22-93a1-3846f9e367ad): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 13:02:28 crc kubenswrapper[4723]: E0309 13:02:28.108781 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-kcvkd" podUID="980fde08-18f8-4e22-93a1-3846f9e367ad" Mar 09 13:02:29 crc kubenswrapper[4723]: I0309 13:02:29.467191 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p9x2d" Mar 09 13:02:29 crc kubenswrapper[4723]: E0309 13:02:29.699025 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-2lhjt" podUID="5adbe8b6-fabd-4e21-8507-84df16004837" Mar 09 13:02:29 crc kubenswrapper[4723]: E0309 13:02:29.699029 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-kcvkd" podUID="980fde08-18f8-4e22-93a1-3846f9e367ad" Mar 09 13:02:29 crc kubenswrapper[4723]: I0309 13:02:29.747501 4723 scope.go:117] "RemoveContainer" containerID="ecf1da64b0b9bc4174a17a47210349df8556dac74e0181a769c736875e007f46" Mar 09 13:02:29 crc kubenswrapper[4723]: E0309 13:02:29.771048 4723 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 09 13:02:29 crc kubenswrapper[4723]: E0309 13:02:29.771208 4723 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6fd2j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-jq2cv_openshift-marketplace(84890bd9-0d95-48f4-89d3-6619e5e5525a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 13:02:29 crc kubenswrapper[4723]: E0309 13:02:29.772515 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-jq2cv" podUID="84890bd9-0d95-48f4-89d3-6619e5e5525a" Mar 09 13:02:29 crc kubenswrapper[4723]: E0309 13:02:29.785988 4723 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 09 13:02:29 crc kubenswrapper[4723]: E0309 13:02:29.786129 4723 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vv9fj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-4x6zm_openshift-marketplace(a7d103aa-232e-4705-a061-8ad7025339cf): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 13:02:29 crc kubenswrapper[4723]: E0309 13:02:29.787423 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-4x6zm" podUID="a7d103aa-232e-4705-a061-8ad7025339cf" Mar 09 13:02:30 crc kubenswrapper[4723]: I0309 13:02:30.276921 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-765f544df7-rlqv5"] Mar 09 13:02:30 crc kubenswrapper[4723]: W0309 13:02:30.340781 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-5990efd6c81baf4a8fb27fbc6519897f5c12c74e08dc438994011376a8ce7e95 WatchSource:0}: Error finding container 5990efd6c81baf4a8fb27fbc6519897f5c12c74e08dc438994011376a8ce7e95: Status 404 returned error can't find the container with id 5990efd6c81baf4a8fb27fbc6519897f5c12c74e08dc438994011376a8ce7e95 Mar 09 13:02:30 crc kubenswrapper[4723]: I0309 13:02:30.420184 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"5990efd6c81baf4a8fb27fbc6519897f5c12c74e08dc438994011376a8ce7e95"} Mar 09 13:02:30 crc kubenswrapper[4723]: I0309 13:02:30.426100 4723 generic.go:334] "Generic (PLEG): container finished" podID="6bc2fb38-5759-4ce6-9c1d-84a6537050e9" containerID="30a17989be411e31f8381dee9cec3aed5cc78f1dd8699eb1224de882f8286923" exitCode=0 Mar 09 13:02:30 crc kubenswrapper[4723]: I0309 13:02:30.426713 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lsmcg" event={"ID":"6bc2fb38-5759-4ce6-9c1d-84a6537050e9","Type":"ContainerDied","Data":"30a17989be411e31f8381dee9cec3aed5cc78f1dd8699eb1224de882f8286923"} Mar 09 13:02:30 crc kubenswrapper[4723]: W0309 13:02:30.434180 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-17cfc3f4516ecf3a008553d06c7cc64d3c97d57978838a3c5d57ccb9ae22f67b WatchSource:0}: Error finding container 17cfc3f4516ecf3a008553d06c7cc64d3c97d57978838a3c5d57ccb9ae22f67b: Status 404 returned error can't find the container with id 17cfc3f4516ecf3a008553d06c7cc64d3c97d57978838a3c5d57ccb9ae22f67b Mar 09 13:02:30 crc kubenswrapper[4723]: I0309 13:02:30.436548 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7d9f75f444-qxsqq"] Mar 09 13:02:30 crc kubenswrapper[4723]: I0309 13:02:30.437374 4723 generic.go:334] "Generic (PLEG): container finished" podID="8ebe9a64-25f7-4d32-bdce-3a3942ba53a2" containerID="cafa6a34afdf28f7f477cb3f7691cbaec13491cafa840531f220a48448d63fff" exitCode=0 Mar 09 13:02:30 crc kubenswrapper[4723]: I0309 13:02:30.437557 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvf4v" event={"ID":"8ebe9a64-25f7-4d32-bdce-3a3942ba53a2","Type":"ContainerDied","Data":"cafa6a34afdf28f7f477cb3f7691cbaec13491cafa840531f220a48448d63fff"} Mar 09 13:02:30 crc kubenswrapper[4723]: I0309 13:02:30.448073 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551022-xgzbd" event={"ID":"b2586638-1604-4545-8203-6b89e38129e6","Type":"ContainerStarted","Data":"f057a186bca17facee44643cdcad3fa67ea23bd1739dd713755164310d050dfd"} Mar 09 13:02:30 crc kubenswrapper[4723]: I0309 13:02:30.452030 4723 generic.go:334] "Generic (PLEG): container finished" podID="fa9195db-65d7-4777-8869-948a26e41933" containerID="a1891ba763f18a6a47a495765e7c3d339abdb8fc358f6ba57e98dac377ab78ce" exitCode=0 Mar 09 13:02:30 crc kubenswrapper[4723]: I0309 13:02:30.452132 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qb2k" event={"ID":"fa9195db-65d7-4777-8869-948a26e41933","Type":"ContainerDied","Data":"a1891ba763f18a6a47a495765e7c3d339abdb8fc358f6ba57e98dac377ab78ce"} Mar 09 13:02:30 crc kubenswrapper[4723]: I0309 13:02:30.456107 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqh66" event={"ID":"b5cafa5f-a4bc-4029-b136-ba9b3e2b6709","Type":"ContainerStarted","Data":"d9976a961885dc5909fdd0cee988c86676cb86d132777f7ab1c4745afa436e1d"} Mar 09 13:02:30 crc kubenswrapper[4723]: E0309 13:02:30.462964 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-4x6zm" podUID="a7d103aa-232e-4705-a061-8ad7025339cf" Mar 09 13:02:30 crc kubenswrapper[4723]: E0309 13:02:30.463022 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-jq2cv" podUID="84890bd9-0d95-48f4-89d3-6619e5e5525a" Mar 09 13:02:30 crc kubenswrapper[4723]: I0309 13:02:30.515242 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551022-xgzbd" podStartSLOduration=1.875591718 podStartE2EDuration="30.51522733s" podCreationTimestamp="2026-03-09 13:02:00 +0000 UTC" firstStartedPulling="2026-03-09 13:02:01.137605716 +0000 UTC m=+195.152073256" lastFinishedPulling="2026-03-09 13:02:29.777241328 +0000 UTC m=+223.791708868" observedRunningTime="2026-03-09 13:02:30.514119922 +0000 UTC m=+224.528587462" watchObservedRunningTime="2026-03-09 13:02:30.51522733 +0000 UTC m=+224.529694870" Mar 09 13:02:30 crc kubenswrapper[4723]: I0309 13:02:30.696336 4723 csr.go:261] certificate signing request csr-zp74f is approved, waiting to be issued Mar 09 13:02:30 crc kubenswrapper[4723]: I0309 13:02:30.704753 4723 csr.go:257] certificate signing request csr-zp74f is issued Mar 09 13:02:30 crc kubenswrapper[4723]: I0309 13:02:30.854601 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 09 13:02:30 crc kubenswrapper[4723]: I0309 13:02:30.855485 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 13:02:30 crc kubenswrapper[4723]: I0309 13:02:30.860512 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 09 13:02:30 crc kubenswrapper[4723]: I0309 13:02:30.861456 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 09 13:02:30 crc kubenswrapper[4723]: I0309 13:02:30.867367 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 09 13:02:30 crc kubenswrapper[4723]: I0309 13:02:30.944381 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/baa6c6a9-71b3-4d4e-965a-9d3acd961579-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"baa6c6a9-71b3-4d4e-965a-9d3acd961579\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 13:02:30 crc kubenswrapper[4723]: I0309 13:02:30.944434 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/baa6c6a9-71b3-4d4e-965a-9d3acd961579-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"baa6c6a9-71b3-4d4e-965a-9d3acd961579\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 13:02:31 crc kubenswrapper[4723]: I0309 13:02:31.047973 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/baa6c6a9-71b3-4d4e-965a-9d3acd961579-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"baa6c6a9-71b3-4d4e-965a-9d3acd961579\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 13:02:31 crc kubenswrapper[4723]: I0309 13:02:31.048324 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/baa6c6a9-71b3-4d4e-965a-9d3acd961579-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"baa6c6a9-71b3-4d4e-965a-9d3acd961579\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 13:02:31 crc kubenswrapper[4723]: I0309 13:02:31.048087 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/baa6c6a9-71b3-4d4e-965a-9d3acd961579-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"baa6c6a9-71b3-4d4e-965a-9d3acd961579\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 13:02:31 crc kubenswrapper[4723]: I0309 13:02:31.082694 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/baa6c6a9-71b3-4d4e-965a-9d3acd961579-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"baa6c6a9-71b3-4d4e-965a-9d3acd961579\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 13:02:31 crc kubenswrapper[4723]: I0309 13:02:31.222901 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 13:02:31 crc kubenswrapper[4723]: I0309 13:02:31.439240 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 09 13:02:31 crc kubenswrapper[4723]: I0309 13:02:31.468629 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"baa6c6a9-71b3-4d4e-965a-9d3acd961579","Type":"ContainerStarted","Data":"287ab7747deb6bb217eb829504365fb4abc6e05bd8680fe3750b989b25842cde"} Mar 09 13:02:31 crc kubenswrapper[4723]: I0309 13:02:31.469897 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"c6fb0789f331edbcca780ee865fe0f9aa00b640f8bad3defdc9b523f3a3b2112"} Mar 09 13:02:31 crc kubenswrapper[4723]: I0309 13:02:31.471288 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"ed61ebc1b62725bf86a4ff7157eca69ed8b4d1538d7423cad467254ac6ae2681"} Mar 09 13:02:31 crc kubenswrapper[4723]: I0309 13:02:31.471317 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"f99b223bdb9c1612905c90f6d310740c7e531a400f97cd2eb010cab46111a0ad"} Mar 09 13:02:31 crc kubenswrapper[4723]: I0309 13:02:31.472454 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-765f544df7-rlqv5" event={"ID":"e9141cfe-9794-4248-8631-e9424aa7dbdf","Type":"ContainerStarted","Data":"b90b1623102991cf265c819f0881039c4baa79ddfe192401932bac33d51eff04"} Mar 09 13:02:31 crc kubenswrapper[4723]: I0309 13:02:31.472484 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-765f544df7-rlqv5" event={"ID":"e9141cfe-9794-4248-8631-e9424aa7dbdf","Type":"ContainerStarted","Data":"9ee3e906d358a06b57cb1067b4060ff8a45b8c9c3ea8ff0f22c4c5d8090381e7"} Mar 09 13:02:31 crc kubenswrapper[4723]: I0309 13:02:31.473350 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-765f544df7-rlqv5" Mar 09 13:02:31 crc kubenswrapper[4723]: I0309 13:02:31.474625 4723 generic.go:334] "Generic (PLEG): container finished" podID="b2586638-1604-4545-8203-6b89e38129e6" containerID="f057a186bca17facee44643cdcad3fa67ea23bd1739dd713755164310d050dfd" exitCode=0 Mar 09 13:02:31 crc kubenswrapper[4723]: I0309 13:02:31.474695 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551022-xgzbd" event={"ID":"b2586638-1604-4545-8203-6b89e38129e6","Type":"ContainerDied","Data":"f057a186bca17facee44643cdcad3fa67ea23bd1739dd713755164310d050dfd"} Mar 09 13:02:31 crc kubenswrapper[4723]: I0309 13:02:31.475900 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"a95433e7fee10c7b24735905769049a5616383ce74643fe72ee4277c2bb42a94"} Mar 09 13:02:31 crc kubenswrapper[4723]: I0309 13:02:31.475925 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"17cfc3f4516ecf3a008553d06c7cc64d3c97d57978838a3c5d57ccb9ae22f67b"} Mar 09 13:02:31 crc kubenswrapper[4723]: I0309 13:02:31.476245 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:02:31 crc kubenswrapper[4723]: I0309 13:02:31.477933 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7d9f75f444-qxsqq" event={"ID":"14629369-42de-4bea-936a-d78ef91b8514","Type":"ContainerStarted","Data":"22a8e8eed05a1999bde6d756dbcd25f7a7b8cdf9ecb4fa5fcc8724c3f7b06bd1"} Mar 09 13:02:31 crc kubenswrapper[4723]: I0309 13:02:31.477962 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7d9f75f444-qxsqq" event={"ID":"14629369-42de-4bea-936a-d78ef91b8514","Type":"ContainerStarted","Data":"068d91d8553cca359277b4449c814b1b950b252374fe5ff8023e3f3adcd722d7"} Mar 09 13:02:31 crc kubenswrapper[4723]: I0309 13:02:31.478416 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7d9f75f444-qxsqq" Mar 09 13:02:31 crc kubenswrapper[4723]: I0309 13:02:31.490740 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-765f544df7-rlqv5" Mar 09 13:02:31 crc kubenswrapper[4723]: I0309 13:02:31.495306 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7d9f75f444-qxsqq" Mar 09 13:02:31 crc kubenswrapper[4723]: I0309 13:02:31.499285 4723 generic.go:334] "Generic (PLEG): container finished" podID="b5cafa5f-a4bc-4029-b136-ba9b3e2b6709" containerID="d9976a961885dc5909fdd0cee988c86676cb86d132777f7ab1c4745afa436e1d" exitCode=0 Mar 09 13:02:31 crc kubenswrapper[4723]: I0309 13:02:31.499346 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqh66" event={"ID":"b5cafa5f-a4bc-4029-b136-ba9b3e2b6709","Type":"ContainerDied","Data":"d9976a961885dc5909fdd0cee988c86676cb86d132777f7ab1c4745afa436e1d"} Mar 09 13:02:31 crc kubenswrapper[4723]: I0309 13:02:31.547328 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-765f544df7-rlqv5" podStartSLOduration=15.54730937 podStartE2EDuration="15.54730937s" podCreationTimestamp="2026-03-09 13:02:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:02:31.546542751 +0000 UTC m=+225.561010301" watchObservedRunningTime="2026-03-09 13:02:31.54730937 +0000 UTC m=+225.561776910" Mar 09 13:02:31 crc kubenswrapper[4723]: I0309 13:02:31.571646 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7d9f75f444-qxsqq" podStartSLOduration=15.571627199 podStartE2EDuration="15.571627199s" podCreationTimestamp="2026-03-09 13:02:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:02:31.571214228 +0000 UTC m=+225.585681768" watchObservedRunningTime="2026-03-09 13:02:31.571627199 +0000 UTC m=+225.586094729" Mar 09 13:02:31 crc kubenswrapper[4723]: I0309 13:02:31.707004 4723 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-25 14:39:02.695200819 +0000 UTC Mar 09 13:02:31 crc kubenswrapper[4723]: I0309 13:02:31.707039 4723 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6265h36m30.988164263s for next certificate rotation Mar 09 13:02:32 crc kubenswrapper[4723]: I0309 13:02:32.506091 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qb2k" event={"ID":"fa9195db-65d7-4777-8869-948a26e41933","Type":"ContainerStarted","Data":"2f0d6afc23a54ff31ea8953c913dbdd3447daf5d3600a0343c3a0bacbe42ae34"} Mar 09 13:02:32 crc kubenswrapper[4723]: I0309 13:02:32.507891 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"baa6c6a9-71b3-4d4e-965a-9d3acd961579","Type":"ContainerStarted","Data":"8636d7f90888d9e31536a9ee459660a1ef2311b15714726378ca51b88fa2cad6"} Mar 09 13:02:32 crc kubenswrapper[4723]: I0309 13:02:32.530083 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.530063267 podStartE2EDuration="2.530063267s" podCreationTimestamp="2026-03-09 13:02:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:02:32.520653511 +0000 UTC m=+226.535121051" watchObservedRunningTime="2026-03-09 13:02:32.530063267 +0000 UTC m=+226.544530807" Mar 09 13:02:32 crc kubenswrapper[4723]: I0309 13:02:32.707768 4723 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-15 05:32:16.475122121 +0000 UTC Mar 09 13:02:32 crc kubenswrapper[4723]: I0309 13:02:32.707806 4723 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6736h29m43.767319237s for next certificate rotation Mar 09 13:02:32 crc kubenswrapper[4723]: I0309 13:02:32.750064 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551022-xgzbd" Mar 09 13:02:32 crc kubenswrapper[4723]: I0309 13:02:32.870994 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69n7q\" (UniqueName: \"kubernetes.io/projected/b2586638-1604-4545-8203-6b89e38129e6-kube-api-access-69n7q\") pod \"b2586638-1604-4545-8203-6b89e38129e6\" (UID: \"b2586638-1604-4545-8203-6b89e38129e6\") " Mar 09 13:02:32 crc kubenswrapper[4723]: I0309 13:02:32.878175 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2586638-1604-4545-8203-6b89e38129e6-kube-api-access-69n7q" (OuterVolumeSpecName: "kube-api-access-69n7q") pod "b2586638-1604-4545-8203-6b89e38129e6" (UID: "b2586638-1604-4545-8203-6b89e38129e6"). InnerVolumeSpecName "kube-api-access-69n7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:02:32 crc kubenswrapper[4723]: I0309 13:02:32.972538 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69n7q\" (UniqueName: \"kubernetes.io/projected/b2586638-1604-4545-8203-6b89e38129e6-kube-api-access-69n7q\") on node \"crc\" DevicePath \"\"" Mar 09 13:02:33 crc kubenswrapper[4723]: I0309 13:02:33.524476 4723 generic.go:334] "Generic (PLEG): container finished" podID="baa6c6a9-71b3-4d4e-965a-9d3acd961579" containerID="8636d7f90888d9e31536a9ee459660a1ef2311b15714726378ca51b88fa2cad6" exitCode=0 Mar 09 13:02:33 crc kubenswrapper[4723]: I0309 13:02:33.524994 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"baa6c6a9-71b3-4d4e-965a-9d3acd961579","Type":"ContainerDied","Data":"8636d7f90888d9e31536a9ee459660a1ef2311b15714726378ca51b88fa2cad6"} Mar 09 13:02:33 crc kubenswrapper[4723]: I0309 13:02:33.528060 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551022-xgzbd" Mar 09 13:02:33 crc kubenswrapper[4723]: I0309 13:02:33.528058 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551022-xgzbd" event={"ID":"b2586638-1604-4545-8203-6b89e38129e6","Type":"ContainerDied","Data":"dac153412c58f3ebd4f1bab6f7ba3b05b7ac9e158db88214adad14ace49dd292"} Mar 09 13:02:33 crc kubenswrapper[4723]: I0309 13:02:33.528479 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dac153412c58f3ebd4f1bab6f7ba3b05b7ac9e158db88214adad14ace49dd292" Mar 09 13:02:33 crc kubenswrapper[4723]: I0309 13:02:33.560781 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9qb2k" podStartSLOduration=3.252297771 podStartE2EDuration="38.560763413s" podCreationTimestamp="2026-03-09 13:01:55 +0000 UTC" firstStartedPulling="2026-03-09 13:01:56.698340844 +0000 UTC m=+190.712808374" lastFinishedPulling="2026-03-09 13:02:32.006806476 +0000 UTC m=+226.021274016" observedRunningTime="2026-03-09 13:02:33.5586625 +0000 UTC m=+227.573130060" watchObservedRunningTime="2026-03-09 13:02:33.560763413 +0000 UTC m=+227.575230953" Mar 09 13:02:34 crc kubenswrapper[4723]: I0309 13:02:34.536818 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqh66" event={"ID":"b5cafa5f-a4bc-4029-b136-ba9b3e2b6709","Type":"ContainerStarted","Data":"76be5bc3f51500f73609fb3b480c784c3e14eaad6c550a5c5e15ecb66497c7c6"} Mar 09 13:02:34 crc kubenswrapper[4723]: I0309 13:02:34.557096 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dqh66" podStartSLOduration=3.937185951 podStartE2EDuration="40.557073678s" podCreationTimestamp="2026-03-09 13:01:54 +0000 UTC" firstStartedPulling="2026-03-09 13:01:56.706508613 +0000 UTC m=+190.720976153" lastFinishedPulling="2026-03-09 13:02:33.32639634 +0000 UTC m=+227.340863880" observedRunningTime="2026-03-09 13:02:34.554800731 +0000 UTC m=+228.569268271" watchObservedRunningTime="2026-03-09 13:02:34.557073678 +0000 UTC m=+228.571541218" Mar 09 13:02:34 crc kubenswrapper[4723]: I0309 13:02:34.837084 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 13:02:34 crc kubenswrapper[4723]: I0309 13:02:34.902626 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/baa6c6a9-71b3-4d4e-965a-9d3acd961579-kubelet-dir\") pod \"baa6c6a9-71b3-4d4e-965a-9d3acd961579\" (UID: \"baa6c6a9-71b3-4d4e-965a-9d3acd961579\") " Mar 09 13:02:34 crc kubenswrapper[4723]: I0309 13:02:34.902689 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/baa6c6a9-71b3-4d4e-965a-9d3acd961579-kube-api-access\") pod \"baa6c6a9-71b3-4d4e-965a-9d3acd961579\" (UID: \"baa6c6a9-71b3-4d4e-965a-9d3acd961579\") " Mar 09 13:02:34 crc kubenswrapper[4723]: I0309 13:02:34.902774 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/baa6c6a9-71b3-4d4e-965a-9d3acd961579-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "baa6c6a9-71b3-4d4e-965a-9d3acd961579" (UID: "baa6c6a9-71b3-4d4e-965a-9d3acd961579"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:02:34 crc kubenswrapper[4723]: I0309 13:02:34.903193 4723 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/baa6c6a9-71b3-4d4e-965a-9d3acd961579-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 09 13:02:34 crc kubenswrapper[4723]: I0309 13:02:34.916055 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baa6c6a9-71b3-4d4e-965a-9d3acd961579-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "baa6c6a9-71b3-4d4e-965a-9d3acd961579" (UID: "baa6c6a9-71b3-4d4e-965a-9d3acd961579"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:02:35 crc kubenswrapper[4723]: I0309 13:02:35.004268 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/baa6c6a9-71b3-4d4e-965a-9d3acd961579-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 13:02:35 crc kubenswrapper[4723]: I0309 13:02:35.024988 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dqh66" Mar 09 13:02:35 crc kubenswrapper[4723]: I0309 13:02:35.025035 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dqh66" Mar 09 13:02:35 crc kubenswrapper[4723]: I0309 13:02:35.527290 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9qb2k" Mar 09 13:02:35 crc kubenswrapper[4723]: I0309 13:02:35.527350 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9qb2k" Mar 09 13:02:35 crc kubenswrapper[4723]: I0309 13:02:35.545350 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvf4v" event={"ID":"8ebe9a64-25f7-4d32-bdce-3a3942ba53a2","Type":"ContainerStarted","Data":"004448f6e17452bc7c76fca92db1a588abea82406b34f99452568f10d3ddf560"} Mar 09 13:02:35 crc kubenswrapper[4723]: I0309 13:02:35.549737 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lsmcg" event={"ID":"6bc2fb38-5759-4ce6-9c1d-84a6537050e9","Type":"ContainerStarted","Data":"bbefc00f23848472d0475fdc1f07e470ad2d8f5a4de2420b2958a0fee0e0218b"} Mar 09 13:02:35 crc kubenswrapper[4723]: I0309 13:02:35.555203 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"baa6c6a9-71b3-4d4e-965a-9d3acd961579","Type":"ContainerDied","Data":"287ab7747deb6bb217eb829504365fb4abc6e05bd8680fe3750b989b25842cde"} Mar 09 13:02:35 crc kubenswrapper[4723]: I0309 13:02:35.555268 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="287ab7747deb6bb217eb829504365fb4abc6e05bd8680fe3750b989b25842cde" Mar 09 13:02:35 crc kubenswrapper[4723]: I0309 13:02:35.555225 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 13:02:35 crc kubenswrapper[4723]: I0309 13:02:35.565606 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pvf4v" podStartSLOduration=3.2136032119999998 podStartE2EDuration="39.565590809s" podCreationTimestamp="2026-03-09 13:01:56 +0000 UTC" firstStartedPulling="2026-03-09 13:01:57.751812631 +0000 UTC m=+191.766280171" lastFinishedPulling="2026-03-09 13:02:34.103800238 +0000 UTC m=+228.118267768" observedRunningTime="2026-03-09 13:02:35.56482044 +0000 UTC m=+229.579287980" watchObservedRunningTime="2026-03-09 13:02:35.565590809 +0000 UTC m=+229.580058349" Mar 09 13:02:35 crc kubenswrapper[4723]: I0309 13:02:35.592518 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lsmcg" podStartSLOduration=3.822450623 podStartE2EDuration="39.592503902s" podCreationTimestamp="2026-03-09 13:01:56 +0000 UTC" firstStartedPulling="2026-03-09 13:01:58.861759479 +0000 UTC m=+192.876227019" lastFinishedPulling="2026-03-09 13:02:34.631812758 +0000 UTC m=+228.646280298" observedRunningTime="2026-03-09 13:02:35.58800545 +0000 UTC m=+229.602473000" watchObservedRunningTime="2026-03-09 13:02:35.592503902 +0000 UTC m=+229.606971432" Mar 09 13:02:36 crc kubenswrapper[4723]: I0309 13:02:36.152456 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-dqh66" podUID="b5cafa5f-a4bc-4029-b136-ba9b3e2b6709" containerName="registry-server" probeResult="failure" output=< Mar 09 13:02:36 crc kubenswrapper[4723]: timeout: failed to connect service ":50051" within 1s Mar 09 13:02:36 crc kubenswrapper[4723]: > Mar 09 13:02:36 crc kubenswrapper[4723]: I0309 13:02:36.418677 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7d9f75f444-qxsqq"] Mar 09 13:02:36 crc kubenswrapper[4723]: I0309 13:02:36.419079 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7d9f75f444-qxsqq" podUID="14629369-42de-4bea-936a-d78ef91b8514" containerName="controller-manager" containerID="cri-o://22a8e8eed05a1999bde6d756dbcd25f7a7b8cdf9ecb4fa5fcc8724c3f7b06bd1" gracePeriod=30 Mar 09 13:02:36 crc kubenswrapper[4723]: I0309 13:02:36.448412 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 09 13:02:36 crc kubenswrapper[4723]: E0309 13:02:36.448599 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baa6c6a9-71b3-4d4e-965a-9d3acd961579" containerName="pruner" Mar 09 13:02:36 crc kubenswrapper[4723]: I0309 13:02:36.448610 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="baa6c6a9-71b3-4d4e-965a-9d3acd961579" containerName="pruner" Mar 09 13:02:36 crc kubenswrapper[4723]: E0309 13:02:36.448629 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2586638-1604-4545-8203-6b89e38129e6" containerName="oc" Mar 09 13:02:36 crc kubenswrapper[4723]: I0309 13:02:36.448634 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2586638-1604-4545-8203-6b89e38129e6" containerName="oc" Mar 09 13:02:36 crc kubenswrapper[4723]: I0309 13:02:36.448718 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="baa6c6a9-71b3-4d4e-965a-9d3acd961579" containerName="pruner" Mar 09 13:02:36 crc kubenswrapper[4723]: I0309 13:02:36.448726 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2586638-1604-4545-8203-6b89e38129e6" containerName="oc" Mar 09 13:02:36 crc kubenswrapper[4723]: I0309 13:02:36.449079 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:02:36 crc kubenswrapper[4723]: I0309 13:02:36.455526 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 09 13:02:36 crc kubenswrapper[4723]: I0309 13:02:36.455947 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 09 13:02:36 crc kubenswrapper[4723]: I0309 13:02:36.464021 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 09 13:02:36 crc kubenswrapper[4723]: I0309 13:02:36.527256 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dbe90cf6-9e4d-49b5-ac07-c9c88288d058-var-lock\") pod \"installer-9-crc\" (UID: \"dbe90cf6-9e4d-49b5-ac07-c9c88288d058\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:02:36 crc kubenswrapper[4723]: I0309 13:02:36.527355 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dbe90cf6-9e4d-49b5-ac07-c9c88288d058-kube-api-access\") pod \"installer-9-crc\" (UID: \"dbe90cf6-9e4d-49b5-ac07-c9c88288d058\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:02:36 crc kubenswrapper[4723]: I0309 13:02:36.527543 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dbe90cf6-9e4d-49b5-ac07-c9c88288d058-kubelet-dir\") pod \"installer-9-crc\" (UID: \"dbe90cf6-9e4d-49b5-ac07-c9c88288d058\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:02:36 crc kubenswrapper[4723]: I0309 13:02:36.537743 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-765f544df7-rlqv5"] Mar 09 13:02:36 crc kubenswrapper[4723]: I0309 13:02:36.538228 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-765f544df7-rlqv5" podUID="e9141cfe-9794-4248-8631-e9424aa7dbdf" containerName="route-controller-manager" containerID="cri-o://b90b1623102991cf265c819f0881039c4baa79ddfe192401932bac33d51eff04" gracePeriod=30 Mar 09 13:02:36 crc kubenswrapper[4723]: I0309 13:02:36.575993 4723 generic.go:334] "Generic (PLEG): container finished" podID="14629369-42de-4bea-936a-d78ef91b8514" containerID="22a8e8eed05a1999bde6d756dbcd25f7a7b8cdf9ecb4fa5fcc8724c3f7b06bd1" exitCode=0 Mar 09 13:02:36 crc kubenswrapper[4723]: I0309 13:02:36.576984 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7d9f75f444-qxsqq" event={"ID":"14629369-42de-4bea-936a-d78ef91b8514","Type":"ContainerDied","Data":"22a8e8eed05a1999bde6d756dbcd25f7a7b8cdf9ecb4fa5fcc8724c3f7b06bd1"} Mar 09 13:02:36 crc kubenswrapper[4723]: I0309 13:02:36.581035 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-9qb2k" podUID="fa9195db-65d7-4777-8869-948a26e41933" containerName="registry-server" probeResult="failure" output=< Mar 09 13:02:36 crc kubenswrapper[4723]: timeout: failed to connect service ":50051" within 1s Mar 09 13:02:36 crc kubenswrapper[4723]: > Mar 09 13:02:36 crc kubenswrapper[4723]: I0309 13:02:36.628998 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dbe90cf6-9e4d-49b5-ac07-c9c88288d058-kube-api-access\") pod \"installer-9-crc\" (UID: \"dbe90cf6-9e4d-49b5-ac07-c9c88288d058\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:02:36 crc kubenswrapper[4723]: I0309 13:02:36.629077 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dbe90cf6-9e4d-49b5-ac07-c9c88288d058-kubelet-dir\") pod \"installer-9-crc\" (UID: \"dbe90cf6-9e4d-49b5-ac07-c9c88288d058\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:02:36 crc kubenswrapper[4723]: I0309 13:02:36.629108 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dbe90cf6-9e4d-49b5-ac07-c9c88288d058-var-lock\") pod \"installer-9-crc\" (UID: \"dbe90cf6-9e4d-49b5-ac07-c9c88288d058\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:02:36 crc kubenswrapper[4723]: I0309 13:02:36.629174 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dbe90cf6-9e4d-49b5-ac07-c9c88288d058-var-lock\") pod \"installer-9-crc\" (UID: \"dbe90cf6-9e4d-49b5-ac07-c9c88288d058\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:02:36 crc kubenswrapper[4723]: I0309 13:02:36.629486 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dbe90cf6-9e4d-49b5-ac07-c9c88288d058-kubelet-dir\") pod \"installer-9-crc\" (UID: \"dbe90cf6-9e4d-49b5-ac07-c9c88288d058\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:02:36 crc kubenswrapper[4723]: I0309 13:02:36.652127 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dbe90cf6-9e4d-49b5-ac07-c9c88288d058-kube-api-access\") pod \"installer-9-crc\" (UID: \"dbe90cf6-9e4d-49b5-ac07-c9c88288d058\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:02:36 crc kubenswrapper[4723]: I0309 13:02:36.779549 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:02:36 crc kubenswrapper[4723]: I0309 13:02:36.784343 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pvf4v" Mar 09 13:02:36 crc kubenswrapper[4723]: I0309 13:02:36.784382 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pvf4v" Mar 09 13:02:36 crc kubenswrapper[4723]: I0309 13:02:36.916003 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7d9f75f444-qxsqq" Mar 09 13:02:36 crc kubenswrapper[4723]: I0309 13:02:36.962512 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-765f544df7-rlqv5" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.035694 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14629369-42de-4bea-936a-d78ef91b8514-serving-cert\") pod \"14629369-42de-4bea-936a-d78ef91b8514\" (UID: \"14629369-42de-4bea-936a-d78ef91b8514\") " Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.035749 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9141cfe-9794-4248-8631-e9424aa7dbdf-config\") pod \"e9141cfe-9794-4248-8631-e9424aa7dbdf\" (UID: \"e9141cfe-9794-4248-8631-e9424aa7dbdf\") " Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.035783 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e9141cfe-9794-4248-8631-e9424aa7dbdf-client-ca\") pod \"e9141cfe-9794-4248-8631-e9424aa7dbdf\" (UID: \"e9141cfe-9794-4248-8631-e9424aa7dbdf\") " Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.035809 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrvh8\" (UniqueName: \"kubernetes.io/projected/e9141cfe-9794-4248-8631-e9424aa7dbdf-kube-api-access-qrvh8\") pod \"e9141cfe-9794-4248-8631-e9424aa7dbdf\" (UID: \"e9141cfe-9794-4248-8631-e9424aa7dbdf\") " Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.035874 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9drdm\" (UniqueName: \"kubernetes.io/projected/14629369-42de-4bea-936a-d78ef91b8514-kube-api-access-9drdm\") pod \"14629369-42de-4bea-936a-d78ef91b8514\" (UID: \"14629369-42de-4bea-936a-d78ef91b8514\") " Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.035898 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14629369-42de-4bea-936a-d78ef91b8514-config\") pod \"14629369-42de-4bea-936a-d78ef91b8514\" (UID: \"14629369-42de-4bea-936a-d78ef91b8514\") " Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.035929 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/14629369-42de-4bea-936a-d78ef91b8514-proxy-ca-bundles\") pod \"14629369-42de-4bea-936a-d78ef91b8514\" (UID: \"14629369-42de-4bea-936a-d78ef91b8514\") " Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.035954 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9141cfe-9794-4248-8631-e9424aa7dbdf-serving-cert\") pod \"e9141cfe-9794-4248-8631-e9424aa7dbdf\" (UID: \"e9141cfe-9794-4248-8631-e9424aa7dbdf\") " Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.035984 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14629369-42de-4bea-936a-d78ef91b8514-client-ca\") pod \"14629369-42de-4bea-936a-d78ef91b8514\" (UID: \"14629369-42de-4bea-936a-d78ef91b8514\") " Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.036496 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9141cfe-9794-4248-8631-e9424aa7dbdf-client-ca" (OuterVolumeSpecName: "client-ca") pod "e9141cfe-9794-4248-8631-e9424aa7dbdf" (UID: "e9141cfe-9794-4248-8631-e9424aa7dbdf"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.036555 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9141cfe-9794-4248-8631-e9424aa7dbdf-config" (OuterVolumeSpecName: "config") pod "e9141cfe-9794-4248-8631-e9424aa7dbdf" (UID: "e9141cfe-9794-4248-8631-e9424aa7dbdf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.036589 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14629369-42de-4bea-936a-d78ef91b8514-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "14629369-42de-4bea-936a-d78ef91b8514" (UID: "14629369-42de-4bea-936a-d78ef91b8514"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.036783 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14629369-42de-4bea-936a-d78ef91b8514-client-ca" (OuterVolumeSpecName: "client-ca") pod "14629369-42de-4bea-936a-d78ef91b8514" (UID: "14629369-42de-4bea-936a-d78ef91b8514"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.036961 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14629369-42de-4bea-936a-d78ef91b8514-config" (OuterVolumeSpecName: "config") pod "14629369-42de-4bea-936a-d78ef91b8514" (UID: "14629369-42de-4bea-936a-d78ef91b8514"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.039646 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9141cfe-9794-4248-8631-e9424aa7dbdf-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e9141cfe-9794-4248-8631-e9424aa7dbdf" (UID: "e9141cfe-9794-4248-8631-e9424aa7dbdf"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.039653 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14629369-42de-4bea-936a-d78ef91b8514-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "14629369-42de-4bea-936a-d78ef91b8514" (UID: "14629369-42de-4bea-936a-d78ef91b8514"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.044376 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9141cfe-9794-4248-8631-e9424aa7dbdf-kube-api-access-qrvh8" (OuterVolumeSpecName: "kube-api-access-qrvh8") pod "e9141cfe-9794-4248-8631-e9424aa7dbdf" (UID: "e9141cfe-9794-4248-8631-e9424aa7dbdf"). InnerVolumeSpecName "kube-api-access-qrvh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.045558 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14629369-42de-4bea-936a-d78ef91b8514-kube-api-access-9drdm" (OuterVolumeSpecName: "kube-api-access-9drdm") pod "14629369-42de-4bea-936a-d78ef91b8514" (UID: "14629369-42de-4bea-936a-d78ef91b8514"). InnerVolumeSpecName "kube-api-access-9drdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.137406 4723 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14629369-42de-4bea-936a-d78ef91b8514-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.137446 4723 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14629369-42de-4bea-936a-d78ef91b8514-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.137455 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9141cfe-9794-4248-8631-e9424aa7dbdf-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.137463 4723 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e9141cfe-9794-4248-8631-e9424aa7dbdf-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.137473 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrvh8\" (UniqueName: \"kubernetes.io/projected/e9141cfe-9794-4248-8631-e9424aa7dbdf-kube-api-access-qrvh8\") on node \"crc\" DevicePath \"\"" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.137485 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9drdm\" (UniqueName: \"kubernetes.io/projected/14629369-42de-4bea-936a-d78ef91b8514-kube-api-access-9drdm\") on node \"crc\" DevicePath \"\"" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.137494 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14629369-42de-4bea-936a-d78ef91b8514-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.137503 4723 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/14629369-42de-4bea-936a-d78ef91b8514-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.137513 4723 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9141cfe-9794-4248-8631-e9424aa7dbdf-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.225716 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 09 13:02:37 crc kubenswrapper[4723]: W0309 13:02:37.230103 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poddbe90cf6_9e4d_49b5_ac07_c9c88288d058.slice/crio-0be9bc4bac89eea171f55a0bfd0ef1393756a96a6cc93f84a5f4b240a599e022 WatchSource:0}: Error finding container 0be9bc4bac89eea171f55a0bfd0ef1393756a96a6cc93f84a5f4b240a599e022: Status 404 returned error can't find the container with id 0be9bc4bac89eea171f55a0bfd0ef1393756a96a6cc93f84a5f4b240a599e022 Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.324153 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lsmcg" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.324500 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lsmcg" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.378322 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lsmcg" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.552325 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-9b48dcbf5-8ksvk"] Mar 09 13:02:37 crc kubenswrapper[4723]: E0309 13:02:37.552954 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9141cfe-9794-4248-8631-e9424aa7dbdf" containerName="route-controller-manager" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.553098 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9141cfe-9794-4248-8631-e9424aa7dbdf" containerName="route-controller-manager" Mar 09 13:02:37 crc kubenswrapper[4723]: E0309 13:02:37.553225 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14629369-42de-4bea-936a-d78ef91b8514" containerName="controller-manager" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.553343 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="14629369-42de-4bea-936a-d78ef91b8514" containerName="controller-manager" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.553615 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="14629369-42de-4bea-936a-d78ef91b8514" containerName="controller-manager" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.553746 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9141cfe-9794-4248-8631-e9424aa7dbdf" containerName="route-controller-manager" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.554829 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9b48dcbf5-8ksvk" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.567417 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-9b48dcbf5-8ksvk"] Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.576257 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-dh6qm"] Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.603581 4723 generic.go:334] "Generic (PLEG): container finished" podID="e9141cfe-9794-4248-8631-e9424aa7dbdf" containerID="b90b1623102991cf265c819f0881039c4baa79ddfe192401932bac33d51eff04" exitCode=0 Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.603758 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-765f544df7-rlqv5" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.605665 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-765f544df7-rlqv5" event={"ID":"e9141cfe-9794-4248-8631-e9424aa7dbdf","Type":"ContainerDied","Data":"b90b1623102991cf265c819f0881039c4baa79ddfe192401932bac33d51eff04"} Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.605715 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-765f544df7-rlqv5" event={"ID":"e9141cfe-9794-4248-8631-e9424aa7dbdf","Type":"ContainerDied","Data":"9ee3e906d358a06b57cb1067b4060ff8a45b8c9c3ea8ff0f22c4c5d8090381e7"} Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.605738 4723 scope.go:117] "RemoveContainer" containerID="b90b1623102991cf265c819f0881039c4baa79ddfe192401932bac33d51eff04" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.635400 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7d9f75f444-qxsqq" event={"ID":"14629369-42de-4bea-936a-d78ef91b8514","Type":"ContainerDied","Data":"068d91d8553cca359277b4449c814b1b950b252374fe5ff8023e3f3adcd722d7"} Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.635520 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7d9f75f444-qxsqq" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.639398 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"dbe90cf6-9e4d-49b5-ac07-c9c88288d058","Type":"ContainerStarted","Data":"0be9bc4bac89eea171f55a0bfd0ef1393756a96a6cc93f84a5f4b240a599e022"} Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.647563 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77ca9a9c-0c47-466d-8219-759581449711-serving-cert\") pod \"controller-manager-9b48dcbf5-8ksvk\" (UID: \"77ca9a9c-0c47-466d-8219-759581449711\") " pod="openshift-controller-manager/controller-manager-9b48dcbf5-8ksvk" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.647623 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77ca9a9c-0c47-466d-8219-759581449711-config\") pod \"controller-manager-9b48dcbf5-8ksvk\" (UID: \"77ca9a9c-0c47-466d-8219-759581449711\") " pod="openshift-controller-manager/controller-manager-9b48dcbf5-8ksvk" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.647690 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/77ca9a9c-0c47-466d-8219-759581449711-client-ca\") pod \"controller-manager-9b48dcbf5-8ksvk\" (UID: \"77ca9a9c-0c47-466d-8219-759581449711\") " pod="openshift-controller-manager/controller-manager-9b48dcbf5-8ksvk" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.647719 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j4fc\" (UniqueName: \"kubernetes.io/projected/77ca9a9c-0c47-466d-8219-759581449711-kube-api-access-8j4fc\") pod \"controller-manager-9b48dcbf5-8ksvk\" (UID: \"77ca9a9c-0c47-466d-8219-759581449711\") " pod="openshift-controller-manager/controller-manager-9b48dcbf5-8ksvk" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.647747 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/77ca9a9c-0c47-466d-8219-759581449711-proxy-ca-bundles\") pod \"controller-manager-9b48dcbf5-8ksvk\" (UID: \"77ca9a9c-0c47-466d-8219-759581449711\") " pod="openshift-controller-manager/controller-manager-9b48dcbf5-8ksvk" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.670683 4723 scope.go:117] "RemoveContainer" containerID="b90b1623102991cf265c819f0881039c4baa79ddfe192401932bac33d51eff04" Mar 09 13:02:37 crc kubenswrapper[4723]: E0309 13:02:37.673872 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b90b1623102991cf265c819f0881039c4baa79ddfe192401932bac33d51eff04\": container with ID starting with b90b1623102991cf265c819f0881039c4baa79ddfe192401932bac33d51eff04 not found: ID does not exist" containerID="b90b1623102991cf265c819f0881039c4baa79ddfe192401932bac33d51eff04" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.673920 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b90b1623102991cf265c819f0881039c4baa79ddfe192401932bac33d51eff04"} err="failed to get container status \"b90b1623102991cf265c819f0881039c4baa79ddfe192401932bac33d51eff04\": rpc error: code = NotFound desc = could not find container \"b90b1623102991cf265c819f0881039c4baa79ddfe192401932bac33d51eff04\": container with ID starting with b90b1623102991cf265c819f0881039c4baa79ddfe192401932bac33d51eff04 not found: ID does not exist" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.673960 4723 scope.go:117] "RemoveContainer" containerID="22a8e8eed05a1999bde6d756dbcd25f7a7b8cdf9ecb4fa5fcc8724c3f7b06bd1" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.674229 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-765f544df7-rlqv5"] Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.689337 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-765f544df7-rlqv5"] Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.700078 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7d9f75f444-qxsqq"] Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.703571 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7d9f75f444-qxsqq"] Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.749051 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/77ca9a9c-0c47-466d-8219-759581449711-client-ca\") pod \"controller-manager-9b48dcbf5-8ksvk\" (UID: \"77ca9a9c-0c47-466d-8219-759581449711\") " pod="openshift-controller-manager/controller-manager-9b48dcbf5-8ksvk" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.749449 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8j4fc\" (UniqueName: \"kubernetes.io/projected/77ca9a9c-0c47-466d-8219-759581449711-kube-api-access-8j4fc\") pod \"controller-manager-9b48dcbf5-8ksvk\" (UID: \"77ca9a9c-0c47-466d-8219-759581449711\") " pod="openshift-controller-manager/controller-manager-9b48dcbf5-8ksvk" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.749482 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/77ca9a9c-0c47-466d-8219-759581449711-proxy-ca-bundles\") pod \"controller-manager-9b48dcbf5-8ksvk\" (UID: \"77ca9a9c-0c47-466d-8219-759581449711\") " pod="openshift-controller-manager/controller-manager-9b48dcbf5-8ksvk" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.749509 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77ca9a9c-0c47-466d-8219-759581449711-serving-cert\") pod \"controller-manager-9b48dcbf5-8ksvk\" (UID: \"77ca9a9c-0c47-466d-8219-759581449711\") " pod="openshift-controller-manager/controller-manager-9b48dcbf5-8ksvk" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.749570 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77ca9a9c-0c47-466d-8219-759581449711-config\") pod \"controller-manager-9b48dcbf5-8ksvk\" (UID: \"77ca9a9c-0c47-466d-8219-759581449711\") " pod="openshift-controller-manager/controller-manager-9b48dcbf5-8ksvk" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.750707 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/77ca9a9c-0c47-466d-8219-759581449711-client-ca\") pod \"controller-manager-9b48dcbf5-8ksvk\" (UID: \"77ca9a9c-0c47-466d-8219-759581449711\") " pod="openshift-controller-manager/controller-manager-9b48dcbf5-8ksvk" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.750997 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77ca9a9c-0c47-466d-8219-759581449711-config\") pod \"controller-manager-9b48dcbf5-8ksvk\" (UID: \"77ca9a9c-0c47-466d-8219-759581449711\") " pod="openshift-controller-manager/controller-manager-9b48dcbf5-8ksvk" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.751468 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/77ca9a9c-0c47-466d-8219-759581449711-proxy-ca-bundles\") pod \"controller-manager-9b48dcbf5-8ksvk\" (UID: \"77ca9a9c-0c47-466d-8219-759581449711\") " pod="openshift-controller-manager/controller-manager-9b48dcbf5-8ksvk" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.755833 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77ca9a9c-0c47-466d-8219-759581449711-serving-cert\") pod \"controller-manager-9b48dcbf5-8ksvk\" (UID: \"77ca9a9c-0c47-466d-8219-759581449711\") " pod="openshift-controller-manager/controller-manager-9b48dcbf5-8ksvk" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.780651 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j4fc\" (UniqueName: \"kubernetes.io/projected/77ca9a9c-0c47-466d-8219-759581449711-kube-api-access-8j4fc\") pod \"controller-manager-9b48dcbf5-8ksvk\" (UID: \"77ca9a9c-0c47-466d-8219-759581449711\") " pod="openshift-controller-manager/controller-manager-9b48dcbf5-8ksvk" Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.851670 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-pvf4v" podUID="8ebe9a64-25f7-4d32-bdce-3a3942ba53a2" containerName="registry-server" probeResult="failure" output=< Mar 09 13:02:37 crc kubenswrapper[4723]: timeout: failed to connect service ":50051" within 1s Mar 09 13:02:37 crc kubenswrapper[4723]: > Mar 09 13:02:37 crc kubenswrapper[4723]: I0309 13:02:37.915671 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9b48dcbf5-8ksvk" Mar 09 13:02:38 crc kubenswrapper[4723]: I0309 13:02:38.141527 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-9b48dcbf5-8ksvk"] Mar 09 13:02:38 crc kubenswrapper[4723]: I0309 13:02:38.554577 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-845445df8d-z5lnp"] Mar 09 13:02:38 crc kubenswrapper[4723]: I0309 13:02:38.555773 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-845445df8d-z5lnp" Mar 09 13:02:38 crc kubenswrapper[4723]: I0309 13:02:38.562439 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 13:02:38 crc kubenswrapper[4723]: I0309 13:02:38.562647 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 13:02:38 crc kubenswrapper[4723]: I0309 13:02:38.562673 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 13:02:38 crc kubenswrapper[4723]: I0309 13:02:38.562770 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 13:02:38 crc kubenswrapper[4723]: I0309 13:02:38.562840 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 13:02:38 crc kubenswrapper[4723]: I0309 13:02:38.563426 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 13:02:38 crc kubenswrapper[4723]: I0309 13:02:38.578548 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-845445df8d-z5lnp"] Mar 09 13:02:38 crc kubenswrapper[4723]: I0309 13:02:38.647092 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9b48dcbf5-8ksvk" event={"ID":"77ca9a9c-0c47-466d-8219-759581449711","Type":"ContainerStarted","Data":"3a7640f4ab56061800ba976071832bb777f8ec93cad5fc88d7e0daf9b8843529"} Mar 09 13:02:38 crc kubenswrapper[4723]: I0309 13:02:38.647138 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9b48dcbf5-8ksvk" event={"ID":"77ca9a9c-0c47-466d-8219-759581449711","Type":"ContainerStarted","Data":"ed87aa963715d6ed1887faf7ea9ffac60022189e7c7e2999a0d71a246e6a1ddc"} Mar 09 13:02:38 crc kubenswrapper[4723]: I0309 13:02:38.647498 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-9b48dcbf5-8ksvk" Mar 09 13:02:38 crc kubenswrapper[4723]: I0309 13:02:38.648775 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"dbe90cf6-9e4d-49b5-ac07-c9c88288d058","Type":"ContainerStarted","Data":"b7e6c6d43080a4ba0949a732c2f3aebd111c7ded6f6cce645864db7d3b1988f9"} Mar 09 13:02:38 crc kubenswrapper[4723]: I0309 13:02:38.652243 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-9b48dcbf5-8ksvk" Mar 09 13:02:38 crc kubenswrapper[4723]: I0309 13:02:38.661464 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba867af3-3d9e-43bf-8d35-0ebd9268b373-config\") pod \"route-controller-manager-845445df8d-z5lnp\" (UID: \"ba867af3-3d9e-43bf-8d35-0ebd9268b373\") " pod="openshift-route-controller-manager/route-controller-manager-845445df8d-z5lnp" Mar 09 13:02:38 crc kubenswrapper[4723]: I0309 13:02:38.661506 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tlsh\" (UniqueName: \"kubernetes.io/projected/ba867af3-3d9e-43bf-8d35-0ebd9268b373-kube-api-access-4tlsh\") pod \"route-controller-manager-845445df8d-z5lnp\" (UID: \"ba867af3-3d9e-43bf-8d35-0ebd9268b373\") " pod="openshift-route-controller-manager/route-controller-manager-845445df8d-z5lnp" Mar 09 13:02:38 crc kubenswrapper[4723]: I0309 13:02:38.661539 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ba867af3-3d9e-43bf-8d35-0ebd9268b373-client-ca\") pod \"route-controller-manager-845445df8d-z5lnp\" (UID: \"ba867af3-3d9e-43bf-8d35-0ebd9268b373\") " pod="openshift-route-controller-manager/route-controller-manager-845445df8d-z5lnp" Mar 09 13:02:38 crc kubenswrapper[4723]: I0309 13:02:38.661552 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba867af3-3d9e-43bf-8d35-0ebd9268b373-serving-cert\") pod \"route-controller-manager-845445df8d-z5lnp\" (UID: \"ba867af3-3d9e-43bf-8d35-0ebd9268b373\") " pod="openshift-route-controller-manager/route-controller-manager-845445df8d-z5lnp" Mar 09 13:02:38 crc kubenswrapper[4723]: I0309 13:02:38.680041 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-9b48dcbf5-8ksvk" podStartSLOduration=2.680023275 podStartE2EDuration="2.680023275s" podCreationTimestamp="2026-03-09 13:02:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:02:38.665391999 +0000 UTC m=+232.679859539" watchObservedRunningTime="2026-03-09 13:02:38.680023275 +0000 UTC m=+232.694490815" Mar 09 13:02:38 crc kubenswrapper[4723]: I0309 13:02:38.704051 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.704036806 podStartE2EDuration="2.704036806s" podCreationTimestamp="2026-03-09 13:02:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:02:38.70219008 +0000 UTC m=+232.716657620" watchObservedRunningTime="2026-03-09 13:02:38.704036806 +0000 UTC m=+232.718504346" Mar 09 13:02:38 crc kubenswrapper[4723]: I0309 13:02:38.763168 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba867af3-3d9e-43bf-8d35-0ebd9268b373-config\") pod \"route-controller-manager-845445df8d-z5lnp\" (UID: \"ba867af3-3d9e-43bf-8d35-0ebd9268b373\") " pod="openshift-route-controller-manager/route-controller-manager-845445df8d-z5lnp" Mar 09 13:02:38 crc kubenswrapper[4723]: I0309 13:02:38.763222 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tlsh\" (UniqueName: \"kubernetes.io/projected/ba867af3-3d9e-43bf-8d35-0ebd9268b373-kube-api-access-4tlsh\") pod \"route-controller-manager-845445df8d-z5lnp\" (UID: \"ba867af3-3d9e-43bf-8d35-0ebd9268b373\") " pod="openshift-route-controller-manager/route-controller-manager-845445df8d-z5lnp" Mar 09 13:02:38 crc kubenswrapper[4723]: I0309 13:02:38.763272 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ba867af3-3d9e-43bf-8d35-0ebd9268b373-client-ca\") pod \"route-controller-manager-845445df8d-z5lnp\" (UID: \"ba867af3-3d9e-43bf-8d35-0ebd9268b373\") " pod="openshift-route-controller-manager/route-controller-manager-845445df8d-z5lnp" Mar 09 13:02:38 crc kubenswrapper[4723]: I0309 13:02:38.763287 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba867af3-3d9e-43bf-8d35-0ebd9268b373-serving-cert\") pod \"route-controller-manager-845445df8d-z5lnp\" (UID: \"ba867af3-3d9e-43bf-8d35-0ebd9268b373\") " pod="openshift-route-controller-manager/route-controller-manager-845445df8d-z5lnp" Mar 09 13:02:38 crc kubenswrapper[4723]: I0309 13:02:38.764759 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ba867af3-3d9e-43bf-8d35-0ebd9268b373-client-ca\") pod \"route-controller-manager-845445df8d-z5lnp\" (UID: \"ba867af3-3d9e-43bf-8d35-0ebd9268b373\") " pod="openshift-route-controller-manager/route-controller-manager-845445df8d-z5lnp" Mar 09 13:02:38 crc kubenswrapper[4723]: I0309 13:02:38.765014 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba867af3-3d9e-43bf-8d35-0ebd9268b373-config\") pod \"route-controller-manager-845445df8d-z5lnp\" (UID: \"ba867af3-3d9e-43bf-8d35-0ebd9268b373\") " pod="openshift-route-controller-manager/route-controller-manager-845445df8d-z5lnp" Mar 09 13:02:38 crc kubenswrapper[4723]: I0309 13:02:38.776575 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba867af3-3d9e-43bf-8d35-0ebd9268b373-serving-cert\") pod \"route-controller-manager-845445df8d-z5lnp\" (UID: \"ba867af3-3d9e-43bf-8d35-0ebd9268b373\") " pod="openshift-route-controller-manager/route-controller-manager-845445df8d-z5lnp" Mar 09 13:02:38 crc kubenswrapper[4723]: I0309 13:02:38.791104 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tlsh\" (UniqueName: \"kubernetes.io/projected/ba867af3-3d9e-43bf-8d35-0ebd9268b373-kube-api-access-4tlsh\") pod \"route-controller-manager-845445df8d-z5lnp\" (UID: \"ba867af3-3d9e-43bf-8d35-0ebd9268b373\") " pod="openshift-route-controller-manager/route-controller-manager-845445df8d-z5lnp" Mar 09 13:02:38 crc kubenswrapper[4723]: I0309 13:02:38.887700 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14629369-42de-4bea-936a-d78ef91b8514" path="/var/lib/kubelet/pods/14629369-42de-4bea-936a-d78ef91b8514/volumes" Mar 09 13:02:38 crc kubenswrapper[4723]: I0309 13:02:38.888571 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9141cfe-9794-4248-8631-e9424aa7dbdf" path="/var/lib/kubelet/pods/e9141cfe-9794-4248-8631-e9424aa7dbdf/volumes" Mar 09 13:02:38 crc kubenswrapper[4723]: I0309 13:02:38.889578 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-845445df8d-z5lnp" Mar 09 13:02:39 crc kubenswrapper[4723]: I0309 13:02:39.099734 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-845445df8d-z5lnp"] Mar 09 13:02:39 crc kubenswrapper[4723]: W0309 13:02:39.109099 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba867af3_3d9e_43bf_8d35_0ebd9268b373.slice/crio-8ba5bb6efe68cb5ce5c8a96f24e14b75383bd753ba4d614dfdb88c2bf61f1c75 WatchSource:0}: Error finding container 8ba5bb6efe68cb5ce5c8a96f24e14b75383bd753ba4d614dfdb88c2bf61f1c75: Status 404 returned error can't find the container with id 8ba5bb6efe68cb5ce5c8a96f24e14b75383bd753ba4d614dfdb88c2bf61f1c75 Mar 09 13:02:39 crc kubenswrapper[4723]: I0309 13:02:39.673407 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-845445df8d-z5lnp" event={"ID":"ba867af3-3d9e-43bf-8d35-0ebd9268b373","Type":"ContainerStarted","Data":"bbe1de00b01a2613369fc49b09ebf1be39682fe08faa6c35c980f31ef9eaa0ab"} Mar 09 13:02:39 crc kubenswrapper[4723]: I0309 13:02:39.673803 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-845445df8d-z5lnp" event={"ID":"ba867af3-3d9e-43bf-8d35-0ebd9268b373","Type":"ContainerStarted","Data":"8ba5bb6efe68cb5ce5c8a96f24e14b75383bd753ba4d614dfdb88c2bf61f1c75"} Mar 09 13:02:39 crc kubenswrapper[4723]: I0309 13:02:39.698811 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-845445df8d-z5lnp" podStartSLOduration=3.698782522 podStartE2EDuration="3.698782522s" podCreationTimestamp="2026-03-09 13:02:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:02:39.693112421 +0000 UTC m=+233.707580001" watchObservedRunningTime="2026-03-09 13:02:39.698782522 +0000 UTC m=+233.713250102" Mar 09 13:02:40 crc kubenswrapper[4723]: I0309 13:02:40.677802 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-845445df8d-z5lnp" Mar 09 13:02:40 crc kubenswrapper[4723]: I0309 13:02:40.691244 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-845445df8d-z5lnp" Mar 09 13:02:44 crc kubenswrapper[4723]: I0309 13:02:44.701880 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jq2cv" event={"ID":"84890bd9-0d95-48f4-89d3-6619e5e5525a","Type":"ContainerDied","Data":"7a5c6c4dc30e89b7a5a04967b525ab1485a6b871c5035a8551969154476829f3"} Mar 09 13:02:44 crc kubenswrapper[4723]: I0309 13:02:44.701938 4723 generic.go:334] "Generic (PLEG): container finished" podID="84890bd9-0d95-48f4-89d3-6619e5e5525a" containerID="7a5c6c4dc30e89b7a5a04967b525ab1485a6b871c5035a8551969154476829f3" exitCode=0 Mar 09 13:02:44 crc kubenswrapper[4723]: I0309 13:02:44.705324 4723 generic.go:334] "Generic (PLEG): container finished" podID="a7d103aa-232e-4705-a061-8ad7025339cf" containerID="2be94a9168f5909d8cb3c4b00a1c68eeb14f3d718787ba16d310e8bb0580616b" exitCode=0 Mar 09 13:02:44 crc kubenswrapper[4723]: I0309 13:02:44.705419 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4x6zm" event={"ID":"a7d103aa-232e-4705-a061-8ad7025339cf","Type":"ContainerDied","Data":"2be94a9168f5909d8cb3c4b00a1c68eeb14f3d718787ba16d310e8bb0580616b"} Mar 09 13:02:44 crc kubenswrapper[4723]: I0309 13:02:44.711940 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2lhjt" event={"ID":"5adbe8b6-fabd-4e21-8507-84df16004837","Type":"ContainerStarted","Data":"6264dfe2e8203b6bfc4f87d5b394f014d1e1e13f2094461c63e031cdad53f4c4"} Mar 09 13:02:45 crc kubenswrapper[4723]: I0309 13:02:45.072930 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dqh66" Mar 09 13:02:45 crc kubenswrapper[4723]: I0309 13:02:45.117569 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dqh66" Mar 09 13:02:45 crc kubenswrapper[4723]: I0309 13:02:45.576649 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9qb2k" Mar 09 13:02:45 crc kubenswrapper[4723]: I0309 13:02:45.619435 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9qb2k" Mar 09 13:02:45 crc kubenswrapper[4723]: I0309 13:02:45.719783 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jq2cv" event={"ID":"84890bd9-0d95-48f4-89d3-6619e5e5525a","Type":"ContainerStarted","Data":"2bcd0259d58210a05b28751843f01bf8fb4c24e3b599b603bc5d6b1c4e6c90f9"} Mar 09 13:02:45 crc kubenswrapper[4723]: I0309 13:02:45.722739 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4x6zm" event={"ID":"a7d103aa-232e-4705-a061-8ad7025339cf","Type":"ContainerStarted","Data":"3ad121701ecbed45714230b6796cc948294ac9e586bad1fc30461db0f68f2e2f"} Mar 09 13:02:45 crc kubenswrapper[4723]: I0309 13:02:45.725155 4723 generic.go:334] "Generic (PLEG): container finished" podID="5adbe8b6-fabd-4e21-8507-84df16004837" containerID="6264dfe2e8203b6bfc4f87d5b394f014d1e1e13f2094461c63e031cdad53f4c4" exitCode=0 Mar 09 13:02:45 crc kubenswrapper[4723]: I0309 13:02:45.725229 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2lhjt" event={"ID":"5adbe8b6-fabd-4e21-8507-84df16004837","Type":"ContainerDied","Data":"6264dfe2e8203b6bfc4f87d5b394f014d1e1e13f2094461c63e031cdad53f4c4"} Mar 09 13:02:45 crc kubenswrapper[4723]: I0309 13:02:45.726625 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kcvkd" event={"ID":"980fde08-18f8-4e22-93a1-3846f9e367ad","Type":"ContainerStarted","Data":"856e534e7cb84b0cb2211a1b4311886f65328d374c944197b38960eb828368d1"} Mar 09 13:02:45 crc kubenswrapper[4723]: I0309 13:02:45.739684 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jq2cv" podStartSLOduration=3.272626401 podStartE2EDuration="51.739663912s" podCreationTimestamp="2026-03-09 13:01:54 +0000 UTC" firstStartedPulling="2026-03-09 13:01:56.716136528 +0000 UTC m=+190.730604068" lastFinishedPulling="2026-03-09 13:02:45.183174039 +0000 UTC m=+239.197641579" observedRunningTime="2026-03-09 13:02:45.734227556 +0000 UTC m=+239.748695106" watchObservedRunningTime="2026-03-09 13:02:45.739663912 +0000 UTC m=+239.754131452" Mar 09 13:02:45 crc kubenswrapper[4723]: I0309 13:02:45.756594 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4x6zm" podStartSLOduration=3.343003831 podStartE2EDuration="51.756569175s" podCreationTimestamp="2026-03-09 13:01:54 +0000 UTC" firstStartedPulling="2026-03-09 13:01:56.69544859 +0000 UTC m=+190.709916130" lastFinishedPulling="2026-03-09 13:02:45.109013934 +0000 UTC m=+239.123481474" observedRunningTime="2026-03-09 13:02:45.753373405 +0000 UTC m=+239.767840945" watchObservedRunningTime="2026-03-09 13:02:45.756569175 +0000 UTC m=+239.771036715" Mar 09 13:02:46 crc kubenswrapper[4723]: I0309 13:02:46.735092 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2lhjt" event={"ID":"5adbe8b6-fabd-4e21-8507-84df16004837","Type":"ContainerStarted","Data":"900ce8517dbf500ee93275d9ddfc72f283c72c0678a51c4eb9ea89f798a07a04"} Mar 09 13:02:46 crc kubenswrapper[4723]: I0309 13:02:46.738466 4723 generic.go:334] "Generic (PLEG): container finished" podID="980fde08-18f8-4e22-93a1-3846f9e367ad" containerID="856e534e7cb84b0cb2211a1b4311886f65328d374c944197b38960eb828368d1" exitCode=0 Mar 09 13:02:46 crc kubenswrapper[4723]: I0309 13:02:46.738521 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kcvkd" event={"ID":"980fde08-18f8-4e22-93a1-3846f9e367ad","Type":"ContainerDied","Data":"856e534e7cb84b0cb2211a1b4311886f65328d374c944197b38960eb828368d1"} Mar 09 13:02:46 crc kubenswrapper[4723]: I0309 13:02:46.761420 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2lhjt" podStartSLOduration=2.487768568 podStartE2EDuration="49.761403484s" podCreationTimestamp="2026-03-09 13:01:57 +0000 UTC" firstStartedPulling="2026-03-09 13:01:58.914112976 +0000 UTC m=+192.928580516" lastFinishedPulling="2026-03-09 13:02:46.187747892 +0000 UTC m=+240.202215432" observedRunningTime="2026-03-09 13:02:46.759482106 +0000 UTC m=+240.773949646" watchObservedRunningTime="2026-03-09 13:02:46.761403484 +0000 UTC m=+240.775871024" Mar 09 13:02:46 crc kubenswrapper[4723]: I0309 13:02:46.833650 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pvf4v" Mar 09 13:02:46 crc kubenswrapper[4723]: I0309 13:02:46.880318 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pvf4v" Mar 09 13:02:47 crc kubenswrapper[4723]: I0309 13:02:47.391998 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lsmcg" Mar 09 13:02:47 crc kubenswrapper[4723]: I0309 13:02:47.746635 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kcvkd" event={"ID":"980fde08-18f8-4e22-93a1-3846f9e367ad","Type":"ContainerStarted","Data":"6ce5ce9cb8f262e4d0376936606237d0bffa473b6a1bd1e5116ab78273f250df"} Mar 09 13:02:48 crc kubenswrapper[4723]: I0309 13:02:48.162379 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2lhjt" Mar 09 13:02:48 crc kubenswrapper[4723]: I0309 13:02:48.162438 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2lhjt" Mar 09 13:02:48 crc kubenswrapper[4723]: I0309 13:02:48.398836 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kcvkd" Mar 09 13:02:48 crc kubenswrapper[4723]: I0309 13:02:48.398958 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kcvkd" Mar 09 13:02:49 crc kubenswrapper[4723]: I0309 13:02:49.123937 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kcvkd" podStartSLOduration=3.896460458 podStartE2EDuration="51.123919528s" podCreationTimestamp="2026-03-09 13:01:58 +0000 UTC" firstStartedPulling="2026-03-09 13:02:00.064018635 +0000 UTC m=+194.078486175" lastFinishedPulling="2026-03-09 13:02:47.291477665 +0000 UTC m=+241.305945245" observedRunningTime="2026-03-09 13:02:47.771056313 +0000 UTC m=+241.785523853" watchObservedRunningTime="2026-03-09 13:02:49.123919528 +0000 UTC m=+243.138387078" Mar 09 13:02:49 crc kubenswrapper[4723]: I0309 13:02:49.128101 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9qb2k"] Mar 09 13:02:49 crc kubenswrapper[4723]: I0309 13:02:49.128343 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9qb2k" podUID="fa9195db-65d7-4777-8869-948a26e41933" containerName="registry-server" containerID="cri-o://2f0d6afc23a54ff31ea8953c913dbdd3447daf5d3600a0343c3a0bacbe42ae34" gracePeriod=2 Mar 09 13:02:49 crc kubenswrapper[4723]: I0309 13:02:49.211448 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2lhjt" podUID="5adbe8b6-fabd-4e21-8507-84df16004837" containerName="registry-server" probeResult="failure" output=< Mar 09 13:02:49 crc kubenswrapper[4723]: timeout: failed to connect service ":50051" within 1s Mar 09 13:02:49 crc kubenswrapper[4723]: > Mar 09 13:02:49 crc kubenswrapper[4723]: I0309 13:02:49.438384 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kcvkd" podUID="980fde08-18f8-4e22-93a1-3846f9e367ad" containerName="registry-server" probeResult="failure" output=< Mar 09 13:02:49 crc kubenswrapper[4723]: timeout: failed to connect service ":50051" within 1s Mar 09 13:02:49 crc kubenswrapper[4723]: > Mar 09 13:02:49 crc kubenswrapper[4723]: I0309 13:02:49.533248 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9qb2k" Mar 09 13:02:49 crc kubenswrapper[4723]: I0309 13:02:49.623228 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9nm8\" (UniqueName: \"kubernetes.io/projected/fa9195db-65d7-4777-8869-948a26e41933-kube-api-access-t9nm8\") pod \"fa9195db-65d7-4777-8869-948a26e41933\" (UID: \"fa9195db-65d7-4777-8869-948a26e41933\") " Mar 09 13:02:49 crc kubenswrapper[4723]: I0309 13:02:49.623338 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa9195db-65d7-4777-8869-948a26e41933-catalog-content\") pod \"fa9195db-65d7-4777-8869-948a26e41933\" (UID: \"fa9195db-65d7-4777-8869-948a26e41933\") " Mar 09 13:02:49 crc kubenswrapper[4723]: I0309 13:02:49.623380 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa9195db-65d7-4777-8869-948a26e41933-utilities\") pod \"fa9195db-65d7-4777-8869-948a26e41933\" (UID: \"fa9195db-65d7-4777-8869-948a26e41933\") " Mar 09 13:02:49 crc kubenswrapper[4723]: I0309 13:02:49.624374 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa9195db-65d7-4777-8869-948a26e41933-utilities" (OuterVolumeSpecName: "utilities") pod "fa9195db-65d7-4777-8869-948a26e41933" (UID: "fa9195db-65d7-4777-8869-948a26e41933"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:02:49 crc kubenswrapper[4723]: I0309 13:02:49.630273 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa9195db-65d7-4777-8869-948a26e41933-kube-api-access-t9nm8" (OuterVolumeSpecName: "kube-api-access-t9nm8") pod "fa9195db-65d7-4777-8869-948a26e41933" (UID: "fa9195db-65d7-4777-8869-948a26e41933"). InnerVolumeSpecName "kube-api-access-t9nm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:02:49 crc kubenswrapper[4723]: I0309 13:02:49.691202 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa9195db-65d7-4777-8869-948a26e41933-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fa9195db-65d7-4777-8869-948a26e41933" (UID: "fa9195db-65d7-4777-8869-948a26e41933"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:02:49 crc kubenswrapper[4723]: I0309 13:02:49.726778 4723 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa9195db-65d7-4777-8869-948a26e41933-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:02:49 crc kubenswrapper[4723]: I0309 13:02:49.727033 4723 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa9195db-65d7-4777-8869-948a26e41933-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:02:49 crc kubenswrapper[4723]: I0309 13:02:49.727116 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9nm8\" (UniqueName: \"kubernetes.io/projected/fa9195db-65d7-4777-8869-948a26e41933-kube-api-access-t9nm8\") on node \"crc\" DevicePath \"\"" Mar 09 13:02:49 crc kubenswrapper[4723]: I0309 13:02:49.759532 4723 generic.go:334] "Generic (PLEG): container finished" podID="fa9195db-65d7-4777-8869-948a26e41933" containerID="2f0d6afc23a54ff31ea8953c913dbdd3447daf5d3600a0343c3a0bacbe42ae34" exitCode=0 Mar 09 13:02:49 crc kubenswrapper[4723]: I0309 13:02:49.759579 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9qb2k" Mar 09 13:02:49 crc kubenswrapper[4723]: I0309 13:02:49.759612 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qb2k" event={"ID":"fa9195db-65d7-4777-8869-948a26e41933","Type":"ContainerDied","Data":"2f0d6afc23a54ff31ea8953c913dbdd3447daf5d3600a0343c3a0bacbe42ae34"} Mar 09 13:02:49 crc kubenswrapper[4723]: I0309 13:02:49.760077 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qb2k" event={"ID":"fa9195db-65d7-4777-8869-948a26e41933","Type":"ContainerDied","Data":"84ec7db969b2fb40634cda3071d8f872cbac94caaaf2a068eb91ebb10bb9fbd6"} Mar 09 13:02:49 crc kubenswrapper[4723]: I0309 13:02:49.760107 4723 scope.go:117] "RemoveContainer" containerID="2f0d6afc23a54ff31ea8953c913dbdd3447daf5d3600a0343c3a0bacbe42ae34" Mar 09 13:02:49 crc kubenswrapper[4723]: I0309 13:02:49.775634 4723 scope.go:117] "RemoveContainer" containerID="a1891ba763f18a6a47a495765e7c3d339abdb8fc358f6ba57e98dac377ab78ce" Mar 09 13:02:49 crc kubenswrapper[4723]: I0309 13:02:49.795343 4723 scope.go:117] "RemoveContainer" containerID="fbe4dc38e4f9f1245c6c0a12d941471c716b36a377fea5bf529d5d3518b2c420" Mar 09 13:02:49 crc kubenswrapper[4723]: I0309 13:02:49.844953 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9qb2k"] Mar 09 13:02:49 crc kubenswrapper[4723]: I0309 13:02:49.847467 4723 scope.go:117] "RemoveContainer" containerID="2f0d6afc23a54ff31ea8953c913dbdd3447daf5d3600a0343c3a0bacbe42ae34" Mar 09 13:02:49 crc kubenswrapper[4723]: E0309 13:02:49.848095 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f0d6afc23a54ff31ea8953c913dbdd3447daf5d3600a0343c3a0bacbe42ae34\": container with ID starting with 2f0d6afc23a54ff31ea8953c913dbdd3447daf5d3600a0343c3a0bacbe42ae34 not found: ID does not exist" containerID="2f0d6afc23a54ff31ea8953c913dbdd3447daf5d3600a0343c3a0bacbe42ae34" Mar 09 13:02:49 crc kubenswrapper[4723]: I0309 13:02:49.848151 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f0d6afc23a54ff31ea8953c913dbdd3447daf5d3600a0343c3a0bacbe42ae34"} err="failed to get container status \"2f0d6afc23a54ff31ea8953c913dbdd3447daf5d3600a0343c3a0bacbe42ae34\": rpc error: code = NotFound desc = could not find container \"2f0d6afc23a54ff31ea8953c913dbdd3447daf5d3600a0343c3a0bacbe42ae34\": container with ID starting with 2f0d6afc23a54ff31ea8953c913dbdd3447daf5d3600a0343c3a0bacbe42ae34 not found: ID does not exist" Mar 09 13:02:49 crc kubenswrapper[4723]: I0309 13:02:49.848193 4723 scope.go:117] "RemoveContainer" containerID="a1891ba763f18a6a47a495765e7c3d339abdb8fc358f6ba57e98dac377ab78ce" Mar 09 13:02:49 crc kubenswrapper[4723]: E0309 13:02:49.848805 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1891ba763f18a6a47a495765e7c3d339abdb8fc358f6ba57e98dac377ab78ce\": container with ID starting with a1891ba763f18a6a47a495765e7c3d339abdb8fc358f6ba57e98dac377ab78ce not found: ID does not exist" containerID="a1891ba763f18a6a47a495765e7c3d339abdb8fc358f6ba57e98dac377ab78ce" Mar 09 13:02:49 crc kubenswrapper[4723]: I0309 13:02:49.848925 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1891ba763f18a6a47a495765e7c3d339abdb8fc358f6ba57e98dac377ab78ce"} err="failed to get container status \"a1891ba763f18a6a47a495765e7c3d339abdb8fc358f6ba57e98dac377ab78ce\": rpc error: code = NotFound desc = could not find container \"a1891ba763f18a6a47a495765e7c3d339abdb8fc358f6ba57e98dac377ab78ce\": container with ID starting with a1891ba763f18a6a47a495765e7c3d339abdb8fc358f6ba57e98dac377ab78ce not found: ID does not exist" Mar 09 13:02:49 crc kubenswrapper[4723]: I0309 13:02:49.848982 4723 scope.go:117] "RemoveContainer" containerID="fbe4dc38e4f9f1245c6c0a12d941471c716b36a377fea5bf529d5d3518b2c420" Mar 09 13:02:49 crc kubenswrapper[4723]: E0309 13:02:49.849430 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbe4dc38e4f9f1245c6c0a12d941471c716b36a377fea5bf529d5d3518b2c420\": container with ID starting with fbe4dc38e4f9f1245c6c0a12d941471c716b36a377fea5bf529d5d3518b2c420 not found: ID does not exist" containerID="fbe4dc38e4f9f1245c6c0a12d941471c716b36a377fea5bf529d5d3518b2c420" Mar 09 13:02:49 crc kubenswrapper[4723]: I0309 13:02:49.849473 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbe4dc38e4f9f1245c6c0a12d941471c716b36a377fea5bf529d5d3518b2c420"} err="failed to get container status \"fbe4dc38e4f9f1245c6c0a12d941471c716b36a377fea5bf529d5d3518b2c420\": rpc error: code = NotFound desc = could not find container \"fbe4dc38e4f9f1245c6c0a12d941471c716b36a377fea5bf529d5d3518b2c420\": container with ID starting with fbe4dc38e4f9f1245c6c0a12d941471c716b36a377fea5bf529d5d3518b2c420 not found: ID does not exist" Mar 09 13:02:49 crc kubenswrapper[4723]: I0309 13:02:49.849690 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9qb2k"] Mar 09 13:02:50 crc kubenswrapper[4723]: I0309 13:02:50.889446 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa9195db-65d7-4777-8869-948a26e41933" path="/var/lib/kubelet/pods/fa9195db-65d7-4777-8869-948a26e41933/volumes" Mar 09 13:02:51 crc kubenswrapper[4723]: I0309 13:02:51.322236 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lsmcg"] Mar 09 13:02:51 crc kubenswrapper[4723]: I0309 13:02:51.322506 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lsmcg" podUID="6bc2fb38-5759-4ce6-9c1d-84a6537050e9" containerName="registry-server" containerID="cri-o://bbefc00f23848472d0475fdc1f07e470ad2d8f5a4de2420b2958a0fee0e0218b" gracePeriod=2 Mar 09 13:02:51 crc kubenswrapper[4723]: I0309 13:02:51.775988 4723 generic.go:334] "Generic (PLEG): container finished" podID="6bc2fb38-5759-4ce6-9c1d-84a6537050e9" containerID="bbefc00f23848472d0475fdc1f07e470ad2d8f5a4de2420b2958a0fee0e0218b" exitCode=0 Mar 09 13:02:51 crc kubenswrapper[4723]: I0309 13:02:51.776061 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lsmcg" event={"ID":"6bc2fb38-5759-4ce6-9c1d-84a6537050e9","Type":"ContainerDied","Data":"bbefc00f23848472d0475fdc1f07e470ad2d8f5a4de2420b2958a0fee0e0218b"} Mar 09 13:02:52 crc kubenswrapper[4723]: I0309 13:02:52.203541 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lsmcg" Mar 09 13:02:52 crc kubenswrapper[4723]: I0309 13:02:52.266548 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bc2fb38-5759-4ce6-9c1d-84a6537050e9-catalog-content\") pod \"6bc2fb38-5759-4ce6-9c1d-84a6537050e9\" (UID: \"6bc2fb38-5759-4ce6-9c1d-84a6537050e9\") " Mar 09 13:02:52 crc kubenswrapper[4723]: I0309 13:02:52.266642 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bc2fb38-5759-4ce6-9c1d-84a6537050e9-utilities\") pod \"6bc2fb38-5759-4ce6-9c1d-84a6537050e9\" (UID: \"6bc2fb38-5759-4ce6-9c1d-84a6537050e9\") " Mar 09 13:02:52 crc kubenswrapper[4723]: I0309 13:02:52.266685 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqp2g\" (UniqueName: \"kubernetes.io/projected/6bc2fb38-5759-4ce6-9c1d-84a6537050e9-kube-api-access-rqp2g\") pod \"6bc2fb38-5759-4ce6-9c1d-84a6537050e9\" (UID: \"6bc2fb38-5759-4ce6-9c1d-84a6537050e9\") " Mar 09 13:02:52 crc kubenswrapper[4723]: I0309 13:02:52.267853 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bc2fb38-5759-4ce6-9c1d-84a6537050e9-utilities" (OuterVolumeSpecName: "utilities") pod "6bc2fb38-5759-4ce6-9c1d-84a6537050e9" (UID: "6bc2fb38-5759-4ce6-9c1d-84a6537050e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:02:52 crc kubenswrapper[4723]: I0309 13:02:52.271076 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bc2fb38-5759-4ce6-9c1d-84a6537050e9-kube-api-access-rqp2g" (OuterVolumeSpecName: "kube-api-access-rqp2g") pod "6bc2fb38-5759-4ce6-9c1d-84a6537050e9" (UID: "6bc2fb38-5759-4ce6-9c1d-84a6537050e9"). InnerVolumeSpecName "kube-api-access-rqp2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:02:52 crc kubenswrapper[4723]: I0309 13:02:52.293035 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bc2fb38-5759-4ce6-9c1d-84a6537050e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6bc2fb38-5759-4ce6-9c1d-84a6537050e9" (UID: "6bc2fb38-5759-4ce6-9c1d-84a6537050e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:02:52 crc kubenswrapper[4723]: I0309 13:02:52.367801 4723 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bc2fb38-5759-4ce6-9c1d-84a6537050e9-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:02:52 crc kubenswrapper[4723]: I0309 13:02:52.367846 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqp2g\" (UniqueName: \"kubernetes.io/projected/6bc2fb38-5759-4ce6-9c1d-84a6537050e9-kube-api-access-rqp2g\") on node \"crc\" DevicePath \"\"" Mar 09 13:02:52 crc kubenswrapper[4723]: I0309 13:02:52.367871 4723 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bc2fb38-5759-4ce6-9c1d-84a6537050e9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:02:52 crc kubenswrapper[4723]: I0309 13:02:52.784371 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lsmcg" event={"ID":"6bc2fb38-5759-4ce6-9c1d-84a6537050e9","Type":"ContainerDied","Data":"cdb375ec401951a0010fbf23f12587fcd32c42dc0ad76f1c2ad0f40c8798b95a"} Mar 09 13:02:52 crc kubenswrapper[4723]: I0309 13:02:52.784427 4723 scope.go:117] "RemoveContainer" containerID="bbefc00f23848472d0475fdc1f07e470ad2d8f5a4de2420b2958a0fee0e0218b" Mar 09 13:02:52 crc kubenswrapper[4723]: I0309 13:02:52.784446 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lsmcg" Mar 09 13:02:52 crc kubenswrapper[4723]: I0309 13:02:52.800139 4723 scope.go:117] "RemoveContainer" containerID="30a17989be411e31f8381dee9cec3aed5cc78f1dd8699eb1224de882f8286923" Mar 09 13:02:52 crc kubenswrapper[4723]: I0309 13:02:52.819424 4723 scope.go:117] "RemoveContainer" containerID="644ceeb0c8726b6e543836ebcd44d1c5d7cdcf25069fbfd1fe05bc0f3340d617" Mar 09 13:02:52 crc kubenswrapper[4723]: I0309 13:02:52.823493 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lsmcg"] Mar 09 13:02:52 crc kubenswrapper[4723]: I0309 13:02:52.829076 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lsmcg"] Mar 09 13:02:52 crc kubenswrapper[4723]: I0309 13:02:52.899348 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bc2fb38-5759-4ce6-9c1d-84a6537050e9" path="/var/lib/kubelet/pods/6bc2fb38-5759-4ce6-9c1d-84a6537050e9/volumes" Mar 09 13:02:54 crc kubenswrapper[4723]: I0309 13:02:54.872656 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4x6zm" Mar 09 13:02:54 crc kubenswrapper[4723]: I0309 13:02:54.873077 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4x6zm" Mar 09 13:02:54 crc kubenswrapper[4723]: I0309 13:02:54.947552 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4x6zm" Mar 09 13:02:55 crc kubenswrapper[4723]: I0309 13:02:55.277194 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jq2cv" Mar 09 13:02:55 crc kubenswrapper[4723]: I0309 13:02:55.277258 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jq2cv" Mar 09 13:02:55 crc kubenswrapper[4723]: I0309 13:02:55.328472 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jq2cv" Mar 09 13:02:55 crc kubenswrapper[4723]: I0309 13:02:55.856814 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jq2cv" Mar 09 13:02:55 crc kubenswrapper[4723]: I0309 13:02:55.881765 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4x6zm" Mar 09 13:02:56 crc kubenswrapper[4723]: I0309 13:02:56.418923 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-9b48dcbf5-8ksvk"] Mar 09 13:02:56 crc kubenswrapper[4723]: I0309 13:02:56.419475 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-9b48dcbf5-8ksvk" podUID="77ca9a9c-0c47-466d-8219-759581449711" containerName="controller-manager" containerID="cri-o://3a7640f4ab56061800ba976071832bb777f8ec93cad5fc88d7e0daf9b8843529" gracePeriod=30 Mar 09 13:02:56 crc kubenswrapper[4723]: I0309 13:02:56.435573 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-845445df8d-z5lnp"] Mar 09 13:02:56 crc kubenswrapper[4723]: I0309 13:02:56.435778 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-845445df8d-z5lnp" podUID="ba867af3-3d9e-43bf-8d35-0ebd9268b373" containerName="route-controller-manager" containerID="cri-o://bbe1de00b01a2613369fc49b09ebf1be39682fe08faa6c35c980f31ef9eaa0ab" gracePeriod=30 Mar 09 13:02:56 crc kubenswrapper[4723]: I0309 13:02:56.829539 4723 generic.go:334] "Generic (PLEG): container finished" podID="77ca9a9c-0c47-466d-8219-759581449711" containerID="3a7640f4ab56061800ba976071832bb777f8ec93cad5fc88d7e0daf9b8843529" exitCode=0 Mar 09 13:02:56 crc kubenswrapper[4723]: I0309 13:02:56.830793 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9b48dcbf5-8ksvk" event={"ID":"77ca9a9c-0c47-466d-8219-759581449711","Type":"ContainerDied","Data":"3a7640f4ab56061800ba976071832bb777f8ec93cad5fc88d7e0daf9b8843529"} Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.599549 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-845445df8d-z5lnp" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.685093 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fcbf754c-xl995"] Mar 09 13:02:57 crc kubenswrapper[4723]: E0309 13:02:57.685399 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bc2fb38-5759-4ce6-9c1d-84a6537050e9" containerName="extract-content" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.685418 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bc2fb38-5759-4ce6-9c1d-84a6537050e9" containerName="extract-content" Mar 09 13:02:57 crc kubenswrapper[4723]: E0309 13:02:57.685436 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa9195db-65d7-4777-8869-948a26e41933" containerName="registry-server" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.685449 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa9195db-65d7-4777-8869-948a26e41933" containerName="registry-server" Mar 09 13:02:57 crc kubenswrapper[4723]: E0309 13:02:57.685469 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bc2fb38-5759-4ce6-9c1d-84a6537050e9" containerName="extract-utilities" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.685481 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bc2fb38-5759-4ce6-9c1d-84a6537050e9" containerName="extract-utilities" Mar 09 13:02:57 crc kubenswrapper[4723]: E0309 13:02:57.685503 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa9195db-65d7-4777-8869-948a26e41933" containerName="extract-utilities" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.685513 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa9195db-65d7-4777-8869-948a26e41933" containerName="extract-utilities" Mar 09 13:02:57 crc kubenswrapper[4723]: E0309 13:02:57.685541 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba867af3-3d9e-43bf-8d35-0ebd9268b373" containerName="route-controller-manager" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.685552 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba867af3-3d9e-43bf-8d35-0ebd9268b373" containerName="route-controller-manager" Mar 09 13:02:57 crc kubenswrapper[4723]: E0309 13:02:57.685568 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa9195db-65d7-4777-8869-948a26e41933" containerName="extract-content" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.685578 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa9195db-65d7-4777-8869-948a26e41933" containerName="extract-content" Mar 09 13:02:57 crc kubenswrapper[4723]: E0309 13:02:57.685597 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bc2fb38-5759-4ce6-9c1d-84a6537050e9" containerName="registry-server" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.685609 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bc2fb38-5759-4ce6-9c1d-84a6537050e9" containerName="registry-server" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.685774 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bc2fb38-5759-4ce6-9c1d-84a6537050e9" containerName="registry-server" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.685802 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa9195db-65d7-4777-8869-948a26e41933" containerName="registry-server" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.685818 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba867af3-3d9e-43bf-8d35-0ebd9268b373" containerName="route-controller-manager" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.686602 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fcbf754c-xl995" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.693513 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fcbf754c-xl995"] Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.698378 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9b48dcbf5-8ksvk" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.721906 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jq2cv"] Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.781597 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ba867af3-3d9e-43bf-8d35-0ebd9268b373-client-ca\") pod \"ba867af3-3d9e-43bf-8d35-0ebd9268b373\" (UID: \"ba867af3-3d9e-43bf-8d35-0ebd9268b373\") " Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.781880 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba867af3-3d9e-43bf-8d35-0ebd9268b373-serving-cert\") pod \"ba867af3-3d9e-43bf-8d35-0ebd9268b373\" (UID: \"ba867af3-3d9e-43bf-8d35-0ebd9268b373\") " Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.781970 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tlsh\" (UniqueName: \"kubernetes.io/projected/ba867af3-3d9e-43bf-8d35-0ebd9268b373-kube-api-access-4tlsh\") pod \"ba867af3-3d9e-43bf-8d35-0ebd9268b373\" (UID: \"ba867af3-3d9e-43bf-8d35-0ebd9268b373\") " Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.782086 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba867af3-3d9e-43bf-8d35-0ebd9268b373-config\") pod \"ba867af3-3d9e-43bf-8d35-0ebd9268b373\" (UID: \"ba867af3-3d9e-43bf-8d35-0ebd9268b373\") " Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.782270 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5772ae4e-fbd1-4d9c-bbe3-a92189b261ff-client-ca\") pod \"route-controller-manager-7fcbf754c-xl995\" (UID: \"5772ae4e-fbd1-4d9c-bbe3-a92189b261ff\") " pod="openshift-route-controller-manager/route-controller-manager-7fcbf754c-xl995" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.782372 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc9x9\" (UniqueName: \"kubernetes.io/projected/5772ae4e-fbd1-4d9c-bbe3-a92189b261ff-kube-api-access-vc9x9\") pod \"route-controller-manager-7fcbf754c-xl995\" (UID: \"5772ae4e-fbd1-4d9c-bbe3-a92189b261ff\") " pod="openshift-route-controller-manager/route-controller-manager-7fcbf754c-xl995" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.782461 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5772ae4e-fbd1-4d9c-bbe3-a92189b261ff-config\") pod \"route-controller-manager-7fcbf754c-xl995\" (UID: \"5772ae4e-fbd1-4d9c-bbe3-a92189b261ff\") " pod="openshift-route-controller-manager/route-controller-manager-7fcbf754c-xl995" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.782566 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5772ae4e-fbd1-4d9c-bbe3-a92189b261ff-serving-cert\") pod \"route-controller-manager-7fcbf754c-xl995\" (UID: \"5772ae4e-fbd1-4d9c-bbe3-a92189b261ff\") " pod="openshift-route-controller-manager/route-controller-manager-7fcbf754c-xl995" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.783293 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba867af3-3d9e-43bf-8d35-0ebd9268b373-config" (OuterVolumeSpecName: "config") pod "ba867af3-3d9e-43bf-8d35-0ebd9268b373" (UID: "ba867af3-3d9e-43bf-8d35-0ebd9268b373"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.783504 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba867af3-3d9e-43bf-8d35-0ebd9268b373-client-ca" (OuterVolumeSpecName: "client-ca") pod "ba867af3-3d9e-43bf-8d35-0ebd9268b373" (UID: "ba867af3-3d9e-43bf-8d35-0ebd9268b373"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.786632 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba867af3-3d9e-43bf-8d35-0ebd9268b373-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ba867af3-3d9e-43bf-8d35-0ebd9268b373" (UID: "ba867af3-3d9e-43bf-8d35-0ebd9268b373"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.787977 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba867af3-3d9e-43bf-8d35-0ebd9268b373-kube-api-access-4tlsh" (OuterVolumeSpecName: "kube-api-access-4tlsh") pod "ba867af3-3d9e-43bf-8d35-0ebd9268b373" (UID: "ba867af3-3d9e-43bf-8d35-0ebd9268b373"). InnerVolumeSpecName "kube-api-access-4tlsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.836522 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9b48dcbf5-8ksvk" event={"ID":"77ca9a9c-0c47-466d-8219-759581449711","Type":"ContainerDied","Data":"ed87aa963715d6ed1887faf7ea9ffac60022189e7c7e2999a0d71a246e6a1ddc"} Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.836573 4723 scope.go:117] "RemoveContainer" containerID="3a7640f4ab56061800ba976071832bb777f8ec93cad5fc88d7e0daf9b8843529" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.836575 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9b48dcbf5-8ksvk" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.838437 4723 generic.go:334] "Generic (PLEG): container finished" podID="ba867af3-3d9e-43bf-8d35-0ebd9268b373" containerID="bbe1de00b01a2613369fc49b09ebf1be39682fe08faa6c35c980f31ef9eaa0ab" exitCode=0 Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.838509 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-845445df8d-z5lnp" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.838578 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-845445df8d-z5lnp" event={"ID":"ba867af3-3d9e-43bf-8d35-0ebd9268b373","Type":"ContainerDied","Data":"bbe1de00b01a2613369fc49b09ebf1be39682fe08faa6c35c980f31ef9eaa0ab"} Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.838608 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-845445df8d-z5lnp" event={"ID":"ba867af3-3d9e-43bf-8d35-0ebd9268b373","Type":"ContainerDied","Data":"8ba5bb6efe68cb5ce5c8a96f24e14b75383bd753ba4d614dfdb88c2bf61f1c75"} Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.838700 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jq2cv" podUID="84890bd9-0d95-48f4-89d3-6619e5e5525a" containerName="registry-server" containerID="cri-o://2bcd0259d58210a05b28751843f01bf8fb4c24e3b599b603bc5d6b1c4e6c90f9" gracePeriod=2 Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.853285 4723 scope.go:117] "RemoveContainer" containerID="bbe1de00b01a2613369fc49b09ebf1be39682fe08faa6c35c980f31ef9eaa0ab" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.873994 4723 scope.go:117] "RemoveContainer" containerID="bbe1de00b01a2613369fc49b09ebf1be39682fe08faa6c35c980f31ef9eaa0ab" Mar 09 13:02:57 crc kubenswrapper[4723]: E0309 13:02:57.874402 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbe1de00b01a2613369fc49b09ebf1be39682fe08faa6c35c980f31ef9eaa0ab\": container with ID starting with bbe1de00b01a2613369fc49b09ebf1be39682fe08faa6c35c980f31ef9eaa0ab not found: ID does not exist" containerID="bbe1de00b01a2613369fc49b09ebf1be39682fe08faa6c35c980f31ef9eaa0ab" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.874437 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbe1de00b01a2613369fc49b09ebf1be39682fe08faa6c35c980f31ef9eaa0ab"} err="failed to get container status \"bbe1de00b01a2613369fc49b09ebf1be39682fe08faa6c35c980f31ef9eaa0ab\": rpc error: code = NotFound desc = could not find container \"bbe1de00b01a2613369fc49b09ebf1be39682fe08faa6c35c980f31ef9eaa0ab\": container with ID starting with bbe1de00b01a2613369fc49b09ebf1be39682fe08faa6c35c980f31ef9eaa0ab not found: ID does not exist" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.877843 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-845445df8d-z5lnp"] Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.881275 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-845445df8d-z5lnp"] Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.883593 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77ca9a9c-0c47-466d-8219-759581449711-config\") pod \"77ca9a9c-0c47-466d-8219-759581449711\" (UID: \"77ca9a9c-0c47-466d-8219-759581449711\") " Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.883732 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/77ca9a9c-0c47-466d-8219-759581449711-proxy-ca-bundles\") pod \"77ca9a9c-0c47-466d-8219-759581449711\" (UID: \"77ca9a9c-0c47-466d-8219-759581449711\") " Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.883854 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/77ca9a9c-0c47-466d-8219-759581449711-client-ca\") pod \"77ca9a9c-0c47-466d-8219-759581449711\" (UID: \"77ca9a9c-0c47-466d-8219-759581449711\") " Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.883978 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77ca9a9c-0c47-466d-8219-759581449711-serving-cert\") pod \"77ca9a9c-0c47-466d-8219-759581449711\" (UID: \"77ca9a9c-0c47-466d-8219-759581449711\") " Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.884123 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8j4fc\" (UniqueName: \"kubernetes.io/projected/77ca9a9c-0c47-466d-8219-759581449711-kube-api-access-8j4fc\") pod \"77ca9a9c-0c47-466d-8219-759581449711\" (UID: \"77ca9a9c-0c47-466d-8219-759581449711\") " Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.884465 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77ca9a9c-0c47-466d-8219-759581449711-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "77ca9a9c-0c47-466d-8219-759581449711" (UID: "77ca9a9c-0c47-466d-8219-759581449711"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.884505 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77ca9a9c-0c47-466d-8219-759581449711-client-ca" (OuterVolumeSpecName: "client-ca") pod "77ca9a9c-0c47-466d-8219-759581449711" (UID: "77ca9a9c-0c47-466d-8219-759581449711"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.884522 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77ca9a9c-0c47-466d-8219-759581449711-config" (OuterVolumeSpecName: "config") pod "77ca9a9c-0c47-466d-8219-759581449711" (UID: "77ca9a9c-0c47-466d-8219-759581449711"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.884991 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc9x9\" (UniqueName: \"kubernetes.io/projected/5772ae4e-fbd1-4d9c-bbe3-a92189b261ff-kube-api-access-vc9x9\") pod \"route-controller-manager-7fcbf754c-xl995\" (UID: \"5772ae4e-fbd1-4d9c-bbe3-a92189b261ff\") " pod="openshift-route-controller-manager/route-controller-manager-7fcbf754c-xl995" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.885750 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5772ae4e-fbd1-4d9c-bbe3-a92189b261ff-config\") pod \"route-controller-manager-7fcbf754c-xl995\" (UID: \"5772ae4e-fbd1-4d9c-bbe3-a92189b261ff\") " pod="openshift-route-controller-manager/route-controller-manager-7fcbf754c-xl995" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.886900 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77ca9a9c-0c47-466d-8219-759581449711-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "77ca9a9c-0c47-466d-8219-759581449711" (UID: "77ca9a9c-0c47-466d-8219-759581449711"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.888185 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77ca9a9c-0c47-466d-8219-759581449711-kube-api-access-8j4fc" (OuterVolumeSpecName: "kube-api-access-8j4fc") pod "77ca9a9c-0c47-466d-8219-759581449711" (UID: "77ca9a9c-0c47-466d-8219-759581449711"). InnerVolumeSpecName "kube-api-access-8j4fc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.887696 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5772ae4e-fbd1-4d9c-bbe3-a92189b261ff-config\") pod \"route-controller-manager-7fcbf754c-xl995\" (UID: \"5772ae4e-fbd1-4d9c-bbe3-a92189b261ff\") " pod="openshift-route-controller-manager/route-controller-manager-7fcbf754c-xl995" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.888611 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5772ae4e-fbd1-4d9c-bbe3-a92189b261ff-serving-cert\") pod \"route-controller-manager-7fcbf754c-xl995\" (UID: \"5772ae4e-fbd1-4d9c-bbe3-a92189b261ff\") " pod="openshift-route-controller-manager/route-controller-manager-7fcbf754c-xl995" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.888831 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5772ae4e-fbd1-4d9c-bbe3-a92189b261ff-client-ca\") pod \"route-controller-manager-7fcbf754c-xl995\" (UID: \"5772ae4e-fbd1-4d9c-bbe3-a92189b261ff\") " pod="openshift-route-controller-manager/route-controller-manager-7fcbf754c-xl995" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.888970 4723 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ba867af3-3d9e-43bf-8d35-0ebd9268b373-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.889588 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8j4fc\" (UniqueName: \"kubernetes.io/projected/77ca9a9c-0c47-466d-8219-759581449711-kube-api-access-8j4fc\") on node \"crc\" DevicePath \"\"" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.889689 4723 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba867af3-3d9e-43bf-8d35-0ebd9268b373-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.889778 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tlsh\" (UniqueName: \"kubernetes.io/projected/ba867af3-3d9e-43bf-8d35-0ebd9268b373-kube-api-access-4tlsh\") on node \"crc\" DevicePath \"\"" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.889818 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5772ae4e-fbd1-4d9c-bbe3-a92189b261ff-client-ca\") pod \"route-controller-manager-7fcbf754c-xl995\" (UID: \"5772ae4e-fbd1-4d9c-bbe3-a92189b261ff\") " pod="openshift-route-controller-manager/route-controller-manager-7fcbf754c-xl995" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.890121 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77ca9a9c-0c47-466d-8219-759581449711-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.890223 4723 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/77ca9a9c-0c47-466d-8219-759581449711-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.890314 4723 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/77ca9a9c-0c47-466d-8219-759581449711-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.890397 4723 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77ca9a9c-0c47-466d-8219-759581449711-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.890480 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba867af3-3d9e-43bf-8d35-0ebd9268b373-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.893174 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5772ae4e-fbd1-4d9c-bbe3-a92189b261ff-serving-cert\") pod \"route-controller-manager-7fcbf754c-xl995\" (UID: \"5772ae4e-fbd1-4d9c-bbe3-a92189b261ff\") " pod="openshift-route-controller-manager/route-controller-manager-7fcbf754c-xl995" Mar 09 13:02:57 crc kubenswrapper[4723]: I0309 13:02:57.902557 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc9x9\" (UniqueName: \"kubernetes.io/projected/5772ae4e-fbd1-4d9c-bbe3-a92189b261ff-kube-api-access-vc9x9\") pod \"route-controller-manager-7fcbf754c-xl995\" (UID: \"5772ae4e-fbd1-4d9c-bbe3-a92189b261ff\") " pod="openshift-route-controller-manager/route-controller-manager-7fcbf754c-xl995" Mar 09 13:02:58 crc kubenswrapper[4723]: I0309 13:02:58.012085 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fcbf754c-xl995" Mar 09 13:02:58 crc kubenswrapper[4723]: I0309 13:02:58.184978 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-9b48dcbf5-8ksvk"] Mar 09 13:02:58 crc kubenswrapper[4723]: I0309 13:02:58.194026 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-9b48dcbf5-8ksvk"] Mar 09 13:02:58 crc kubenswrapper[4723]: I0309 13:02:58.232551 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2lhjt" Mar 09 13:02:58 crc kubenswrapper[4723]: I0309 13:02:58.257446 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jq2cv" Mar 09 13:02:58 crc kubenswrapper[4723]: I0309 13:02:58.273014 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fcbf754c-xl995"] Mar 09 13:02:58 crc kubenswrapper[4723]: I0309 13:02:58.280040 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2lhjt" Mar 09 13:02:58 crc kubenswrapper[4723]: W0309 13:02:58.283989 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5772ae4e_fbd1_4d9c_bbe3_a92189b261ff.slice/crio-4474729d287066553ffd0717364c6dd2727bc9ee588c77ca743d301e590c9b79 WatchSource:0}: Error finding container 4474729d287066553ffd0717364c6dd2727bc9ee588c77ca743d301e590c9b79: Status 404 returned error can't find the container with id 4474729d287066553ffd0717364c6dd2727bc9ee588c77ca743d301e590c9b79 Mar 09 13:02:58 crc kubenswrapper[4723]: I0309 13:02:58.299160 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84890bd9-0d95-48f4-89d3-6619e5e5525a-utilities\") pod \"84890bd9-0d95-48f4-89d3-6619e5e5525a\" (UID: \"84890bd9-0d95-48f4-89d3-6619e5e5525a\") " Mar 09 13:02:58 crc kubenswrapper[4723]: I0309 13:02:58.300173 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84890bd9-0d95-48f4-89d3-6619e5e5525a-utilities" (OuterVolumeSpecName: "utilities") pod "84890bd9-0d95-48f4-89d3-6619e5e5525a" (UID: "84890bd9-0d95-48f4-89d3-6619e5e5525a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:02:58 crc kubenswrapper[4723]: I0309 13:02:58.399837 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84890bd9-0d95-48f4-89d3-6619e5e5525a-catalog-content\") pod \"84890bd9-0d95-48f4-89d3-6619e5e5525a\" (UID: \"84890bd9-0d95-48f4-89d3-6619e5e5525a\") " Mar 09 13:02:58 crc kubenswrapper[4723]: I0309 13:02:58.399895 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fd2j\" (UniqueName: \"kubernetes.io/projected/84890bd9-0d95-48f4-89d3-6619e5e5525a-kube-api-access-6fd2j\") pod \"84890bd9-0d95-48f4-89d3-6619e5e5525a\" (UID: \"84890bd9-0d95-48f4-89d3-6619e5e5525a\") " Mar 09 13:02:58 crc kubenswrapper[4723]: I0309 13:02:58.400104 4723 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84890bd9-0d95-48f4-89d3-6619e5e5525a-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:02:58 crc kubenswrapper[4723]: I0309 13:02:58.403291 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84890bd9-0d95-48f4-89d3-6619e5e5525a-kube-api-access-6fd2j" (OuterVolumeSpecName: "kube-api-access-6fd2j") pod "84890bd9-0d95-48f4-89d3-6619e5e5525a" (UID: "84890bd9-0d95-48f4-89d3-6619e5e5525a"). InnerVolumeSpecName "kube-api-access-6fd2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:02:58 crc kubenswrapper[4723]: I0309 13:02:58.446002 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kcvkd" Mar 09 13:02:58 crc kubenswrapper[4723]: I0309 13:02:58.468392 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84890bd9-0d95-48f4-89d3-6619e5e5525a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "84890bd9-0d95-48f4-89d3-6619e5e5525a" (UID: "84890bd9-0d95-48f4-89d3-6619e5e5525a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:02:58 crc kubenswrapper[4723]: I0309 13:02:58.481964 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kcvkd" Mar 09 13:02:58 crc kubenswrapper[4723]: I0309 13:02:58.501288 4723 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84890bd9-0d95-48f4-89d3-6619e5e5525a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:02:58 crc kubenswrapper[4723]: I0309 13:02:58.501318 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fd2j\" (UniqueName: \"kubernetes.io/projected/84890bd9-0d95-48f4-89d3-6619e5e5525a-kube-api-access-6fd2j\") on node \"crc\" DevicePath \"\"" Mar 09 13:02:58 crc kubenswrapper[4723]: I0309 13:02:58.845941 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fcbf754c-xl995" event={"ID":"5772ae4e-fbd1-4d9c-bbe3-a92189b261ff","Type":"ContainerStarted","Data":"3c68982cacd76b0c0cb7b62ac568c4026e91cf838d55abe0f7a83432783f1cb8"} Mar 09 13:02:58 crc kubenswrapper[4723]: I0309 13:02:58.845990 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fcbf754c-xl995" event={"ID":"5772ae4e-fbd1-4d9c-bbe3-a92189b261ff","Type":"ContainerStarted","Data":"4474729d287066553ffd0717364c6dd2727bc9ee588c77ca743d301e590c9b79"} Mar 09 13:02:58 crc kubenswrapper[4723]: I0309 13:02:58.846190 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7fcbf754c-xl995" Mar 09 13:02:58 crc kubenswrapper[4723]: I0309 13:02:58.850141 4723 generic.go:334] "Generic (PLEG): container finished" podID="84890bd9-0d95-48f4-89d3-6619e5e5525a" containerID="2bcd0259d58210a05b28751843f01bf8fb4c24e3b599b603bc5d6b1c4e6c90f9" exitCode=0 Mar 09 13:02:58 crc kubenswrapper[4723]: I0309 13:02:58.850807 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jq2cv" Mar 09 13:02:58 crc kubenswrapper[4723]: I0309 13:02:58.852125 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jq2cv" event={"ID":"84890bd9-0d95-48f4-89d3-6619e5e5525a","Type":"ContainerDied","Data":"2bcd0259d58210a05b28751843f01bf8fb4c24e3b599b603bc5d6b1c4e6c90f9"} Mar 09 13:02:58 crc kubenswrapper[4723]: I0309 13:02:58.852239 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jq2cv" event={"ID":"84890bd9-0d95-48f4-89d3-6619e5e5525a","Type":"ContainerDied","Data":"18882635a5ebc0eb89e65369741b36ec94c51f70df60d1767ecb5cfd13eda526"} Mar 09 13:02:58 crc kubenswrapper[4723]: I0309 13:02:58.852312 4723 scope.go:117] "RemoveContainer" containerID="2bcd0259d58210a05b28751843f01bf8fb4c24e3b599b603bc5d6b1c4e6c90f9" Mar 09 13:02:58 crc kubenswrapper[4723]: I0309 13:02:58.853772 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7fcbf754c-xl995" Mar 09 13:02:58 crc kubenswrapper[4723]: I0309 13:02:58.864694 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7fcbf754c-xl995" podStartSLOduration=2.8646788 podStartE2EDuration="2.8646788s" podCreationTimestamp="2026-03-09 13:02:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:02:58.861161753 +0000 UTC m=+252.875629303" watchObservedRunningTime="2026-03-09 13:02:58.8646788 +0000 UTC m=+252.879146340" Mar 09 13:02:58 crc kubenswrapper[4723]: I0309 13:02:58.881051 4723 scope.go:117] "RemoveContainer" containerID="7a5c6c4dc30e89b7a5a04967b525ab1485a6b871c5035a8551969154476829f3" Mar 09 13:02:58 crc kubenswrapper[4723]: I0309 13:02:58.886707 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77ca9a9c-0c47-466d-8219-759581449711" path="/var/lib/kubelet/pods/77ca9a9c-0c47-466d-8219-759581449711/volumes" Mar 09 13:02:58 crc kubenswrapper[4723]: I0309 13:02:58.890703 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba867af3-3d9e-43bf-8d35-0ebd9268b373" path="/var/lib/kubelet/pods/ba867af3-3d9e-43bf-8d35-0ebd9268b373/volumes" Mar 09 13:02:58 crc kubenswrapper[4723]: I0309 13:02:58.924917 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jq2cv"] Mar 09 13:02:58 crc kubenswrapper[4723]: I0309 13:02:58.927045 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jq2cv"] Mar 09 13:02:58 crc kubenswrapper[4723]: I0309 13:02:58.931188 4723 scope.go:117] "RemoveContainer" containerID="7a776bd9c4199b485783dad74583e7b1b1934d99c1d2155c4cbbc2866b078b4e" Mar 09 13:02:58 crc kubenswrapper[4723]: I0309 13:02:58.959296 4723 scope.go:117] "RemoveContainer" containerID="2bcd0259d58210a05b28751843f01bf8fb4c24e3b599b603bc5d6b1c4e6c90f9" Mar 09 13:02:58 crc kubenswrapper[4723]: E0309 13:02:58.959663 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bcd0259d58210a05b28751843f01bf8fb4c24e3b599b603bc5d6b1c4e6c90f9\": container with ID starting with 2bcd0259d58210a05b28751843f01bf8fb4c24e3b599b603bc5d6b1c4e6c90f9 not found: ID does not exist" containerID="2bcd0259d58210a05b28751843f01bf8fb4c24e3b599b603bc5d6b1c4e6c90f9" Mar 09 13:02:58 crc kubenswrapper[4723]: I0309 13:02:58.959694 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bcd0259d58210a05b28751843f01bf8fb4c24e3b599b603bc5d6b1c4e6c90f9"} err="failed to get container status \"2bcd0259d58210a05b28751843f01bf8fb4c24e3b599b603bc5d6b1c4e6c90f9\": rpc error: code = NotFound desc = could not find container \"2bcd0259d58210a05b28751843f01bf8fb4c24e3b599b603bc5d6b1c4e6c90f9\": container with ID starting with 2bcd0259d58210a05b28751843f01bf8fb4c24e3b599b603bc5d6b1c4e6c90f9 not found: ID does not exist" Mar 09 13:02:58 crc kubenswrapper[4723]: I0309 13:02:58.959715 4723 scope.go:117] "RemoveContainer" containerID="7a5c6c4dc30e89b7a5a04967b525ab1485a6b871c5035a8551969154476829f3" Mar 09 13:02:58 crc kubenswrapper[4723]: E0309 13:02:58.960014 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a5c6c4dc30e89b7a5a04967b525ab1485a6b871c5035a8551969154476829f3\": container with ID starting with 7a5c6c4dc30e89b7a5a04967b525ab1485a6b871c5035a8551969154476829f3 not found: ID does not exist" containerID="7a5c6c4dc30e89b7a5a04967b525ab1485a6b871c5035a8551969154476829f3" Mar 09 13:02:58 crc kubenswrapper[4723]: I0309 13:02:58.960046 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a5c6c4dc30e89b7a5a04967b525ab1485a6b871c5035a8551969154476829f3"} err="failed to get container status \"7a5c6c4dc30e89b7a5a04967b525ab1485a6b871c5035a8551969154476829f3\": rpc error: code = NotFound desc = could not find container \"7a5c6c4dc30e89b7a5a04967b525ab1485a6b871c5035a8551969154476829f3\": container with ID starting with 7a5c6c4dc30e89b7a5a04967b525ab1485a6b871c5035a8551969154476829f3 not found: ID does not exist" Mar 09 13:02:58 crc kubenswrapper[4723]: I0309 13:02:58.960068 4723 scope.go:117] "RemoveContainer" containerID="7a776bd9c4199b485783dad74583e7b1b1934d99c1d2155c4cbbc2866b078b4e" Mar 09 13:02:58 crc kubenswrapper[4723]: E0309 13:02:58.960283 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a776bd9c4199b485783dad74583e7b1b1934d99c1d2155c4cbbc2866b078b4e\": container with ID starting with 7a776bd9c4199b485783dad74583e7b1b1934d99c1d2155c4cbbc2866b078b4e not found: ID does not exist" containerID="7a776bd9c4199b485783dad74583e7b1b1934d99c1d2155c4cbbc2866b078b4e" Mar 09 13:02:58 crc kubenswrapper[4723]: I0309 13:02:58.960305 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a776bd9c4199b485783dad74583e7b1b1934d99c1d2155c4cbbc2866b078b4e"} err="failed to get container status \"7a776bd9c4199b485783dad74583e7b1b1934d99c1d2155c4cbbc2866b078b4e\": rpc error: code = NotFound desc = could not find container \"7a776bd9c4199b485783dad74583e7b1b1934d99c1d2155c4cbbc2866b078b4e\": container with ID starting with 7a776bd9c4199b485783dad74583e7b1b1934d99c1d2155c4cbbc2866b078b4e not found: ID does not exist" Mar 09 13:03:00 crc kubenswrapper[4723]: I0309 13:03:00.571584 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-c8dbfdf94-gvc8b"] Mar 09 13:03:00 crc kubenswrapper[4723]: E0309 13:03:00.572379 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77ca9a9c-0c47-466d-8219-759581449711" containerName="controller-manager" Mar 09 13:03:00 crc kubenswrapper[4723]: I0309 13:03:00.572412 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="77ca9a9c-0c47-466d-8219-759581449711" containerName="controller-manager" Mar 09 13:03:00 crc kubenswrapper[4723]: E0309 13:03:00.572440 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84890bd9-0d95-48f4-89d3-6619e5e5525a" containerName="registry-server" Mar 09 13:03:00 crc kubenswrapper[4723]: I0309 13:03:00.572459 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="84890bd9-0d95-48f4-89d3-6619e5e5525a" containerName="registry-server" Mar 09 13:03:00 crc kubenswrapper[4723]: E0309 13:03:00.572488 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84890bd9-0d95-48f4-89d3-6619e5e5525a" containerName="extract-content" Mar 09 13:03:00 crc kubenswrapper[4723]: I0309 13:03:00.572505 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="84890bd9-0d95-48f4-89d3-6619e5e5525a" containerName="extract-content" Mar 09 13:03:00 crc kubenswrapper[4723]: E0309 13:03:00.572522 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84890bd9-0d95-48f4-89d3-6619e5e5525a" containerName="extract-utilities" Mar 09 13:03:00 crc kubenswrapper[4723]: I0309 13:03:00.572536 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="84890bd9-0d95-48f4-89d3-6619e5e5525a" containerName="extract-utilities" Mar 09 13:03:00 crc kubenswrapper[4723]: I0309 13:03:00.572803 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="84890bd9-0d95-48f4-89d3-6619e5e5525a" containerName="registry-server" Mar 09 13:03:00 crc kubenswrapper[4723]: I0309 13:03:00.572842 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="77ca9a9c-0c47-466d-8219-759581449711" containerName="controller-manager" Mar 09 13:03:00 crc kubenswrapper[4723]: I0309 13:03:00.573547 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c8dbfdf94-gvc8b" Mar 09 13:03:00 crc kubenswrapper[4723]: I0309 13:03:00.577083 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 13:03:00 crc kubenswrapper[4723]: I0309 13:03:00.578742 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 13:03:00 crc kubenswrapper[4723]: I0309 13:03:00.578805 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 13:03:00 crc kubenswrapper[4723]: I0309 13:03:00.579359 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 13:03:00 crc kubenswrapper[4723]: I0309 13:03:00.579423 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 13:03:00 crc kubenswrapper[4723]: I0309 13:03:00.579655 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 13:03:00 crc kubenswrapper[4723]: I0309 13:03:00.590314 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 13:03:00 crc kubenswrapper[4723]: I0309 13:03:00.593487 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-c8dbfdf94-gvc8b"] Mar 09 13:03:00 crc kubenswrapper[4723]: I0309 13:03:00.729514 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrkxt\" (UniqueName: \"kubernetes.io/projected/e6f5147d-bdf7-48e5-881d-0aad778e319f-kube-api-access-xrkxt\") pod \"controller-manager-c8dbfdf94-gvc8b\" (UID: \"e6f5147d-bdf7-48e5-881d-0aad778e319f\") " pod="openshift-controller-manager/controller-manager-c8dbfdf94-gvc8b" Mar 09 13:03:00 crc kubenswrapper[4723]: I0309 13:03:00.729971 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e6f5147d-bdf7-48e5-881d-0aad778e319f-client-ca\") pod \"controller-manager-c8dbfdf94-gvc8b\" (UID: \"e6f5147d-bdf7-48e5-881d-0aad778e319f\") " pod="openshift-controller-manager/controller-manager-c8dbfdf94-gvc8b" Mar 09 13:03:00 crc kubenswrapper[4723]: I0309 13:03:00.730220 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6f5147d-bdf7-48e5-881d-0aad778e319f-serving-cert\") pod \"controller-manager-c8dbfdf94-gvc8b\" (UID: \"e6f5147d-bdf7-48e5-881d-0aad778e319f\") " pod="openshift-controller-manager/controller-manager-c8dbfdf94-gvc8b" Mar 09 13:03:00 crc kubenswrapper[4723]: I0309 13:03:00.730496 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e6f5147d-bdf7-48e5-881d-0aad778e319f-proxy-ca-bundles\") pod \"controller-manager-c8dbfdf94-gvc8b\" (UID: \"e6f5147d-bdf7-48e5-881d-0aad778e319f\") " pod="openshift-controller-manager/controller-manager-c8dbfdf94-gvc8b" Mar 09 13:03:00 crc kubenswrapper[4723]: I0309 13:03:00.730677 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6f5147d-bdf7-48e5-881d-0aad778e319f-config\") pod \"controller-manager-c8dbfdf94-gvc8b\" (UID: \"e6f5147d-bdf7-48e5-881d-0aad778e319f\") " pod="openshift-controller-manager/controller-manager-c8dbfdf94-gvc8b" Mar 09 13:03:00 crc kubenswrapper[4723]: I0309 13:03:00.831821 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e6f5147d-bdf7-48e5-881d-0aad778e319f-proxy-ca-bundles\") pod \"controller-manager-c8dbfdf94-gvc8b\" (UID: \"e6f5147d-bdf7-48e5-881d-0aad778e319f\") " pod="openshift-controller-manager/controller-manager-c8dbfdf94-gvc8b" Mar 09 13:03:00 crc kubenswrapper[4723]: I0309 13:03:00.831921 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6f5147d-bdf7-48e5-881d-0aad778e319f-config\") pod \"controller-manager-c8dbfdf94-gvc8b\" (UID: \"e6f5147d-bdf7-48e5-881d-0aad778e319f\") " pod="openshift-controller-manager/controller-manager-c8dbfdf94-gvc8b" Mar 09 13:03:00 crc kubenswrapper[4723]: I0309 13:03:00.831953 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrkxt\" (UniqueName: \"kubernetes.io/projected/e6f5147d-bdf7-48e5-881d-0aad778e319f-kube-api-access-xrkxt\") pod \"controller-manager-c8dbfdf94-gvc8b\" (UID: \"e6f5147d-bdf7-48e5-881d-0aad778e319f\") " pod="openshift-controller-manager/controller-manager-c8dbfdf94-gvc8b" Mar 09 13:03:00 crc kubenswrapper[4723]: I0309 13:03:00.832000 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e6f5147d-bdf7-48e5-881d-0aad778e319f-client-ca\") pod \"controller-manager-c8dbfdf94-gvc8b\" (UID: \"e6f5147d-bdf7-48e5-881d-0aad778e319f\") " pod="openshift-controller-manager/controller-manager-c8dbfdf94-gvc8b" Mar 09 13:03:00 crc kubenswrapper[4723]: I0309 13:03:00.832023 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6f5147d-bdf7-48e5-881d-0aad778e319f-serving-cert\") pod \"controller-manager-c8dbfdf94-gvc8b\" (UID: \"e6f5147d-bdf7-48e5-881d-0aad778e319f\") " pod="openshift-controller-manager/controller-manager-c8dbfdf94-gvc8b" Mar 09 13:03:00 crc kubenswrapper[4723]: I0309 13:03:00.833789 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e6f5147d-bdf7-48e5-881d-0aad778e319f-proxy-ca-bundles\") pod \"controller-manager-c8dbfdf94-gvc8b\" (UID: \"e6f5147d-bdf7-48e5-881d-0aad778e319f\") " pod="openshift-controller-manager/controller-manager-c8dbfdf94-gvc8b" Mar 09 13:03:00 crc kubenswrapper[4723]: I0309 13:03:00.833791 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e6f5147d-bdf7-48e5-881d-0aad778e319f-client-ca\") pod \"controller-manager-c8dbfdf94-gvc8b\" (UID: \"e6f5147d-bdf7-48e5-881d-0aad778e319f\") " pod="openshift-controller-manager/controller-manager-c8dbfdf94-gvc8b" Mar 09 13:03:00 crc kubenswrapper[4723]: I0309 13:03:00.835375 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6f5147d-bdf7-48e5-881d-0aad778e319f-config\") pod \"controller-manager-c8dbfdf94-gvc8b\" (UID: \"e6f5147d-bdf7-48e5-881d-0aad778e319f\") " pod="openshift-controller-manager/controller-manager-c8dbfdf94-gvc8b" Mar 09 13:03:00 crc kubenswrapper[4723]: I0309 13:03:00.840138 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6f5147d-bdf7-48e5-881d-0aad778e319f-serving-cert\") pod \"controller-manager-c8dbfdf94-gvc8b\" (UID: \"e6f5147d-bdf7-48e5-881d-0aad778e319f\") " pod="openshift-controller-manager/controller-manager-c8dbfdf94-gvc8b" Mar 09 13:03:00 crc kubenswrapper[4723]: I0309 13:03:00.865440 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrkxt\" (UniqueName: \"kubernetes.io/projected/e6f5147d-bdf7-48e5-881d-0aad778e319f-kube-api-access-xrkxt\") pod \"controller-manager-c8dbfdf94-gvc8b\" (UID: \"e6f5147d-bdf7-48e5-881d-0aad778e319f\") " pod="openshift-controller-manager/controller-manager-c8dbfdf94-gvc8b" Mar 09 13:03:00 crc kubenswrapper[4723]: I0309 13:03:00.892030 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84890bd9-0d95-48f4-89d3-6619e5e5525a" path="/var/lib/kubelet/pods/84890bd9-0d95-48f4-89d3-6619e5e5525a/volumes" Mar 09 13:03:00 crc kubenswrapper[4723]: I0309 13:03:00.938700 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c8dbfdf94-gvc8b" Mar 09 13:03:01 crc kubenswrapper[4723]: I0309 13:03:01.395816 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-c8dbfdf94-gvc8b"] Mar 09 13:03:01 crc kubenswrapper[4723]: W0309 13:03:01.408073 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6f5147d_bdf7_48e5_881d_0aad778e319f.slice/crio-b892dc3749810158766c22de3b1b99704a89a65c1cfdd182f4e305c5ab9a9513 WatchSource:0}: Error finding container b892dc3749810158766c22de3b1b99704a89a65c1cfdd182f4e305c5ab9a9513: Status 404 returned error can't find the container with id b892dc3749810158766c22de3b1b99704a89a65c1cfdd182f4e305c5ab9a9513 Mar 09 13:03:01 crc kubenswrapper[4723]: I0309 13:03:01.531779 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kcvkd"] Mar 09 13:03:01 crc kubenswrapper[4723]: I0309 13:03:01.532908 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kcvkd" podUID="980fde08-18f8-4e22-93a1-3846f9e367ad" containerName="registry-server" containerID="cri-o://6ce5ce9cb8f262e4d0376936606237d0bffa473b6a1bd1e5116ab78273f250df" gracePeriod=2 Mar 09 13:03:01 crc kubenswrapper[4723]: I0309 13:03:01.876238 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c8dbfdf94-gvc8b" event={"ID":"e6f5147d-bdf7-48e5-881d-0aad778e319f","Type":"ContainerStarted","Data":"b7863751366fe9e17fde27da6ce25afa3b4cf12d630d11bc9f82259923106811"} Mar 09 13:03:01 crc kubenswrapper[4723]: I0309 13:03:01.876558 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-c8dbfdf94-gvc8b" Mar 09 13:03:01 crc kubenswrapper[4723]: I0309 13:03:01.876568 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c8dbfdf94-gvc8b" event={"ID":"e6f5147d-bdf7-48e5-881d-0aad778e319f","Type":"ContainerStarted","Data":"b892dc3749810158766c22de3b1b99704a89a65c1cfdd182f4e305c5ab9a9513"} Mar 09 13:03:01 crc kubenswrapper[4723]: I0309 13:03:01.879935 4723 generic.go:334] "Generic (PLEG): container finished" podID="980fde08-18f8-4e22-93a1-3846f9e367ad" containerID="6ce5ce9cb8f262e4d0376936606237d0bffa473b6a1bd1e5116ab78273f250df" exitCode=0 Mar 09 13:03:01 crc kubenswrapper[4723]: I0309 13:03:01.880007 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kcvkd" event={"ID":"980fde08-18f8-4e22-93a1-3846f9e367ad","Type":"ContainerDied","Data":"6ce5ce9cb8f262e4d0376936606237d0bffa473b6a1bd1e5116ab78273f250df"} Mar 09 13:03:01 crc kubenswrapper[4723]: I0309 13:03:01.889715 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-c8dbfdf94-gvc8b" Mar 09 13:03:01 crc kubenswrapper[4723]: I0309 13:03:01.897121 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-c8dbfdf94-gvc8b" podStartSLOduration=5.897106155 podStartE2EDuration="5.897106155s" podCreationTimestamp="2026-03-09 13:02:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:03:01.892727355 +0000 UTC m=+255.907194895" watchObservedRunningTime="2026-03-09 13:03:01.897106155 +0000 UTC m=+255.911573695" Mar 09 13:03:01 crc kubenswrapper[4723]: I0309 13:03:01.967342 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kcvkd" Mar 09 13:03:02 crc kubenswrapper[4723]: I0309 13:03:02.149259 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/980fde08-18f8-4e22-93a1-3846f9e367ad-utilities\") pod \"980fde08-18f8-4e22-93a1-3846f9e367ad\" (UID: \"980fde08-18f8-4e22-93a1-3846f9e367ad\") " Mar 09 13:03:02 crc kubenswrapper[4723]: I0309 13:03:02.149445 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d85gj\" (UniqueName: \"kubernetes.io/projected/980fde08-18f8-4e22-93a1-3846f9e367ad-kube-api-access-d85gj\") pod \"980fde08-18f8-4e22-93a1-3846f9e367ad\" (UID: \"980fde08-18f8-4e22-93a1-3846f9e367ad\") " Mar 09 13:03:02 crc kubenswrapper[4723]: I0309 13:03:02.149481 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/980fde08-18f8-4e22-93a1-3846f9e367ad-catalog-content\") pod \"980fde08-18f8-4e22-93a1-3846f9e367ad\" (UID: \"980fde08-18f8-4e22-93a1-3846f9e367ad\") " Mar 09 13:03:02 crc kubenswrapper[4723]: I0309 13:03:02.153009 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/980fde08-18f8-4e22-93a1-3846f9e367ad-utilities" (OuterVolumeSpecName: "utilities") pod "980fde08-18f8-4e22-93a1-3846f9e367ad" (UID: "980fde08-18f8-4e22-93a1-3846f9e367ad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:03:02 crc kubenswrapper[4723]: I0309 13:03:02.153823 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/980fde08-18f8-4e22-93a1-3846f9e367ad-kube-api-access-d85gj" (OuterVolumeSpecName: "kube-api-access-d85gj") pod "980fde08-18f8-4e22-93a1-3846f9e367ad" (UID: "980fde08-18f8-4e22-93a1-3846f9e367ad"). InnerVolumeSpecName "kube-api-access-d85gj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:03:02 crc kubenswrapper[4723]: I0309 13:03:02.250734 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d85gj\" (UniqueName: \"kubernetes.io/projected/980fde08-18f8-4e22-93a1-3846f9e367ad-kube-api-access-d85gj\") on node \"crc\" DevicePath \"\"" Mar 09 13:03:02 crc kubenswrapper[4723]: I0309 13:03:02.250779 4723 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/980fde08-18f8-4e22-93a1-3846f9e367ad-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:03:02 crc kubenswrapper[4723]: I0309 13:03:02.294430 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/980fde08-18f8-4e22-93a1-3846f9e367ad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "980fde08-18f8-4e22-93a1-3846f9e367ad" (UID: "980fde08-18f8-4e22-93a1-3846f9e367ad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:03:02 crc kubenswrapper[4723]: I0309 13:03:02.352014 4723 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/980fde08-18f8-4e22-93a1-3846f9e367ad-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:03:02 crc kubenswrapper[4723]: I0309 13:03:02.616112 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" podUID="9ae03d73-b21d-4004-a000-e49a547ef19d" containerName="oauth-openshift" containerID="cri-o://97edd0b231105266aa5e79d53a279aecad1c294047940549fef18653ebec0290" gracePeriod=15 Mar 09 13:03:02 crc kubenswrapper[4723]: I0309 13:03:02.887294 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kcvkd" Mar 09 13:03:02 crc kubenswrapper[4723]: I0309 13:03:02.887384 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kcvkd" event={"ID":"980fde08-18f8-4e22-93a1-3846f9e367ad","Type":"ContainerDied","Data":"00770bf6d02ef6c880f6f04c9309af835537d0a14e71af65f34e605e92979447"} Mar 09 13:03:02 crc kubenswrapper[4723]: I0309 13:03:02.887422 4723 scope.go:117] "RemoveContainer" containerID="6ce5ce9cb8f262e4d0376936606237d0bffa473b6a1bd1e5116ab78273f250df" Mar 09 13:03:02 crc kubenswrapper[4723]: I0309 13:03:02.892375 4723 generic.go:334] "Generic (PLEG): container finished" podID="9ae03d73-b21d-4004-a000-e49a547ef19d" containerID="97edd0b231105266aa5e79d53a279aecad1c294047940549fef18653ebec0290" exitCode=0 Mar 09 13:03:02 crc kubenswrapper[4723]: I0309 13:03:02.892500 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" event={"ID":"9ae03d73-b21d-4004-a000-e49a547ef19d","Type":"ContainerDied","Data":"97edd0b231105266aa5e79d53a279aecad1c294047940549fef18653ebec0290"} Mar 09 13:03:02 crc kubenswrapper[4723]: I0309 13:03:02.924187 4723 scope.go:117] "RemoveContainer" containerID="856e534e7cb84b0cb2211a1b4311886f65328d374c944197b38960eb828368d1" Mar 09 13:03:02 crc kubenswrapper[4723]: I0309 13:03:02.941346 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kcvkd"] Mar 09 13:03:02 crc kubenswrapper[4723]: I0309 13:03:02.945718 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kcvkd"] Mar 09 13:03:02 crc kubenswrapper[4723]: I0309 13:03:02.946626 4723 scope.go:117] "RemoveContainer" containerID="4dfd81283017ab558113998e30e2d1364cdca40a7573010d280a40087dec8f95" Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.122544 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.181567 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9ae03d73-b21d-4004-a000-e49a547ef19d-audit-policies\") pod \"9ae03d73-b21d-4004-a000-e49a547ef19d\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.181629 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-user-template-error\") pod \"9ae03d73-b21d-4004-a000-e49a547ef19d\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.181676 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-system-ocp-branding-template\") pod \"9ae03d73-b21d-4004-a000-e49a547ef19d\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.181732 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-system-trusted-ca-bundle\") pod \"9ae03d73-b21d-4004-a000-e49a547ef19d\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.181763 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-system-router-certs\") pod \"9ae03d73-b21d-4004-a000-e49a547ef19d\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.181788 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9ae03d73-b21d-4004-a000-e49a547ef19d-audit-dir\") pod \"9ae03d73-b21d-4004-a000-e49a547ef19d\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.181810 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-user-template-login\") pod \"9ae03d73-b21d-4004-a000-e49a547ef19d\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.181841 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-system-cliconfig\") pod \"9ae03d73-b21d-4004-a000-e49a547ef19d\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.181885 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-user-template-provider-selection\") pod \"9ae03d73-b21d-4004-a000-e49a547ef19d\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.181916 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-system-service-ca\") pod \"9ae03d73-b21d-4004-a000-e49a547ef19d\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.181941 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrc6q\" (UniqueName: \"kubernetes.io/projected/9ae03d73-b21d-4004-a000-e49a547ef19d-kube-api-access-wrc6q\") pod \"9ae03d73-b21d-4004-a000-e49a547ef19d\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.181966 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-user-idp-0-file-data\") pod \"9ae03d73-b21d-4004-a000-e49a547ef19d\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.182017 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-system-serving-cert\") pod \"9ae03d73-b21d-4004-a000-e49a547ef19d\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.182040 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-system-session\") pod \"9ae03d73-b21d-4004-a000-e49a547ef19d\" (UID: \"9ae03d73-b21d-4004-a000-e49a547ef19d\") " Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.183249 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ae03d73-b21d-4004-a000-e49a547ef19d-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "9ae03d73-b21d-4004-a000-e49a547ef19d" (UID: "9ae03d73-b21d-4004-a000-e49a547ef19d"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.183492 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "9ae03d73-b21d-4004-a000-e49a547ef19d" (UID: "9ae03d73-b21d-4004-a000-e49a547ef19d"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.183542 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9ae03d73-b21d-4004-a000-e49a547ef19d-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "9ae03d73-b21d-4004-a000-e49a547ef19d" (UID: "9ae03d73-b21d-4004-a000-e49a547ef19d"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.187319 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "9ae03d73-b21d-4004-a000-e49a547ef19d" (UID: "9ae03d73-b21d-4004-a000-e49a547ef19d"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.188426 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "9ae03d73-b21d-4004-a000-e49a547ef19d" (UID: "9ae03d73-b21d-4004-a000-e49a547ef19d"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.188618 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "9ae03d73-b21d-4004-a000-e49a547ef19d" (UID: "9ae03d73-b21d-4004-a000-e49a547ef19d"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.189089 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "9ae03d73-b21d-4004-a000-e49a547ef19d" (UID: "9ae03d73-b21d-4004-a000-e49a547ef19d"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.190078 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "9ae03d73-b21d-4004-a000-e49a547ef19d" (UID: "9ae03d73-b21d-4004-a000-e49a547ef19d"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.190083 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "9ae03d73-b21d-4004-a000-e49a547ef19d" (UID: "9ae03d73-b21d-4004-a000-e49a547ef19d"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.195479 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "9ae03d73-b21d-4004-a000-e49a547ef19d" (UID: "9ae03d73-b21d-4004-a000-e49a547ef19d"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.195834 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "9ae03d73-b21d-4004-a000-e49a547ef19d" (UID: "9ae03d73-b21d-4004-a000-e49a547ef19d"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.200137 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "9ae03d73-b21d-4004-a000-e49a547ef19d" (UID: "9ae03d73-b21d-4004-a000-e49a547ef19d"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.203763 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ae03d73-b21d-4004-a000-e49a547ef19d-kube-api-access-wrc6q" (OuterVolumeSpecName: "kube-api-access-wrc6q") pod "9ae03d73-b21d-4004-a000-e49a547ef19d" (UID: "9ae03d73-b21d-4004-a000-e49a547ef19d"). InnerVolumeSpecName "kube-api-access-wrc6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.205312 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "9ae03d73-b21d-4004-a000-e49a547ef19d" (UID: "9ae03d73-b21d-4004-a000-e49a547ef19d"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.283792 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrc6q\" (UniqueName: \"kubernetes.io/projected/9ae03d73-b21d-4004-a000-e49a547ef19d-kube-api-access-wrc6q\") on node \"crc\" DevicePath \"\"" Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.283831 4723 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.283848 4723 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.283873 4723 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.283886 4723 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9ae03d73-b21d-4004-a000-e49a547ef19d-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.283899 4723 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.283910 4723 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.283922 4723 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.283933 4723 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.283943 4723 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9ae03d73-b21d-4004-a000-e49a547ef19d-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.283954 4723 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.283965 4723 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.283978 4723 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.283990 4723 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9ae03d73-b21d-4004-a000-e49a547ef19d-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.899469 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" event={"ID":"9ae03d73-b21d-4004-a000-e49a547ef19d","Type":"ContainerDied","Data":"58ac658811e294849a47a07873fb96a56ed1b8c33a137ed49fff1e2c9981cdb2"} Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.899498 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-dh6qm" Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.899533 4723 scope.go:117] "RemoveContainer" containerID="97edd0b231105266aa5e79d53a279aecad1c294047940549fef18653ebec0290" Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.927376 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-dh6qm"] Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.929942 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-dh6qm"] Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.947576 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:03:03 crc kubenswrapper[4723]: I0309 13:03:03.947639 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:03:04 crc kubenswrapper[4723]: I0309 13:03:04.889256 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="980fde08-18f8-4e22-93a1-3846f9e367ad" path="/var/lib/kubelet/pods/980fde08-18f8-4e22-93a1-3846f9e367ad/volumes" Mar 09 13:03:04 crc kubenswrapper[4723]: I0309 13:03:04.890271 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ae03d73-b21d-4004-a000-e49a547ef19d" path="/var/lib/kubelet/pods/9ae03d73-b21d-4004-a000-e49a547ef19d/volumes" Mar 09 13:03:05 crc kubenswrapper[4723]: I0309 13:03:05.113767 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.575591 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6775b6d8cc-n5skm"] Mar 09 13:03:06 crc kubenswrapper[4723]: E0309 13:03:06.575814 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="980fde08-18f8-4e22-93a1-3846f9e367ad" containerName="registry-server" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.575827 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="980fde08-18f8-4e22-93a1-3846f9e367ad" containerName="registry-server" Mar 09 13:03:06 crc kubenswrapper[4723]: E0309 13:03:06.575845 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="980fde08-18f8-4e22-93a1-3846f9e367ad" containerName="extract-utilities" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.575853 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="980fde08-18f8-4e22-93a1-3846f9e367ad" containerName="extract-utilities" Mar 09 13:03:06 crc kubenswrapper[4723]: E0309 13:03:06.575879 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ae03d73-b21d-4004-a000-e49a547ef19d" containerName="oauth-openshift" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.575901 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ae03d73-b21d-4004-a000-e49a547ef19d" containerName="oauth-openshift" Mar 09 13:03:06 crc kubenswrapper[4723]: E0309 13:03:06.575916 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="980fde08-18f8-4e22-93a1-3846f9e367ad" containerName="extract-content" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.575923 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="980fde08-18f8-4e22-93a1-3846f9e367ad" containerName="extract-content" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.576031 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ae03d73-b21d-4004-a000-e49a547ef19d" containerName="oauth-openshift" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.576049 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="980fde08-18f8-4e22-93a1-3846f9e367ad" containerName="registry-server" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.576507 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.579842 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.580008 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.580182 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.580460 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.583005 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.583012 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.583432 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.583656 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.583729 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.584097 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.584391 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.584524 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.590294 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6775b6d8cc-n5skm"] Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.596129 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.599000 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.600395 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.739417 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f0275b6b-90ed-4c22-ae68-834792f8e5dd-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6775b6d8cc-n5skm\" (UID: \"f0275b6b-90ed-4c22-ae68-834792f8e5dd\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.739469 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f0275b6b-90ed-4c22-ae68-834792f8e5dd-v4-0-config-system-service-ca\") pod \"oauth-openshift-6775b6d8cc-n5skm\" (UID: \"f0275b6b-90ed-4c22-ae68-834792f8e5dd\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.739491 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f0275b6b-90ed-4c22-ae68-834792f8e5dd-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6775b6d8cc-n5skm\" (UID: \"f0275b6b-90ed-4c22-ae68-834792f8e5dd\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.739562 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f0275b6b-90ed-4c22-ae68-834792f8e5dd-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6775b6d8cc-n5skm\" (UID: \"f0275b6b-90ed-4c22-ae68-834792f8e5dd\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.739671 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0275b6b-90ed-4c22-ae68-834792f8e5dd-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6775b6d8cc-n5skm\" (UID: \"f0275b6b-90ed-4c22-ae68-834792f8e5dd\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.739770 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkkbg\" (UniqueName: \"kubernetes.io/projected/f0275b6b-90ed-4c22-ae68-834792f8e5dd-kube-api-access-tkkbg\") pod \"oauth-openshift-6775b6d8cc-n5skm\" (UID: \"f0275b6b-90ed-4c22-ae68-834792f8e5dd\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.739808 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f0275b6b-90ed-4c22-ae68-834792f8e5dd-v4-0-config-user-template-login\") pod \"oauth-openshift-6775b6d8cc-n5skm\" (UID: \"f0275b6b-90ed-4c22-ae68-834792f8e5dd\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.739888 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f0275b6b-90ed-4c22-ae68-834792f8e5dd-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6775b6d8cc-n5skm\" (UID: \"f0275b6b-90ed-4c22-ae68-834792f8e5dd\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.739918 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f0275b6b-90ed-4c22-ae68-834792f8e5dd-v4-0-config-system-router-certs\") pod \"oauth-openshift-6775b6d8cc-n5skm\" (UID: \"f0275b6b-90ed-4c22-ae68-834792f8e5dd\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.739953 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f0275b6b-90ed-4c22-ae68-834792f8e5dd-v4-0-config-user-template-error\") pod \"oauth-openshift-6775b6d8cc-n5skm\" (UID: \"f0275b6b-90ed-4c22-ae68-834792f8e5dd\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.739986 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f0275b6b-90ed-4c22-ae68-834792f8e5dd-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6775b6d8cc-n5skm\" (UID: \"f0275b6b-90ed-4c22-ae68-834792f8e5dd\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.740045 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f0275b6b-90ed-4c22-ae68-834792f8e5dd-audit-policies\") pod \"oauth-openshift-6775b6d8cc-n5skm\" (UID: \"f0275b6b-90ed-4c22-ae68-834792f8e5dd\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.740070 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f0275b6b-90ed-4c22-ae68-834792f8e5dd-audit-dir\") pod \"oauth-openshift-6775b6d8cc-n5skm\" (UID: \"f0275b6b-90ed-4c22-ae68-834792f8e5dd\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.740101 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f0275b6b-90ed-4c22-ae68-834792f8e5dd-v4-0-config-system-session\") pod \"oauth-openshift-6775b6d8cc-n5skm\" (UID: \"f0275b6b-90ed-4c22-ae68-834792f8e5dd\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.840672 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f0275b6b-90ed-4c22-ae68-834792f8e5dd-audit-policies\") pod \"oauth-openshift-6775b6d8cc-n5skm\" (UID: \"f0275b6b-90ed-4c22-ae68-834792f8e5dd\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.840716 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f0275b6b-90ed-4c22-ae68-834792f8e5dd-audit-dir\") pod \"oauth-openshift-6775b6d8cc-n5skm\" (UID: \"f0275b6b-90ed-4c22-ae68-834792f8e5dd\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.840774 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f0275b6b-90ed-4c22-ae68-834792f8e5dd-v4-0-config-system-session\") pod \"oauth-openshift-6775b6d8cc-n5skm\" (UID: \"f0275b6b-90ed-4c22-ae68-834792f8e5dd\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.840804 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f0275b6b-90ed-4c22-ae68-834792f8e5dd-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6775b6d8cc-n5skm\" (UID: \"f0275b6b-90ed-4c22-ae68-834792f8e5dd\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.840870 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f0275b6b-90ed-4c22-ae68-834792f8e5dd-v4-0-config-system-service-ca\") pod \"oauth-openshift-6775b6d8cc-n5skm\" (UID: \"f0275b6b-90ed-4c22-ae68-834792f8e5dd\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.840897 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f0275b6b-90ed-4c22-ae68-834792f8e5dd-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6775b6d8cc-n5skm\" (UID: \"f0275b6b-90ed-4c22-ae68-834792f8e5dd\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.840896 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f0275b6b-90ed-4c22-ae68-834792f8e5dd-audit-dir\") pod \"oauth-openshift-6775b6d8cc-n5skm\" (UID: \"f0275b6b-90ed-4c22-ae68-834792f8e5dd\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.840928 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f0275b6b-90ed-4c22-ae68-834792f8e5dd-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6775b6d8cc-n5skm\" (UID: \"f0275b6b-90ed-4c22-ae68-834792f8e5dd\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.840983 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0275b6b-90ed-4c22-ae68-834792f8e5dd-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6775b6d8cc-n5skm\" (UID: \"f0275b6b-90ed-4c22-ae68-834792f8e5dd\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.841055 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkkbg\" (UniqueName: \"kubernetes.io/projected/f0275b6b-90ed-4c22-ae68-834792f8e5dd-kube-api-access-tkkbg\") pod \"oauth-openshift-6775b6d8cc-n5skm\" (UID: \"f0275b6b-90ed-4c22-ae68-834792f8e5dd\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.841075 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f0275b6b-90ed-4c22-ae68-834792f8e5dd-v4-0-config-user-template-login\") pod \"oauth-openshift-6775b6d8cc-n5skm\" (UID: \"f0275b6b-90ed-4c22-ae68-834792f8e5dd\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.841125 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f0275b6b-90ed-4c22-ae68-834792f8e5dd-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6775b6d8cc-n5skm\" (UID: \"f0275b6b-90ed-4c22-ae68-834792f8e5dd\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.841151 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f0275b6b-90ed-4c22-ae68-834792f8e5dd-v4-0-config-system-router-certs\") pod \"oauth-openshift-6775b6d8cc-n5skm\" (UID: \"f0275b6b-90ed-4c22-ae68-834792f8e5dd\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.841181 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f0275b6b-90ed-4c22-ae68-834792f8e5dd-v4-0-config-user-template-error\") pod \"oauth-openshift-6775b6d8cc-n5skm\" (UID: \"f0275b6b-90ed-4c22-ae68-834792f8e5dd\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.841206 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f0275b6b-90ed-4c22-ae68-834792f8e5dd-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6775b6d8cc-n5skm\" (UID: \"f0275b6b-90ed-4c22-ae68-834792f8e5dd\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.841647 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f0275b6b-90ed-4c22-ae68-834792f8e5dd-v4-0-config-system-service-ca\") pod \"oauth-openshift-6775b6d8cc-n5skm\" (UID: \"f0275b6b-90ed-4c22-ae68-834792f8e5dd\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.841807 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f0275b6b-90ed-4c22-ae68-834792f8e5dd-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6775b6d8cc-n5skm\" (UID: \"f0275b6b-90ed-4c22-ae68-834792f8e5dd\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.842544 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f0275b6b-90ed-4c22-ae68-834792f8e5dd-audit-policies\") pod \"oauth-openshift-6775b6d8cc-n5skm\" (UID: \"f0275b6b-90ed-4c22-ae68-834792f8e5dd\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.843136 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0275b6b-90ed-4c22-ae68-834792f8e5dd-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6775b6d8cc-n5skm\" (UID: \"f0275b6b-90ed-4c22-ae68-834792f8e5dd\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.845463 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f0275b6b-90ed-4c22-ae68-834792f8e5dd-v4-0-config-system-router-certs\") pod \"oauth-openshift-6775b6d8cc-n5skm\" (UID: \"f0275b6b-90ed-4c22-ae68-834792f8e5dd\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.845638 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f0275b6b-90ed-4c22-ae68-834792f8e5dd-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6775b6d8cc-n5skm\" (UID: \"f0275b6b-90ed-4c22-ae68-834792f8e5dd\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.846103 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f0275b6b-90ed-4c22-ae68-834792f8e5dd-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6775b6d8cc-n5skm\" (UID: \"f0275b6b-90ed-4c22-ae68-834792f8e5dd\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.846577 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f0275b6b-90ed-4c22-ae68-834792f8e5dd-v4-0-config-system-session\") pod \"oauth-openshift-6775b6d8cc-n5skm\" (UID: \"f0275b6b-90ed-4c22-ae68-834792f8e5dd\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.846888 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f0275b6b-90ed-4c22-ae68-834792f8e5dd-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6775b6d8cc-n5skm\" (UID: \"f0275b6b-90ed-4c22-ae68-834792f8e5dd\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.847228 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f0275b6b-90ed-4c22-ae68-834792f8e5dd-v4-0-config-user-template-login\") pod \"oauth-openshift-6775b6d8cc-n5skm\" (UID: \"f0275b6b-90ed-4c22-ae68-834792f8e5dd\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.859191 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f0275b6b-90ed-4c22-ae68-834792f8e5dd-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6775b6d8cc-n5skm\" (UID: \"f0275b6b-90ed-4c22-ae68-834792f8e5dd\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.859192 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f0275b6b-90ed-4c22-ae68-834792f8e5dd-v4-0-config-user-template-error\") pod \"oauth-openshift-6775b6d8cc-n5skm\" (UID: \"f0275b6b-90ed-4c22-ae68-834792f8e5dd\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.866481 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkkbg\" (UniqueName: \"kubernetes.io/projected/f0275b6b-90ed-4c22-ae68-834792f8e5dd-kube-api-access-tkkbg\") pod \"oauth-openshift-6775b6d8cc-n5skm\" (UID: \"f0275b6b-90ed-4c22-ae68-834792f8e5dd\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" Mar 09 13:03:06 crc kubenswrapper[4723]: I0309 13:03:06.947125 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" Mar 09 13:03:07 crc kubenswrapper[4723]: I0309 13:03:07.392357 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6775b6d8cc-n5skm"] Mar 09 13:03:07 crc kubenswrapper[4723]: I0309 13:03:07.930338 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" event={"ID":"f0275b6b-90ed-4c22-ae68-834792f8e5dd","Type":"ContainerStarted","Data":"d7e1db3f46667317cd77068087d5d5f9fbee77f42db19cf2e2b4edfeff137d79"} Mar 09 13:03:07 crc kubenswrapper[4723]: I0309 13:03:07.930644 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" event={"ID":"f0275b6b-90ed-4c22-ae68-834792f8e5dd","Type":"ContainerStarted","Data":"ee6ba9ddccfeb21ee3b143d0c3bbea356dbfbc4b6338024f87141a58839f29e4"} Mar 09 13:03:07 crc kubenswrapper[4723]: I0309 13:03:07.930665 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" Mar 09 13:03:07 crc kubenswrapper[4723]: I0309 13:03:07.984916 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" Mar 09 13:03:08 crc kubenswrapper[4723]: I0309 13:03:08.003492 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" podStartSLOduration=31.003474099 podStartE2EDuration="31.003474099s" podCreationTimestamp="2026-03-09 13:02:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:03:07.9614105 +0000 UTC m=+261.975878040" watchObservedRunningTime="2026-03-09 13:03:08.003474099 +0000 UTC m=+262.017941629" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.224525 4723 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.226964 4723 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.227017 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.227798 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://ea9771677551554a5d85cade16547356c2335fa472e5b66b2b18d44a5b5d45c9" gracePeriod=15 Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.227798 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1" gracePeriod=15 Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.227797 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f" gracePeriod=15 Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.227922 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7" gracePeriod=15 Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.227925 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140" gracePeriod=15 Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.228424 4723 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 13:03:15 crc kubenswrapper[4723]: E0309 13:03:15.228660 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.228677 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:03:15 crc kubenswrapper[4723]: E0309 13:03:15.228686 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.228692 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:03:15 crc kubenswrapper[4723]: E0309 13:03:15.228700 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.228708 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 09 13:03:15 crc kubenswrapper[4723]: E0309 13:03:15.228721 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.228726 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 09 13:03:15 crc kubenswrapper[4723]: E0309 13:03:15.228736 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.228742 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 09 13:03:15 crc kubenswrapper[4723]: E0309 13:03:15.228812 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.228819 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:03:15 crc kubenswrapper[4723]: E0309 13:03:15.228827 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.228832 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:03:15 crc kubenswrapper[4723]: E0309 13:03:15.228843 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.228849 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 09 13:03:15 crc kubenswrapper[4723]: E0309 13:03:15.228882 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.228888 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.228995 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.229006 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.229014 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.229021 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.229029 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.229037 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.229044 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:03:15 crc kubenswrapper[4723]: E0309 13:03:15.229187 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.229196 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.229528 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.229548 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.257838 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.257996 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.258042 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.258136 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.258457 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.258497 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.258516 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.258821 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:03:15 crc kubenswrapper[4723]: E0309 13:03:15.268521 4723 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.129:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.360306 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.360378 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.360409 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.360431 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.360454 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.360481 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.360531 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.360534 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.360448 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.360566 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.360487 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.360726 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.360780 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.360894 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.360558 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.361125 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.569761 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:03:15 crc kubenswrapper[4723]: W0309 13:03:15.595102 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-5b06386e55f91960d238308ccb5ff4307773332dcdd4ffeb1dcbb10f01946b45 WatchSource:0}: Error finding container 5b06386e55f91960d238308ccb5ff4307773332dcdd4ffeb1dcbb10f01946b45: Status 404 returned error can't find the container with id 5b06386e55f91960d238308ccb5ff4307773332dcdd4ffeb1dcbb10f01946b45 Mar 09 13:03:15 crc kubenswrapper[4723]: E0309 13:03:15.597793 4723 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.129:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189b2df1f9886486 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:03:15.597255814 +0000 UTC m=+269.611723354,LastTimestamp:2026-03-09 13:03:15.597255814 +0000 UTC m=+269.611723354,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.986069 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.987717 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.988777 4723 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ea9771677551554a5d85cade16547356c2335fa472e5b66b2b18d44a5b5d45c9" exitCode=0 Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.988855 4723 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7" exitCode=0 Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.988945 4723 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1" exitCode=0 Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.989011 4723 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140" exitCode=2 Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.988893 4723 scope.go:117] "RemoveContainer" containerID="093568ac39112d6498ac61418cf93f91faa209b1e378d12e55805f628bb2c468" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.992238 4723 generic.go:334] "Generic (PLEG): container finished" podID="dbe90cf6-9e4d-49b5-ac07-c9c88288d058" containerID="b7e6c6d43080a4ba0949a732c2f3aebd111c7ded6f6cce645864db7d3b1988f9" exitCode=0 Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.992308 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"dbe90cf6-9e4d-49b5-ac07-c9c88288d058","Type":"ContainerDied","Data":"b7e6c6d43080a4ba0949a732c2f3aebd111c7ded6f6cce645864db7d3b1988f9"} Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.993546 4723 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.993823 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"511d841fbabc2fb0e18cf0547f7df362d41f2c44260b4d9921913408a3dbc401"} Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.993862 4723 status_manager.go:851] "Failed to get status for pod" podUID="dbe90cf6-9e4d-49b5-ac07-c9c88288d058" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.993905 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"5b06386e55f91960d238308ccb5ff4307773332dcdd4ffeb1dcbb10f01946b45"} Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.994346 4723 status_manager.go:851] "Failed to get status for pod" podUID="dbe90cf6-9e4d-49b5-ac07-c9c88288d058" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 09 13:03:15 crc kubenswrapper[4723]: E0309 13:03:15.994409 4723 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.129:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:03:15 crc kubenswrapper[4723]: I0309 13:03:15.994735 4723 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 09 13:03:16 crc kubenswrapper[4723]: I0309 13:03:16.883224 4723 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 09 13:03:16 crc kubenswrapper[4723]: I0309 13:03:16.884401 4723 status_manager.go:851] "Failed to get status for pod" podUID="dbe90cf6-9e4d-49b5-ac07-c9c88288d058" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 09 13:03:17 crc kubenswrapper[4723]: I0309 13:03:17.004306 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 09 13:03:17 crc kubenswrapper[4723]: E0309 13:03:17.007510 4723 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.129:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:03:17 crc kubenswrapper[4723]: I0309 13:03:17.605614 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:03:17 crc kubenswrapper[4723]: I0309 13:03:17.606786 4723 status_manager.go:851] "Failed to get status for pod" podUID="dbe90cf6-9e4d-49b5-ac07-c9c88288d058" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 09 13:03:17 crc kubenswrapper[4723]: I0309 13:03:17.612456 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 09 13:03:17 crc kubenswrapper[4723]: I0309 13:03:17.613096 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:03:17 crc kubenswrapper[4723]: I0309 13:03:17.613624 4723 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 09 13:03:17 crc kubenswrapper[4723]: I0309 13:03:17.613794 4723 status_manager.go:851] "Failed to get status for pod" podUID="dbe90cf6-9e4d-49b5-ac07-c9c88288d058" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 09 13:03:17 crc kubenswrapper[4723]: I0309 13:03:17.707947 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dbe90cf6-9e4d-49b5-ac07-c9c88288d058-kube-api-access\") pod \"dbe90cf6-9e4d-49b5-ac07-c9c88288d058\" (UID: \"dbe90cf6-9e4d-49b5-ac07-c9c88288d058\") " Mar 09 13:03:17 crc kubenswrapper[4723]: I0309 13:03:17.708009 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dbe90cf6-9e4d-49b5-ac07-c9c88288d058-kubelet-dir\") pod \"dbe90cf6-9e4d-49b5-ac07-c9c88288d058\" (UID: \"dbe90cf6-9e4d-49b5-ac07-c9c88288d058\") " Mar 09 13:03:17 crc kubenswrapper[4723]: I0309 13:03:17.708040 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dbe90cf6-9e4d-49b5-ac07-c9c88288d058-var-lock\") pod \"dbe90cf6-9e4d-49b5-ac07-c9c88288d058\" (UID: \"dbe90cf6-9e4d-49b5-ac07-c9c88288d058\") " Mar 09 13:03:17 crc kubenswrapper[4723]: I0309 13:03:17.708172 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dbe90cf6-9e4d-49b5-ac07-c9c88288d058-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "dbe90cf6-9e4d-49b5-ac07-c9c88288d058" (UID: "dbe90cf6-9e4d-49b5-ac07-c9c88288d058"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:03:17 crc kubenswrapper[4723]: I0309 13:03:17.708255 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dbe90cf6-9e4d-49b5-ac07-c9c88288d058-var-lock" (OuterVolumeSpecName: "var-lock") pod "dbe90cf6-9e4d-49b5-ac07-c9c88288d058" (UID: "dbe90cf6-9e4d-49b5-ac07-c9c88288d058"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:03:17 crc kubenswrapper[4723]: I0309 13:03:17.708601 4723 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dbe90cf6-9e4d-49b5-ac07-c9c88288d058-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 09 13:03:17 crc kubenswrapper[4723]: I0309 13:03:17.708628 4723 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dbe90cf6-9e4d-49b5-ac07-c9c88288d058-var-lock\") on node \"crc\" DevicePath \"\"" Mar 09 13:03:17 crc kubenswrapper[4723]: I0309 13:03:17.712988 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbe90cf6-9e4d-49b5-ac07-c9c88288d058-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "dbe90cf6-9e4d-49b5-ac07-c9c88288d058" (UID: "dbe90cf6-9e4d-49b5-ac07-c9c88288d058"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:03:17 crc kubenswrapper[4723]: I0309 13:03:17.809103 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 09 13:03:17 crc kubenswrapper[4723]: I0309 13:03:17.809233 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 09 13:03:17 crc kubenswrapper[4723]: I0309 13:03:17.809268 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 09 13:03:17 crc kubenswrapper[4723]: I0309 13:03:17.809411 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:03:17 crc kubenswrapper[4723]: I0309 13:03:17.809524 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:03:17 crc kubenswrapper[4723]: I0309 13:03:17.809545 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:03:17 crc kubenswrapper[4723]: I0309 13:03:17.809568 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dbe90cf6-9e4d-49b5-ac07-c9c88288d058-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 13:03:17 crc kubenswrapper[4723]: I0309 13:03:17.912015 4723 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 09 13:03:17 crc kubenswrapper[4723]: I0309 13:03:17.912076 4723 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 09 13:03:17 crc kubenswrapper[4723]: I0309 13:03:17.912120 4723 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 09 13:03:17 crc kubenswrapper[4723]: E0309 13:03:17.943434 4723 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 09 13:03:17 crc kubenswrapper[4723]: E0309 13:03:17.944241 4723 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 09 13:03:17 crc kubenswrapper[4723]: E0309 13:03:17.944782 4723 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 09 13:03:17 crc kubenswrapper[4723]: E0309 13:03:17.945293 4723 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 09 13:03:17 crc kubenswrapper[4723]: E0309 13:03:17.945754 4723 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 09 13:03:17 crc kubenswrapper[4723]: I0309 13:03:17.945803 4723 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 09 13:03:17 crc kubenswrapper[4723]: E0309 13:03:17.946402 4723 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" interval="200ms" Mar 09 13:03:18 crc kubenswrapper[4723]: I0309 13:03:18.017582 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 09 13:03:18 crc kubenswrapper[4723]: I0309 13:03:18.018624 4723 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f" exitCode=0 Mar 09 13:03:18 crc kubenswrapper[4723]: I0309 13:03:18.018732 4723 scope.go:117] "RemoveContainer" containerID="ea9771677551554a5d85cade16547356c2335fa472e5b66b2b18d44a5b5d45c9" Mar 09 13:03:18 crc kubenswrapper[4723]: I0309 13:03:18.018794 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:03:18 crc kubenswrapper[4723]: I0309 13:03:18.031341 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:03:18 crc kubenswrapper[4723]: I0309 13:03:18.031328 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"dbe90cf6-9e4d-49b5-ac07-c9c88288d058","Type":"ContainerDied","Data":"0be9bc4bac89eea171f55a0bfd0ef1393756a96a6cc93f84a5f4b240a599e022"} Mar 09 13:03:18 crc kubenswrapper[4723]: I0309 13:03:18.031561 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0be9bc4bac89eea171f55a0bfd0ef1393756a96a6cc93f84a5f4b240a599e022" Mar 09 13:03:18 crc kubenswrapper[4723]: I0309 13:03:18.043402 4723 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 09 13:03:18 crc kubenswrapper[4723]: I0309 13:03:18.044112 4723 status_manager.go:851] "Failed to get status for pod" podUID="dbe90cf6-9e4d-49b5-ac07-c9c88288d058" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 09 13:03:18 crc kubenswrapper[4723]: I0309 13:03:18.050776 4723 scope.go:117] "RemoveContainer" containerID="205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7" Mar 09 13:03:18 crc kubenswrapper[4723]: I0309 13:03:18.057179 4723 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 09 13:03:18 crc kubenswrapper[4723]: I0309 13:03:18.058298 4723 status_manager.go:851] "Failed to get status for pod" podUID="dbe90cf6-9e4d-49b5-ac07-c9c88288d058" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 09 13:03:18 crc kubenswrapper[4723]: I0309 13:03:18.068036 4723 scope.go:117] "RemoveContainer" containerID="00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1" Mar 09 13:03:18 crc kubenswrapper[4723]: I0309 13:03:18.084512 4723 scope.go:117] "RemoveContainer" containerID="01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140" Mar 09 13:03:18 crc kubenswrapper[4723]: I0309 13:03:18.098814 4723 scope.go:117] "RemoveContainer" containerID="1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f" Mar 09 13:03:18 crc kubenswrapper[4723]: I0309 13:03:18.124192 4723 scope.go:117] "RemoveContainer" containerID="74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9" Mar 09 13:03:18 crc kubenswrapper[4723]: I0309 13:03:18.147004 4723 scope.go:117] "RemoveContainer" containerID="ea9771677551554a5d85cade16547356c2335fa472e5b66b2b18d44a5b5d45c9" Mar 09 13:03:18 crc kubenswrapper[4723]: E0309 13:03:18.147532 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea9771677551554a5d85cade16547356c2335fa472e5b66b2b18d44a5b5d45c9\": container with ID starting with ea9771677551554a5d85cade16547356c2335fa472e5b66b2b18d44a5b5d45c9 not found: ID does not exist" containerID="ea9771677551554a5d85cade16547356c2335fa472e5b66b2b18d44a5b5d45c9" Mar 09 13:03:18 crc kubenswrapper[4723]: I0309 13:03:18.147628 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea9771677551554a5d85cade16547356c2335fa472e5b66b2b18d44a5b5d45c9"} err="failed to get container status \"ea9771677551554a5d85cade16547356c2335fa472e5b66b2b18d44a5b5d45c9\": rpc error: code = NotFound desc = could not find container \"ea9771677551554a5d85cade16547356c2335fa472e5b66b2b18d44a5b5d45c9\": container with ID starting with ea9771677551554a5d85cade16547356c2335fa472e5b66b2b18d44a5b5d45c9 not found: ID does not exist" Mar 09 13:03:18 crc kubenswrapper[4723]: I0309 13:03:18.147665 4723 scope.go:117] "RemoveContainer" containerID="205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7" Mar 09 13:03:18 crc kubenswrapper[4723]: E0309 13:03:18.148261 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7\": container with ID starting with 205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7 not found: ID does not exist" containerID="205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7" Mar 09 13:03:18 crc kubenswrapper[4723]: I0309 13:03:18.148302 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7"} err="failed to get container status \"205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7\": rpc error: code = NotFound desc = could not find container \"205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7\": container with ID starting with 205d4f189f899b3034951f689ebef180c2458b451e1f57ab0ef27b29e67dbaa7 not found: ID does not exist" Mar 09 13:03:18 crc kubenswrapper[4723]: I0309 13:03:18.148330 4723 scope.go:117] "RemoveContainer" containerID="00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1" Mar 09 13:03:18 crc kubenswrapper[4723]: E0309 13:03:18.149536 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1\": container with ID starting with 00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1 not found: ID does not exist" containerID="00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1" Mar 09 13:03:18 crc kubenswrapper[4723]: I0309 13:03:18.149602 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1"} err="failed to get container status \"00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1\": rpc error: code = NotFound desc = could not find container \"00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1\": container with ID starting with 00f4379e4ac6d21a4758b911da13d7b8a6aa609ad83c71949d562d0e1caa4fd1 not found: ID does not exist" Mar 09 13:03:18 crc kubenswrapper[4723]: I0309 13:03:18.149643 4723 scope.go:117] "RemoveContainer" containerID="01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140" Mar 09 13:03:18 crc kubenswrapper[4723]: E0309 13:03:18.150122 4723 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" interval="400ms" Mar 09 13:03:18 crc kubenswrapper[4723]: E0309 13:03:18.150641 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140\": container with ID starting with 01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140 not found: ID does not exist" containerID="01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140" Mar 09 13:03:18 crc kubenswrapper[4723]: I0309 13:03:18.150932 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140"} err="failed to get container status \"01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140\": rpc error: code = NotFound desc = could not find container \"01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140\": container with ID starting with 01a017defca19585da32be31eb9febc8f9f61d5fb75c0dcf0d5c22b61cdab140 not found: ID does not exist" Mar 09 13:03:18 crc kubenswrapper[4723]: I0309 13:03:18.151067 4723 scope.go:117] "RemoveContainer" containerID="1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f" Mar 09 13:03:18 crc kubenswrapper[4723]: E0309 13:03:18.151666 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f\": container with ID starting with 1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f not found: ID does not exist" containerID="1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f" Mar 09 13:03:18 crc kubenswrapper[4723]: I0309 13:03:18.151708 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f"} err="failed to get container status \"1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f\": rpc error: code = NotFound desc = could not find container \"1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f\": container with ID starting with 1cd14b0d693ce3053cb1380394a92d1c13d70d1b1282332a14faa01a3dd6718f not found: ID does not exist" Mar 09 13:03:18 crc kubenswrapper[4723]: I0309 13:03:18.151737 4723 scope.go:117] "RemoveContainer" containerID="74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9" Mar 09 13:03:18 crc kubenswrapper[4723]: E0309 13:03:18.152329 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\": container with ID starting with 74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9 not found: ID does not exist" containerID="74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9" Mar 09 13:03:18 crc kubenswrapper[4723]: I0309 13:03:18.152366 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9"} err="failed to get container status \"74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\": rpc error: code = NotFound desc = could not find container \"74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9\": container with ID starting with 74cb5ef5fe403266765720f4303f6906afa9c7c090f8051ae7829f9bc241dbc9 not found: ID does not exist" Mar 09 13:03:18 crc kubenswrapper[4723]: E0309 13:03:18.551807 4723 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" interval="800ms" Mar 09 13:03:18 crc kubenswrapper[4723]: I0309 13:03:18.889897 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 09 13:03:19 crc kubenswrapper[4723]: E0309 13:03:19.353440 4723 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" interval="1.6s" Mar 09 13:03:20 crc kubenswrapper[4723]: E0309 13:03:20.954178 4723 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" interval="3.2s" Mar 09 13:03:24 crc kubenswrapper[4723]: E0309 13:03:24.154740 4723 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" interval="6.4s" Mar 09 13:03:25 crc kubenswrapper[4723]: E0309 13:03:25.553008 4723 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.129:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189b2df1f9886486 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:03:15.597255814 +0000 UTC m=+269.611723354,LastTimestamp:2026-03-09 13:03:15.597255814 +0000 UTC m=+269.611723354,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:03:26 crc kubenswrapper[4723]: I0309 13:03:26.890635 4723 status_manager.go:851] "Failed to get status for pod" podUID="dbe90cf6-9e4d-49b5-ac07-c9c88288d058" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 09 13:03:28 crc kubenswrapper[4723]: E0309 13:03:28.675599 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:03:28Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:03:28Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:03:28Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:03:28Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 09 13:03:28 crc kubenswrapper[4723]: E0309 13:03:28.676560 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 09 13:03:28 crc kubenswrapper[4723]: E0309 13:03:28.677123 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 09 13:03:28 crc kubenswrapper[4723]: E0309 13:03:28.677657 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 09 13:03:28 crc kubenswrapper[4723]: E0309 13:03:28.678191 4723 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 09 13:03:28 crc kubenswrapper[4723]: E0309 13:03:28.678249 4723 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 13:03:28 crc kubenswrapper[4723]: I0309 13:03:28.879990 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:03:28 crc kubenswrapper[4723]: I0309 13:03:28.881102 4723 status_manager.go:851] "Failed to get status for pod" podUID="dbe90cf6-9e4d-49b5-ac07-c9c88288d058" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 09 13:03:28 crc kubenswrapper[4723]: I0309 13:03:28.893976 4723 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea" Mar 09 13:03:28 crc kubenswrapper[4723]: I0309 13:03:28.894008 4723 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea" Mar 09 13:03:28 crc kubenswrapper[4723]: E0309 13:03:28.894318 4723 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:03:28 crc kubenswrapper[4723]: I0309 13:03:28.894708 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:03:29 crc kubenswrapper[4723]: I0309 13:03:29.102886 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c8b1d09cdcd21c0b5b54bdf59d88ee4d7fb307e9f51ac815cd5ace7862fa5aab"} Mar 09 13:03:29 crc kubenswrapper[4723]: I0309 13:03:29.105134 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 09 13:03:29 crc kubenswrapper[4723]: I0309 13:03:29.106330 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 09 13:03:29 crc kubenswrapper[4723]: I0309 13:03:29.106364 4723 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="c9d14cd0710057eb4ca03d616f762c6dbd472b8bf45c04180d00387292000f19" exitCode=1 Mar 09 13:03:29 crc kubenswrapper[4723]: I0309 13:03:29.106383 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"c9d14cd0710057eb4ca03d616f762c6dbd472b8bf45c04180d00387292000f19"} Mar 09 13:03:29 crc kubenswrapper[4723]: I0309 13:03:29.106787 4723 scope.go:117] "RemoveContainer" containerID="c9d14cd0710057eb4ca03d616f762c6dbd472b8bf45c04180d00387292000f19" Mar 09 13:03:29 crc kubenswrapper[4723]: I0309 13:03:29.107013 4723 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 09 13:03:29 crc kubenswrapper[4723]: I0309 13:03:29.107458 4723 status_manager.go:851] "Failed to get status for pod" podUID="dbe90cf6-9e4d-49b5-ac07-c9c88288d058" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 09 13:03:29 crc kubenswrapper[4723]: I0309 13:03:29.174207 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:03:29 crc kubenswrapper[4723]: I0309 13:03:29.514251 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:03:29 crc kubenswrapper[4723]: I0309 13:03:29.832826 4723 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:03:30 crc kubenswrapper[4723]: I0309 13:03:30.123045 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 09 13:03:30 crc kubenswrapper[4723]: I0309 13:03:30.125424 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 09 13:03:30 crc kubenswrapper[4723]: I0309 13:03:30.125576 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a197c3e351d4660b3b52aa111553133695c0af3d102e08f16ac8085396159b7f"} Mar 09 13:03:30 crc kubenswrapper[4723]: I0309 13:03:30.127836 4723 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="9b7358c00d2e92eca1aa15d383186db8ac29032c113d3445c62c9dfefb5c62db" exitCode=0 Mar 09 13:03:30 crc kubenswrapper[4723]: I0309 13:03:30.127812 4723 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 09 13:03:30 crc kubenswrapper[4723]: I0309 13:03:30.127951 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"9b7358c00d2e92eca1aa15d383186db8ac29032c113d3445c62c9dfefb5c62db"} Mar 09 13:03:30 crc kubenswrapper[4723]: I0309 13:03:30.128416 4723 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea" Mar 09 13:03:30 crc kubenswrapper[4723]: I0309 13:03:30.128452 4723 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea" Mar 09 13:03:30 crc kubenswrapper[4723]: I0309 13:03:30.128812 4723 status_manager.go:851] "Failed to get status for pod" podUID="dbe90cf6-9e4d-49b5-ac07-c9c88288d058" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 09 13:03:30 crc kubenswrapper[4723]: E0309 13:03:30.129312 4723 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:03:30 crc kubenswrapper[4723]: I0309 13:03:30.129592 4723 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 09 13:03:30 crc kubenswrapper[4723]: I0309 13:03:30.130028 4723 status_manager.go:851] "Failed to get status for pod" podUID="dbe90cf6-9e4d-49b5-ac07-c9c88288d058" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 09 13:03:30 crc kubenswrapper[4723]: E0309 13:03:30.556011 4723 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" interval="7s" Mar 09 13:03:31 crc kubenswrapper[4723]: I0309 13:03:31.136134 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"83acff7936183e3735cee3efec9275156251648165ca1b66963f672c9cf9b4d4"} Mar 09 13:03:31 crc kubenswrapper[4723]: I0309 13:03:31.136217 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"17e5448b4e8a7c990c2c22ff8b7713ff2f5d89b0020103ce37c2e7cc2d39b339"} Mar 09 13:03:31 crc kubenswrapper[4723]: I0309 13:03:31.136239 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5d4fff5e6ee803c8e166d8f2d50620aa3d407e41139b3d92a5ba8304c80397c9"} Mar 09 13:03:32 crc kubenswrapper[4723]: I0309 13:03:32.144988 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a965fc465a011584c1946903db7be4e99f05a2670a5b5e7758f257a23e4dd40d"} Mar 09 13:03:32 crc kubenswrapper[4723]: I0309 13:03:32.145432 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"0d3f5bf3c8a62be4c15fcb3b1d66504a0d43d04599ac2089a6bdfc72fcd8e1f4"} Mar 09 13:03:32 crc kubenswrapper[4723]: I0309 13:03:32.145473 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:03:32 crc kubenswrapper[4723]: I0309 13:03:32.145473 4723 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea" Mar 09 13:03:32 crc kubenswrapper[4723]: I0309 13:03:32.145510 4723 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea" Mar 09 13:03:33 crc kubenswrapper[4723]: I0309 13:03:33.895511 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:03:33 crc kubenswrapper[4723]: I0309 13:03:33.895936 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:03:33 crc kubenswrapper[4723]: I0309 13:03:33.902812 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:03:33 crc kubenswrapper[4723]: I0309 13:03:33.947077 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:03:33 crc kubenswrapper[4723]: I0309 13:03:33.947180 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:03:37 crc kubenswrapper[4723]: I0309 13:03:37.157963 4723 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:03:37 crc kubenswrapper[4723]: I0309 13:03:37.218159 4723 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="920edd76-c1c9-47de-a168-57453595607a" Mar 09 13:03:38 crc kubenswrapper[4723]: I0309 13:03:38.192058 4723 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea" Mar 09 13:03:38 crc kubenswrapper[4723]: I0309 13:03:38.192092 4723 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea" Mar 09 13:03:38 crc kubenswrapper[4723]: I0309 13:03:38.196134 4723 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="920edd76-c1c9-47de-a168-57453595607a" Mar 09 13:03:38 crc kubenswrapper[4723]: I0309 13:03:38.198157 4723 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://5d4fff5e6ee803c8e166d8f2d50620aa3d407e41139b3d92a5ba8304c80397c9" Mar 09 13:03:38 crc kubenswrapper[4723]: I0309 13:03:38.198181 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:03:39 crc kubenswrapper[4723]: I0309 13:03:39.174290 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:03:39 crc kubenswrapper[4723]: I0309 13:03:39.198830 4723 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea" Mar 09 13:03:39 crc kubenswrapper[4723]: I0309 13:03:39.198908 4723 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1e77b3d3-85bc-48dd-b287-a7ec3bb7d2ea" Mar 09 13:03:39 crc kubenswrapper[4723]: I0309 13:03:39.202958 4723 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="920edd76-c1c9-47de-a168-57453595607a" Mar 09 13:03:39 crc kubenswrapper[4723]: I0309 13:03:39.513921 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:03:39 crc kubenswrapper[4723]: I0309 13:03:39.525388 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:03:40 crc kubenswrapper[4723]: I0309 13:03:40.209546 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:03:47 crc kubenswrapper[4723]: I0309 13:03:47.447479 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 13:03:47 crc kubenswrapper[4723]: I0309 13:03:47.664004 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 13:03:47 crc kubenswrapper[4723]: I0309 13:03:47.841498 4723 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 09 13:03:47 crc kubenswrapper[4723]: I0309 13:03:47.921259 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 09 13:03:47 crc kubenswrapper[4723]: I0309 13:03:47.988780 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 09 13:03:48 crc kubenswrapper[4723]: I0309 13:03:48.024826 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 09 13:03:48 crc kubenswrapper[4723]: I0309 13:03:48.024955 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 09 13:03:48 crc kubenswrapper[4723]: I0309 13:03:48.063841 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 09 13:03:48 crc kubenswrapper[4723]: I0309 13:03:48.120592 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 09 13:03:48 crc kubenswrapper[4723]: I0309 13:03:48.137243 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 09 13:03:48 crc kubenswrapper[4723]: I0309 13:03:48.393083 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 09 13:03:48 crc kubenswrapper[4723]: I0309 13:03:48.760447 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 13:03:48 crc kubenswrapper[4723]: I0309 13:03:48.965151 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 09 13:03:49 crc kubenswrapper[4723]: I0309 13:03:49.241071 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 09 13:03:49 crc kubenswrapper[4723]: I0309 13:03:49.263396 4723 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 09 13:03:49 crc kubenswrapper[4723]: I0309 13:03:49.381671 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 09 13:03:49 crc kubenswrapper[4723]: I0309 13:03:49.439448 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 09 13:03:49 crc kubenswrapper[4723]: I0309 13:03:49.449883 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 09 13:03:49 crc kubenswrapper[4723]: I0309 13:03:49.546077 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 09 13:03:49 crc kubenswrapper[4723]: I0309 13:03:49.588504 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 09 13:03:49 crc kubenswrapper[4723]: I0309 13:03:49.596815 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 09 13:03:49 crc kubenswrapper[4723]: I0309 13:03:49.607493 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 09 13:03:49 crc kubenswrapper[4723]: I0309 13:03:49.610701 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 09 13:03:49 crc kubenswrapper[4723]: I0309 13:03:49.693098 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 09 13:03:49 crc kubenswrapper[4723]: I0309 13:03:49.747742 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 09 13:03:49 crc kubenswrapper[4723]: I0309 13:03:49.898474 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 09 13:03:49 crc kubenswrapper[4723]: I0309 13:03:49.981277 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 09 13:03:50 crc kubenswrapper[4723]: I0309 13:03:50.027724 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 09 13:03:50 crc kubenswrapper[4723]: I0309 13:03:50.081968 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 09 13:03:50 crc kubenswrapper[4723]: I0309 13:03:50.149341 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 09 13:03:50 crc kubenswrapper[4723]: I0309 13:03:50.190347 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 09 13:03:50 crc kubenswrapper[4723]: I0309 13:03:50.315402 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 09 13:03:50 crc kubenswrapper[4723]: I0309 13:03:50.578641 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 09 13:03:50 crc kubenswrapper[4723]: I0309 13:03:50.608687 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 09 13:03:50 crc kubenswrapper[4723]: I0309 13:03:50.664502 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 09 13:03:50 crc kubenswrapper[4723]: I0309 13:03:50.684132 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 09 13:03:50 crc kubenswrapper[4723]: I0309 13:03:50.762152 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 09 13:03:50 crc kubenswrapper[4723]: I0309 13:03:50.778846 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 09 13:03:51 crc kubenswrapper[4723]: I0309 13:03:51.046051 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 09 13:03:51 crc kubenswrapper[4723]: I0309 13:03:51.056578 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 09 13:03:51 crc kubenswrapper[4723]: I0309 13:03:51.066502 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 09 13:03:51 crc kubenswrapper[4723]: I0309 13:03:51.091621 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 09 13:03:51 crc kubenswrapper[4723]: I0309 13:03:51.270823 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 09 13:03:51 crc kubenswrapper[4723]: I0309 13:03:51.280386 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 09 13:03:51 crc kubenswrapper[4723]: I0309 13:03:51.305582 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 09 13:03:51 crc kubenswrapper[4723]: I0309 13:03:51.316990 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 09 13:03:51 crc kubenswrapper[4723]: I0309 13:03:51.338811 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 09 13:03:51 crc kubenswrapper[4723]: I0309 13:03:51.360931 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 09 13:03:51 crc kubenswrapper[4723]: I0309 13:03:51.606200 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 09 13:03:51 crc kubenswrapper[4723]: I0309 13:03:51.620777 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 09 13:03:51 crc kubenswrapper[4723]: I0309 13:03:51.651790 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 09 13:03:51 crc kubenswrapper[4723]: I0309 13:03:51.707260 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 09 13:03:51 crc kubenswrapper[4723]: I0309 13:03:51.915565 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 09 13:03:52 crc kubenswrapper[4723]: I0309 13:03:52.045289 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 09 13:03:52 crc kubenswrapper[4723]: I0309 13:03:52.084832 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 13:03:52 crc kubenswrapper[4723]: I0309 13:03:52.197635 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 09 13:03:52 crc kubenswrapper[4723]: I0309 13:03:52.257394 4723 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 09 13:03:52 crc kubenswrapper[4723]: I0309 13:03:52.291721 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 09 13:03:52 crc kubenswrapper[4723]: I0309 13:03:52.338311 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 09 13:03:52 crc kubenswrapper[4723]: I0309 13:03:52.450375 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 09 13:03:52 crc kubenswrapper[4723]: I0309 13:03:52.475498 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 09 13:03:52 crc kubenswrapper[4723]: I0309 13:03:52.478255 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 09 13:03:52 crc kubenswrapper[4723]: I0309 13:03:52.602221 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 09 13:03:52 crc kubenswrapper[4723]: I0309 13:03:52.623464 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 09 13:03:52 crc kubenswrapper[4723]: I0309 13:03:52.719007 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 09 13:03:52 crc kubenswrapper[4723]: I0309 13:03:52.758536 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 09 13:03:52 crc kubenswrapper[4723]: I0309 13:03:52.922816 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 09 13:03:52 crc kubenswrapper[4723]: I0309 13:03:52.955849 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 09 13:03:53 crc kubenswrapper[4723]: I0309 13:03:53.060994 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 09 13:03:53 crc kubenswrapper[4723]: I0309 13:03:53.078191 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 09 13:03:53 crc kubenswrapper[4723]: I0309 13:03:53.083736 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 09 13:03:53 crc kubenswrapper[4723]: I0309 13:03:53.114304 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 09 13:03:53 crc kubenswrapper[4723]: I0309 13:03:53.135185 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 09 13:03:53 crc kubenswrapper[4723]: I0309 13:03:53.232221 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 09 13:03:53 crc kubenswrapper[4723]: I0309 13:03:53.245594 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 09 13:03:53 crc kubenswrapper[4723]: I0309 13:03:53.320853 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 09 13:03:53 crc kubenswrapper[4723]: I0309 13:03:53.349359 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 09 13:03:53 crc kubenswrapper[4723]: I0309 13:03:53.601883 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 09 13:03:53 crc kubenswrapper[4723]: I0309 13:03:53.624221 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 13:03:53 crc kubenswrapper[4723]: I0309 13:03:53.629923 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 09 13:03:53 crc kubenswrapper[4723]: I0309 13:03:53.719092 4723 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 09 13:03:53 crc kubenswrapper[4723]: I0309 13:03:53.723146 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 13:03:53 crc kubenswrapper[4723]: I0309 13:03:53.723193 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 13:03:53 crc kubenswrapper[4723]: I0309 13:03:53.727930 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:03:53 crc kubenswrapper[4723]: I0309 13:03:53.742597 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 09 13:03:53 crc kubenswrapper[4723]: I0309 13:03:53.744575 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=16.744562424 podStartE2EDuration="16.744562424s" podCreationTimestamp="2026-03-09 13:03:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:03:53.739838565 +0000 UTC m=+307.754306105" watchObservedRunningTime="2026-03-09 13:03:53.744562424 +0000 UTC m=+307.759029964" Mar 09 13:03:53 crc kubenswrapper[4723]: I0309 13:03:53.776017 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 09 13:03:53 crc kubenswrapper[4723]: I0309 13:03:53.837273 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 09 13:03:53 crc kubenswrapper[4723]: I0309 13:03:53.861036 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 13:03:53 crc kubenswrapper[4723]: I0309 13:03:53.883099 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 09 13:03:53 crc kubenswrapper[4723]: I0309 13:03:53.885528 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 09 13:03:53 crc kubenswrapper[4723]: I0309 13:03:53.932000 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 13:03:53 crc kubenswrapper[4723]: I0309 13:03:53.948236 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 09 13:03:53 crc kubenswrapper[4723]: I0309 13:03:53.953682 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 09 13:03:54 crc kubenswrapper[4723]: I0309 13:03:54.021065 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 09 13:03:54 crc kubenswrapper[4723]: I0309 13:03:54.040311 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 09 13:03:54 crc kubenswrapper[4723]: I0309 13:03:54.043810 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 09 13:03:54 crc kubenswrapper[4723]: I0309 13:03:54.046224 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 09 13:03:54 crc kubenswrapper[4723]: I0309 13:03:54.055055 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 09 13:03:54 crc kubenswrapper[4723]: I0309 13:03:54.176722 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 09 13:03:54 crc kubenswrapper[4723]: I0309 13:03:54.222715 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 09 13:03:54 crc kubenswrapper[4723]: I0309 13:03:54.294954 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 09 13:03:54 crc kubenswrapper[4723]: I0309 13:03:54.296841 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 09 13:03:54 crc kubenswrapper[4723]: I0309 13:03:54.319160 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 09 13:03:54 crc kubenswrapper[4723]: I0309 13:03:54.340271 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 09 13:03:54 crc kubenswrapper[4723]: I0309 13:03:54.352388 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 09 13:03:54 crc kubenswrapper[4723]: I0309 13:03:54.397232 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 09 13:03:54 crc kubenswrapper[4723]: I0309 13:03:54.514303 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 09 13:03:54 crc kubenswrapper[4723]: I0309 13:03:54.563719 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 09 13:03:54 crc kubenswrapper[4723]: I0309 13:03:54.609983 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 09 13:03:54 crc kubenswrapper[4723]: I0309 13:03:54.686496 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 09 13:03:54 crc kubenswrapper[4723]: I0309 13:03:54.748925 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 09 13:03:54 crc kubenswrapper[4723]: I0309 13:03:54.749173 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 09 13:03:54 crc kubenswrapper[4723]: I0309 13:03:54.767807 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 09 13:03:54 crc kubenswrapper[4723]: I0309 13:03:54.796056 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 09 13:03:54 crc kubenswrapper[4723]: I0309 13:03:54.801084 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 09 13:03:54 crc kubenswrapper[4723]: I0309 13:03:54.879313 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 09 13:03:54 crc kubenswrapper[4723]: I0309 13:03:54.935348 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 09 13:03:54 crc kubenswrapper[4723]: I0309 13:03:54.954308 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 09 13:03:54 crc kubenswrapper[4723]: I0309 13:03:54.956665 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 09 13:03:55 crc kubenswrapper[4723]: I0309 13:03:55.179598 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 09 13:03:55 crc kubenswrapper[4723]: I0309 13:03:55.237249 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 09 13:03:55 crc kubenswrapper[4723]: I0309 13:03:55.334330 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 09 13:03:55 crc kubenswrapper[4723]: I0309 13:03:55.380628 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 09 13:03:55 crc kubenswrapper[4723]: I0309 13:03:55.404417 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 09 13:03:55 crc kubenswrapper[4723]: I0309 13:03:55.434964 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 09 13:03:55 crc kubenswrapper[4723]: I0309 13:03:55.539193 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 09 13:03:55 crc kubenswrapper[4723]: I0309 13:03:55.579521 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 09 13:03:55 crc kubenswrapper[4723]: I0309 13:03:55.670127 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 09 13:03:55 crc kubenswrapper[4723]: I0309 13:03:55.711354 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 09 13:03:55 crc kubenswrapper[4723]: I0309 13:03:55.771641 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 09 13:03:55 crc kubenswrapper[4723]: I0309 13:03:55.774322 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 13:03:55 crc kubenswrapper[4723]: I0309 13:03:55.786280 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 09 13:03:55 crc kubenswrapper[4723]: I0309 13:03:55.806900 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 09 13:03:55 crc kubenswrapper[4723]: I0309 13:03:55.839168 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 09 13:03:55 crc kubenswrapper[4723]: I0309 13:03:55.853291 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 09 13:03:55 crc kubenswrapper[4723]: I0309 13:03:55.863125 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 09 13:03:55 crc kubenswrapper[4723]: I0309 13:03:55.873835 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 09 13:03:56 crc kubenswrapper[4723]: I0309 13:03:56.007563 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 09 13:03:56 crc kubenswrapper[4723]: I0309 13:03:56.047699 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 09 13:03:56 crc kubenswrapper[4723]: I0309 13:03:56.059911 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 09 13:03:56 crc kubenswrapper[4723]: I0309 13:03:56.202967 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 09 13:03:56 crc kubenswrapper[4723]: I0309 13:03:56.246777 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 09 13:03:56 crc kubenswrapper[4723]: I0309 13:03:56.261572 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 09 13:03:56 crc kubenswrapper[4723]: I0309 13:03:56.285241 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 09 13:03:56 crc kubenswrapper[4723]: I0309 13:03:56.368375 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 09 13:03:56 crc kubenswrapper[4723]: I0309 13:03:56.415418 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fcbf754c-xl995"] Mar 09 13:03:56 crc kubenswrapper[4723]: I0309 13:03:56.415708 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7fcbf754c-xl995" podUID="5772ae4e-fbd1-4d9c-bbe3-a92189b261ff" containerName="route-controller-manager" containerID="cri-o://3c68982cacd76b0c0cb7b62ac568c4026e91cf838d55abe0f7a83432783f1cb8" gracePeriod=30 Mar 09 13:03:56 crc kubenswrapper[4723]: I0309 13:03:56.419817 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-c8dbfdf94-gvc8b"] Mar 09 13:03:56 crc kubenswrapper[4723]: I0309 13:03:56.420040 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-c8dbfdf94-gvc8b" podUID="e6f5147d-bdf7-48e5-881d-0aad778e319f" containerName="controller-manager" containerID="cri-o://b7863751366fe9e17fde27da6ce25afa3b4cf12d630d11bc9f82259923106811" gracePeriod=30 Mar 09 13:03:56 crc kubenswrapper[4723]: I0309 13:03:56.499442 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 09 13:03:56 crc kubenswrapper[4723]: I0309 13:03:56.665237 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 09 13:03:56 crc kubenswrapper[4723]: I0309 13:03:56.772174 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 09 13:03:56 crc kubenswrapper[4723]: I0309 13:03:56.792975 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 09 13:03:56 crc kubenswrapper[4723]: I0309 13:03:56.915636 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 09 13:03:56 crc kubenswrapper[4723]: I0309 13:03:56.920927 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c8dbfdf94-gvc8b" Mar 09 13:03:56 crc kubenswrapper[4723]: I0309 13:03:56.926082 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fcbf754c-xl995" Mar 09 13:03:56 crc kubenswrapper[4723]: I0309 13:03:56.946083 4723 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 09 13:03:56 crc kubenswrapper[4723]: I0309 13:03:56.967369 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 09 13:03:56 crc kubenswrapper[4723]: I0309 13:03:56.968459 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e6f5147d-bdf7-48e5-881d-0aad778e319f-client-ca\") pod \"e6f5147d-bdf7-48e5-881d-0aad778e319f\" (UID: \"e6f5147d-bdf7-48e5-881d-0aad778e319f\") " Mar 09 13:03:56 crc kubenswrapper[4723]: I0309 13:03:56.968496 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5772ae4e-fbd1-4d9c-bbe3-a92189b261ff-client-ca\") pod \"5772ae4e-fbd1-4d9c-bbe3-a92189b261ff\" (UID: \"5772ae4e-fbd1-4d9c-bbe3-a92189b261ff\") " Mar 09 13:03:56 crc kubenswrapper[4723]: I0309 13:03:56.968551 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrkxt\" (UniqueName: \"kubernetes.io/projected/e6f5147d-bdf7-48e5-881d-0aad778e319f-kube-api-access-xrkxt\") pod \"e6f5147d-bdf7-48e5-881d-0aad778e319f\" (UID: \"e6f5147d-bdf7-48e5-881d-0aad778e319f\") " Mar 09 13:03:56 crc kubenswrapper[4723]: I0309 13:03:56.968581 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5772ae4e-fbd1-4d9c-bbe3-a92189b261ff-config\") pod \"5772ae4e-fbd1-4d9c-bbe3-a92189b261ff\" (UID: \"5772ae4e-fbd1-4d9c-bbe3-a92189b261ff\") " Mar 09 13:03:56 crc kubenswrapper[4723]: I0309 13:03:56.968969 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5772ae4e-fbd1-4d9c-bbe3-a92189b261ff-serving-cert\") pod \"5772ae4e-fbd1-4d9c-bbe3-a92189b261ff\" (UID: \"5772ae4e-fbd1-4d9c-bbe3-a92189b261ff\") " Mar 09 13:03:56 crc kubenswrapper[4723]: I0309 13:03:56.968995 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6f5147d-bdf7-48e5-881d-0aad778e319f-config\") pod \"e6f5147d-bdf7-48e5-881d-0aad778e319f\" (UID: \"e6f5147d-bdf7-48e5-881d-0aad778e319f\") " Mar 09 13:03:56 crc kubenswrapper[4723]: I0309 13:03:56.969036 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vc9x9\" (UniqueName: \"kubernetes.io/projected/5772ae4e-fbd1-4d9c-bbe3-a92189b261ff-kube-api-access-vc9x9\") pod \"5772ae4e-fbd1-4d9c-bbe3-a92189b261ff\" (UID: \"5772ae4e-fbd1-4d9c-bbe3-a92189b261ff\") " Mar 09 13:03:56 crc kubenswrapper[4723]: I0309 13:03:56.969064 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e6f5147d-bdf7-48e5-881d-0aad778e319f-proxy-ca-bundles\") pod \"e6f5147d-bdf7-48e5-881d-0aad778e319f\" (UID: \"e6f5147d-bdf7-48e5-881d-0aad778e319f\") " Mar 09 13:03:56 crc kubenswrapper[4723]: I0309 13:03:56.969086 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6f5147d-bdf7-48e5-881d-0aad778e319f-serving-cert\") pod \"e6f5147d-bdf7-48e5-881d-0aad778e319f\" (UID: \"e6f5147d-bdf7-48e5-881d-0aad778e319f\") " Mar 09 13:03:56 crc kubenswrapper[4723]: I0309 13:03:56.969983 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5772ae4e-fbd1-4d9c-bbe3-a92189b261ff-client-ca" (OuterVolumeSpecName: "client-ca") pod "5772ae4e-fbd1-4d9c-bbe3-a92189b261ff" (UID: "5772ae4e-fbd1-4d9c-bbe3-a92189b261ff"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:03:56 crc kubenswrapper[4723]: I0309 13:03:56.970040 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5772ae4e-fbd1-4d9c-bbe3-a92189b261ff-config" (OuterVolumeSpecName: "config") pod "5772ae4e-fbd1-4d9c-bbe3-a92189b261ff" (UID: "5772ae4e-fbd1-4d9c-bbe3-a92189b261ff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:03:56 crc kubenswrapper[4723]: I0309 13:03:56.970765 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6f5147d-bdf7-48e5-881d-0aad778e319f-client-ca" (OuterVolumeSpecName: "client-ca") pod "e6f5147d-bdf7-48e5-881d-0aad778e319f" (UID: "e6f5147d-bdf7-48e5-881d-0aad778e319f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:03:56 crc kubenswrapper[4723]: I0309 13:03:56.971033 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6f5147d-bdf7-48e5-881d-0aad778e319f-config" (OuterVolumeSpecName: "config") pod "e6f5147d-bdf7-48e5-881d-0aad778e319f" (UID: "e6f5147d-bdf7-48e5-881d-0aad778e319f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:03:56 crc kubenswrapper[4723]: I0309 13:03:56.971555 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6f5147d-bdf7-48e5-881d-0aad778e319f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e6f5147d-bdf7-48e5-881d-0aad778e319f" (UID: "e6f5147d-bdf7-48e5-881d-0aad778e319f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:03:56 crc kubenswrapper[4723]: I0309 13:03:56.971903 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 09 13:03:56 crc kubenswrapper[4723]: I0309 13:03:56.975033 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6f5147d-bdf7-48e5-881d-0aad778e319f-kube-api-access-xrkxt" (OuterVolumeSpecName: "kube-api-access-xrkxt") pod "e6f5147d-bdf7-48e5-881d-0aad778e319f" (UID: "e6f5147d-bdf7-48e5-881d-0aad778e319f"). InnerVolumeSpecName "kube-api-access-xrkxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:03:56 crc kubenswrapper[4723]: I0309 13:03:56.975132 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6f5147d-bdf7-48e5-881d-0aad778e319f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e6f5147d-bdf7-48e5-881d-0aad778e319f" (UID: "e6f5147d-bdf7-48e5-881d-0aad778e319f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:03:56 crc kubenswrapper[4723]: I0309 13:03:56.975298 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5772ae4e-fbd1-4d9c-bbe3-a92189b261ff-kube-api-access-vc9x9" (OuterVolumeSpecName: "kube-api-access-vc9x9") pod "5772ae4e-fbd1-4d9c-bbe3-a92189b261ff" (UID: "5772ae4e-fbd1-4d9c-bbe3-a92189b261ff"). InnerVolumeSpecName "kube-api-access-vc9x9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:03:56 crc kubenswrapper[4723]: I0309 13:03:56.975508 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5772ae4e-fbd1-4d9c-bbe3-a92189b261ff-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5772ae4e-fbd1-4d9c-bbe3-a92189b261ff" (UID: "5772ae4e-fbd1-4d9c-bbe3-a92189b261ff"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:03:56 crc kubenswrapper[4723]: I0309 13:03:56.978844 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.006105 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.008220 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.022173 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.035153 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.069934 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5772ae4e-fbd1-4d9c-bbe3-a92189b261ff-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.069972 4723 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5772ae4e-fbd1-4d9c-bbe3-a92189b261ff-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.069988 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6f5147d-bdf7-48e5-881d-0aad778e319f-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.070034 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vc9x9\" (UniqueName: \"kubernetes.io/projected/5772ae4e-fbd1-4d9c-bbe3-a92189b261ff-kube-api-access-vc9x9\") on node \"crc\" DevicePath \"\"" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.070049 4723 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e6f5147d-bdf7-48e5-881d-0aad778e319f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.070063 4723 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6f5147d-bdf7-48e5-881d-0aad778e319f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.070074 4723 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e6f5147d-bdf7-48e5-881d-0aad778e319f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.070086 4723 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5772ae4e-fbd1-4d9c-bbe3-a92189b261ff-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.070098 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrkxt\" (UniqueName: \"kubernetes.io/projected/e6f5147d-bdf7-48e5-881d-0aad778e319f-kube-api-access-xrkxt\") on node \"crc\" DevicePath \"\"" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.197483 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.271384 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.297001 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.318146 4723 generic.go:334] "Generic (PLEG): container finished" podID="e6f5147d-bdf7-48e5-881d-0aad778e319f" containerID="b7863751366fe9e17fde27da6ce25afa3b4cf12d630d11bc9f82259923106811" exitCode=0 Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.318258 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c8dbfdf94-gvc8b" event={"ID":"e6f5147d-bdf7-48e5-881d-0aad778e319f","Type":"ContainerDied","Data":"b7863751366fe9e17fde27da6ce25afa3b4cf12d630d11bc9f82259923106811"} Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.318271 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c8dbfdf94-gvc8b" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.318299 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c8dbfdf94-gvc8b" event={"ID":"e6f5147d-bdf7-48e5-881d-0aad778e319f","Type":"ContainerDied","Data":"b892dc3749810158766c22de3b1b99704a89a65c1cfdd182f4e305c5ab9a9513"} Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.318331 4723 scope.go:117] "RemoveContainer" containerID="b7863751366fe9e17fde27da6ce25afa3b4cf12d630d11bc9f82259923106811" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.321596 4723 generic.go:334] "Generic (PLEG): container finished" podID="5772ae4e-fbd1-4d9c-bbe3-a92189b261ff" containerID="3c68982cacd76b0c0cb7b62ac568c4026e91cf838d55abe0f7a83432783f1cb8" exitCode=0 Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.321641 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fcbf754c-xl995" event={"ID":"5772ae4e-fbd1-4d9c-bbe3-a92189b261ff","Type":"ContainerDied","Data":"3c68982cacd76b0c0cb7b62ac568c4026e91cf838d55abe0f7a83432783f1cb8"} Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.321669 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fcbf754c-xl995" event={"ID":"5772ae4e-fbd1-4d9c-bbe3-a92189b261ff","Type":"ContainerDied","Data":"4474729d287066553ffd0717364c6dd2727bc9ee588c77ca743d301e590c9b79"} Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.321758 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fcbf754c-xl995" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.339466 4723 scope.go:117] "RemoveContainer" containerID="b7863751366fe9e17fde27da6ce25afa3b4cf12d630d11bc9f82259923106811" Mar 09 13:03:57 crc kubenswrapper[4723]: E0309 13:03:57.340307 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7863751366fe9e17fde27da6ce25afa3b4cf12d630d11bc9f82259923106811\": container with ID starting with b7863751366fe9e17fde27da6ce25afa3b4cf12d630d11bc9f82259923106811 not found: ID does not exist" containerID="b7863751366fe9e17fde27da6ce25afa3b4cf12d630d11bc9f82259923106811" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.340363 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7863751366fe9e17fde27da6ce25afa3b4cf12d630d11bc9f82259923106811"} err="failed to get container status \"b7863751366fe9e17fde27da6ce25afa3b4cf12d630d11bc9f82259923106811\": rpc error: code = NotFound desc = could not find container \"b7863751366fe9e17fde27da6ce25afa3b4cf12d630d11bc9f82259923106811\": container with ID starting with b7863751366fe9e17fde27da6ce25afa3b4cf12d630d11bc9f82259923106811 not found: ID does not exist" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.340394 4723 scope.go:117] "RemoveContainer" containerID="3c68982cacd76b0c0cb7b62ac568c4026e91cf838d55abe0f7a83432783f1cb8" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.363142 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-c8dbfdf94-gvc8b"] Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.370584 4723 scope.go:117] "RemoveContainer" containerID="3c68982cacd76b0c0cb7b62ac568c4026e91cf838d55abe0f7a83432783f1cb8" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.373290 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-c8dbfdf94-gvc8b"] Mar 09 13:03:57 crc kubenswrapper[4723]: E0309 13:03:57.376033 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c68982cacd76b0c0cb7b62ac568c4026e91cf838d55abe0f7a83432783f1cb8\": container with ID starting with 3c68982cacd76b0c0cb7b62ac568c4026e91cf838d55abe0f7a83432783f1cb8 not found: ID does not exist" containerID="3c68982cacd76b0c0cb7b62ac568c4026e91cf838d55abe0f7a83432783f1cb8" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.376092 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c68982cacd76b0c0cb7b62ac568c4026e91cf838d55abe0f7a83432783f1cb8"} err="failed to get container status \"3c68982cacd76b0c0cb7b62ac568c4026e91cf838d55abe0f7a83432783f1cb8\": rpc error: code = NotFound desc = could not find container \"3c68982cacd76b0c0cb7b62ac568c4026e91cf838d55abe0f7a83432783f1cb8\": container with ID starting with 3c68982cacd76b0c0cb7b62ac568c4026e91cf838d55abe0f7a83432783f1cb8 not found: ID does not exist" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.380735 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fcbf754c-xl995"] Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.386937 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fcbf754c-xl995"] Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.395014 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.464669 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.605179 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.606570 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-774cb675cc-zxwjq"] Mar 09 13:03:57 crc kubenswrapper[4723]: E0309 13:03:57.606977 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6f5147d-bdf7-48e5-881d-0aad778e319f" containerName="controller-manager" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.607012 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6f5147d-bdf7-48e5-881d-0aad778e319f" containerName="controller-manager" Mar 09 13:03:57 crc kubenswrapper[4723]: E0309 13:03:57.607034 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5772ae4e-fbd1-4d9c-bbe3-a92189b261ff" containerName="route-controller-manager" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.607046 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="5772ae4e-fbd1-4d9c-bbe3-a92189b261ff" containerName="route-controller-manager" Mar 09 13:03:57 crc kubenswrapper[4723]: E0309 13:03:57.607076 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbe90cf6-9e4d-49b5-ac07-c9c88288d058" containerName="installer" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.607087 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbe90cf6-9e4d-49b5-ac07-c9c88288d058" containerName="installer" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.607243 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6f5147d-bdf7-48e5-881d-0aad778e319f" containerName="controller-manager" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.607257 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="5772ae4e-fbd1-4d9c-bbe3-a92189b261ff" containerName="route-controller-manager" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.607277 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbe90cf6-9e4d-49b5-ac07-c9c88288d058" containerName="installer" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.607894 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-774cb675cc-zxwjq" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.610915 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.611327 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7757f9dd75-k996n"] Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.612266 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7757f9dd75-k996n" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.613814 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.614018 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.616332 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.616432 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.616572 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.618721 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.618898 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.619260 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.620551 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.620906 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.625182 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-774cb675cc-zxwjq"] Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.626142 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.630522 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.654083 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7757f9dd75-k996n"] Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.695100 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.711934 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.740042 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.779093 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af3980f7-9337-48f4-84f1-c311b36f47b0-serving-cert\") pod \"route-controller-manager-7757f9dd75-k996n\" (UID: \"af3980f7-9337-48f4-84f1-c311b36f47b0\") " pod="openshift-route-controller-manager/route-controller-manager-7757f9dd75-k996n" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.779150 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e0a1ce6-0cec-47d3-9082-8630337ae033-serving-cert\") pod \"controller-manager-774cb675cc-zxwjq\" (UID: \"1e0a1ce6-0cec-47d3-9082-8630337ae033\") " pod="openshift-controller-manager/controller-manager-774cb675cc-zxwjq" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.779189 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af3980f7-9337-48f4-84f1-c311b36f47b0-config\") pod \"route-controller-manager-7757f9dd75-k996n\" (UID: \"af3980f7-9337-48f4-84f1-c311b36f47b0\") " pod="openshift-route-controller-manager/route-controller-manager-7757f9dd75-k996n" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.779232 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e0a1ce6-0cec-47d3-9082-8630337ae033-config\") pod \"controller-manager-774cb675cc-zxwjq\" (UID: \"1e0a1ce6-0cec-47d3-9082-8630337ae033\") " pod="openshift-controller-manager/controller-manager-774cb675cc-zxwjq" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.779285 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbndp\" (UniqueName: \"kubernetes.io/projected/1e0a1ce6-0cec-47d3-9082-8630337ae033-kube-api-access-vbndp\") pod \"controller-manager-774cb675cc-zxwjq\" (UID: \"1e0a1ce6-0cec-47d3-9082-8630337ae033\") " pod="openshift-controller-manager/controller-manager-774cb675cc-zxwjq" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.779316 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af3980f7-9337-48f4-84f1-c311b36f47b0-client-ca\") pod \"route-controller-manager-7757f9dd75-k996n\" (UID: \"af3980f7-9337-48f4-84f1-c311b36f47b0\") " pod="openshift-route-controller-manager/route-controller-manager-7757f9dd75-k996n" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.779361 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1e0a1ce6-0cec-47d3-9082-8630337ae033-proxy-ca-bundles\") pod \"controller-manager-774cb675cc-zxwjq\" (UID: \"1e0a1ce6-0cec-47d3-9082-8630337ae033\") " pod="openshift-controller-manager/controller-manager-774cb675cc-zxwjq" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.779412 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m52xm\" (UniqueName: \"kubernetes.io/projected/af3980f7-9337-48f4-84f1-c311b36f47b0-kube-api-access-m52xm\") pod \"route-controller-manager-7757f9dd75-k996n\" (UID: \"af3980f7-9337-48f4-84f1-c311b36f47b0\") " pod="openshift-route-controller-manager/route-controller-manager-7757f9dd75-k996n" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.779448 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1e0a1ce6-0cec-47d3-9082-8630337ae033-client-ca\") pod \"controller-manager-774cb675cc-zxwjq\" (UID: \"1e0a1ce6-0cec-47d3-9082-8630337ae033\") " pod="openshift-controller-manager/controller-manager-774cb675cc-zxwjq" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.809851 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.842726 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.871318 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.881065 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbndp\" (UniqueName: \"kubernetes.io/projected/1e0a1ce6-0cec-47d3-9082-8630337ae033-kube-api-access-vbndp\") pod \"controller-manager-774cb675cc-zxwjq\" (UID: \"1e0a1ce6-0cec-47d3-9082-8630337ae033\") " pod="openshift-controller-manager/controller-manager-774cb675cc-zxwjq" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.881134 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af3980f7-9337-48f4-84f1-c311b36f47b0-client-ca\") pod \"route-controller-manager-7757f9dd75-k996n\" (UID: \"af3980f7-9337-48f4-84f1-c311b36f47b0\") " pod="openshift-route-controller-manager/route-controller-manager-7757f9dd75-k996n" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.881207 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1e0a1ce6-0cec-47d3-9082-8630337ae033-proxy-ca-bundles\") pod \"controller-manager-774cb675cc-zxwjq\" (UID: \"1e0a1ce6-0cec-47d3-9082-8630337ae033\") " pod="openshift-controller-manager/controller-manager-774cb675cc-zxwjq" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.881290 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m52xm\" (UniqueName: \"kubernetes.io/projected/af3980f7-9337-48f4-84f1-c311b36f47b0-kube-api-access-m52xm\") pod \"route-controller-manager-7757f9dd75-k996n\" (UID: \"af3980f7-9337-48f4-84f1-c311b36f47b0\") " pod="openshift-route-controller-manager/route-controller-manager-7757f9dd75-k996n" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.881377 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1e0a1ce6-0cec-47d3-9082-8630337ae033-client-ca\") pod \"controller-manager-774cb675cc-zxwjq\" (UID: \"1e0a1ce6-0cec-47d3-9082-8630337ae033\") " pod="openshift-controller-manager/controller-manager-774cb675cc-zxwjq" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.881455 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af3980f7-9337-48f4-84f1-c311b36f47b0-serving-cert\") pod \"route-controller-manager-7757f9dd75-k996n\" (UID: \"af3980f7-9337-48f4-84f1-c311b36f47b0\") " pod="openshift-route-controller-manager/route-controller-manager-7757f9dd75-k996n" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.881504 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e0a1ce6-0cec-47d3-9082-8630337ae033-serving-cert\") pod \"controller-manager-774cb675cc-zxwjq\" (UID: \"1e0a1ce6-0cec-47d3-9082-8630337ae033\") " pod="openshift-controller-manager/controller-manager-774cb675cc-zxwjq" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.881559 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af3980f7-9337-48f4-84f1-c311b36f47b0-config\") pod \"route-controller-manager-7757f9dd75-k996n\" (UID: \"af3980f7-9337-48f4-84f1-c311b36f47b0\") " pod="openshift-route-controller-manager/route-controller-manager-7757f9dd75-k996n" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.881600 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e0a1ce6-0cec-47d3-9082-8630337ae033-config\") pod \"controller-manager-774cb675cc-zxwjq\" (UID: \"1e0a1ce6-0cec-47d3-9082-8630337ae033\") " pod="openshift-controller-manager/controller-manager-774cb675cc-zxwjq" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.883000 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af3980f7-9337-48f4-84f1-c311b36f47b0-client-ca\") pod \"route-controller-manager-7757f9dd75-k996n\" (UID: \"af3980f7-9337-48f4-84f1-c311b36f47b0\") " pod="openshift-route-controller-manager/route-controller-manager-7757f9dd75-k996n" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.883152 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1e0a1ce6-0cec-47d3-9082-8630337ae033-client-ca\") pod \"controller-manager-774cb675cc-zxwjq\" (UID: \"1e0a1ce6-0cec-47d3-9082-8630337ae033\") " pod="openshift-controller-manager/controller-manager-774cb675cc-zxwjq" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.883326 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af3980f7-9337-48f4-84f1-c311b36f47b0-config\") pod \"route-controller-manager-7757f9dd75-k996n\" (UID: \"af3980f7-9337-48f4-84f1-c311b36f47b0\") " pod="openshift-route-controller-manager/route-controller-manager-7757f9dd75-k996n" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.884020 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e0a1ce6-0cec-47d3-9082-8630337ae033-config\") pod \"controller-manager-774cb675cc-zxwjq\" (UID: \"1e0a1ce6-0cec-47d3-9082-8630337ae033\") " pod="openshift-controller-manager/controller-manager-774cb675cc-zxwjq" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.884395 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1e0a1ce6-0cec-47d3-9082-8630337ae033-proxy-ca-bundles\") pod \"controller-manager-774cb675cc-zxwjq\" (UID: \"1e0a1ce6-0cec-47d3-9082-8630337ae033\") " pod="openshift-controller-manager/controller-manager-774cb675cc-zxwjq" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.885977 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e0a1ce6-0cec-47d3-9082-8630337ae033-serving-cert\") pod \"controller-manager-774cb675cc-zxwjq\" (UID: \"1e0a1ce6-0cec-47d3-9082-8630337ae033\") " pod="openshift-controller-manager/controller-manager-774cb675cc-zxwjq" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.886683 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af3980f7-9337-48f4-84f1-c311b36f47b0-serving-cert\") pod \"route-controller-manager-7757f9dd75-k996n\" (UID: \"af3980f7-9337-48f4-84f1-c311b36f47b0\") " pod="openshift-route-controller-manager/route-controller-manager-7757f9dd75-k996n" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.891797 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.904440 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbndp\" (UniqueName: \"kubernetes.io/projected/1e0a1ce6-0cec-47d3-9082-8630337ae033-kube-api-access-vbndp\") pod \"controller-manager-774cb675cc-zxwjq\" (UID: \"1e0a1ce6-0cec-47d3-9082-8630337ae033\") " pod="openshift-controller-manager/controller-manager-774cb675cc-zxwjq" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.905704 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m52xm\" (UniqueName: \"kubernetes.io/projected/af3980f7-9337-48f4-84f1-c311b36f47b0-kube-api-access-m52xm\") pod \"route-controller-manager-7757f9dd75-k996n\" (UID: \"af3980f7-9337-48f4-84f1-c311b36f47b0\") " pod="openshift-route-controller-manager/route-controller-manager-7757f9dd75-k996n" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.935392 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.947408 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.973039 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-774cb675cc-zxwjq" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.973057 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 09 13:03:57 crc kubenswrapper[4723]: I0309 13:03:57.981448 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7757f9dd75-k996n" Mar 09 13:03:58 crc kubenswrapper[4723]: I0309 13:03:58.021594 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 09 13:03:58 crc kubenswrapper[4723]: I0309 13:03:58.029158 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 09 13:03:58 crc kubenswrapper[4723]: I0309 13:03:58.031012 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 09 13:03:58 crc kubenswrapper[4723]: I0309 13:03:58.045946 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 09 13:03:58 crc kubenswrapper[4723]: I0309 13:03:58.048399 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 09 13:03:58 crc kubenswrapper[4723]: I0309 13:03:58.092565 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 09 13:03:58 crc kubenswrapper[4723]: I0309 13:03:58.184787 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 09 13:03:58 crc kubenswrapper[4723]: I0309 13:03:58.185070 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 09 13:03:58 crc kubenswrapper[4723]: I0309 13:03:58.186097 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 09 13:03:58 crc kubenswrapper[4723]: I0309 13:03:58.243800 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 09 13:03:58 crc kubenswrapper[4723]: I0309 13:03:58.397756 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 09 13:03:58 crc kubenswrapper[4723]: I0309 13:03:58.414253 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-774cb675cc-zxwjq"] Mar 09 13:03:58 crc kubenswrapper[4723]: I0309 13:03:58.425341 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 09 13:03:58 crc kubenswrapper[4723]: I0309 13:03:58.430493 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 09 13:03:58 crc kubenswrapper[4723]: I0309 13:03:58.438905 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7757f9dd75-k996n"] Mar 09 13:03:58 crc kubenswrapper[4723]: W0309 13:03:58.443462 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf3980f7_9337_48f4_84f1_c311b36f47b0.slice/crio-387771f3cfe35aaed1b9909f48de8a163ac3400dad17c1bd3c0761a0cb3a124b WatchSource:0}: Error finding container 387771f3cfe35aaed1b9909f48de8a163ac3400dad17c1bd3c0761a0cb3a124b: Status 404 returned error can't find the container with id 387771f3cfe35aaed1b9909f48de8a163ac3400dad17c1bd3c0761a0cb3a124b Mar 09 13:03:58 crc kubenswrapper[4723]: I0309 13:03:58.495490 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 09 13:03:58 crc kubenswrapper[4723]: I0309 13:03:58.506936 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 09 13:03:58 crc kubenswrapper[4723]: I0309 13:03:58.606347 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 09 13:03:58 crc kubenswrapper[4723]: I0309 13:03:58.717036 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 09 13:03:58 crc kubenswrapper[4723]: I0309 13:03:58.752635 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 09 13:03:58 crc kubenswrapper[4723]: I0309 13:03:58.756574 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 09 13:03:58 crc kubenswrapper[4723]: I0309 13:03:58.890405 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5772ae4e-fbd1-4d9c-bbe3-a92189b261ff" path="/var/lib/kubelet/pods/5772ae4e-fbd1-4d9c-bbe3-a92189b261ff/volumes" Mar 09 13:03:58 crc kubenswrapper[4723]: I0309 13:03:58.891654 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6f5147d-bdf7-48e5-881d-0aad778e319f" path="/var/lib/kubelet/pods/e6f5147d-bdf7-48e5-881d-0aad778e319f/volumes" Mar 09 13:03:58 crc kubenswrapper[4723]: I0309 13:03:58.924875 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 09 13:03:58 crc kubenswrapper[4723]: I0309 13:03:58.954621 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 09 13:03:59 crc kubenswrapper[4723]: I0309 13:03:59.012550 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 09 13:03:59 crc kubenswrapper[4723]: I0309 13:03:59.054366 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 09 13:03:59 crc kubenswrapper[4723]: I0309 13:03:59.078362 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 09 13:03:59 crc kubenswrapper[4723]: I0309 13:03:59.112204 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 09 13:03:59 crc kubenswrapper[4723]: I0309 13:03:59.293039 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 09 13:03:59 crc kubenswrapper[4723]: I0309 13:03:59.336225 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-774cb675cc-zxwjq" event={"ID":"1e0a1ce6-0cec-47d3-9082-8630337ae033","Type":"ContainerStarted","Data":"de7d7145a898bf1cfc8de1683526a07a40b29f8a4b0ced80624f626a12d62e99"} Mar 09 13:03:59 crc kubenswrapper[4723]: I0309 13:03:59.336274 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-774cb675cc-zxwjq" event={"ID":"1e0a1ce6-0cec-47d3-9082-8630337ae033","Type":"ContainerStarted","Data":"d0bdf9451fa415b6f6e98e44097627cf8e16f9b2e08fa5b8274d975fc9438a9a"} Mar 09 13:03:59 crc kubenswrapper[4723]: I0309 13:03:59.336841 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-774cb675cc-zxwjq" Mar 09 13:03:59 crc kubenswrapper[4723]: I0309 13:03:59.337681 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7757f9dd75-k996n" event={"ID":"af3980f7-9337-48f4-84f1-c311b36f47b0","Type":"ContainerStarted","Data":"6f030229611deafd7b4128b36586774314009b265b8cfbfd08a9a3a216b7d85b"} Mar 09 13:03:59 crc kubenswrapper[4723]: I0309 13:03:59.337710 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7757f9dd75-k996n" event={"ID":"af3980f7-9337-48f4-84f1-c311b36f47b0","Type":"ContainerStarted","Data":"387771f3cfe35aaed1b9909f48de8a163ac3400dad17c1bd3c0761a0cb3a124b"} Mar 09 13:03:59 crc kubenswrapper[4723]: I0309 13:03:59.338100 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7757f9dd75-k996n" Mar 09 13:03:59 crc kubenswrapper[4723]: I0309 13:03:59.341425 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-774cb675cc-zxwjq" Mar 09 13:03:59 crc kubenswrapper[4723]: I0309 13:03:59.343973 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7757f9dd75-k996n" Mar 09 13:03:59 crc kubenswrapper[4723]: I0309 13:03:59.357535 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-774cb675cc-zxwjq" podStartSLOduration=3.3575107109999998 podStartE2EDuration="3.357510711s" podCreationTimestamp="2026-03-09 13:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:03:59.353353417 +0000 UTC m=+313.367820987" watchObservedRunningTime="2026-03-09 13:03:59.357510711 +0000 UTC m=+313.371978251" Mar 09 13:03:59 crc kubenswrapper[4723]: I0309 13:03:59.373164 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7757f9dd75-k996n" podStartSLOduration=3.373138065 podStartE2EDuration="3.373138065s" podCreationTimestamp="2026-03-09 13:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:03:59.372202981 +0000 UTC m=+313.386670561" watchObservedRunningTime="2026-03-09 13:03:59.373138065 +0000 UTC m=+313.387605625" Mar 09 13:03:59 crc kubenswrapper[4723]: I0309 13:03:59.392965 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 09 13:03:59 crc kubenswrapper[4723]: I0309 13:03:59.413630 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 09 13:03:59 crc kubenswrapper[4723]: I0309 13:03:59.455656 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 09 13:03:59 crc kubenswrapper[4723]: I0309 13:03:59.469969 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 09 13:03:59 crc kubenswrapper[4723]: I0309 13:03:59.519159 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 09 13:03:59 crc kubenswrapper[4723]: I0309 13:03:59.570020 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 09 13:03:59 crc kubenswrapper[4723]: I0309 13:03:59.570673 4723 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 09 13:03:59 crc kubenswrapper[4723]: I0309 13:03:59.570909 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://511d841fbabc2fb0e18cf0547f7df362d41f2c44260b4d9921913408a3dbc401" gracePeriod=5 Mar 09 13:03:59 crc kubenswrapper[4723]: I0309 13:03:59.602094 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 09 13:03:59 crc kubenswrapper[4723]: I0309 13:03:59.612073 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 09 13:03:59 crc kubenswrapper[4723]: I0309 13:03:59.699834 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 09 13:03:59 crc kubenswrapper[4723]: I0309 13:03:59.873684 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 09 13:03:59 crc kubenswrapper[4723]: I0309 13:03:59.921820 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 09 13:03:59 crc kubenswrapper[4723]: I0309 13:03:59.949503 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 09 13:03:59 crc kubenswrapper[4723]: I0309 13:03:59.973202 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 09 13:04:00 crc kubenswrapper[4723]: I0309 13:04:00.148493 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 09 13:04:00 crc kubenswrapper[4723]: I0309 13:04:00.161456 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551024-flwht"] Mar 09 13:04:00 crc kubenswrapper[4723]: E0309 13:04:00.161696 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 09 13:04:00 crc kubenswrapper[4723]: I0309 13:04:00.161712 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 09 13:04:00 crc kubenswrapper[4723]: I0309 13:04:00.161830 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 09 13:04:00 crc kubenswrapper[4723]: I0309 13:04:00.162247 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551024-flwht" Mar 09 13:04:00 crc kubenswrapper[4723]: I0309 13:04:00.164800 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:04:00 crc kubenswrapper[4723]: I0309 13:04:00.165063 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:04:00 crc kubenswrapper[4723]: I0309 13:04:00.165268 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jrp4x" Mar 09 13:04:00 crc kubenswrapper[4723]: I0309 13:04:00.176989 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551024-flwht"] Mar 09 13:04:00 crc kubenswrapper[4723]: I0309 13:04:00.247699 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 09 13:04:00 crc kubenswrapper[4723]: I0309 13:04:00.313586 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzv4h\" (UniqueName: \"kubernetes.io/projected/1c990608-7d03-402a-a042-9db3b406ca16-kube-api-access-hzv4h\") pod \"auto-csr-approver-29551024-flwht\" (UID: \"1c990608-7d03-402a-a042-9db3b406ca16\") " pod="openshift-infra/auto-csr-approver-29551024-flwht" Mar 09 13:04:00 crc kubenswrapper[4723]: I0309 13:04:00.415269 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzv4h\" (UniqueName: \"kubernetes.io/projected/1c990608-7d03-402a-a042-9db3b406ca16-kube-api-access-hzv4h\") pod \"auto-csr-approver-29551024-flwht\" (UID: \"1c990608-7d03-402a-a042-9db3b406ca16\") " pod="openshift-infra/auto-csr-approver-29551024-flwht" Mar 09 13:04:00 crc kubenswrapper[4723]: I0309 13:04:00.444214 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzv4h\" (UniqueName: \"kubernetes.io/projected/1c990608-7d03-402a-a042-9db3b406ca16-kube-api-access-hzv4h\") pod \"auto-csr-approver-29551024-flwht\" (UID: \"1c990608-7d03-402a-a042-9db3b406ca16\") " pod="openshift-infra/auto-csr-approver-29551024-flwht" Mar 09 13:04:00 crc kubenswrapper[4723]: I0309 13:04:00.445743 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 09 13:04:00 crc kubenswrapper[4723]: I0309 13:04:00.515764 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551024-flwht" Mar 09 13:04:00 crc kubenswrapper[4723]: I0309 13:04:00.568416 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 09 13:04:00 crc kubenswrapper[4723]: I0309 13:04:00.657448 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 09 13:04:00 crc kubenswrapper[4723]: I0309 13:04:00.907404 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 09 13:04:00 crc kubenswrapper[4723]: I0309 13:04:00.922332 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 09 13:04:00 crc kubenswrapper[4723]: I0309 13:04:00.951946 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551024-flwht"] Mar 09 13:04:00 crc kubenswrapper[4723]: I0309 13:04:00.978472 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 09 13:04:01 crc kubenswrapper[4723]: I0309 13:04:01.107419 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 09 13:04:01 crc kubenswrapper[4723]: I0309 13:04:01.147683 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 09 13:04:01 crc kubenswrapper[4723]: I0309 13:04:01.292221 4723 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 09 13:04:01 crc kubenswrapper[4723]: I0309 13:04:01.344587 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 09 13:04:01 crc kubenswrapper[4723]: I0309 13:04:01.352270 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551024-flwht" event={"ID":"1c990608-7d03-402a-a042-9db3b406ca16","Type":"ContainerStarted","Data":"6693bf1c6425158b6ed23c2d3e67e79af379ee08a8355e33c49bff963948a8a9"} Mar 09 13:04:01 crc kubenswrapper[4723]: I0309 13:04:01.402464 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 09 13:04:01 crc kubenswrapper[4723]: I0309 13:04:01.517552 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 09 13:04:01 crc kubenswrapper[4723]: I0309 13:04:01.566172 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 09 13:04:01 crc kubenswrapper[4723]: I0309 13:04:01.716599 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 09 13:04:01 crc kubenswrapper[4723]: I0309 13:04:01.774307 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 09 13:04:01 crc kubenswrapper[4723]: I0309 13:04:01.822225 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 09 13:04:02 crc kubenswrapper[4723]: I0309 13:04:02.175644 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 09 13:04:02 crc kubenswrapper[4723]: I0309 13:04:02.263661 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 09 13:04:02 crc kubenswrapper[4723]: I0309 13:04:02.346548 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 09 13:04:02 crc kubenswrapper[4723]: I0309 13:04:02.427478 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 09 13:04:02 crc kubenswrapper[4723]: I0309 13:04:02.537825 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 09 13:04:02 crc kubenswrapper[4723]: I0309 13:04:02.579072 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 09 13:04:02 crc kubenswrapper[4723]: I0309 13:04:02.626397 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 09 13:04:02 crc kubenswrapper[4723]: I0309 13:04:02.730327 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 09 13:04:02 crc kubenswrapper[4723]: I0309 13:04:02.807391 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 09 13:04:02 crc kubenswrapper[4723]: I0309 13:04:02.834480 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 09 13:04:03 crc kubenswrapper[4723]: I0309 13:04:03.129841 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 09 13:04:03 crc kubenswrapper[4723]: I0309 13:04:03.146792 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 09 13:04:03 crc kubenswrapper[4723]: I0309 13:04:03.341406 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 09 13:04:03 crc kubenswrapper[4723]: I0309 13:04:03.369934 4723 generic.go:334] "Generic (PLEG): container finished" podID="1c990608-7d03-402a-a042-9db3b406ca16" containerID="8a4b67848d5f5b3107170fd3a79e6d8f7dea820bfb2924ffc63d2832a8e2c6c1" exitCode=0 Mar 09 13:04:03 crc kubenswrapper[4723]: I0309 13:04:03.369970 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551024-flwht" event={"ID":"1c990608-7d03-402a-a042-9db3b406ca16","Type":"ContainerDied","Data":"8a4b67848d5f5b3107170fd3a79e6d8f7dea820bfb2924ffc63d2832a8e2c6c1"} Mar 09 13:04:03 crc kubenswrapper[4723]: I0309 13:04:03.435546 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 09 13:04:03 crc kubenswrapper[4723]: I0309 13:04:03.468332 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 09 13:04:03 crc kubenswrapper[4723]: I0309 13:04:03.947007 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:04:03 crc kubenswrapper[4723]: I0309 13:04:03.948277 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:04:03 crc kubenswrapper[4723]: I0309 13:04:03.948514 4723 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" Mar 09 13:04:03 crc kubenswrapper[4723]: I0309 13:04:03.949785 4723 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8b99b2d350bcf051156986ebffa7486b5fd35d0dcd70f0163bbbeb54050408e1"} pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 13:04:03 crc kubenswrapper[4723]: I0309 13:04:03.950140 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" containerID="cri-o://8b99b2d350bcf051156986ebffa7486b5fd35d0dcd70f0163bbbeb54050408e1" gracePeriod=600 Mar 09 13:04:04 crc kubenswrapper[4723]: I0309 13:04:04.053434 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 09 13:04:04 crc kubenswrapper[4723]: I0309 13:04:04.224452 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 09 13:04:04 crc kubenswrapper[4723]: I0309 13:04:04.356962 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 09 13:04:04 crc kubenswrapper[4723]: I0309 13:04:04.380414 4723 generic.go:334] "Generic (PLEG): container finished" podID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerID="8b99b2d350bcf051156986ebffa7486b5fd35d0dcd70f0163bbbeb54050408e1" exitCode=0 Mar 09 13:04:04 crc kubenswrapper[4723]: I0309 13:04:04.380637 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" event={"ID":"983d5ed4-cfc7-402a-b226-29dc071c6e4e","Type":"ContainerDied","Data":"8b99b2d350bcf051156986ebffa7486b5fd35d0dcd70f0163bbbeb54050408e1"} Mar 09 13:04:04 crc kubenswrapper[4723]: I0309 13:04:04.380695 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" event={"ID":"983d5ed4-cfc7-402a-b226-29dc071c6e4e","Type":"ContainerStarted","Data":"d054e197559b4879e59df42a68d1c798a7c319b81a2cd49030fdbc518b252634"} Mar 09 13:04:04 crc kubenswrapper[4723]: I0309 13:04:04.776686 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551024-flwht" Mar 09 13:04:04 crc kubenswrapper[4723]: I0309 13:04:04.886328 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzv4h\" (UniqueName: \"kubernetes.io/projected/1c990608-7d03-402a-a042-9db3b406ca16-kube-api-access-hzv4h\") pod \"1c990608-7d03-402a-a042-9db3b406ca16\" (UID: \"1c990608-7d03-402a-a042-9db3b406ca16\") " Mar 09 13:04:04 crc kubenswrapper[4723]: I0309 13:04:04.893020 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c990608-7d03-402a-a042-9db3b406ca16-kube-api-access-hzv4h" (OuterVolumeSpecName: "kube-api-access-hzv4h") pod "1c990608-7d03-402a-a042-9db3b406ca16" (UID: "1c990608-7d03-402a-a042-9db3b406ca16"). InnerVolumeSpecName "kube-api-access-hzv4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:04:04 crc kubenswrapper[4723]: I0309 13:04:04.988073 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzv4h\" (UniqueName: \"kubernetes.io/projected/1c990608-7d03-402a-a042-9db3b406ca16-kube-api-access-hzv4h\") on node \"crc\" DevicePath \"\"" Mar 09 13:04:05 crc kubenswrapper[4723]: I0309 13:04:05.140575 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 09 13:04:05 crc kubenswrapper[4723]: I0309 13:04:05.140649 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:04:05 crc kubenswrapper[4723]: I0309 13:04:05.292053 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 13:04:05 crc kubenswrapper[4723]: I0309 13:04:05.292154 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 13:04:05 crc kubenswrapper[4723]: I0309 13:04:05.292183 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:04:05 crc kubenswrapper[4723]: I0309 13:04:05.292268 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:04:05 crc kubenswrapper[4723]: I0309 13:04:05.292187 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 13:04:05 crc kubenswrapper[4723]: I0309 13:04:05.292455 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 13:04:05 crc kubenswrapper[4723]: I0309 13:04:05.292482 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 13:04:05 crc kubenswrapper[4723]: I0309 13:04:05.292530 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:04:05 crc kubenswrapper[4723]: I0309 13:04:05.292626 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:04:05 crc kubenswrapper[4723]: I0309 13:04:05.292967 4723 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 09 13:04:05 crc kubenswrapper[4723]: I0309 13:04:05.292990 4723 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 09 13:04:05 crc kubenswrapper[4723]: I0309 13:04:05.293007 4723 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 09 13:04:05 crc kubenswrapper[4723]: I0309 13:04:05.293021 4723 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 09 13:04:05 crc kubenswrapper[4723]: I0309 13:04:05.299109 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:04:05 crc kubenswrapper[4723]: I0309 13:04:05.394196 4723 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 09 13:04:05 crc kubenswrapper[4723]: I0309 13:04:05.396627 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551024-flwht" event={"ID":"1c990608-7d03-402a-a042-9db3b406ca16","Type":"ContainerDied","Data":"6693bf1c6425158b6ed23c2d3e67e79af379ee08a8355e33c49bff963948a8a9"} Mar 09 13:04:05 crc kubenswrapper[4723]: I0309 13:04:05.396687 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6693bf1c6425158b6ed23c2d3e67e79af379ee08a8355e33c49bff963948a8a9" Mar 09 13:04:05 crc kubenswrapper[4723]: I0309 13:04:05.396657 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551024-flwht" Mar 09 13:04:05 crc kubenswrapper[4723]: I0309 13:04:05.398434 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 09 13:04:05 crc kubenswrapper[4723]: I0309 13:04:05.398498 4723 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="511d841fbabc2fb0e18cf0547f7df362d41f2c44260b4d9921913408a3dbc401" exitCode=137 Mar 09 13:04:05 crc kubenswrapper[4723]: I0309 13:04:05.398544 4723 scope.go:117] "RemoveContainer" containerID="511d841fbabc2fb0e18cf0547f7df362d41f2c44260b4d9921913408a3dbc401" Mar 09 13:04:05 crc kubenswrapper[4723]: I0309 13:04:05.398639 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:04:05 crc kubenswrapper[4723]: I0309 13:04:05.423541 4723 scope.go:117] "RemoveContainer" containerID="511d841fbabc2fb0e18cf0547f7df362d41f2c44260b4d9921913408a3dbc401" Mar 09 13:04:05 crc kubenswrapper[4723]: E0309 13:04:05.424325 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"511d841fbabc2fb0e18cf0547f7df362d41f2c44260b4d9921913408a3dbc401\": container with ID starting with 511d841fbabc2fb0e18cf0547f7df362d41f2c44260b4d9921913408a3dbc401 not found: ID does not exist" containerID="511d841fbabc2fb0e18cf0547f7df362d41f2c44260b4d9921913408a3dbc401" Mar 09 13:04:05 crc kubenswrapper[4723]: I0309 13:04:05.424384 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"511d841fbabc2fb0e18cf0547f7df362d41f2c44260b4d9921913408a3dbc401"} err="failed to get container status \"511d841fbabc2fb0e18cf0547f7df362d41f2c44260b4d9921913408a3dbc401\": rpc error: code = NotFound desc = could not find container \"511d841fbabc2fb0e18cf0547f7df362d41f2c44260b4d9921913408a3dbc401\": container with ID starting with 511d841fbabc2fb0e18cf0547f7df362d41f2c44260b4d9921913408a3dbc401 not found: ID does not exist" Mar 09 13:04:06 crc kubenswrapper[4723]: I0309 13:04:06.888778 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 09 13:04:16 crc kubenswrapper[4723]: I0309 13:04:16.402151 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-774cb675cc-zxwjq"] Mar 09 13:04:16 crc kubenswrapper[4723]: I0309 13:04:16.409098 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-774cb675cc-zxwjq" podUID="1e0a1ce6-0cec-47d3-9082-8630337ae033" containerName="controller-manager" containerID="cri-o://de7d7145a898bf1cfc8de1683526a07a40b29f8a4b0ced80624f626a12d62e99" gracePeriod=30 Mar 09 13:04:16 crc kubenswrapper[4723]: I0309 13:04:16.418492 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7757f9dd75-k996n"] Mar 09 13:04:16 crc kubenswrapper[4723]: I0309 13:04:16.419033 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7757f9dd75-k996n" podUID="af3980f7-9337-48f4-84f1-c311b36f47b0" containerName="route-controller-manager" containerID="cri-o://6f030229611deafd7b4128b36586774314009b265b8cfbfd08a9a3a216b7d85b" gracePeriod=30 Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.053611 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7757f9dd75-k996n" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.148846 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m52xm\" (UniqueName: \"kubernetes.io/projected/af3980f7-9337-48f4-84f1-c311b36f47b0-kube-api-access-m52xm\") pod \"af3980f7-9337-48f4-84f1-c311b36f47b0\" (UID: \"af3980f7-9337-48f4-84f1-c311b36f47b0\") " Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.148889 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af3980f7-9337-48f4-84f1-c311b36f47b0-client-ca\") pod \"af3980f7-9337-48f4-84f1-c311b36f47b0\" (UID: \"af3980f7-9337-48f4-84f1-c311b36f47b0\") " Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.148911 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af3980f7-9337-48f4-84f1-c311b36f47b0-serving-cert\") pod \"af3980f7-9337-48f4-84f1-c311b36f47b0\" (UID: \"af3980f7-9337-48f4-84f1-c311b36f47b0\") " Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.148971 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af3980f7-9337-48f4-84f1-c311b36f47b0-config\") pod \"af3980f7-9337-48f4-84f1-c311b36f47b0\" (UID: \"af3980f7-9337-48f4-84f1-c311b36f47b0\") " Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.149718 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af3980f7-9337-48f4-84f1-c311b36f47b0-config" (OuterVolumeSpecName: "config") pod "af3980f7-9337-48f4-84f1-c311b36f47b0" (UID: "af3980f7-9337-48f4-84f1-c311b36f47b0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.149740 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af3980f7-9337-48f4-84f1-c311b36f47b0-client-ca" (OuterVolumeSpecName: "client-ca") pod "af3980f7-9337-48f4-84f1-c311b36f47b0" (UID: "af3980f7-9337-48f4-84f1-c311b36f47b0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.149940 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af3980f7-9337-48f4-84f1-c311b36f47b0-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.149989 4723 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af3980f7-9337-48f4-84f1-c311b36f47b0-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.151178 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-774cb675cc-zxwjq" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.154473 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af3980f7-9337-48f4-84f1-c311b36f47b0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "af3980f7-9337-48f4-84f1-c311b36f47b0" (UID: "af3980f7-9337-48f4-84f1-c311b36f47b0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.154504 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af3980f7-9337-48f4-84f1-c311b36f47b0-kube-api-access-m52xm" (OuterVolumeSpecName: "kube-api-access-m52xm") pod "af3980f7-9337-48f4-84f1-c311b36f47b0" (UID: "af3980f7-9337-48f4-84f1-c311b36f47b0"). InnerVolumeSpecName "kube-api-access-m52xm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.250637 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e0a1ce6-0cec-47d3-9082-8630337ae033-config\") pod \"1e0a1ce6-0cec-47d3-9082-8630337ae033\" (UID: \"1e0a1ce6-0cec-47d3-9082-8630337ae033\") " Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.250756 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1e0a1ce6-0cec-47d3-9082-8630337ae033-client-ca\") pod \"1e0a1ce6-0cec-47d3-9082-8630337ae033\" (UID: \"1e0a1ce6-0cec-47d3-9082-8630337ae033\") " Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.250798 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e0a1ce6-0cec-47d3-9082-8630337ae033-serving-cert\") pod \"1e0a1ce6-0cec-47d3-9082-8630337ae033\" (UID: \"1e0a1ce6-0cec-47d3-9082-8630337ae033\") " Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.250826 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbndp\" (UniqueName: \"kubernetes.io/projected/1e0a1ce6-0cec-47d3-9082-8630337ae033-kube-api-access-vbndp\") pod \"1e0a1ce6-0cec-47d3-9082-8630337ae033\" (UID: \"1e0a1ce6-0cec-47d3-9082-8630337ae033\") " Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.250906 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1e0a1ce6-0cec-47d3-9082-8630337ae033-proxy-ca-bundles\") pod \"1e0a1ce6-0cec-47d3-9082-8630337ae033\" (UID: \"1e0a1ce6-0cec-47d3-9082-8630337ae033\") " Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.251150 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m52xm\" (UniqueName: \"kubernetes.io/projected/af3980f7-9337-48f4-84f1-c311b36f47b0-kube-api-access-m52xm\") on node \"crc\" DevicePath \"\"" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.251173 4723 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af3980f7-9337-48f4-84f1-c311b36f47b0-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.251829 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e0a1ce6-0cec-47d3-9082-8630337ae033-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "1e0a1ce6-0cec-47d3-9082-8630337ae033" (UID: "1e0a1ce6-0cec-47d3-9082-8630337ae033"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.251907 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e0a1ce6-0cec-47d3-9082-8630337ae033-client-ca" (OuterVolumeSpecName: "client-ca") pod "1e0a1ce6-0cec-47d3-9082-8630337ae033" (UID: "1e0a1ce6-0cec-47d3-9082-8630337ae033"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.252236 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e0a1ce6-0cec-47d3-9082-8630337ae033-config" (OuterVolumeSpecName: "config") pod "1e0a1ce6-0cec-47d3-9082-8630337ae033" (UID: "1e0a1ce6-0cec-47d3-9082-8630337ae033"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.253995 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e0a1ce6-0cec-47d3-9082-8630337ae033-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1e0a1ce6-0cec-47d3-9082-8630337ae033" (UID: "1e0a1ce6-0cec-47d3-9082-8630337ae033"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.254486 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e0a1ce6-0cec-47d3-9082-8630337ae033-kube-api-access-vbndp" (OuterVolumeSpecName: "kube-api-access-vbndp") pod "1e0a1ce6-0cec-47d3-9082-8630337ae033" (UID: "1e0a1ce6-0cec-47d3-9082-8630337ae033"). InnerVolumeSpecName "kube-api-access-vbndp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.352217 4723 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1e0a1ce6-0cec-47d3-9082-8630337ae033-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.352247 4723 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e0a1ce6-0cec-47d3-9082-8630337ae033-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.352256 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbndp\" (UniqueName: \"kubernetes.io/projected/1e0a1ce6-0cec-47d3-9082-8630337ae033-kube-api-access-vbndp\") on node \"crc\" DevicePath \"\"" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.352266 4723 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1e0a1ce6-0cec-47d3-9082-8630337ae033-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.352274 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e0a1ce6-0cec-47d3-9082-8630337ae033-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.476257 4723 generic.go:334] "Generic (PLEG): container finished" podID="1e0a1ce6-0cec-47d3-9082-8630337ae033" containerID="de7d7145a898bf1cfc8de1683526a07a40b29f8a4b0ced80624f626a12d62e99" exitCode=0 Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.476310 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-774cb675cc-zxwjq" event={"ID":"1e0a1ce6-0cec-47d3-9082-8630337ae033","Type":"ContainerDied","Data":"de7d7145a898bf1cfc8de1683526a07a40b29f8a4b0ced80624f626a12d62e99"} Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.476334 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-774cb675cc-zxwjq" event={"ID":"1e0a1ce6-0cec-47d3-9082-8630337ae033","Type":"ContainerDied","Data":"d0bdf9451fa415b6f6e98e44097627cf8e16f9b2e08fa5b8274d975fc9438a9a"} Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.476352 4723 scope.go:117] "RemoveContainer" containerID="de7d7145a898bf1cfc8de1683526a07a40b29f8a4b0ced80624f626a12d62e99" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.476440 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-774cb675cc-zxwjq" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.487868 4723 generic.go:334] "Generic (PLEG): container finished" podID="af3980f7-9337-48f4-84f1-c311b36f47b0" containerID="6f030229611deafd7b4128b36586774314009b265b8cfbfd08a9a3a216b7d85b" exitCode=0 Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.487905 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7757f9dd75-k996n" event={"ID":"af3980f7-9337-48f4-84f1-c311b36f47b0","Type":"ContainerDied","Data":"6f030229611deafd7b4128b36586774314009b265b8cfbfd08a9a3a216b7d85b"} Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.487936 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7757f9dd75-k996n" event={"ID":"af3980f7-9337-48f4-84f1-c311b36f47b0","Type":"ContainerDied","Data":"387771f3cfe35aaed1b9909f48de8a163ac3400dad17c1bd3c0761a0cb3a124b"} Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.488064 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7757f9dd75-k996n" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.500678 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-774cb675cc-zxwjq"] Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.500824 4723 scope.go:117] "RemoveContainer" containerID="de7d7145a898bf1cfc8de1683526a07a40b29f8a4b0ced80624f626a12d62e99" Mar 09 13:04:17 crc kubenswrapper[4723]: E0309 13:04:17.501487 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de7d7145a898bf1cfc8de1683526a07a40b29f8a4b0ced80624f626a12d62e99\": container with ID starting with de7d7145a898bf1cfc8de1683526a07a40b29f8a4b0ced80624f626a12d62e99 not found: ID does not exist" containerID="de7d7145a898bf1cfc8de1683526a07a40b29f8a4b0ced80624f626a12d62e99" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.501522 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de7d7145a898bf1cfc8de1683526a07a40b29f8a4b0ced80624f626a12d62e99"} err="failed to get container status \"de7d7145a898bf1cfc8de1683526a07a40b29f8a4b0ced80624f626a12d62e99\": rpc error: code = NotFound desc = could not find container \"de7d7145a898bf1cfc8de1683526a07a40b29f8a4b0ced80624f626a12d62e99\": container with ID starting with de7d7145a898bf1cfc8de1683526a07a40b29f8a4b0ced80624f626a12d62e99 not found: ID does not exist" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.501545 4723 scope.go:117] "RemoveContainer" containerID="6f030229611deafd7b4128b36586774314009b265b8cfbfd08a9a3a216b7d85b" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.507560 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-774cb675cc-zxwjq"] Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.526773 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dqh66"] Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.527070 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dqh66" podUID="b5cafa5f-a4bc-4029-b136-ba9b3e2b6709" containerName="registry-server" containerID="cri-o://76be5bc3f51500f73609fb3b480c784c3e14eaad6c550a5c5e15ecb66497c7c6" gracePeriod=30 Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.529454 4723 scope.go:117] "RemoveContainer" containerID="6f030229611deafd7b4128b36586774314009b265b8cfbfd08a9a3a216b7d85b" Mar 09 13:04:17 crc kubenswrapper[4723]: E0309 13:04:17.530111 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f030229611deafd7b4128b36586774314009b265b8cfbfd08a9a3a216b7d85b\": container with ID starting with 6f030229611deafd7b4128b36586774314009b265b8cfbfd08a9a3a216b7d85b not found: ID does not exist" containerID="6f030229611deafd7b4128b36586774314009b265b8cfbfd08a9a3a216b7d85b" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.530181 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f030229611deafd7b4128b36586774314009b265b8cfbfd08a9a3a216b7d85b"} err="failed to get container status \"6f030229611deafd7b4128b36586774314009b265b8cfbfd08a9a3a216b7d85b\": rpc error: code = NotFound desc = could not find container \"6f030229611deafd7b4128b36586774314009b265b8cfbfd08a9a3a216b7d85b\": container with ID starting with 6f030229611deafd7b4128b36586774314009b265b8cfbfd08a9a3a216b7d85b not found: ID does not exist" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.539204 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7757f9dd75-k996n"] Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.548227 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4x6zm"] Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.548622 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4x6zm" podUID="a7d103aa-232e-4705-a061-8ad7025339cf" containerName="registry-server" containerID="cri-o://3ad121701ecbed45714230b6796cc948294ac9e586bad1fc30461db0f68f2e2f" gracePeriod=30 Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.552993 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7757f9dd75-k996n"] Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.554555 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p8pxb"] Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.554767 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-p8pxb" podUID="c6d61e80-043f-4ece-a6a6-eed6357749f5" containerName="marketplace-operator" containerID="cri-o://8c8e2b68a7e5d4c3484d0f64943ccbf1d2fb0a14d4c8d1f1ae6f774121072bc7" gracePeriod=30 Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.558792 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pvf4v"] Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.561965 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pvf4v" podUID="8ebe9a64-25f7-4d32-bdce-3a3942ba53a2" containerName="registry-server" containerID="cri-o://004448f6e17452bc7c76fca92db1a588abea82406b34f99452568f10d3ddf560" gracePeriod=30 Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.567637 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2lhjt"] Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.567961 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2lhjt" podUID="5adbe8b6-fabd-4e21-8507-84df16004837" containerName="registry-server" containerID="cri-o://900ce8517dbf500ee93275d9ddfc72f283c72c0678a51c4eb9ea89f798a07a04" gracePeriod=30 Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.576726 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7nz4v"] Mar 09 13:04:17 crc kubenswrapper[4723]: E0309 13:04:17.577091 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af3980f7-9337-48f4-84f1-c311b36f47b0" containerName="route-controller-manager" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.577120 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="af3980f7-9337-48f4-84f1-c311b36f47b0" containerName="route-controller-manager" Mar 09 13:04:17 crc kubenswrapper[4723]: E0309 13:04:17.577147 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e0a1ce6-0cec-47d3-9082-8630337ae033" containerName="controller-manager" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.577160 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e0a1ce6-0cec-47d3-9082-8630337ae033" containerName="controller-manager" Mar 09 13:04:17 crc kubenswrapper[4723]: E0309 13:04:17.577186 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c990608-7d03-402a-a042-9db3b406ca16" containerName="oc" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.577196 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c990608-7d03-402a-a042-9db3b406ca16" containerName="oc" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.577349 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e0a1ce6-0cec-47d3-9082-8630337ae033" containerName="controller-manager" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.577381 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c990608-7d03-402a-a042-9db3b406ca16" containerName="oc" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.577403 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="af3980f7-9337-48f4-84f1-c311b36f47b0" containerName="route-controller-manager" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.578013 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7nz4v" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.580064 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7nz4v"] Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.623315 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-8574658c57-bbjt8"] Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.624129 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8574658c57-bbjt8" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.626649 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.626939 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.627235 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.627346 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.627448 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.627544 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.629293 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cf876879c-jn49r"] Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.630023 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7cf876879c-jn49r" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.634969 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.635016 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.635092 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.635266 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.635276 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.635444 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.636891 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8574658c57-bbjt8"] Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.638320 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.644112 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cf876879c-jn49r"] Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.655996 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjqgd\" (UniqueName: \"kubernetes.io/projected/ede99d46-5d03-42b4-bb3f-76a952e4e9e1-kube-api-access-zjqgd\") pod \"route-controller-manager-7cf876879c-jn49r\" (UID: \"ede99d46-5d03-42b4-bb3f-76a952e4e9e1\") " pod="openshift-route-controller-manager/route-controller-manager-7cf876879c-jn49r" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.656038 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ede99d46-5d03-42b4-bb3f-76a952e4e9e1-serving-cert\") pod \"route-controller-manager-7cf876879c-jn49r\" (UID: \"ede99d46-5d03-42b4-bb3f-76a952e4e9e1\") " pod="openshift-route-controller-manager/route-controller-manager-7cf876879c-jn49r" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.656072 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0c45ecd0-a916-4ef0-80aa-cfe88212d0ed-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7nz4v\" (UID: \"0c45ecd0-a916-4ef0-80aa-cfe88212d0ed\") " pod="openshift-marketplace/marketplace-operator-79b997595-7nz4v" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.656105 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0c45ecd0-a916-4ef0-80aa-cfe88212d0ed-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7nz4v\" (UID: \"0c45ecd0-a916-4ef0-80aa-cfe88212d0ed\") " pod="openshift-marketplace/marketplace-operator-79b997595-7nz4v" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.656130 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ede99d46-5d03-42b4-bb3f-76a952e4e9e1-config\") pod \"route-controller-manager-7cf876879c-jn49r\" (UID: \"ede99d46-5d03-42b4-bb3f-76a952e4e9e1\") " pod="openshift-route-controller-manager/route-controller-manager-7cf876879c-jn49r" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.656331 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ede99d46-5d03-42b4-bb3f-76a952e4e9e1-client-ca\") pod \"route-controller-manager-7cf876879c-jn49r\" (UID: \"ede99d46-5d03-42b4-bb3f-76a952e4e9e1\") " pod="openshift-route-controller-manager/route-controller-manager-7cf876879c-jn49r" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.656414 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zctvr\" (UniqueName: \"kubernetes.io/projected/0c45ecd0-a916-4ef0-80aa-cfe88212d0ed-kube-api-access-zctvr\") pod \"marketplace-operator-79b997595-7nz4v\" (UID: \"0c45ecd0-a916-4ef0-80aa-cfe88212d0ed\") " pod="openshift-marketplace/marketplace-operator-79b997595-7nz4v" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.757441 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0c45ecd0-a916-4ef0-80aa-cfe88212d0ed-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7nz4v\" (UID: \"0c45ecd0-a916-4ef0-80aa-cfe88212d0ed\") " pod="openshift-marketplace/marketplace-operator-79b997595-7nz4v" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.757515 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6be8b05e-d252-470d-bf25-5bd6102d08f7-serving-cert\") pod \"controller-manager-8574658c57-bbjt8\" (UID: \"6be8b05e-d252-470d-bf25-5bd6102d08f7\") " pod="openshift-controller-manager/controller-manager-8574658c57-bbjt8" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.757545 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0c45ecd0-a916-4ef0-80aa-cfe88212d0ed-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7nz4v\" (UID: \"0c45ecd0-a916-4ef0-80aa-cfe88212d0ed\") " pod="openshift-marketplace/marketplace-operator-79b997595-7nz4v" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.757571 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ede99d46-5d03-42b4-bb3f-76a952e4e9e1-config\") pod \"route-controller-manager-7cf876879c-jn49r\" (UID: \"ede99d46-5d03-42b4-bb3f-76a952e4e9e1\") " pod="openshift-route-controller-manager/route-controller-manager-7cf876879c-jn49r" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.757618 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ede99d46-5d03-42b4-bb3f-76a952e4e9e1-client-ca\") pod \"route-controller-manager-7cf876879c-jn49r\" (UID: \"ede99d46-5d03-42b4-bb3f-76a952e4e9e1\") " pod="openshift-route-controller-manager/route-controller-manager-7cf876879c-jn49r" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.757642 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6be8b05e-d252-470d-bf25-5bd6102d08f7-proxy-ca-bundles\") pod \"controller-manager-8574658c57-bbjt8\" (UID: \"6be8b05e-d252-470d-bf25-5bd6102d08f7\") " pod="openshift-controller-manager/controller-manager-8574658c57-bbjt8" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.757669 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjrsj\" (UniqueName: \"kubernetes.io/projected/6be8b05e-d252-470d-bf25-5bd6102d08f7-kube-api-access-vjrsj\") pod \"controller-manager-8574658c57-bbjt8\" (UID: \"6be8b05e-d252-470d-bf25-5bd6102d08f7\") " pod="openshift-controller-manager/controller-manager-8574658c57-bbjt8" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.757702 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zctvr\" (UniqueName: \"kubernetes.io/projected/0c45ecd0-a916-4ef0-80aa-cfe88212d0ed-kube-api-access-zctvr\") pod \"marketplace-operator-79b997595-7nz4v\" (UID: \"0c45ecd0-a916-4ef0-80aa-cfe88212d0ed\") " pod="openshift-marketplace/marketplace-operator-79b997595-7nz4v" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.757737 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6be8b05e-d252-470d-bf25-5bd6102d08f7-client-ca\") pod \"controller-manager-8574658c57-bbjt8\" (UID: \"6be8b05e-d252-470d-bf25-5bd6102d08f7\") " pod="openshift-controller-manager/controller-manager-8574658c57-bbjt8" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.757754 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6be8b05e-d252-470d-bf25-5bd6102d08f7-config\") pod \"controller-manager-8574658c57-bbjt8\" (UID: \"6be8b05e-d252-470d-bf25-5bd6102d08f7\") " pod="openshift-controller-manager/controller-manager-8574658c57-bbjt8" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.757774 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjqgd\" (UniqueName: \"kubernetes.io/projected/ede99d46-5d03-42b4-bb3f-76a952e4e9e1-kube-api-access-zjqgd\") pod \"route-controller-manager-7cf876879c-jn49r\" (UID: \"ede99d46-5d03-42b4-bb3f-76a952e4e9e1\") " pod="openshift-route-controller-manager/route-controller-manager-7cf876879c-jn49r" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.757800 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ede99d46-5d03-42b4-bb3f-76a952e4e9e1-serving-cert\") pod \"route-controller-manager-7cf876879c-jn49r\" (UID: \"ede99d46-5d03-42b4-bb3f-76a952e4e9e1\") " pod="openshift-route-controller-manager/route-controller-manager-7cf876879c-jn49r" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.758672 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ede99d46-5d03-42b4-bb3f-76a952e4e9e1-client-ca\") pod \"route-controller-manager-7cf876879c-jn49r\" (UID: \"ede99d46-5d03-42b4-bb3f-76a952e4e9e1\") " pod="openshift-route-controller-manager/route-controller-manager-7cf876879c-jn49r" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.759111 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ede99d46-5d03-42b4-bb3f-76a952e4e9e1-config\") pod \"route-controller-manager-7cf876879c-jn49r\" (UID: \"ede99d46-5d03-42b4-bb3f-76a952e4e9e1\") " pod="openshift-route-controller-manager/route-controller-manager-7cf876879c-jn49r" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.760218 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0c45ecd0-a916-4ef0-80aa-cfe88212d0ed-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7nz4v\" (UID: \"0c45ecd0-a916-4ef0-80aa-cfe88212d0ed\") " pod="openshift-marketplace/marketplace-operator-79b997595-7nz4v" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.761065 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ede99d46-5d03-42b4-bb3f-76a952e4e9e1-serving-cert\") pod \"route-controller-manager-7cf876879c-jn49r\" (UID: \"ede99d46-5d03-42b4-bb3f-76a952e4e9e1\") " pod="openshift-route-controller-manager/route-controller-manager-7cf876879c-jn49r" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.770496 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0c45ecd0-a916-4ef0-80aa-cfe88212d0ed-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7nz4v\" (UID: \"0c45ecd0-a916-4ef0-80aa-cfe88212d0ed\") " pod="openshift-marketplace/marketplace-operator-79b997595-7nz4v" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.772450 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjqgd\" (UniqueName: \"kubernetes.io/projected/ede99d46-5d03-42b4-bb3f-76a952e4e9e1-kube-api-access-zjqgd\") pod \"route-controller-manager-7cf876879c-jn49r\" (UID: \"ede99d46-5d03-42b4-bb3f-76a952e4e9e1\") " pod="openshift-route-controller-manager/route-controller-manager-7cf876879c-jn49r" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.773267 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zctvr\" (UniqueName: \"kubernetes.io/projected/0c45ecd0-a916-4ef0-80aa-cfe88212d0ed-kube-api-access-zctvr\") pod \"marketplace-operator-79b997595-7nz4v\" (UID: \"0c45ecd0-a916-4ef0-80aa-cfe88212d0ed\") " pod="openshift-marketplace/marketplace-operator-79b997595-7nz4v" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.858818 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6be8b05e-d252-470d-bf25-5bd6102d08f7-proxy-ca-bundles\") pod \"controller-manager-8574658c57-bbjt8\" (UID: \"6be8b05e-d252-470d-bf25-5bd6102d08f7\") " pod="openshift-controller-manager/controller-manager-8574658c57-bbjt8" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.858880 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjrsj\" (UniqueName: \"kubernetes.io/projected/6be8b05e-d252-470d-bf25-5bd6102d08f7-kube-api-access-vjrsj\") pod \"controller-manager-8574658c57-bbjt8\" (UID: \"6be8b05e-d252-470d-bf25-5bd6102d08f7\") " pod="openshift-controller-manager/controller-manager-8574658c57-bbjt8" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.858912 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6be8b05e-d252-470d-bf25-5bd6102d08f7-client-ca\") pod \"controller-manager-8574658c57-bbjt8\" (UID: \"6be8b05e-d252-470d-bf25-5bd6102d08f7\") " pod="openshift-controller-manager/controller-manager-8574658c57-bbjt8" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.858932 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6be8b05e-d252-470d-bf25-5bd6102d08f7-config\") pod \"controller-manager-8574658c57-bbjt8\" (UID: \"6be8b05e-d252-470d-bf25-5bd6102d08f7\") " pod="openshift-controller-manager/controller-manager-8574658c57-bbjt8" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.858982 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6be8b05e-d252-470d-bf25-5bd6102d08f7-serving-cert\") pod \"controller-manager-8574658c57-bbjt8\" (UID: \"6be8b05e-d252-470d-bf25-5bd6102d08f7\") " pod="openshift-controller-manager/controller-manager-8574658c57-bbjt8" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.860002 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6be8b05e-d252-470d-bf25-5bd6102d08f7-client-ca\") pod \"controller-manager-8574658c57-bbjt8\" (UID: \"6be8b05e-d252-470d-bf25-5bd6102d08f7\") " pod="openshift-controller-manager/controller-manager-8574658c57-bbjt8" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.860012 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6be8b05e-d252-470d-bf25-5bd6102d08f7-proxy-ca-bundles\") pod \"controller-manager-8574658c57-bbjt8\" (UID: \"6be8b05e-d252-470d-bf25-5bd6102d08f7\") " pod="openshift-controller-manager/controller-manager-8574658c57-bbjt8" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.860413 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6be8b05e-d252-470d-bf25-5bd6102d08f7-config\") pod \"controller-manager-8574658c57-bbjt8\" (UID: \"6be8b05e-d252-470d-bf25-5bd6102d08f7\") " pod="openshift-controller-manager/controller-manager-8574658c57-bbjt8" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.862059 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6be8b05e-d252-470d-bf25-5bd6102d08f7-serving-cert\") pod \"controller-manager-8574658c57-bbjt8\" (UID: \"6be8b05e-d252-470d-bf25-5bd6102d08f7\") " pod="openshift-controller-manager/controller-manager-8574658c57-bbjt8" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.874123 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjrsj\" (UniqueName: \"kubernetes.io/projected/6be8b05e-d252-470d-bf25-5bd6102d08f7-kube-api-access-vjrsj\") pod \"controller-manager-8574658c57-bbjt8\" (UID: \"6be8b05e-d252-470d-bf25-5bd6102d08f7\") " pod="openshift-controller-manager/controller-manager-8574658c57-bbjt8" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.895079 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7nz4v" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.938842 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7cf876879c-jn49r" Mar 09 13:04:17 crc kubenswrapper[4723]: I0309 13:04:17.947465 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8574658c57-bbjt8" Mar 09 13:04:18 crc kubenswrapper[4723]: E0309 13:04:18.162436 4723 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 900ce8517dbf500ee93275d9ddfc72f283c72c0678a51c4eb9ea89f798a07a04 is running failed: container process not found" containerID="900ce8517dbf500ee93275d9ddfc72f283c72c0678a51c4eb9ea89f798a07a04" cmd=["grpc_health_probe","-addr=:50051"] Mar 09 13:04:18 crc kubenswrapper[4723]: E0309 13:04:18.162831 4723 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 900ce8517dbf500ee93275d9ddfc72f283c72c0678a51c4eb9ea89f798a07a04 is running failed: container process not found" containerID="900ce8517dbf500ee93275d9ddfc72f283c72c0678a51c4eb9ea89f798a07a04" cmd=["grpc_health_probe","-addr=:50051"] Mar 09 13:04:18 crc kubenswrapper[4723]: E0309 13:04:18.163130 4723 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 900ce8517dbf500ee93275d9ddfc72f283c72c0678a51c4eb9ea89f798a07a04 is running failed: container process not found" containerID="900ce8517dbf500ee93275d9ddfc72f283c72c0678a51c4eb9ea89f798a07a04" cmd=["grpc_health_probe","-addr=:50051"] Mar 09 13:04:18 crc kubenswrapper[4723]: E0309 13:04:18.163162 4723 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 900ce8517dbf500ee93275d9ddfc72f283c72c0678a51c4eb9ea89f798a07a04 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-2lhjt" podUID="5adbe8b6-fabd-4e21-8507-84df16004837" containerName="registry-server" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.330018 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7nz4v"] Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.429644 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8574658c57-bbjt8"] Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.448165 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cf876879c-jn49r"] Mar 09 13:04:18 crc kubenswrapper[4723]: W0309 13:04:18.463090 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6be8b05e_d252_470d_bf25_5bd6102d08f7.slice/crio-33a21e1e9e773393661594d23b24dc59b4318e2cde821b192b1ba9b76ab97920 WatchSource:0}: Error finding container 33a21e1e9e773393661594d23b24dc59b4318e2cde821b192b1ba9b76ab97920: Status 404 returned error can't find the container with id 33a21e1e9e773393661594d23b24dc59b4318e2cde821b192b1ba9b76ab97920 Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.471765 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dqh66" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.520520 4723 generic.go:334] "Generic (PLEG): container finished" podID="8ebe9a64-25f7-4d32-bdce-3a3942ba53a2" containerID="004448f6e17452bc7c76fca92db1a588abea82406b34f99452568f10d3ddf560" exitCode=0 Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.520624 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvf4v" event={"ID":"8ebe9a64-25f7-4d32-bdce-3a3942ba53a2","Type":"ContainerDied","Data":"004448f6e17452bc7c76fca92db1a588abea82406b34f99452568f10d3ddf560"} Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.523968 4723 generic.go:334] "Generic (PLEG): container finished" podID="b5cafa5f-a4bc-4029-b136-ba9b3e2b6709" containerID="76be5bc3f51500f73609fb3b480c784c3e14eaad6c550a5c5e15ecb66497c7c6" exitCode=0 Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.524022 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqh66" event={"ID":"b5cafa5f-a4bc-4029-b136-ba9b3e2b6709","Type":"ContainerDied","Data":"76be5bc3f51500f73609fb3b480c784c3e14eaad6c550a5c5e15ecb66497c7c6"} Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.524044 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqh66" event={"ID":"b5cafa5f-a4bc-4029-b136-ba9b3e2b6709","Type":"ContainerDied","Data":"24094a17a6fbfe474fc5d590964344bc77b407f77e662c38c3fe583ec34d7876"} Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.524044 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dqh66" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.524080 4723 scope.go:117] "RemoveContainer" containerID="76be5bc3f51500f73609fb3b480c784c3e14eaad6c550a5c5e15ecb66497c7c6" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.526088 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8574658c57-bbjt8" event={"ID":"6be8b05e-d252-470d-bf25-5bd6102d08f7","Type":"ContainerStarted","Data":"33a21e1e9e773393661594d23b24dc59b4318e2cde821b192b1ba9b76ab97920"} Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.527903 4723 generic.go:334] "Generic (PLEG): container finished" podID="c6d61e80-043f-4ece-a6a6-eed6357749f5" containerID="8c8e2b68a7e5d4c3484d0f64943ccbf1d2fb0a14d4c8d1f1ae6f774121072bc7" exitCode=0 Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.527962 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-p8pxb" event={"ID":"c6d61e80-043f-4ece-a6a6-eed6357749f5","Type":"ContainerDied","Data":"8c8e2b68a7e5d4c3484d0f64943ccbf1d2fb0a14d4c8d1f1ae6f774121072bc7"} Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.529534 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7cf876879c-jn49r" event={"ID":"ede99d46-5d03-42b4-bb3f-76a952e4e9e1","Type":"ContainerStarted","Data":"a0623ff2a79e14297c47673d7d272cc3fc0ba362611714e5338bec158e235860"} Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.531931 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7nz4v" event={"ID":"0c45ecd0-a916-4ef0-80aa-cfe88212d0ed","Type":"ContainerStarted","Data":"7e80ec29523bbb99b4b53914f2d432a14006eaf4a7af82f43c0e5d5a056b3654"} Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.532083 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-7nz4v" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.534806 4723 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-7nz4v container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.70:8080/healthz\": dial tcp 10.217.0.70:8080: connect: connection refused" start-of-body= Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.534952 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-7nz4v" podUID="0c45ecd0-a916-4ef0-80aa-cfe88212d0ed" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.70:8080/healthz\": dial tcp 10.217.0.70:8080: connect: connection refused" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.536940 4723 generic.go:334] "Generic (PLEG): container finished" podID="a7d103aa-232e-4705-a061-8ad7025339cf" containerID="3ad121701ecbed45714230b6796cc948294ac9e586bad1fc30461db0f68f2e2f" exitCode=0 Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.536992 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4x6zm" event={"ID":"a7d103aa-232e-4705-a061-8ad7025339cf","Type":"ContainerDied","Data":"3ad121701ecbed45714230b6796cc948294ac9e586bad1fc30461db0f68f2e2f"} Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.539091 4723 generic.go:334] "Generic (PLEG): container finished" podID="5adbe8b6-fabd-4e21-8507-84df16004837" containerID="900ce8517dbf500ee93275d9ddfc72f283c72c0678a51c4eb9ea89f798a07a04" exitCode=0 Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.539117 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2lhjt" event={"ID":"5adbe8b6-fabd-4e21-8507-84df16004837","Type":"ContainerDied","Data":"900ce8517dbf500ee93275d9ddfc72f283c72c0678a51c4eb9ea89f798a07a04"} Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.549506 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-7nz4v" podStartSLOduration=1.549487624 podStartE2EDuration="1.549487624s" podCreationTimestamp="2026-03-09 13:04:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:04:18.546340321 +0000 UTC m=+332.560807861" watchObservedRunningTime="2026-03-09 13:04:18.549487624 +0000 UTC m=+332.563955164" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.578165 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5cafa5f-a4bc-4029-b136-ba9b3e2b6709-utilities\") pod \"b5cafa5f-a4bc-4029-b136-ba9b3e2b6709\" (UID: \"b5cafa5f-a4bc-4029-b136-ba9b3e2b6709\") " Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.578230 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5cafa5f-a4bc-4029-b136-ba9b3e2b6709-catalog-content\") pod \"b5cafa5f-a4bc-4029-b136-ba9b3e2b6709\" (UID: \"b5cafa5f-a4bc-4029-b136-ba9b3e2b6709\") " Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.578310 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgmzt\" (UniqueName: \"kubernetes.io/projected/b5cafa5f-a4bc-4029-b136-ba9b3e2b6709-kube-api-access-rgmzt\") pod \"b5cafa5f-a4bc-4029-b136-ba9b3e2b6709\" (UID: \"b5cafa5f-a4bc-4029-b136-ba9b3e2b6709\") " Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.579378 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5cafa5f-a4bc-4029-b136-ba9b3e2b6709-utilities" (OuterVolumeSpecName: "utilities") pod "b5cafa5f-a4bc-4029-b136-ba9b3e2b6709" (UID: "b5cafa5f-a4bc-4029-b136-ba9b3e2b6709"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.585551 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5cafa5f-a4bc-4029-b136-ba9b3e2b6709-kube-api-access-rgmzt" (OuterVolumeSpecName: "kube-api-access-rgmzt") pod "b5cafa5f-a4bc-4029-b136-ba9b3e2b6709" (UID: "b5cafa5f-a4bc-4029-b136-ba9b3e2b6709"). InnerVolumeSpecName "kube-api-access-rgmzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.635978 4723 scope.go:117] "RemoveContainer" containerID="d9976a961885dc5909fdd0cee988c86676cb86d132777f7ab1c4745afa436e1d" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.644532 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5cafa5f-a4bc-4029-b136-ba9b3e2b6709-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b5cafa5f-a4bc-4029-b136-ba9b3e2b6709" (UID: "b5cafa5f-a4bc-4029-b136-ba9b3e2b6709"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.654464 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pvf4v" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.658797 4723 scope.go:117] "RemoveContainer" containerID="263b436902a04df4cb817bfe6083969a396c37ce40294eacd3745a9fc76d9942" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.672243 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-p8pxb" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.678172 4723 scope.go:117] "RemoveContainer" containerID="76be5bc3f51500f73609fb3b480c784c3e14eaad6c550a5c5e15ecb66497c7c6" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.679457 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgmzt\" (UniqueName: \"kubernetes.io/projected/b5cafa5f-a4bc-4029-b136-ba9b3e2b6709-kube-api-access-rgmzt\") on node \"crc\" DevicePath \"\"" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.679515 4723 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5cafa5f-a4bc-4029-b136-ba9b3e2b6709-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.679530 4723 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5cafa5f-a4bc-4029-b136-ba9b3e2b6709-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:04:18 crc kubenswrapper[4723]: E0309 13:04:18.680726 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76be5bc3f51500f73609fb3b480c784c3e14eaad6c550a5c5e15ecb66497c7c6\": container with ID starting with 76be5bc3f51500f73609fb3b480c784c3e14eaad6c550a5c5e15ecb66497c7c6 not found: ID does not exist" containerID="76be5bc3f51500f73609fb3b480c784c3e14eaad6c550a5c5e15ecb66497c7c6" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.680763 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76be5bc3f51500f73609fb3b480c784c3e14eaad6c550a5c5e15ecb66497c7c6"} err="failed to get container status \"76be5bc3f51500f73609fb3b480c784c3e14eaad6c550a5c5e15ecb66497c7c6\": rpc error: code = NotFound desc = could not find container \"76be5bc3f51500f73609fb3b480c784c3e14eaad6c550a5c5e15ecb66497c7c6\": container with ID starting with 76be5bc3f51500f73609fb3b480c784c3e14eaad6c550a5c5e15ecb66497c7c6 not found: ID does not exist" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.680792 4723 scope.go:117] "RemoveContainer" containerID="d9976a961885dc5909fdd0cee988c86676cb86d132777f7ab1c4745afa436e1d" Mar 09 13:04:18 crc kubenswrapper[4723]: E0309 13:04:18.682548 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9976a961885dc5909fdd0cee988c86676cb86d132777f7ab1c4745afa436e1d\": container with ID starting with d9976a961885dc5909fdd0cee988c86676cb86d132777f7ab1c4745afa436e1d not found: ID does not exist" containerID="d9976a961885dc5909fdd0cee988c86676cb86d132777f7ab1c4745afa436e1d" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.682577 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9976a961885dc5909fdd0cee988c86676cb86d132777f7ab1c4745afa436e1d"} err="failed to get container status \"d9976a961885dc5909fdd0cee988c86676cb86d132777f7ab1c4745afa436e1d\": rpc error: code = NotFound desc = could not find container \"d9976a961885dc5909fdd0cee988c86676cb86d132777f7ab1c4745afa436e1d\": container with ID starting with d9976a961885dc5909fdd0cee988c86676cb86d132777f7ab1c4745afa436e1d not found: ID does not exist" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.682598 4723 scope.go:117] "RemoveContainer" containerID="263b436902a04df4cb817bfe6083969a396c37ce40294eacd3745a9fc76d9942" Mar 09 13:04:18 crc kubenswrapper[4723]: E0309 13:04:18.682960 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"263b436902a04df4cb817bfe6083969a396c37ce40294eacd3745a9fc76d9942\": container with ID starting with 263b436902a04df4cb817bfe6083969a396c37ce40294eacd3745a9fc76d9942 not found: ID does not exist" containerID="263b436902a04df4cb817bfe6083969a396c37ce40294eacd3745a9fc76d9942" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.682993 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"263b436902a04df4cb817bfe6083969a396c37ce40294eacd3745a9fc76d9942"} err="failed to get container status \"263b436902a04df4cb817bfe6083969a396c37ce40294eacd3745a9fc76d9942\": rpc error: code = NotFound desc = could not find container \"263b436902a04df4cb817bfe6083969a396c37ce40294eacd3745a9fc76d9942\": container with ID starting with 263b436902a04df4cb817bfe6083969a396c37ce40294eacd3745a9fc76d9942 not found: ID does not exist" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.705706 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2lhjt" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.712277 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4x6zm" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.780424 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c6d61e80-043f-4ece-a6a6-eed6357749f5-marketplace-operator-metrics\") pod \"c6d61e80-043f-4ece-a6a6-eed6357749f5\" (UID: \"c6d61e80-043f-4ece-a6a6-eed6357749f5\") " Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.780488 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjjls\" (UniqueName: \"kubernetes.io/projected/c6d61e80-043f-4ece-a6a6-eed6357749f5-kube-api-access-qjjls\") pod \"c6d61e80-043f-4ece-a6a6-eed6357749f5\" (UID: \"c6d61e80-043f-4ece-a6a6-eed6357749f5\") " Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.780524 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxs2b\" (UniqueName: \"kubernetes.io/projected/8ebe9a64-25f7-4d32-bdce-3a3942ba53a2-kube-api-access-cxs2b\") pod \"8ebe9a64-25f7-4d32-bdce-3a3942ba53a2\" (UID: \"8ebe9a64-25f7-4d32-bdce-3a3942ba53a2\") " Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.780563 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ebe9a64-25f7-4d32-bdce-3a3942ba53a2-utilities\") pod \"8ebe9a64-25f7-4d32-bdce-3a3942ba53a2\" (UID: \"8ebe9a64-25f7-4d32-bdce-3a3942ba53a2\") " Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.780597 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c6d61e80-043f-4ece-a6a6-eed6357749f5-marketplace-trusted-ca\") pod \"c6d61e80-043f-4ece-a6a6-eed6357749f5\" (UID: \"c6d61e80-043f-4ece-a6a6-eed6357749f5\") " Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.780651 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ebe9a64-25f7-4d32-bdce-3a3942ba53a2-catalog-content\") pod \"8ebe9a64-25f7-4d32-bdce-3a3942ba53a2\" (UID: \"8ebe9a64-25f7-4d32-bdce-3a3942ba53a2\") " Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.781506 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ebe9a64-25f7-4d32-bdce-3a3942ba53a2-utilities" (OuterVolumeSpecName: "utilities") pod "8ebe9a64-25f7-4d32-bdce-3a3942ba53a2" (UID: "8ebe9a64-25f7-4d32-bdce-3a3942ba53a2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.781666 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6d61e80-043f-4ece-a6a6-eed6357749f5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "c6d61e80-043f-4ece-a6a6-eed6357749f5" (UID: "c6d61e80-043f-4ece-a6a6-eed6357749f5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.783953 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6d61e80-043f-4ece-a6a6-eed6357749f5-kube-api-access-qjjls" (OuterVolumeSpecName: "kube-api-access-qjjls") pod "c6d61e80-043f-4ece-a6a6-eed6357749f5" (UID: "c6d61e80-043f-4ece-a6a6-eed6357749f5"). InnerVolumeSpecName "kube-api-access-qjjls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.783962 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6d61e80-043f-4ece-a6a6-eed6357749f5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "c6d61e80-043f-4ece-a6a6-eed6357749f5" (UID: "c6d61e80-043f-4ece-a6a6-eed6357749f5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.784461 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ebe9a64-25f7-4d32-bdce-3a3942ba53a2-kube-api-access-cxs2b" (OuterVolumeSpecName: "kube-api-access-cxs2b") pod "8ebe9a64-25f7-4d32-bdce-3a3942ba53a2" (UID: "8ebe9a64-25f7-4d32-bdce-3a3942ba53a2"). InnerVolumeSpecName "kube-api-access-cxs2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.825893 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ebe9a64-25f7-4d32-bdce-3a3942ba53a2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8ebe9a64-25f7-4d32-bdce-3a3942ba53a2" (UID: "8ebe9a64-25f7-4d32-bdce-3a3942ba53a2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.850049 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dqh66"] Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.857879 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dqh66"] Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.881818 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7d103aa-232e-4705-a061-8ad7025339cf-utilities\") pod \"a7d103aa-232e-4705-a061-8ad7025339cf\" (UID: \"a7d103aa-232e-4705-a061-8ad7025339cf\") " Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.881885 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5adbe8b6-fabd-4e21-8507-84df16004837-utilities\") pod \"5adbe8b6-fabd-4e21-8507-84df16004837\" (UID: \"5adbe8b6-fabd-4e21-8507-84df16004837\") " Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.881925 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5bxr\" (UniqueName: \"kubernetes.io/projected/5adbe8b6-fabd-4e21-8507-84df16004837-kube-api-access-m5bxr\") pod \"5adbe8b6-fabd-4e21-8507-84df16004837\" (UID: \"5adbe8b6-fabd-4e21-8507-84df16004837\") " Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.882005 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vv9fj\" (UniqueName: \"kubernetes.io/projected/a7d103aa-232e-4705-a061-8ad7025339cf-kube-api-access-vv9fj\") pod \"a7d103aa-232e-4705-a061-8ad7025339cf\" (UID: \"a7d103aa-232e-4705-a061-8ad7025339cf\") " Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.882051 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7d103aa-232e-4705-a061-8ad7025339cf-catalog-content\") pod \"a7d103aa-232e-4705-a061-8ad7025339cf\" (UID: \"a7d103aa-232e-4705-a061-8ad7025339cf\") " Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.882083 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5adbe8b6-fabd-4e21-8507-84df16004837-catalog-content\") pod \"5adbe8b6-fabd-4e21-8507-84df16004837\" (UID: \"5adbe8b6-fabd-4e21-8507-84df16004837\") " Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.882312 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjjls\" (UniqueName: \"kubernetes.io/projected/c6d61e80-043f-4ece-a6a6-eed6357749f5-kube-api-access-qjjls\") on node \"crc\" DevicePath \"\"" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.882330 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxs2b\" (UniqueName: \"kubernetes.io/projected/8ebe9a64-25f7-4d32-bdce-3a3942ba53a2-kube-api-access-cxs2b\") on node \"crc\" DevicePath \"\"" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.882342 4723 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ebe9a64-25f7-4d32-bdce-3a3942ba53a2-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.882354 4723 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c6d61e80-043f-4ece-a6a6-eed6357749f5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.882366 4723 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ebe9a64-25f7-4d32-bdce-3a3942ba53a2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.882378 4723 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c6d61e80-043f-4ece-a6a6-eed6357749f5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.885152 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7d103aa-232e-4705-a061-8ad7025339cf-kube-api-access-vv9fj" (OuterVolumeSpecName: "kube-api-access-vv9fj") pod "a7d103aa-232e-4705-a061-8ad7025339cf" (UID: "a7d103aa-232e-4705-a061-8ad7025339cf"). InnerVolumeSpecName "kube-api-access-vv9fj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.885481 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5adbe8b6-fabd-4e21-8507-84df16004837-kube-api-access-m5bxr" (OuterVolumeSpecName: "kube-api-access-m5bxr") pod "5adbe8b6-fabd-4e21-8507-84df16004837" (UID: "5adbe8b6-fabd-4e21-8507-84df16004837"). InnerVolumeSpecName "kube-api-access-m5bxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.886338 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7d103aa-232e-4705-a061-8ad7025339cf-utilities" (OuterVolumeSpecName: "utilities") pod "a7d103aa-232e-4705-a061-8ad7025339cf" (UID: "a7d103aa-232e-4705-a061-8ad7025339cf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.887026 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5adbe8b6-fabd-4e21-8507-84df16004837-utilities" (OuterVolumeSpecName: "utilities") pod "5adbe8b6-fabd-4e21-8507-84df16004837" (UID: "5adbe8b6-fabd-4e21-8507-84df16004837"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.889786 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e0a1ce6-0cec-47d3-9082-8630337ae033" path="/var/lib/kubelet/pods/1e0a1ce6-0cec-47d3-9082-8630337ae033/volumes" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.890654 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af3980f7-9337-48f4-84f1-c311b36f47b0" path="/var/lib/kubelet/pods/af3980f7-9337-48f4-84f1-c311b36f47b0/volumes" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.891301 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5cafa5f-a4bc-4029-b136-ba9b3e2b6709" path="/var/lib/kubelet/pods/b5cafa5f-a4bc-4029-b136-ba9b3e2b6709/volumes" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.941226 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7d103aa-232e-4705-a061-8ad7025339cf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a7d103aa-232e-4705-a061-8ad7025339cf" (UID: "a7d103aa-232e-4705-a061-8ad7025339cf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.983496 4723 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7d103aa-232e-4705-a061-8ad7025339cf-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.983537 4723 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5adbe8b6-fabd-4e21-8507-84df16004837-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.983552 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5bxr\" (UniqueName: \"kubernetes.io/projected/5adbe8b6-fabd-4e21-8507-84df16004837-kube-api-access-m5bxr\") on node \"crc\" DevicePath \"\"" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.983570 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vv9fj\" (UniqueName: \"kubernetes.io/projected/a7d103aa-232e-4705-a061-8ad7025339cf-kube-api-access-vv9fj\") on node \"crc\" DevicePath \"\"" Mar 09 13:04:18 crc kubenswrapper[4723]: I0309 13:04:18.983581 4723 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7d103aa-232e-4705-a061-8ad7025339cf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:04:19 crc kubenswrapper[4723]: I0309 13:04:19.057117 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5adbe8b6-fabd-4e21-8507-84df16004837-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5adbe8b6-fabd-4e21-8507-84df16004837" (UID: "5adbe8b6-fabd-4e21-8507-84df16004837"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:04:19 crc kubenswrapper[4723]: I0309 13:04:19.084776 4723 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5adbe8b6-fabd-4e21-8507-84df16004837-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:04:19 crc kubenswrapper[4723]: I0309 13:04:19.549903 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-p8pxb" event={"ID":"c6d61e80-043f-4ece-a6a6-eed6357749f5","Type":"ContainerDied","Data":"9eeba3cd17532de8486e559017adf9d4e639ac9458c1a69ee6498e95be9e3265"} Mar 09 13:04:19 crc kubenswrapper[4723]: I0309 13:04:19.550281 4723 scope.go:117] "RemoveContainer" containerID="8c8e2b68a7e5d4c3484d0f64943ccbf1d2fb0a14d4c8d1f1ae6f774121072bc7" Mar 09 13:04:19 crc kubenswrapper[4723]: I0309 13:04:19.549964 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-p8pxb" Mar 09 13:04:19 crc kubenswrapper[4723]: I0309 13:04:19.566024 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4x6zm" event={"ID":"a7d103aa-232e-4705-a061-8ad7025339cf","Type":"ContainerDied","Data":"267d32d38f4ac36ab26ff8a9fd90384b80a1153d16dc73fb0012983d27013595"} Mar 09 13:04:19 crc kubenswrapper[4723]: I0309 13:04:19.566102 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4x6zm" Mar 09 13:04:19 crc kubenswrapper[4723]: I0309 13:04:19.572411 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2lhjt" event={"ID":"5adbe8b6-fabd-4e21-8507-84df16004837","Type":"ContainerDied","Data":"e65e0c5d60346ad68a3042a270ed19b462677a140d0fb88df19a328f6f77e82b"} Mar 09 13:04:19 crc kubenswrapper[4723]: I0309 13:04:19.572553 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2lhjt" Mar 09 13:04:19 crc kubenswrapper[4723]: I0309 13:04:19.577171 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvf4v" event={"ID":"8ebe9a64-25f7-4d32-bdce-3a3942ba53a2","Type":"ContainerDied","Data":"6f1271a1ca8ffe2a3c61f046dc0babc22d8c025eb24407ba32a635aefeaf5fb0"} Mar 09 13:04:19 crc kubenswrapper[4723]: I0309 13:04:19.577286 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pvf4v" Mar 09 13:04:19 crc kubenswrapper[4723]: I0309 13:04:19.580990 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7cf876879c-jn49r" event={"ID":"ede99d46-5d03-42b4-bb3f-76a952e4e9e1","Type":"ContainerStarted","Data":"3b57386b307fe14f2c250d7291c8986af7055c39429cea230334f2ec4a055b79"} Mar 09 13:04:19 crc kubenswrapper[4723]: I0309 13:04:19.581558 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7cf876879c-jn49r" Mar 09 13:04:19 crc kubenswrapper[4723]: I0309 13:04:19.582527 4723 scope.go:117] "RemoveContainer" containerID="3ad121701ecbed45714230b6796cc948294ac9e586bad1fc30461db0f68f2e2f" Mar 09 13:04:19 crc kubenswrapper[4723]: I0309 13:04:19.582700 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p8pxb"] Mar 09 13:04:19 crc kubenswrapper[4723]: I0309 13:04:19.583116 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7nz4v" event={"ID":"0c45ecd0-a916-4ef0-80aa-cfe88212d0ed","Type":"ContainerStarted","Data":"001831c1b798665f067152d3c22a2112fdfbec75c7ae05447ad45ae8608feac3"} Mar 09 13:04:19 crc kubenswrapper[4723]: I0309 13:04:19.586773 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8574658c57-bbjt8" event={"ID":"6be8b05e-d252-470d-bf25-5bd6102d08f7","Type":"ContainerStarted","Data":"c0817dc9dd5a71dd6b26ae2f68f87d0dc40fbdcfbdc789ec8a46fcaa79e9c548"} Mar 09 13:04:19 crc kubenswrapper[4723]: I0309 13:04:19.587145 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-8574658c57-bbjt8" Mar 09 13:04:19 crc kubenswrapper[4723]: I0309 13:04:19.588906 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7cf876879c-jn49r" Mar 09 13:04:19 crc kubenswrapper[4723]: I0309 13:04:19.589178 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-7nz4v" Mar 09 13:04:19 crc kubenswrapper[4723]: I0309 13:04:19.592263 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-8574658c57-bbjt8" Mar 09 13:04:19 crc kubenswrapper[4723]: I0309 13:04:19.593338 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p8pxb"] Mar 09 13:04:19 crc kubenswrapper[4723]: I0309 13:04:19.604358 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7cf876879c-jn49r" podStartSLOduration=3.6043239 podStartE2EDuration="3.6043239s" podCreationTimestamp="2026-03-09 13:04:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:04:19.596638627 +0000 UTC m=+333.611106177" watchObservedRunningTime="2026-03-09 13:04:19.6043239 +0000 UTC m=+333.618791430" Mar 09 13:04:19 crc kubenswrapper[4723]: I0309 13:04:19.619952 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-8574658c57-bbjt8" podStartSLOduration=3.619905832 podStartE2EDuration="3.619905832s" podCreationTimestamp="2026-03-09 13:04:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:04:19.612757573 +0000 UTC m=+333.627225143" watchObservedRunningTime="2026-03-09 13:04:19.619905832 +0000 UTC m=+333.634373382" Mar 09 13:04:19 crc kubenswrapper[4723]: I0309 13:04:19.620027 4723 scope.go:117] "RemoveContainer" containerID="2be94a9168f5909d8cb3c4b00a1c68eeb14f3d718787ba16d310e8bb0580616b" Mar 09 13:04:19 crc kubenswrapper[4723]: I0309 13:04:19.661803 4723 scope.go:117] "RemoveContainer" containerID="1e9fbc13b683916412c35d60c517f1a2ec6392cc8226e266a1ca4fcb24256272" Mar 09 13:04:19 crc kubenswrapper[4723]: I0309 13:04:19.677455 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4x6zm"] Mar 09 13:04:19 crc kubenswrapper[4723]: I0309 13:04:19.680992 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4x6zm"] Mar 09 13:04:19 crc kubenswrapper[4723]: I0309 13:04:19.681421 4723 scope.go:117] "RemoveContainer" containerID="900ce8517dbf500ee93275d9ddfc72f283c72c0678a51c4eb9ea89f798a07a04" Mar 09 13:04:19 crc kubenswrapper[4723]: I0309 13:04:19.682975 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pvf4v"] Mar 09 13:04:19 crc kubenswrapper[4723]: I0309 13:04:19.687125 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pvf4v"] Mar 09 13:04:19 crc kubenswrapper[4723]: I0309 13:04:19.690278 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2lhjt"] Mar 09 13:04:19 crc kubenswrapper[4723]: I0309 13:04:19.692374 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2lhjt"] Mar 09 13:04:19 crc kubenswrapper[4723]: I0309 13:04:19.698130 4723 scope.go:117] "RemoveContainer" containerID="6264dfe2e8203b6bfc4f87d5b394f014d1e1e13f2094461c63e031cdad53f4c4" Mar 09 13:04:19 crc kubenswrapper[4723]: I0309 13:04:19.716121 4723 scope.go:117] "RemoveContainer" containerID="aad7f6d12e19f3f187c33be6859a22310f784401b9f27bf5cf65a192902f32cb" Mar 09 13:04:19 crc kubenswrapper[4723]: I0309 13:04:19.742392 4723 scope.go:117] "RemoveContainer" containerID="004448f6e17452bc7c76fca92db1a588abea82406b34f99452568f10d3ddf560" Mar 09 13:04:19 crc kubenswrapper[4723]: I0309 13:04:19.754357 4723 scope.go:117] "RemoveContainer" containerID="cafa6a34afdf28f7f477cb3f7691cbaec13491cafa840531f220a48448d63fff" Mar 09 13:04:19 crc kubenswrapper[4723]: I0309 13:04:19.764778 4723 scope.go:117] "RemoveContainer" containerID="0919fe583c9c158b8aef61c78369eca1ffe1b146fd4349657fbe61e5955df06a" Mar 09 13:04:20 crc kubenswrapper[4723]: I0309 13:04:20.887655 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5adbe8b6-fabd-4e21-8507-84df16004837" path="/var/lib/kubelet/pods/5adbe8b6-fabd-4e21-8507-84df16004837/volumes" Mar 09 13:04:20 crc kubenswrapper[4723]: I0309 13:04:20.888301 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ebe9a64-25f7-4d32-bdce-3a3942ba53a2" path="/var/lib/kubelet/pods/8ebe9a64-25f7-4d32-bdce-3a3942ba53a2/volumes" Mar 09 13:04:20 crc kubenswrapper[4723]: I0309 13:04:20.889282 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7d103aa-232e-4705-a061-8ad7025339cf" path="/var/lib/kubelet/pods/a7d103aa-232e-4705-a061-8ad7025339cf/volumes" Mar 09 13:04:20 crc kubenswrapper[4723]: I0309 13:04:20.890748 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6d61e80-043f-4ece-a6a6-eed6357749f5" path="/var/lib/kubelet/pods/c6d61e80-043f-4ece-a6a6-eed6357749f5/volumes" Mar 09 13:04:36 crc kubenswrapper[4723]: I0309 13:04:36.152467 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-r9xzz"] Mar 09 13:04:36 crc kubenswrapper[4723]: E0309 13:04:36.153017 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7d103aa-232e-4705-a061-8ad7025339cf" containerName="extract-content" Mar 09 13:04:36 crc kubenswrapper[4723]: I0309 13:04:36.153033 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7d103aa-232e-4705-a061-8ad7025339cf" containerName="extract-content" Mar 09 13:04:36 crc kubenswrapper[4723]: E0309 13:04:36.153049 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7d103aa-232e-4705-a061-8ad7025339cf" containerName="extract-utilities" Mar 09 13:04:36 crc kubenswrapper[4723]: I0309 13:04:36.153058 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7d103aa-232e-4705-a061-8ad7025339cf" containerName="extract-utilities" Mar 09 13:04:36 crc kubenswrapper[4723]: E0309 13:04:36.153066 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5adbe8b6-fabd-4e21-8507-84df16004837" containerName="extract-content" Mar 09 13:04:36 crc kubenswrapper[4723]: I0309 13:04:36.153074 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="5adbe8b6-fabd-4e21-8507-84df16004837" containerName="extract-content" Mar 09 13:04:36 crc kubenswrapper[4723]: E0309 13:04:36.153086 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7d103aa-232e-4705-a061-8ad7025339cf" containerName="registry-server" Mar 09 13:04:36 crc kubenswrapper[4723]: I0309 13:04:36.153093 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7d103aa-232e-4705-a061-8ad7025339cf" containerName="registry-server" Mar 09 13:04:36 crc kubenswrapper[4723]: E0309 13:04:36.153102 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5cafa5f-a4bc-4029-b136-ba9b3e2b6709" containerName="extract-content" Mar 09 13:04:36 crc kubenswrapper[4723]: I0309 13:04:36.153110 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5cafa5f-a4bc-4029-b136-ba9b3e2b6709" containerName="extract-content" Mar 09 13:04:36 crc kubenswrapper[4723]: E0309 13:04:36.153124 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5cafa5f-a4bc-4029-b136-ba9b3e2b6709" containerName="extract-utilities" Mar 09 13:04:36 crc kubenswrapper[4723]: I0309 13:04:36.153132 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5cafa5f-a4bc-4029-b136-ba9b3e2b6709" containerName="extract-utilities" Mar 09 13:04:36 crc kubenswrapper[4723]: E0309 13:04:36.153142 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5adbe8b6-fabd-4e21-8507-84df16004837" containerName="extract-utilities" Mar 09 13:04:36 crc kubenswrapper[4723]: I0309 13:04:36.153149 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="5adbe8b6-fabd-4e21-8507-84df16004837" containerName="extract-utilities" Mar 09 13:04:36 crc kubenswrapper[4723]: E0309 13:04:36.153160 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5cafa5f-a4bc-4029-b136-ba9b3e2b6709" containerName="registry-server" Mar 09 13:04:36 crc kubenswrapper[4723]: I0309 13:04:36.153169 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5cafa5f-a4bc-4029-b136-ba9b3e2b6709" containerName="registry-server" Mar 09 13:04:36 crc kubenswrapper[4723]: E0309 13:04:36.153183 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ebe9a64-25f7-4d32-bdce-3a3942ba53a2" containerName="extract-utilities" Mar 09 13:04:36 crc kubenswrapper[4723]: I0309 13:04:36.153191 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ebe9a64-25f7-4d32-bdce-3a3942ba53a2" containerName="extract-utilities" Mar 09 13:04:36 crc kubenswrapper[4723]: E0309 13:04:36.153201 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ebe9a64-25f7-4d32-bdce-3a3942ba53a2" containerName="registry-server" Mar 09 13:04:36 crc kubenswrapper[4723]: I0309 13:04:36.153208 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ebe9a64-25f7-4d32-bdce-3a3942ba53a2" containerName="registry-server" Mar 09 13:04:36 crc kubenswrapper[4723]: E0309 13:04:36.153220 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ebe9a64-25f7-4d32-bdce-3a3942ba53a2" containerName="extract-content" Mar 09 13:04:36 crc kubenswrapper[4723]: I0309 13:04:36.153228 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ebe9a64-25f7-4d32-bdce-3a3942ba53a2" containerName="extract-content" Mar 09 13:04:36 crc kubenswrapper[4723]: E0309 13:04:36.153238 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5adbe8b6-fabd-4e21-8507-84df16004837" containerName="registry-server" Mar 09 13:04:36 crc kubenswrapper[4723]: I0309 13:04:36.153246 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="5adbe8b6-fabd-4e21-8507-84df16004837" containerName="registry-server" Mar 09 13:04:36 crc kubenswrapper[4723]: E0309 13:04:36.153258 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6d61e80-043f-4ece-a6a6-eed6357749f5" containerName="marketplace-operator" Mar 09 13:04:36 crc kubenswrapper[4723]: I0309 13:04:36.153265 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6d61e80-043f-4ece-a6a6-eed6357749f5" containerName="marketplace-operator" Mar 09 13:04:36 crc kubenswrapper[4723]: I0309 13:04:36.153385 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6d61e80-043f-4ece-a6a6-eed6357749f5" containerName="marketplace-operator" Mar 09 13:04:36 crc kubenswrapper[4723]: I0309 13:04:36.153397 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="5adbe8b6-fabd-4e21-8507-84df16004837" containerName="registry-server" Mar 09 13:04:36 crc kubenswrapper[4723]: I0309 13:04:36.153406 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ebe9a64-25f7-4d32-bdce-3a3942ba53a2" containerName="registry-server" Mar 09 13:04:36 crc kubenswrapper[4723]: I0309 13:04:36.153421 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5cafa5f-a4bc-4029-b136-ba9b3e2b6709" containerName="registry-server" Mar 09 13:04:36 crc kubenswrapper[4723]: I0309 13:04:36.153435 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7d103aa-232e-4705-a061-8ad7025339cf" containerName="registry-server" Mar 09 13:04:36 crc kubenswrapper[4723]: I0309 13:04:36.154246 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r9xzz" Mar 09 13:04:36 crc kubenswrapper[4723]: I0309 13:04:36.156675 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 09 13:04:36 crc kubenswrapper[4723]: I0309 13:04:36.169561 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r9xzz"] Mar 09 13:04:36 crc kubenswrapper[4723]: I0309 13:04:36.317200 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d85zx\" (UniqueName: \"kubernetes.io/projected/0b462749-ee4f-4661-8a3a-06e721ef51a8-kube-api-access-d85zx\") pod \"redhat-operators-r9xzz\" (UID: \"0b462749-ee4f-4661-8a3a-06e721ef51a8\") " pod="openshift-marketplace/redhat-operators-r9xzz" Mar 09 13:04:36 crc kubenswrapper[4723]: I0309 13:04:36.317335 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b462749-ee4f-4661-8a3a-06e721ef51a8-catalog-content\") pod \"redhat-operators-r9xzz\" (UID: \"0b462749-ee4f-4661-8a3a-06e721ef51a8\") " pod="openshift-marketplace/redhat-operators-r9xzz" Mar 09 13:04:36 crc kubenswrapper[4723]: I0309 13:04:36.317396 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b462749-ee4f-4661-8a3a-06e721ef51a8-utilities\") pod \"redhat-operators-r9xzz\" (UID: \"0b462749-ee4f-4661-8a3a-06e721ef51a8\") " pod="openshift-marketplace/redhat-operators-r9xzz" Mar 09 13:04:36 crc kubenswrapper[4723]: I0309 13:04:36.352331 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vwn46"] Mar 09 13:04:36 crc kubenswrapper[4723]: I0309 13:04:36.353283 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vwn46" Mar 09 13:04:36 crc kubenswrapper[4723]: I0309 13:04:36.359680 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 09 13:04:36 crc kubenswrapper[4723]: I0309 13:04:36.380293 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vwn46"] Mar 09 13:04:36 crc kubenswrapper[4723]: I0309 13:04:36.418793 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d85zx\" (UniqueName: \"kubernetes.io/projected/0b462749-ee4f-4661-8a3a-06e721ef51a8-kube-api-access-d85zx\") pod \"redhat-operators-r9xzz\" (UID: \"0b462749-ee4f-4661-8a3a-06e721ef51a8\") " pod="openshift-marketplace/redhat-operators-r9xzz" Mar 09 13:04:36 crc kubenswrapper[4723]: I0309 13:04:36.418924 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b462749-ee4f-4661-8a3a-06e721ef51a8-catalog-content\") pod \"redhat-operators-r9xzz\" (UID: \"0b462749-ee4f-4661-8a3a-06e721ef51a8\") " pod="openshift-marketplace/redhat-operators-r9xzz" Mar 09 13:04:36 crc kubenswrapper[4723]: I0309 13:04:36.418960 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b462749-ee4f-4661-8a3a-06e721ef51a8-utilities\") pod \"redhat-operators-r9xzz\" (UID: \"0b462749-ee4f-4661-8a3a-06e721ef51a8\") " pod="openshift-marketplace/redhat-operators-r9xzz" Mar 09 13:04:36 crc kubenswrapper[4723]: I0309 13:04:36.419940 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b462749-ee4f-4661-8a3a-06e721ef51a8-catalog-content\") pod \"redhat-operators-r9xzz\" (UID: \"0b462749-ee4f-4661-8a3a-06e721ef51a8\") " pod="openshift-marketplace/redhat-operators-r9xzz" Mar 09 13:04:36 crc kubenswrapper[4723]: I0309 13:04:36.420209 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b462749-ee4f-4661-8a3a-06e721ef51a8-utilities\") pod \"redhat-operators-r9xzz\" (UID: \"0b462749-ee4f-4661-8a3a-06e721ef51a8\") " pod="openshift-marketplace/redhat-operators-r9xzz" Mar 09 13:04:36 crc kubenswrapper[4723]: I0309 13:04:36.440184 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d85zx\" (UniqueName: \"kubernetes.io/projected/0b462749-ee4f-4661-8a3a-06e721ef51a8-kube-api-access-d85zx\") pod \"redhat-operators-r9xzz\" (UID: \"0b462749-ee4f-4661-8a3a-06e721ef51a8\") " pod="openshift-marketplace/redhat-operators-r9xzz" Mar 09 13:04:36 crc kubenswrapper[4723]: I0309 13:04:36.471465 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r9xzz" Mar 09 13:04:36 crc kubenswrapper[4723]: I0309 13:04:36.520386 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcq48\" (UniqueName: \"kubernetes.io/projected/98195455-05c0-408c-b3e2-728b991eee12-kube-api-access-mcq48\") pod \"redhat-marketplace-vwn46\" (UID: \"98195455-05c0-408c-b3e2-728b991eee12\") " pod="openshift-marketplace/redhat-marketplace-vwn46" Mar 09 13:04:36 crc kubenswrapper[4723]: I0309 13:04:36.520486 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98195455-05c0-408c-b3e2-728b991eee12-utilities\") pod \"redhat-marketplace-vwn46\" (UID: \"98195455-05c0-408c-b3e2-728b991eee12\") " pod="openshift-marketplace/redhat-marketplace-vwn46" Mar 09 13:04:36 crc kubenswrapper[4723]: I0309 13:04:36.520541 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98195455-05c0-408c-b3e2-728b991eee12-catalog-content\") pod \"redhat-marketplace-vwn46\" (UID: \"98195455-05c0-408c-b3e2-728b991eee12\") " pod="openshift-marketplace/redhat-marketplace-vwn46" Mar 09 13:04:36 crc kubenswrapper[4723]: I0309 13:04:36.621768 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcq48\" (UniqueName: \"kubernetes.io/projected/98195455-05c0-408c-b3e2-728b991eee12-kube-api-access-mcq48\") pod \"redhat-marketplace-vwn46\" (UID: \"98195455-05c0-408c-b3e2-728b991eee12\") " pod="openshift-marketplace/redhat-marketplace-vwn46" Mar 09 13:04:36 crc kubenswrapper[4723]: I0309 13:04:36.621905 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98195455-05c0-408c-b3e2-728b991eee12-utilities\") pod \"redhat-marketplace-vwn46\" (UID: \"98195455-05c0-408c-b3e2-728b991eee12\") " pod="openshift-marketplace/redhat-marketplace-vwn46" Mar 09 13:04:36 crc kubenswrapper[4723]: I0309 13:04:36.621963 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98195455-05c0-408c-b3e2-728b991eee12-catalog-content\") pod \"redhat-marketplace-vwn46\" (UID: \"98195455-05c0-408c-b3e2-728b991eee12\") " pod="openshift-marketplace/redhat-marketplace-vwn46" Mar 09 13:04:36 crc kubenswrapper[4723]: I0309 13:04:36.622540 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98195455-05c0-408c-b3e2-728b991eee12-catalog-content\") pod \"redhat-marketplace-vwn46\" (UID: \"98195455-05c0-408c-b3e2-728b991eee12\") " pod="openshift-marketplace/redhat-marketplace-vwn46" Mar 09 13:04:36 crc kubenswrapper[4723]: I0309 13:04:36.622798 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98195455-05c0-408c-b3e2-728b991eee12-utilities\") pod \"redhat-marketplace-vwn46\" (UID: \"98195455-05c0-408c-b3e2-728b991eee12\") " pod="openshift-marketplace/redhat-marketplace-vwn46" Mar 09 13:04:36 crc kubenswrapper[4723]: I0309 13:04:36.655964 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcq48\" (UniqueName: \"kubernetes.io/projected/98195455-05c0-408c-b3e2-728b991eee12-kube-api-access-mcq48\") pod \"redhat-marketplace-vwn46\" (UID: \"98195455-05c0-408c-b3e2-728b991eee12\") " pod="openshift-marketplace/redhat-marketplace-vwn46" Mar 09 13:04:36 crc kubenswrapper[4723]: I0309 13:04:36.683813 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vwn46" Mar 09 13:04:36 crc kubenswrapper[4723]: I0309 13:04:36.849723 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r9xzz"] Mar 09 13:04:36 crc kubenswrapper[4723]: W0309 13:04:36.852528 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b462749_ee4f_4661_8a3a_06e721ef51a8.slice/crio-9eb9ab158119ea9cd39c665002ee0b3683b5e96d3392a10ab3746897b2b16550 WatchSource:0}: Error finding container 9eb9ab158119ea9cd39c665002ee0b3683b5e96d3392a10ab3746897b2b16550: Status 404 returned error can't find the container with id 9eb9ab158119ea9cd39c665002ee0b3683b5e96d3392a10ab3746897b2b16550 Mar 09 13:04:37 crc kubenswrapper[4723]: I0309 13:04:37.129622 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vwn46"] Mar 09 13:04:37 crc kubenswrapper[4723]: W0309 13:04:37.134101 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98195455_05c0_408c_b3e2_728b991eee12.slice/crio-0bb50f94a2a258775cc2dac5b4002ecc3dfc3c586b2c2f1b1bf1f091e8becafe WatchSource:0}: Error finding container 0bb50f94a2a258775cc2dac5b4002ecc3dfc3c586b2c2f1b1bf1f091e8becafe: Status 404 returned error can't find the container with id 0bb50f94a2a258775cc2dac5b4002ecc3dfc3c586b2c2f1b1bf1f091e8becafe Mar 09 13:04:37 crc kubenswrapper[4723]: I0309 13:04:37.700563 4723 generic.go:334] "Generic (PLEG): container finished" podID="0b462749-ee4f-4661-8a3a-06e721ef51a8" containerID="f6bac3256d1d2258b8d8f7670a4077d3b9749b049009fe69a83d5d6a8dfa0c46" exitCode=0 Mar 09 13:04:37 crc kubenswrapper[4723]: I0309 13:04:37.700616 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r9xzz" event={"ID":"0b462749-ee4f-4661-8a3a-06e721ef51a8","Type":"ContainerDied","Data":"f6bac3256d1d2258b8d8f7670a4077d3b9749b049009fe69a83d5d6a8dfa0c46"} Mar 09 13:04:37 crc kubenswrapper[4723]: I0309 13:04:37.700851 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r9xzz" event={"ID":"0b462749-ee4f-4661-8a3a-06e721ef51a8","Type":"ContainerStarted","Data":"9eb9ab158119ea9cd39c665002ee0b3683b5e96d3392a10ab3746897b2b16550"} Mar 09 13:04:37 crc kubenswrapper[4723]: I0309 13:04:37.703347 4723 generic.go:334] "Generic (PLEG): container finished" podID="98195455-05c0-408c-b3e2-728b991eee12" containerID="a3e7bb4bfb32dcb4f51bcc85d27cb8b8ee0ddb5ba0e5eacce4d0136ae86538dc" exitCode=0 Mar 09 13:04:37 crc kubenswrapper[4723]: I0309 13:04:37.703380 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vwn46" event={"ID":"98195455-05c0-408c-b3e2-728b991eee12","Type":"ContainerDied","Data":"a3e7bb4bfb32dcb4f51bcc85d27cb8b8ee0ddb5ba0e5eacce4d0136ae86538dc"} Mar 09 13:04:37 crc kubenswrapper[4723]: I0309 13:04:37.703400 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vwn46" event={"ID":"98195455-05c0-408c-b3e2-728b991eee12","Type":"ContainerStarted","Data":"0bb50f94a2a258775cc2dac5b4002ecc3dfc3c586b2c2f1b1bf1f091e8becafe"} Mar 09 13:04:38 crc kubenswrapper[4723]: I0309 13:04:38.552985 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wn4hg"] Mar 09 13:04:38 crc kubenswrapper[4723]: I0309 13:04:38.553948 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wn4hg" Mar 09 13:04:38 crc kubenswrapper[4723]: I0309 13:04:38.555802 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 09 13:04:38 crc kubenswrapper[4723]: I0309 13:04:38.562934 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wn4hg"] Mar 09 13:04:38 crc kubenswrapper[4723]: I0309 13:04:38.710062 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r9xzz" event={"ID":"0b462749-ee4f-4661-8a3a-06e721ef51a8","Type":"ContainerStarted","Data":"0636cc64c48395648daf951b3c3919fc570b6bb3e17c8a5dd064fdf56dac13ff"} Mar 09 13:04:38 crc kubenswrapper[4723]: I0309 13:04:38.712255 4723 generic.go:334] "Generic (PLEG): container finished" podID="98195455-05c0-408c-b3e2-728b991eee12" containerID="f1ac67e249f857396d983f92bb3f7514d31a3606a844dd58b7b5398e051e5a60" exitCode=0 Mar 09 13:04:38 crc kubenswrapper[4723]: I0309 13:04:38.712286 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vwn46" event={"ID":"98195455-05c0-408c-b3e2-728b991eee12","Type":"ContainerDied","Data":"f1ac67e249f857396d983f92bb3f7514d31a3606a844dd58b7b5398e051e5a60"} Mar 09 13:04:38 crc kubenswrapper[4723]: I0309 13:04:38.749946 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/901259b6-1c9d-49ca-9c13-4626d65c68fa-utilities\") pod \"certified-operators-wn4hg\" (UID: \"901259b6-1c9d-49ca-9c13-4626d65c68fa\") " pod="openshift-marketplace/certified-operators-wn4hg" Mar 09 13:04:38 crc kubenswrapper[4723]: I0309 13:04:38.750331 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz5px\" (UniqueName: \"kubernetes.io/projected/901259b6-1c9d-49ca-9c13-4626d65c68fa-kube-api-access-mz5px\") pod \"certified-operators-wn4hg\" (UID: \"901259b6-1c9d-49ca-9c13-4626d65c68fa\") " pod="openshift-marketplace/certified-operators-wn4hg" Mar 09 13:04:38 crc kubenswrapper[4723]: I0309 13:04:38.750389 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/901259b6-1c9d-49ca-9c13-4626d65c68fa-catalog-content\") pod \"certified-operators-wn4hg\" (UID: \"901259b6-1c9d-49ca-9c13-4626d65c68fa\") " pod="openshift-marketplace/certified-operators-wn4hg" Mar 09 13:04:38 crc kubenswrapper[4723]: I0309 13:04:38.752790 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-h7rg4"] Mar 09 13:04:38 crc kubenswrapper[4723]: I0309 13:04:38.754135 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h7rg4" Mar 09 13:04:38 crc kubenswrapper[4723]: I0309 13:04:38.756692 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 09 13:04:38 crc kubenswrapper[4723]: I0309 13:04:38.762043 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h7rg4"] Mar 09 13:04:38 crc kubenswrapper[4723]: I0309 13:04:38.851558 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/901259b6-1c9d-49ca-9c13-4626d65c68fa-utilities\") pod \"certified-operators-wn4hg\" (UID: \"901259b6-1c9d-49ca-9c13-4626d65c68fa\") " pod="openshift-marketplace/certified-operators-wn4hg" Mar 09 13:04:38 crc kubenswrapper[4723]: I0309 13:04:38.851615 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mz5px\" (UniqueName: \"kubernetes.io/projected/901259b6-1c9d-49ca-9c13-4626d65c68fa-kube-api-access-mz5px\") pod \"certified-operators-wn4hg\" (UID: \"901259b6-1c9d-49ca-9c13-4626d65c68fa\") " pod="openshift-marketplace/certified-operators-wn4hg" Mar 09 13:04:38 crc kubenswrapper[4723]: I0309 13:04:38.851666 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/901259b6-1c9d-49ca-9c13-4626d65c68fa-catalog-content\") pod \"certified-operators-wn4hg\" (UID: \"901259b6-1c9d-49ca-9c13-4626d65c68fa\") " pod="openshift-marketplace/certified-operators-wn4hg" Mar 09 13:04:38 crc kubenswrapper[4723]: I0309 13:04:38.852401 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/901259b6-1c9d-49ca-9c13-4626d65c68fa-catalog-content\") pod \"certified-operators-wn4hg\" (UID: \"901259b6-1c9d-49ca-9c13-4626d65c68fa\") " pod="openshift-marketplace/certified-operators-wn4hg" Mar 09 13:04:38 crc kubenswrapper[4723]: I0309 13:04:38.852437 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/901259b6-1c9d-49ca-9c13-4626d65c68fa-utilities\") pod \"certified-operators-wn4hg\" (UID: \"901259b6-1c9d-49ca-9c13-4626d65c68fa\") " pod="openshift-marketplace/certified-operators-wn4hg" Mar 09 13:04:38 crc kubenswrapper[4723]: I0309 13:04:38.885453 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz5px\" (UniqueName: \"kubernetes.io/projected/901259b6-1c9d-49ca-9c13-4626d65c68fa-kube-api-access-mz5px\") pod \"certified-operators-wn4hg\" (UID: \"901259b6-1c9d-49ca-9c13-4626d65c68fa\") " pod="openshift-marketplace/certified-operators-wn4hg" Mar 09 13:04:38 crc kubenswrapper[4723]: I0309 13:04:38.953080 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce90ce22-0632-4cec-bb6e-4c85b78b1833-utilities\") pod \"community-operators-h7rg4\" (UID: \"ce90ce22-0632-4cec-bb6e-4c85b78b1833\") " pod="openshift-marketplace/community-operators-h7rg4" Mar 09 13:04:38 crc kubenswrapper[4723]: I0309 13:04:38.953149 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5j6w\" (UniqueName: \"kubernetes.io/projected/ce90ce22-0632-4cec-bb6e-4c85b78b1833-kube-api-access-h5j6w\") pod \"community-operators-h7rg4\" (UID: \"ce90ce22-0632-4cec-bb6e-4c85b78b1833\") " pod="openshift-marketplace/community-operators-h7rg4" Mar 09 13:04:38 crc kubenswrapper[4723]: I0309 13:04:38.953204 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce90ce22-0632-4cec-bb6e-4c85b78b1833-catalog-content\") pod \"community-operators-h7rg4\" (UID: \"ce90ce22-0632-4cec-bb6e-4c85b78b1833\") " pod="openshift-marketplace/community-operators-h7rg4" Mar 09 13:04:39 crc kubenswrapper[4723]: I0309 13:04:39.055165 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce90ce22-0632-4cec-bb6e-4c85b78b1833-catalog-content\") pod \"community-operators-h7rg4\" (UID: \"ce90ce22-0632-4cec-bb6e-4c85b78b1833\") " pod="openshift-marketplace/community-operators-h7rg4" Mar 09 13:04:39 crc kubenswrapper[4723]: I0309 13:04:39.055516 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce90ce22-0632-4cec-bb6e-4c85b78b1833-utilities\") pod \"community-operators-h7rg4\" (UID: \"ce90ce22-0632-4cec-bb6e-4c85b78b1833\") " pod="openshift-marketplace/community-operators-h7rg4" Mar 09 13:04:39 crc kubenswrapper[4723]: I0309 13:04:39.055632 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5j6w\" (UniqueName: \"kubernetes.io/projected/ce90ce22-0632-4cec-bb6e-4c85b78b1833-kube-api-access-h5j6w\") pod \"community-operators-h7rg4\" (UID: \"ce90ce22-0632-4cec-bb6e-4c85b78b1833\") " pod="openshift-marketplace/community-operators-h7rg4" Mar 09 13:04:39 crc kubenswrapper[4723]: I0309 13:04:39.059167 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce90ce22-0632-4cec-bb6e-4c85b78b1833-catalog-content\") pod \"community-operators-h7rg4\" (UID: \"ce90ce22-0632-4cec-bb6e-4c85b78b1833\") " pod="openshift-marketplace/community-operators-h7rg4" Mar 09 13:04:39 crc kubenswrapper[4723]: I0309 13:04:39.060718 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce90ce22-0632-4cec-bb6e-4c85b78b1833-utilities\") pod \"community-operators-h7rg4\" (UID: \"ce90ce22-0632-4cec-bb6e-4c85b78b1833\") " pod="openshift-marketplace/community-operators-h7rg4" Mar 09 13:04:39 crc kubenswrapper[4723]: I0309 13:04:39.096031 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5j6w\" (UniqueName: \"kubernetes.io/projected/ce90ce22-0632-4cec-bb6e-4c85b78b1833-kube-api-access-h5j6w\") pod \"community-operators-h7rg4\" (UID: \"ce90ce22-0632-4cec-bb6e-4c85b78b1833\") " pod="openshift-marketplace/community-operators-h7rg4" Mar 09 13:04:39 crc kubenswrapper[4723]: I0309 13:04:39.173135 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wn4hg" Mar 09 13:04:39 crc kubenswrapper[4723]: I0309 13:04:39.369602 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h7rg4" Mar 09 13:04:39 crc kubenswrapper[4723]: W0309 13:04:39.614114 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod901259b6_1c9d_49ca_9c13_4626d65c68fa.slice/crio-11575cca6b1d501793eb4a6af9339ec3c81c6a66c94e0c2f7c75db973048cdd6 WatchSource:0}: Error finding container 11575cca6b1d501793eb4a6af9339ec3c81c6a66c94e0c2f7c75db973048cdd6: Status 404 returned error can't find the container with id 11575cca6b1d501793eb4a6af9339ec3c81c6a66c94e0c2f7c75db973048cdd6 Mar 09 13:04:39 crc kubenswrapper[4723]: I0309 13:04:39.622533 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wn4hg"] Mar 09 13:04:39 crc kubenswrapper[4723]: I0309 13:04:39.719222 4723 generic.go:334] "Generic (PLEG): container finished" podID="0b462749-ee4f-4661-8a3a-06e721ef51a8" containerID="0636cc64c48395648daf951b3c3919fc570b6bb3e17c8a5dd064fdf56dac13ff" exitCode=0 Mar 09 13:04:39 crc kubenswrapper[4723]: I0309 13:04:39.719330 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r9xzz" event={"ID":"0b462749-ee4f-4661-8a3a-06e721ef51a8","Type":"ContainerDied","Data":"0636cc64c48395648daf951b3c3919fc570b6bb3e17c8a5dd064fdf56dac13ff"} Mar 09 13:04:39 crc kubenswrapper[4723]: I0309 13:04:39.723023 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vwn46" event={"ID":"98195455-05c0-408c-b3e2-728b991eee12","Type":"ContainerStarted","Data":"01e7254714c2fabbb46348d2cc7bef03189ef811c3f876031c72e02e5d995c09"} Mar 09 13:04:39 crc kubenswrapper[4723]: I0309 13:04:39.726947 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wn4hg" event={"ID":"901259b6-1c9d-49ca-9c13-4626d65c68fa","Type":"ContainerStarted","Data":"11575cca6b1d501793eb4a6af9339ec3c81c6a66c94e0c2f7c75db973048cdd6"} Mar 09 13:04:39 crc kubenswrapper[4723]: I0309 13:04:39.760691 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vwn46" podStartSLOduration=2.304163973 podStartE2EDuration="3.760671662s" podCreationTimestamp="2026-03-09 13:04:36 +0000 UTC" firstStartedPulling="2026-03-09 13:04:37.705510682 +0000 UTC m=+351.719978222" lastFinishedPulling="2026-03-09 13:04:39.162018371 +0000 UTC m=+353.176485911" observedRunningTime="2026-03-09 13:04:39.759437229 +0000 UTC m=+353.773904779" watchObservedRunningTime="2026-03-09 13:04:39.760671662 +0000 UTC m=+353.775139202" Mar 09 13:04:39 crc kubenswrapper[4723]: I0309 13:04:39.786398 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h7rg4"] Mar 09 13:04:39 crc kubenswrapper[4723]: E0309 13:04:39.825353 4723 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod901259b6_1c9d_49ca_9c13_4626d65c68fa.slice/crio-e1acb38f561ced7d19841d5ace0478f2fe7ce6321a35ac27bf81d2e613eba9d3.scope\": RecentStats: unable to find data in memory cache]" Mar 09 13:04:39 crc kubenswrapper[4723]: W0309 13:04:39.826226 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce90ce22_0632_4cec_bb6e_4c85b78b1833.slice/crio-29b49e3b3007ff9232ae99819133c65150714311c5d179c00f47701e79a8fd64 WatchSource:0}: Error finding container 29b49e3b3007ff9232ae99819133c65150714311c5d179c00f47701e79a8fd64: Status 404 returned error can't find the container with id 29b49e3b3007ff9232ae99819133c65150714311c5d179c00f47701e79a8fd64 Mar 09 13:04:40 crc kubenswrapper[4723]: I0309 13:04:40.733281 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r9xzz" event={"ID":"0b462749-ee4f-4661-8a3a-06e721ef51a8","Type":"ContainerStarted","Data":"a74279d7cc8e601aedb5491987542313acbe7a465b53548d3f299418ec1240a9"} Mar 09 13:04:40 crc kubenswrapper[4723]: I0309 13:04:40.734088 4723 generic.go:334] "Generic (PLEG): container finished" podID="901259b6-1c9d-49ca-9c13-4626d65c68fa" containerID="e1acb38f561ced7d19841d5ace0478f2fe7ce6321a35ac27bf81d2e613eba9d3" exitCode=0 Mar 09 13:04:40 crc kubenswrapper[4723]: I0309 13:04:40.734143 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wn4hg" event={"ID":"901259b6-1c9d-49ca-9c13-4626d65c68fa","Type":"ContainerDied","Data":"e1acb38f561ced7d19841d5ace0478f2fe7ce6321a35ac27bf81d2e613eba9d3"} Mar 09 13:04:40 crc kubenswrapper[4723]: I0309 13:04:40.735260 4723 generic.go:334] "Generic (PLEG): container finished" podID="ce90ce22-0632-4cec-bb6e-4c85b78b1833" containerID="263726b0eac4cedf204b7a1163b19a2c29599ee5515969e81a400fdadea3c902" exitCode=0 Mar 09 13:04:40 crc kubenswrapper[4723]: I0309 13:04:40.735292 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h7rg4" event={"ID":"ce90ce22-0632-4cec-bb6e-4c85b78b1833","Type":"ContainerDied","Data":"263726b0eac4cedf204b7a1163b19a2c29599ee5515969e81a400fdadea3c902"} Mar 09 13:04:40 crc kubenswrapper[4723]: I0309 13:04:40.735319 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h7rg4" event={"ID":"ce90ce22-0632-4cec-bb6e-4c85b78b1833","Type":"ContainerStarted","Data":"29b49e3b3007ff9232ae99819133c65150714311c5d179c00f47701e79a8fd64"} Mar 09 13:04:40 crc kubenswrapper[4723]: I0309 13:04:40.752373 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-r9xzz" podStartSLOduration=2.298407891 podStartE2EDuration="4.752352668s" podCreationTimestamp="2026-03-09 13:04:36 +0000 UTC" firstStartedPulling="2026-03-09 13:04:37.702529603 +0000 UTC m=+351.716997143" lastFinishedPulling="2026-03-09 13:04:40.15647438 +0000 UTC m=+354.170941920" observedRunningTime="2026-03-09 13:04:40.74750502 +0000 UTC m=+354.761972570" watchObservedRunningTime="2026-03-09 13:04:40.752352668 +0000 UTC m=+354.766820208" Mar 09 13:04:41 crc kubenswrapper[4723]: I0309 13:04:41.744085 4723 generic.go:334] "Generic (PLEG): container finished" podID="ce90ce22-0632-4cec-bb6e-4c85b78b1833" containerID="1c419abeee7b7b92e75d4fed67f72c0a0a3f91a80b58671fb1cdc5e79110264d" exitCode=0 Mar 09 13:04:41 crc kubenswrapper[4723]: I0309 13:04:41.744986 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h7rg4" event={"ID":"ce90ce22-0632-4cec-bb6e-4c85b78b1833","Type":"ContainerDied","Data":"1c419abeee7b7b92e75d4fed67f72c0a0a3f91a80b58671fb1cdc5e79110264d"} Mar 09 13:04:42 crc kubenswrapper[4723]: I0309 13:04:42.752150 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h7rg4" event={"ID":"ce90ce22-0632-4cec-bb6e-4c85b78b1833","Type":"ContainerStarted","Data":"0cdeed54b1694870ebcffa01f973a3ce5570f9e0c78750f958c981952b47dec2"} Mar 09 13:04:42 crc kubenswrapper[4723]: I0309 13:04:42.754778 4723 generic.go:334] "Generic (PLEG): container finished" podID="901259b6-1c9d-49ca-9c13-4626d65c68fa" containerID="6a719e1de9cdf24ce33ecb992d036ac8e5aca6450ad4d81b5b1642ed1cb8f53e" exitCode=0 Mar 09 13:04:42 crc kubenswrapper[4723]: I0309 13:04:42.754818 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wn4hg" event={"ID":"901259b6-1c9d-49ca-9c13-4626d65c68fa","Type":"ContainerDied","Data":"6a719e1de9cdf24ce33ecb992d036ac8e5aca6450ad4d81b5b1642ed1cb8f53e"} Mar 09 13:04:42 crc kubenswrapper[4723]: I0309 13:04:42.771904 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-h7rg4" podStartSLOduration=3.34005105 podStartE2EDuration="4.771888977s" podCreationTimestamp="2026-03-09 13:04:38 +0000 UTC" firstStartedPulling="2026-03-09 13:04:40.736956691 +0000 UTC m=+354.751424221" lastFinishedPulling="2026-03-09 13:04:42.168794598 +0000 UTC m=+356.183262148" observedRunningTime="2026-03-09 13:04:42.768436416 +0000 UTC m=+356.782903956" watchObservedRunningTime="2026-03-09 13:04:42.771888977 +0000 UTC m=+356.786356517" Mar 09 13:04:44 crc kubenswrapper[4723]: I0309 13:04:44.699893 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 09 13:04:44 crc kubenswrapper[4723]: I0309 13:04:44.766115 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wn4hg" event={"ID":"901259b6-1c9d-49ca-9c13-4626d65c68fa","Type":"ContainerStarted","Data":"5bc5a26a7410120617b9a6a0be48e3b7290a20d12768910995e1af991131e53d"} Mar 09 13:04:44 crc kubenswrapper[4723]: I0309 13:04:44.782895 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wn4hg" podStartSLOduration=3.803424465 podStartE2EDuration="6.782874359s" podCreationTimestamp="2026-03-09 13:04:38 +0000 UTC" firstStartedPulling="2026-03-09 13:04:40.736838698 +0000 UTC m=+354.751306238" lastFinishedPulling="2026-03-09 13:04:43.716288592 +0000 UTC m=+357.730756132" observedRunningTime="2026-03-09 13:04:44.782333125 +0000 UTC m=+358.796800675" watchObservedRunningTime="2026-03-09 13:04:44.782874359 +0000 UTC m=+358.797341899" Mar 09 13:04:46 crc kubenswrapper[4723]: I0309 13:04:46.471610 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-r9xzz" Mar 09 13:04:46 crc kubenswrapper[4723]: I0309 13:04:46.472615 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-r9xzz" Mar 09 13:04:46 crc kubenswrapper[4723]: I0309 13:04:46.684511 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vwn46" Mar 09 13:04:46 crc kubenswrapper[4723]: I0309 13:04:46.684562 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vwn46" Mar 09 13:04:46 crc kubenswrapper[4723]: I0309 13:04:46.735303 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vwn46" Mar 09 13:04:46 crc kubenswrapper[4723]: I0309 13:04:46.823724 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vwn46" Mar 09 13:04:47 crc kubenswrapper[4723]: I0309 13:04:47.521853 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-r9xzz" podUID="0b462749-ee4f-4661-8a3a-06e721ef51a8" containerName="registry-server" probeResult="failure" output=< Mar 09 13:04:47 crc kubenswrapper[4723]: timeout: failed to connect service ":50051" within 1s Mar 09 13:04:47 crc kubenswrapper[4723]: > Mar 09 13:04:49 crc kubenswrapper[4723]: I0309 13:04:49.173610 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wn4hg" Mar 09 13:04:49 crc kubenswrapper[4723]: I0309 13:04:49.174181 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wn4hg" Mar 09 13:04:49 crc kubenswrapper[4723]: I0309 13:04:49.220164 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wn4hg" Mar 09 13:04:49 crc kubenswrapper[4723]: I0309 13:04:49.369892 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-h7rg4" Mar 09 13:04:49 crc kubenswrapper[4723]: I0309 13:04:49.371005 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-h7rg4" Mar 09 13:04:49 crc kubenswrapper[4723]: I0309 13:04:49.411935 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-h7rg4" Mar 09 13:04:49 crc kubenswrapper[4723]: I0309 13:04:49.829209 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wn4hg" Mar 09 13:04:49 crc kubenswrapper[4723]: I0309 13:04:49.829262 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-h7rg4" Mar 09 13:04:56 crc kubenswrapper[4723]: I0309 13:04:56.407986 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8574658c57-bbjt8"] Mar 09 13:04:56 crc kubenswrapper[4723]: I0309 13:04:56.408714 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-8574658c57-bbjt8" podUID="6be8b05e-d252-470d-bf25-5bd6102d08f7" containerName="controller-manager" containerID="cri-o://c0817dc9dd5a71dd6b26ae2f68f87d0dc40fbdcfbdc789ec8a46fcaa79e9c548" gracePeriod=30 Mar 09 13:04:56 crc kubenswrapper[4723]: I0309 13:04:56.417245 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cf876879c-jn49r"] Mar 09 13:04:56 crc kubenswrapper[4723]: I0309 13:04:56.417793 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7cf876879c-jn49r" podUID="ede99d46-5d03-42b4-bb3f-76a952e4e9e1" containerName="route-controller-manager" containerID="cri-o://3b57386b307fe14f2c250d7291c8986af7055c39429cea230334f2ec4a055b79" gracePeriod=30 Mar 09 13:04:56 crc kubenswrapper[4723]: I0309 13:04:56.519588 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-r9xzz" Mar 09 13:04:56 crc kubenswrapper[4723]: I0309 13:04:56.563249 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-r9xzz" Mar 09 13:04:56 crc kubenswrapper[4723]: I0309 13:04:56.836366 4723 generic.go:334] "Generic (PLEG): container finished" podID="ede99d46-5d03-42b4-bb3f-76a952e4e9e1" containerID="3b57386b307fe14f2c250d7291c8986af7055c39429cea230334f2ec4a055b79" exitCode=0 Mar 09 13:04:56 crc kubenswrapper[4723]: I0309 13:04:56.836458 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7cf876879c-jn49r" event={"ID":"ede99d46-5d03-42b4-bb3f-76a952e4e9e1","Type":"ContainerDied","Data":"3b57386b307fe14f2c250d7291c8986af7055c39429cea230334f2ec4a055b79"} Mar 09 13:04:56 crc kubenswrapper[4723]: I0309 13:04:56.836699 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7cf876879c-jn49r" event={"ID":"ede99d46-5d03-42b4-bb3f-76a952e4e9e1","Type":"ContainerDied","Data":"a0623ff2a79e14297c47673d7d272cc3fc0ba362611714e5338bec158e235860"} Mar 09 13:04:56 crc kubenswrapper[4723]: I0309 13:04:56.836715 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0623ff2a79e14297c47673d7d272cc3fc0ba362611714e5338bec158e235860" Mar 09 13:04:56 crc kubenswrapper[4723]: I0309 13:04:56.838455 4723 generic.go:334] "Generic (PLEG): container finished" podID="6be8b05e-d252-470d-bf25-5bd6102d08f7" containerID="c0817dc9dd5a71dd6b26ae2f68f87d0dc40fbdcfbdc789ec8a46fcaa79e9c548" exitCode=0 Mar 09 13:04:56 crc kubenswrapper[4723]: I0309 13:04:56.839185 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8574658c57-bbjt8" event={"ID":"6be8b05e-d252-470d-bf25-5bd6102d08f7","Type":"ContainerDied","Data":"c0817dc9dd5a71dd6b26ae2f68f87d0dc40fbdcfbdc789ec8a46fcaa79e9c548"} Mar 09 13:04:56 crc kubenswrapper[4723]: I0309 13:04:56.880905 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7cf876879c-jn49r" Mar 09 13:04:56 crc kubenswrapper[4723]: I0309 13:04:56.885598 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8574658c57-bbjt8" Mar 09 13:04:56 crc kubenswrapper[4723]: I0309 13:04:56.898272 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6be8b05e-d252-470d-bf25-5bd6102d08f7-serving-cert\") pod \"6be8b05e-d252-470d-bf25-5bd6102d08f7\" (UID: \"6be8b05e-d252-470d-bf25-5bd6102d08f7\") " Mar 09 13:04:56 crc kubenswrapper[4723]: I0309 13:04:56.900278 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6be8b05e-d252-470d-bf25-5bd6102d08f7-config\") pod \"6be8b05e-d252-470d-bf25-5bd6102d08f7\" (UID: \"6be8b05e-d252-470d-bf25-5bd6102d08f7\") " Mar 09 13:04:56 crc kubenswrapper[4723]: I0309 13:04:56.900389 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ede99d46-5d03-42b4-bb3f-76a952e4e9e1-config\") pod \"ede99d46-5d03-42b4-bb3f-76a952e4e9e1\" (UID: \"ede99d46-5d03-42b4-bb3f-76a952e4e9e1\") " Mar 09 13:04:56 crc kubenswrapper[4723]: I0309 13:04:56.900414 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjrsj\" (UniqueName: \"kubernetes.io/projected/6be8b05e-d252-470d-bf25-5bd6102d08f7-kube-api-access-vjrsj\") pod \"6be8b05e-d252-470d-bf25-5bd6102d08f7\" (UID: \"6be8b05e-d252-470d-bf25-5bd6102d08f7\") " Mar 09 13:04:56 crc kubenswrapper[4723]: I0309 13:04:56.900453 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjqgd\" (UniqueName: \"kubernetes.io/projected/ede99d46-5d03-42b4-bb3f-76a952e4e9e1-kube-api-access-zjqgd\") pod \"ede99d46-5d03-42b4-bb3f-76a952e4e9e1\" (UID: \"ede99d46-5d03-42b4-bb3f-76a952e4e9e1\") " Mar 09 13:04:56 crc kubenswrapper[4723]: I0309 13:04:56.900480 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ede99d46-5d03-42b4-bb3f-76a952e4e9e1-serving-cert\") pod \"ede99d46-5d03-42b4-bb3f-76a952e4e9e1\" (UID: \"ede99d46-5d03-42b4-bb3f-76a952e4e9e1\") " Mar 09 13:04:56 crc kubenswrapper[4723]: I0309 13:04:56.900525 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6be8b05e-d252-470d-bf25-5bd6102d08f7-client-ca\") pod \"6be8b05e-d252-470d-bf25-5bd6102d08f7\" (UID: \"6be8b05e-d252-470d-bf25-5bd6102d08f7\") " Mar 09 13:04:56 crc kubenswrapper[4723]: I0309 13:04:56.902332 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ede99d46-5d03-42b4-bb3f-76a952e4e9e1-config" (OuterVolumeSpecName: "config") pod "ede99d46-5d03-42b4-bb3f-76a952e4e9e1" (UID: "ede99d46-5d03-42b4-bb3f-76a952e4e9e1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:04:56 crc kubenswrapper[4723]: I0309 13:04:56.902435 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6be8b05e-d252-470d-bf25-5bd6102d08f7-config" (OuterVolumeSpecName: "config") pod "6be8b05e-d252-470d-bf25-5bd6102d08f7" (UID: "6be8b05e-d252-470d-bf25-5bd6102d08f7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:04:56 crc kubenswrapper[4723]: I0309 13:04:56.903096 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6be8b05e-d252-470d-bf25-5bd6102d08f7-client-ca" (OuterVolumeSpecName: "client-ca") pod "6be8b05e-d252-470d-bf25-5bd6102d08f7" (UID: "6be8b05e-d252-470d-bf25-5bd6102d08f7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:04:56 crc kubenswrapper[4723]: I0309 13:04:56.904027 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6be8b05e-d252-470d-bf25-5bd6102d08f7-proxy-ca-bundles\") pod \"6be8b05e-d252-470d-bf25-5bd6102d08f7\" (UID: \"6be8b05e-d252-470d-bf25-5bd6102d08f7\") " Mar 09 13:04:56 crc kubenswrapper[4723]: I0309 13:04:56.904084 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ede99d46-5d03-42b4-bb3f-76a952e4e9e1-client-ca\") pod \"ede99d46-5d03-42b4-bb3f-76a952e4e9e1\" (UID: \"ede99d46-5d03-42b4-bb3f-76a952e4e9e1\") " Mar 09 13:04:56 crc kubenswrapper[4723]: I0309 13:04:56.904553 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6be8b05e-d252-470d-bf25-5bd6102d08f7-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:04:56 crc kubenswrapper[4723]: I0309 13:04:56.904572 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ede99d46-5d03-42b4-bb3f-76a952e4e9e1-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:04:56 crc kubenswrapper[4723]: I0309 13:04:56.904584 4723 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6be8b05e-d252-470d-bf25-5bd6102d08f7-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:04:56 crc kubenswrapper[4723]: I0309 13:04:56.905151 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6be8b05e-d252-470d-bf25-5bd6102d08f7-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "6be8b05e-d252-470d-bf25-5bd6102d08f7" (UID: "6be8b05e-d252-470d-bf25-5bd6102d08f7"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:04:56 crc kubenswrapper[4723]: I0309 13:04:56.905640 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ede99d46-5d03-42b4-bb3f-76a952e4e9e1-client-ca" (OuterVolumeSpecName: "client-ca") pod "ede99d46-5d03-42b4-bb3f-76a952e4e9e1" (UID: "ede99d46-5d03-42b4-bb3f-76a952e4e9e1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:04:56 crc kubenswrapper[4723]: I0309 13:04:56.905998 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6be8b05e-d252-470d-bf25-5bd6102d08f7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6be8b05e-d252-470d-bf25-5bd6102d08f7" (UID: "6be8b05e-d252-470d-bf25-5bd6102d08f7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:04:56 crc kubenswrapper[4723]: I0309 13:04:56.907472 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ede99d46-5d03-42b4-bb3f-76a952e4e9e1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ede99d46-5d03-42b4-bb3f-76a952e4e9e1" (UID: "ede99d46-5d03-42b4-bb3f-76a952e4e9e1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:04:56 crc kubenswrapper[4723]: I0309 13:04:56.908821 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6be8b05e-d252-470d-bf25-5bd6102d08f7-kube-api-access-vjrsj" (OuterVolumeSpecName: "kube-api-access-vjrsj") pod "6be8b05e-d252-470d-bf25-5bd6102d08f7" (UID: "6be8b05e-d252-470d-bf25-5bd6102d08f7"). InnerVolumeSpecName "kube-api-access-vjrsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:04:56 crc kubenswrapper[4723]: I0309 13:04:56.912200 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ede99d46-5d03-42b4-bb3f-76a952e4e9e1-kube-api-access-zjqgd" (OuterVolumeSpecName: "kube-api-access-zjqgd") pod "ede99d46-5d03-42b4-bb3f-76a952e4e9e1" (UID: "ede99d46-5d03-42b4-bb3f-76a952e4e9e1"). InnerVolumeSpecName "kube-api-access-zjqgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.005649 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjqgd\" (UniqueName: \"kubernetes.io/projected/ede99d46-5d03-42b4-bb3f-76a952e4e9e1-kube-api-access-zjqgd\") on node \"crc\" DevicePath \"\"" Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.005689 4723 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ede99d46-5d03-42b4-bb3f-76a952e4e9e1-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.005699 4723 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6be8b05e-d252-470d-bf25-5bd6102d08f7-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.005707 4723 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ede99d46-5d03-42b4-bb3f-76a952e4e9e1-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.005715 4723 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6be8b05e-d252-470d-bf25-5bd6102d08f7-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.005724 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjrsj\" (UniqueName: \"kubernetes.io/projected/6be8b05e-d252-470d-bf25-5bd6102d08f7-kube-api-access-vjrsj\") on node \"crc\" DevicePath \"\"" Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.647003 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7757f9dd75-n7jz4"] Mar 09 13:04:57 crc kubenswrapper[4723]: E0309 13:04:57.648465 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ede99d46-5d03-42b4-bb3f-76a952e4e9e1" containerName="route-controller-manager" Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.648501 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="ede99d46-5d03-42b4-bb3f-76a952e4e9e1" containerName="route-controller-manager" Mar 09 13:04:57 crc kubenswrapper[4723]: E0309 13:04:57.648533 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6be8b05e-d252-470d-bf25-5bd6102d08f7" containerName="controller-manager" Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.648543 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="6be8b05e-d252-470d-bf25-5bd6102d08f7" containerName="controller-manager" Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.648958 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="6be8b05e-d252-470d-bf25-5bd6102d08f7" containerName="controller-manager" Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.648984 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="ede99d46-5d03-42b4-bb3f-76a952e4e9e1" containerName="route-controller-manager" Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.650541 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7757f9dd75-n7jz4" Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.665535 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-774cb675cc-hwvwx"] Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.669340 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-774cb675cc-hwvwx" Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.672395 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-774cb675cc-hwvwx"] Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.687860 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7757f9dd75-n7jz4"] Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.714718 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f52a68c7-f2a2-4c16-a45b-b821debecd6d-proxy-ca-bundles\") pod \"controller-manager-774cb675cc-hwvwx\" (UID: \"f52a68c7-f2a2-4c16-a45b-b821debecd6d\") " pod="openshift-controller-manager/controller-manager-774cb675cc-hwvwx" Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.714780 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bfc8cb7a-3df5-4dd3-8520-82316314e76b-client-ca\") pod \"route-controller-manager-7757f9dd75-n7jz4\" (UID: \"bfc8cb7a-3df5-4dd3-8520-82316314e76b\") " pod="openshift-route-controller-manager/route-controller-manager-7757f9dd75-n7jz4" Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.714905 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtfr7\" (UniqueName: \"kubernetes.io/projected/f52a68c7-f2a2-4c16-a45b-b821debecd6d-kube-api-access-gtfr7\") pod \"controller-manager-774cb675cc-hwvwx\" (UID: \"f52a68c7-f2a2-4c16-a45b-b821debecd6d\") " pod="openshift-controller-manager/controller-manager-774cb675cc-hwvwx" Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.714954 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfc8cb7a-3df5-4dd3-8520-82316314e76b-serving-cert\") pod \"route-controller-manager-7757f9dd75-n7jz4\" (UID: \"bfc8cb7a-3df5-4dd3-8520-82316314e76b\") " pod="openshift-route-controller-manager/route-controller-manager-7757f9dd75-n7jz4" Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.714988 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfc8cb7a-3df5-4dd3-8520-82316314e76b-config\") pod \"route-controller-manager-7757f9dd75-n7jz4\" (UID: \"bfc8cb7a-3df5-4dd3-8520-82316314e76b\") " pod="openshift-route-controller-manager/route-controller-manager-7757f9dd75-n7jz4" Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.715035 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f52a68c7-f2a2-4c16-a45b-b821debecd6d-client-ca\") pod \"controller-manager-774cb675cc-hwvwx\" (UID: \"f52a68c7-f2a2-4c16-a45b-b821debecd6d\") " pod="openshift-controller-manager/controller-manager-774cb675cc-hwvwx" Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.715057 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5gzg\" (UniqueName: \"kubernetes.io/projected/bfc8cb7a-3df5-4dd3-8520-82316314e76b-kube-api-access-q5gzg\") pod \"route-controller-manager-7757f9dd75-n7jz4\" (UID: \"bfc8cb7a-3df5-4dd3-8520-82316314e76b\") " pod="openshift-route-controller-manager/route-controller-manager-7757f9dd75-n7jz4" Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.715107 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f52a68c7-f2a2-4c16-a45b-b821debecd6d-serving-cert\") pod \"controller-manager-774cb675cc-hwvwx\" (UID: \"f52a68c7-f2a2-4c16-a45b-b821debecd6d\") " pod="openshift-controller-manager/controller-manager-774cb675cc-hwvwx" Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.715164 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f52a68c7-f2a2-4c16-a45b-b821debecd6d-config\") pod \"controller-manager-774cb675cc-hwvwx\" (UID: \"f52a68c7-f2a2-4c16-a45b-b821debecd6d\") " pod="openshift-controller-manager/controller-manager-774cb675cc-hwvwx" Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.816396 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f52a68c7-f2a2-4c16-a45b-b821debecd6d-proxy-ca-bundles\") pod \"controller-manager-774cb675cc-hwvwx\" (UID: \"f52a68c7-f2a2-4c16-a45b-b821debecd6d\") " pod="openshift-controller-manager/controller-manager-774cb675cc-hwvwx" Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.816696 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bfc8cb7a-3df5-4dd3-8520-82316314e76b-client-ca\") pod \"route-controller-manager-7757f9dd75-n7jz4\" (UID: \"bfc8cb7a-3df5-4dd3-8520-82316314e76b\") " pod="openshift-route-controller-manager/route-controller-manager-7757f9dd75-n7jz4" Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.816860 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtfr7\" (UniqueName: \"kubernetes.io/projected/f52a68c7-f2a2-4c16-a45b-b821debecd6d-kube-api-access-gtfr7\") pod \"controller-manager-774cb675cc-hwvwx\" (UID: \"f52a68c7-f2a2-4c16-a45b-b821debecd6d\") " pod="openshift-controller-manager/controller-manager-774cb675cc-hwvwx" Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.816980 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfc8cb7a-3df5-4dd3-8520-82316314e76b-serving-cert\") pod \"route-controller-manager-7757f9dd75-n7jz4\" (UID: \"bfc8cb7a-3df5-4dd3-8520-82316314e76b\") " pod="openshift-route-controller-manager/route-controller-manager-7757f9dd75-n7jz4" Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.817065 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfc8cb7a-3df5-4dd3-8520-82316314e76b-config\") pod \"route-controller-manager-7757f9dd75-n7jz4\" (UID: \"bfc8cb7a-3df5-4dd3-8520-82316314e76b\") " pod="openshift-route-controller-manager/route-controller-manager-7757f9dd75-n7jz4" Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.817191 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f52a68c7-f2a2-4c16-a45b-b821debecd6d-client-ca\") pod \"controller-manager-774cb675cc-hwvwx\" (UID: \"f52a68c7-f2a2-4c16-a45b-b821debecd6d\") " pod="openshift-controller-manager/controller-manager-774cb675cc-hwvwx" Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.818151 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5gzg\" (UniqueName: \"kubernetes.io/projected/bfc8cb7a-3df5-4dd3-8520-82316314e76b-kube-api-access-q5gzg\") pod \"route-controller-manager-7757f9dd75-n7jz4\" (UID: \"bfc8cb7a-3df5-4dd3-8520-82316314e76b\") " pod="openshift-route-controller-manager/route-controller-manager-7757f9dd75-n7jz4" Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.818640 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f52a68c7-f2a2-4c16-a45b-b821debecd6d-serving-cert\") pod \"controller-manager-774cb675cc-hwvwx\" (UID: \"f52a68c7-f2a2-4c16-a45b-b821debecd6d\") " pod="openshift-controller-manager/controller-manager-774cb675cc-hwvwx" Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.818786 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f52a68c7-f2a2-4c16-a45b-b821debecd6d-config\") pod \"controller-manager-774cb675cc-hwvwx\" (UID: \"f52a68c7-f2a2-4c16-a45b-b821debecd6d\") " pod="openshift-controller-manager/controller-manager-774cb675cc-hwvwx" Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.817489 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bfc8cb7a-3df5-4dd3-8520-82316314e76b-client-ca\") pod \"route-controller-manager-7757f9dd75-n7jz4\" (UID: \"bfc8cb7a-3df5-4dd3-8520-82316314e76b\") " pod="openshift-route-controller-manager/route-controller-manager-7757f9dd75-n7jz4" Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.818421 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfc8cb7a-3df5-4dd3-8520-82316314e76b-config\") pod \"route-controller-manager-7757f9dd75-n7jz4\" (UID: \"bfc8cb7a-3df5-4dd3-8520-82316314e76b\") " pod="openshift-route-controller-manager/route-controller-manager-7757f9dd75-n7jz4" Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.817545 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f52a68c7-f2a2-4c16-a45b-b821debecd6d-proxy-ca-bundles\") pod \"controller-manager-774cb675cc-hwvwx\" (UID: \"f52a68c7-f2a2-4c16-a45b-b821debecd6d\") " pod="openshift-controller-manager/controller-manager-774cb675cc-hwvwx" Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.818093 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f52a68c7-f2a2-4c16-a45b-b821debecd6d-client-ca\") pod \"controller-manager-774cb675cc-hwvwx\" (UID: \"f52a68c7-f2a2-4c16-a45b-b821debecd6d\") " pod="openshift-controller-manager/controller-manager-774cb675cc-hwvwx" Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.820223 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfc8cb7a-3df5-4dd3-8520-82316314e76b-serving-cert\") pod \"route-controller-manager-7757f9dd75-n7jz4\" (UID: \"bfc8cb7a-3df5-4dd3-8520-82316314e76b\") " pod="openshift-route-controller-manager/route-controller-manager-7757f9dd75-n7jz4" Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.821230 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f52a68c7-f2a2-4c16-a45b-b821debecd6d-config\") pod \"controller-manager-774cb675cc-hwvwx\" (UID: \"f52a68c7-f2a2-4c16-a45b-b821debecd6d\") " pod="openshift-controller-manager/controller-manager-774cb675cc-hwvwx" Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.821474 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f52a68c7-f2a2-4c16-a45b-b821debecd6d-serving-cert\") pod \"controller-manager-774cb675cc-hwvwx\" (UID: \"f52a68c7-f2a2-4c16-a45b-b821debecd6d\") " pod="openshift-controller-manager/controller-manager-774cb675cc-hwvwx" Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.836536 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtfr7\" (UniqueName: \"kubernetes.io/projected/f52a68c7-f2a2-4c16-a45b-b821debecd6d-kube-api-access-gtfr7\") pod \"controller-manager-774cb675cc-hwvwx\" (UID: \"f52a68c7-f2a2-4c16-a45b-b821debecd6d\") " pod="openshift-controller-manager/controller-manager-774cb675cc-hwvwx" Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.844192 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7cf876879c-jn49r" Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.844494 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8574658c57-bbjt8" event={"ID":"6be8b05e-d252-470d-bf25-5bd6102d08f7","Type":"ContainerDied","Data":"33a21e1e9e773393661594d23b24dc59b4318e2cde821b192b1ba9b76ab97920"} Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.844562 4723 scope.go:117] "RemoveContainer" containerID="c0817dc9dd5a71dd6b26ae2f68f87d0dc40fbdcfbdc789ec8a46fcaa79e9c548" Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.845102 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8574658c57-bbjt8" Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.847657 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5gzg\" (UniqueName: \"kubernetes.io/projected/bfc8cb7a-3df5-4dd3-8520-82316314e76b-kube-api-access-q5gzg\") pod \"route-controller-manager-7757f9dd75-n7jz4\" (UID: \"bfc8cb7a-3df5-4dd3-8520-82316314e76b\") " pod="openshift-route-controller-manager/route-controller-manager-7757f9dd75-n7jz4" Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.909181 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cf876879c-jn49r"] Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.913093 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cf876879c-jn49r"] Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.923660 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8574658c57-bbjt8"] Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.929062 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-8574658c57-bbjt8"] Mar 09 13:04:57 crc kubenswrapper[4723]: I0309 13:04:57.991041 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7757f9dd75-n7jz4" Mar 09 13:04:58 crc kubenswrapper[4723]: I0309 13:04:58.001099 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-774cb675cc-hwvwx" Mar 09 13:04:58 crc kubenswrapper[4723]: I0309 13:04:58.378775 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-774cb675cc-hwvwx"] Mar 09 13:04:58 crc kubenswrapper[4723]: W0309 13:04:58.383901 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf52a68c7_f2a2_4c16_a45b_b821debecd6d.slice/crio-1b4d9248f190fdbdac62a46e4246ac63244a8be5fc3d7b06f316e902dffc67a5 WatchSource:0}: Error finding container 1b4d9248f190fdbdac62a46e4246ac63244a8be5fc3d7b06f316e902dffc67a5: Status 404 returned error can't find the container with id 1b4d9248f190fdbdac62a46e4246ac63244a8be5fc3d7b06f316e902dffc67a5 Mar 09 13:04:58 crc kubenswrapper[4723]: I0309 13:04:58.421839 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7757f9dd75-n7jz4"] Mar 09 13:04:58 crc kubenswrapper[4723]: W0309 13:04:58.435433 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfc8cb7a_3df5_4dd3_8520_82316314e76b.slice/crio-f9d0189e2439c09883f6daea7b38e7ec725629d928a56740273d746c9f72bc73 WatchSource:0}: Error finding container f9d0189e2439c09883f6daea7b38e7ec725629d928a56740273d746c9f72bc73: Status 404 returned error can't find the container with id f9d0189e2439c09883f6daea7b38e7ec725629d928a56740273d746c9f72bc73 Mar 09 13:04:58 crc kubenswrapper[4723]: I0309 13:04:58.851011 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7757f9dd75-n7jz4" event={"ID":"bfc8cb7a-3df5-4dd3-8520-82316314e76b","Type":"ContainerStarted","Data":"e8a4376002e9cdf09ffd0e2ee4e450b839cff7f5d907c91e523d2bbc5296bacc"} Mar 09 13:04:58 crc kubenswrapper[4723]: I0309 13:04:58.851052 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7757f9dd75-n7jz4" event={"ID":"bfc8cb7a-3df5-4dd3-8520-82316314e76b","Type":"ContainerStarted","Data":"f9d0189e2439c09883f6daea7b38e7ec725629d928a56740273d746c9f72bc73"} Mar 09 13:04:58 crc kubenswrapper[4723]: I0309 13:04:58.851229 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7757f9dd75-n7jz4" Mar 09 13:04:58 crc kubenswrapper[4723]: I0309 13:04:58.852835 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-774cb675cc-hwvwx" event={"ID":"f52a68c7-f2a2-4c16-a45b-b821debecd6d","Type":"ContainerStarted","Data":"9926e9b2436ede65b865baf0c89a935a8a0f1e4c6e755d249aa36f187081c72d"} Mar 09 13:04:58 crc kubenswrapper[4723]: I0309 13:04:58.852885 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-774cb675cc-hwvwx" event={"ID":"f52a68c7-f2a2-4c16-a45b-b821debecd6d","Type":"ContainerStarted","Data":"1b4d9248f190fdbdac62a46e4246ac63244a8be5fc3d7b06f316e902dffc67a5"} Mar 09 13:04:58 crc kubenswrapper[4723]: I0309 13:04:58.853035 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-774cb675cc-hwvwx" Mar 09 13:04:58 crc kubenswrapper[4723]: I0309 13:04:58.868018 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7757f9dd75-n7jz4" podStartSLOduration=2.867998353 podStartE2EDuration="2.867998353s" podCreationTimestamp="2026-03-09 13:04:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:04:58.864344316 +0000 UTC m=+372.878811856" watchObservedRunningTime="2026-03-09 13:04:58.867998353 +0000 UTC m=+372.882465893" Mar 09 13:04:58 crc kubenswrapper[4723]: I0309 13:04:58.868094 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-774cb675cc-hwvwx" Mar 09 13:04:58 crc kubenswrapper[4723]: I0309 13:04:58.881412 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-774cb675cc-hwvwx" podStartSLOduration=2.881394337 podStartE2EDuration="2.881394337s" podCreationTimestamp="2026-03-09 13:04:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:04:58.878021288 +0000 UTC m=+372.892488848" watchObservedRunningTime="2026-03-09 13:04:58.881394337 +0000 UTC m=+372.895861877" Mar 09 13:04:58 crc kubenswrapper[4723]: I0309 13:04:58.888094 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6be8b05e-d252-470d-bf25-5bd6102d08f7" path="/var/lib/kubelet/pods/6be8b05e-d252-470d-bf25-5bd6102d08f7/volumes" Mar 09 13:04:58 crc kubenswrapper[4723]: I0309 13:04:58.888746 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ede99d46-5d03-42b4-bb3f-76a952e4e9e1" path="/var/lib/kubelet/pods/ede99d46-5d03-42b4-bb3f-76a952e4e9e1/volumes" Mar 09 13:04:58 crc kubenswrapper[4723]: I0309 13:04:58.981079 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7757f9dd75-n7jz4" Mar 09 13:05:12 crc kubenswrapper[4723]: I0309 13:05:12.129010 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-snmgj"] Mar 09 13:05:12 crc kubenswrapper[4723]: I0309 13:05:12.130157 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-snmgj" Mar 09 13:05:12 crc kubenswrapper[4723]: I0309 13:05:12.132188 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 09 13:05:12 crc kubenswrapper[4723]: I0309 13:05:12.133000 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 09 13:05:12 crc kubenswrapper[4723]: I0309 13:05:12.133045 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 09 13:05:12 crc kubenswrapper[4723]: I0309 13:05:12.133143 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-dockercfg-wwt9l" Mar 09 13:05:12 crc kubenswrapper[4723]: I0309 13:05:12.133656 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 09 13:05:12 crc kubenswrapper[4723]: I0309 13:05:12.147379 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-snmgj"] Mar 09 13:05:12 crc kubenswrapper[4723]: I0309 13:05:12.213634 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nl7w\" (UniqueName: \"kubernetes.io/projected/8cbc27b5-1a55-41cc-a66f-f85a279c348e-kube-api-access-8nl7w\") pod \"cluster-monitoring-operator-6d5b84845-snmgj\" (UID: \"8cbc27b5-1a55-41cc-a66f-f85a279c348e\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-snmgj" Mar 09 13:05:12 crc kubenswrapper[4723]: I0309 13:05:12.213762 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8cbc27b5-1a55-41cc-a66f-f85a279c348e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-snmgj\" (UID: \"8cbc27b5-1a55-41cc-a66f-f85a279c348e\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-snmgj" Mar 09 13:05:12 crc kubenswrapper[4723]: I0309 13:05:12.213841 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/8cbc27b5-1a55-41cc-a66f-f85a279c348e-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-snmgj\" (UID: \"8cbc27b5-1a55-41cc-a66f-f85a279c348e\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-snmgj" Mar 09 13:05:12 crc kubenswrapper[4723]: I0309 13:05:12.315033 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nl7w\" (UniqueName: \"kubernetes.io/projected/8cbc27b5-1a55-41cc-a66f-f85a279c348e-kube-api-access-8nl7w\") pod \"cluster-monitoring-operator-6d5b84845-snmgj\" (UID: \"8cbc27b5-1a55-41cc-a66f-f85a279c348e\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-snmgj" Mar 09 13:05:12 crc kubenswrapper[4723]: I0309 13:05:12.315097 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8cbc27b5-1a55-41cc-a66f-f85a279c348e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-snmgj\" (UID: \"8cbc27b5-1a55-41cc-a66f-f85a279c348e\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-snmgj" Mar 09 13:05:12 crc kubenswrapper[4723]: I0309 13:05:12.315134 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/8cbc27b5-1a55-41cc-a66f-f85a279c348e-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-snmgj\" (UID: \"8cbc27b5-1a55-41cc-a66f-f85a279c348e\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-snmgj" Mar 09 13:05:12 crc kubenswrapper[4723]: I0309 13:05:12.316111 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/8cbc27b5-1a55-41cc-a66f-f85a279c348e-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-snmgj\" (UID: \"8cbc27b5-1a55-41cc-a66f-f85a279c348e\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-snmgj" Mar 09 13:05:12 crc kubenswrapper[4723]: I0309 13:05:12.328919 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8cbc27b5-1a55-41cc-a66f-f85a279c348e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-snmgj\" (UID: \"8cbc27b5-1a55-41cc-a66f-f85a279c348e\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-snmgj" Mar 09 13:05:12 crc kubenswrapper[4723]: I0309 13:05:12.333832 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nl7w\" (UniqueName: \"kubernetes.io/projected/8cbc27b5-1a55-41cc-a66f-f85a279c348e-kube-api-access-8nl7w\") pod \"cluster-monitoring-operator-6d5b84845-snmgj\" (UID: \"8cbc27b5-1a55-41cc-a66f-f85a279c348e\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-snmgj" Mar 09 13:05:12 crc kubenswrapper[4723]: I0309 13:05:12.491480 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-snmgj" Mar 09 13:05:13 crc kubenswrapper[4723]: I0309 13:05:13.452129 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-snmgj"] Mar 09 13:05:13 crc kubenswrapper[4723]: W0309 13:05:13.461925 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cbc27b5_1a55_41cc_a66f_f85a279c348e.slice/crio-c5b3642775d57081932ecb909e3865812ae02b745fc2ee903bb15c47bffb1072 WatchSource:0}: Error finding container c5b3642775d57081932ecb909e3865812ae02b745fc2ee903bb15c47bffb1072: Status 404 returned error can't find the container with id c5b3642775d57081932ecb909e3865812ae02b745fc2ee903bb15c47bffb1072 Mar 09 13:05:13 crc kubenswrapper[4723]: I0309 13:05:13.931690 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-snmgj" event={"ID":"8cbc27b5-1a55-41cc-a66f-f85a279c348e","Type":"ContainerStarted","Data":"c5b3642775d57081932ecb909e3865812ae02b745fc2ee903bb15c47bffb1072"} Mar 09 13:05:15 crc kubenswrapper[4723]: I0309 13:05:15.942743 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-snmgj" event={"ID":"8cbc27b5-1a55-41cc-a66f-f85a279c348e","Type":"ContainerStarted","Data":"3248a469b2d6cd39f85d0b5c432c8fca27613020306becf2987d73efcbb2a447"} Mar 09 13:05:15 crc kubenswrapper[4723]: I0309 13:05:15.958426 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-snmgj" podStartSLOduration=1.9033432019999998 podStartE2EDuration="3.95840824s" podCreationTimestamp="2026-03-09 13:05:12 +0000 UTC" firstStartedPulling="2026-03-09 13:05:13.465578599 +0000 UTC m=+387.480046149" lastFinishedPulling="2026-03-09 13:05:15.520643647 +0000 UTC m=+389.535111187" observedRunningTime="2026-03-09 13:05:15.956451424 +0000 UTC m=+389.970918984" watchObservedRunningTime="2026-03-09 13:05:15.95840824 +0000 UTC m=+389.972875790" Mar 09 13:05:16 crc kubenswrapper[4723]: I0309 13:05:16.120450 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-nm8sh"] Mar 09 13:05:16 crc kubenswrapper[4723]: I0309 13:05:16.121337 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-nm8sh" Mar 09 13:05:16 crc kubenswrapper[4723]: I0309 13:05:16.132919 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-lfdgl"] Mar 09 13:05:16 crc kubenswrapper[4723]: I0309 13:05:16.133853 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-lfdgl" Mar 09 13:05:16 crc kubenswrapper[4723]: I0309 13:05:16.139253 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 09 13:05:16 crc kubenswrapper[4723]: I0309 13:05:16.139295 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-dockercfg-jfxht" Mar 09 13:05:16 crc kubenswrapper[4723]: I0309 13:05:16.146088 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-nm8sh"] Mar 09 13:05:16 crc kubenswrapper[4723]: I0309 13:05:16.164586 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-lfdgl"] Mar 09 13:05:16 crc kubenswrapper[4723]: I0309 13:05:16.176594 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/da5c4e91-8006-4b10-a03e-65e7b318345d-installation-pull-secrets\") pod \"image-registry-66df7c8f76-nm8sh\" (UID: \"da5c4e91-8006-4b10-a03e-65e7b318345d\") " pod="openshift-image-registry/image-registry-66df7c8f76-nm8sh" Mar 09 13:05:16 crc kubenswrapper[4723]: I0309 13:05:16.176666 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/da5c4e91-8006-4b10-a03e-65e7b318345d-bound-sa-token\") pod \"image-registry-66df7c8f76-nm8sh\" (UID: \"da5c4e91-8006-4b10-a03e-65e7b318345d\") " pod="openshift-image-registry/image-registry-66df7c8f76-nm8sh" Mar 09 13:05:16 crc kubenswrapper[4723]: I0309 13:05:16.176710 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvck8\" (UniqueName: \"kubernetes.io/projected/da5c4e91-8006-4b10-a03e-65e7b318345d-kube-api-access-fvck8\") pod \"image-registry-66df7c8f76-nm8sh\" (UID: \"da5c4e91-8006-4b10-a03e-65e7b318345d\") " pod="openshift-image-registry/image-registry-66df7c8f76-nm8sh" Mar 09 13:05:16 crc kubenswrapper[4723]: I0309 13:05:16.176728 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/da5c4e91-8006-4b10-a03e-65e7b318345d-trusted-ca\") pod \"image-registry-66df7c8f76-nm8sh\" (UID: \"da5c4e91-8006-4b10-a03e-65e7b318345d\") " pod="openshift-image-registry/image-registry-66df7c8f76-nm8sh" Mar 09 13:05:16 crc kubenswrapper[4723]: I0309 13:05:16.176786 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-nm8sh\" (UID: \"da5c4e91-8006-4b10-a03e-65e7b318345d\") " pod="openshift-image-registry/image-registry-66df7c8f76-nm8sh" Mar 09 13:05:16 crc kubenswrapper[4723]: I0309 13:05:16.176811 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/da5c4e91-8006-4b10-a03e-65e7b318345d-ca-trust-extracted\") pod \"image-registry-66df7c8f76-nm8sh\" (UID: \"da5c4e91-8006-4b10-a03e-65e7b318345d\") " pod="openshift-image-registry/image-registry-66df7c8f76-nm8sh" Mar 09 13:05:16 crc kubenswrapper[4723]: I0309 13:05:16.176830 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/da5c4e91-8006-4b10-a03e-65e7b318345d-registry-certificates\") pod \"image-registry-66df7c8f76-nm8sh\" (UID: \"da5c4e91-8006-4b10-a03e-65e7b318345d\") " pod="openshift-image-registry/image-registry-66df7c8f76-nm8sh" Mar 09 13:05:16 crc kubenswrapper[4723]: I0309 13:05:16.176872 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/2a058d13-df7c-45fc-9c82-83cd7d61ffbd-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-lfdgl\" (UID: \"2a058d13-df7c-45fc-9c82-83cd7d61ffbd\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-lfdgl" Mar 09 13:05:16 crc kubenswrapper[4723]: I0309 13:05:16.176893 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/da5c4e91-8006-4b10-a03e-65e7b318345d-registry-tls\") pod \"image-registry-66df7c8f76-nm8sh\" (UID: \"da5c4e91-8006-4b10-a03e-65e7b318345d\") " pod="openshift-image-registry/image-registry-66df7c8f76-nm8sh" Mar 09 13:05:16 crc kubenswrapper[4723]: I0309 13:05:16.219505 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-nm8sh\" (UID: \"da5c4e91-8006-4b10-a03e-65e7b318345d\") " pod="openshift-image-registry/image-registry-66df7c8f76-nm8sh" Mar 09 13:05:16 crc kubenswrapper[4723]: I0309 13:05:16.278380 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/da5c4e91-8006-4b10-a03e-65e7b318345d-installation-pull-secrets\") pod \"image-registry-66df7c8f76-nm8sh\" (UID: \"da5c4e91-8006-4b10-a03e-65e7b318345d\") " pod="openshift-image-registry/image-registry-66df7c8f76-nm8sh" Mar 09 13:05:16 crc kubenswrapper[4723]: I0309 13:05:16.278452 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/da5c4e91-8006-4b10-a03e-65e7b318345d-bound-sa-token\") pod \"image-registry-66df7c8f76-nm8sh\" (UID: \"da5c4e91-8006-4b10-a03e-65e7b318345d\") " pod="openshift-image-registry/image-registry-66df7c8f76-nm8sh" Mar 09 13:05:16 crc kubenswrapper[4723]: I0309 13:05:16.278482 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvck8\" (UniqueName: \"kubernetes.io/projected/da5c4e91-8006-4b10-a03e-65e7b318345d-kube-api-access-fvck8\") pod \"image-registry-66df7c8f76-nm8sh\" (UID: \"da5c4e91-8006-4b10-a03e-65e7b318345d\") " pod="openshift-image-registry/image-registry-66df7c8f76-nm8sh" Mar 09 13:05:16 crc kubenswrapper[4723]: I0309 13:05:16.278500 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/da5c4e91-8006-4b10-a03e-65e7b318345d-trusted-ca\") pod \"image-registry-66df7c8f76-nm8sh\" (UID: \"da5c4e91-8006-4b10-a03e-65e7b318345d\") " pod="openshift-image-registry/image-registry-66df7c8f76-nm8sh" Mar 09 13:05:16 crc kubenswrapper[4723]: I0309 13:05:16.278530 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/da5c4e91-8006-4b10-a03e-65e7b318345d-ca-trust-extracted\") pod \"image-registry-66df7c8f76-nm8sh\" (UID: \"da5c4e91-8006-4b10-a03e-65e7b318345d\") " pod="openshift-image-registry/image-registry-66df7c8f76-nm8sh" Mar 09 13:05:16 crc kubenswrapper[4723]: I0309 13:05:16.278551 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/2a058d13-df7c-45fc-9c82-83cd7d61ffbd-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-lfdgl\" (UID: \"2a058d13-df7c-45fc-9c82-83cd7d61ffbd\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-lfdgl" Mar 09 13:05:16 crc kubenswrapper[4723]: I0309 13:05:16.278566 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/da5c4e91-8006-4b10-a03e-65e7b318345d-registry-certificates\") pod \"image-registry-66df7c8f76-nm8sh\" (UID: \"da5c4e91-8006-4b10-a03e-65e7b318345d\") " pod="openshift-image-registry/image-registry-66df7c8f76-nm8sh" Mar 09 13:05:16 crc kubenswrapper[4723]: I0309 13:05:16.278583 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/da5c4e91-8006-4b10-a03e-65e7b318345d-registry-tls\") pod \"image-registry-66df7c8f76-nm8sh\" (UID: \"da5c4e91-8006-4b10-a03e-65e7b318345d\") " pod="openshift-image-registry/image-registry-66df7c8f76-nm8sh" Mar 09 13:05:16 crc kubenswrapper[4723]: I0309 13:05:16.279262 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/da5c4e91-8006-4b10-a03e-65e7b318345d-ca-trust-extracted\") pod \"image-registry-66df7c8f76-nm8sh\" (UID: \"da5c4e91-8006-4b10-a03e-65e7b318345d\") " pod="openshift-image-registry/image-registry-66df7c8f76-nm8sh" Mar 09 13:05:16 crc kubenswrapper[4723]: I0309 13:05:16.280282 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/da5c4e91-8006-4b10-a03e-65e7b318345d-registry-certificates\") pod \"image-registry-66df7c8f76-nm8sh\" (UID: \"da5c4e91-8006-4b10-a03e-65e7b318345d\") " pod="openshift-image-registry/image-registry-66df7c8f76-nm8sh" Mar 09 13:05:16 crc kubenswrapper[4723]: I0309 13:05:16.280906 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/da5c4e91-8006-4b10-a03e-65e7b318345d-trusted-ca\") pod \"image-registry-66df7c8f76-nm8sh\" (UID: \"da5c4e91-8006-4b10-a03e-65e7b318345d\") " pod="openshift-image-registry/image-registry-66df7c8f76-nm8sh" Mar 09 13:05:16 crc kubenswrapper[4723]: I0309 13:05:16.287750 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/da5c4e91-8006-4b10-a03e-65e7b318345d-installation-pull-secrets\") pod \"image-registry-66df7c8f76-nm8sh\" (UID: \"da5c4e91-8006-4b10-a03e-65e7b318345d\") " pod="openshift-image-registry/image-registry-66df7c8f76-nm8sh" Mar 09 13:05:16 crc kubenswrapper[4723]: I0309 13:05:16.296617 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/2a058d13-df7c-45fc-9c82-83cd7d61ffbd-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-lfdgl\" (UID: \"2a058d13-df7c-45fc-9c82-83cd7d61ffbd\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-lfdgl" Mar 09 13:05:16 crc kubenswrapper[4723]: I0309 13:05:16.298768 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/da5c4e91-8006-4b10-a03e-65e7b318345d-registry-tls\") pod \"image-registry-66df7c8f76-nm8sh\" (UID: \"da5c4e91-8006-4b10-a03e-65e7b318345d\") " pod="openshift-image-registry/image-registry-66df7c8f76-nm8sh" Mar 09 13:05:16 crc kubenswrapper[4723]: I0309 13:05:16.303576 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvck8\" (UniqueName: \"kubernetes.io/projected/da5c4e91-8006-4b10-a03e-65e7b318345d-kube-api-access-fvck8\") pod \"image-registry-66df7c8f76-nm8sh\" (UID: \"da5c4e91-8006-4b10-a03e-65e7b318345d\") " pod="openshift-image-registry/image-registry-66df7c8f76-nm8sh" Mar 09 13:05:16 crc kubenswrapper[4723]: I0309 13:05:16.306904 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/da5c4e91-8006-4b10-a03e-65e7b318345d-bound-sa-token\") pod \"image-registry-66df7c8f76-nm8sh\" (UID: \"da5c4e91-8006-4b10-a03e-65e7b318345d\") " pod="openshift-image-registry/image-registry-66df7c8f76-nm8sh" Mar 09 13:05:16 crc kubenswrapper[4723]: I0309 13:05:16.437460 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-nm8sh" Mar 09 13:05:16 crc kubenswrapper[4723]: I0309 13:05:16.458436 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-lfdgl" Mar 09 13:05:16 crc kubenswrapper[4723]: I0309 13:05:16.916800 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-nm8sh"] Mar 09 13:05:16 crc kubenswrapper[4723]: W0309 13:05:16.933123 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda5c4e91_8006_4b10_a03e_65e7b318345d.slice/crio-8c7a9e3d7097582a52ddf2027510a7f542c138a8891b812735afd02a770fdffe WatchSource:0}: Error finding container 8c7a9e3d7097582a52ddf2027510a7f542c138a8891b812735afd02a770fdffe: Status 404 returned error can't find the container with id 8c7a9e3d7097582a52ddf2027510a7f542c138a8891b812735afd02a770fdffe Mar 09 13:05:16 crc kubenswrapper[4723]: I0309 13:05:16.950019 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-nm8sh" event={"ID":"da5c4e91-8006-4b10-a03e-65e7b318345d","Type":"ContainerStarted","Data":"8c7a9e3d7097582a52ddf2027510a7f542c138a8891b812735afd02a770fdffe"} Mar 09 13:05:16 crc kubenswrapper[4723]: I0309 13:05:16.989757 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-lfdgl"] Mar 09 13:05:16 crc kubenswrapper[4723]: W0309 13:05:16.996394 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a058d13_df7c_45fc_9c82_83cd7d61ffbd.slice/crio-2a4826e02a5f3b99676a506f2fd52485e8bf129589ca2256215cdf5191a553aa WatchSource:0}: Error finding container 2a4826e02a5f3b99676a506f2fd52485e8bf129589ca2256215cdf5191a553aa: Status 404 returned error can't find the container with id 2a4826e02a5f3b99676a506f2fd52485e8bf129589ca2256215cdf5191a553aa Mar 09 13:05:17 crc kubenswrapper[4723]: I0309 13:05:17.963836 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-nm8sh" event={"ID":"da5c4e91-8006-4b10-a03e-65e7b318345d","Type":"ContainerStarted","Data":"7396c9736500c25bad5863e1de82b67592904d5cbbf47236de5ef4cf0e4eb75d"} Mar 09 13:05:17 crc kubenswrapper[4723]: I0309 13:05:17.964203 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-nm8sh" Mar 09 13:05:17 crc kubenswrapper[4723]: I0309 13:05:17.965118 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-lfdgl" event={"ID":"2a058d13-df7c-45fc-9c82-83cd7d61ffbd","Type":"ContainerStarted","Data":"2a4826e02a5f3b99676a506f2fd52485e8bf129589ca2256215cdf5191a553aa"} Mar 09 13:05:17 crc kubenswrapper[4723]: I0309 13:05:17.989560 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-nm8sh" podStartSLOduration=1.989542645 podStartE2EDuration="1.989542645s" podCreationTimestamp="2026-03-09 13:05:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:05:17.987360633 +0000 UTC m=+392.001828193" watchObservedRunningTime="2026-03-09 13:05:17.989542645 +0000 UTC m=+392.004010185" Mar 09 13:05:19 crc kubenswrapper[4723]: I0309 13:05:19.978810 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-lfdgl" event={"ID":"2a058d13-df7c-45fc-9c82-83cd7d61ffbd","Type":"ContainerStarted","Data":"b124684f72380ad60f10edbf37fb903d68bb0492a1b8300816aa3664ba32e1d4"} Mar 09 13:05:19 crc kubenswrapper[4723]: I0309 13:05:19.979673 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-lfdgl" Mar 09 13:05:19 crc kubenswrapper[4723]: I0309 13:05:19.984553 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-lfdgl" Mar 09 13:05:20 crc kubenswrapper[4723]: I0309 13:05:20.002409 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-lfdgl" podStartSLOduration=1.897401265 podStartE2EDuration="4.002383655s" podCreationTimestamp="2026-03-09 13:05:16 +0000 UTC" firstStartedPulling="2026-03-09 13:05:16.999900986 +0000 UTC m=+391.014368526" lastFinishedPulling="2026-03-09 13:05:19.104883376 +0000 UTC m=+393.119350916" observedRunningTime="2026-03-09 13:05:19.998570246 +0000 UTC m=+394.013037786" watchObservedRunningTime="2026-03-09 13:05:20.002383655 +0000 UTC m=+394.016851205" Mar 09 13:05:20 crc kubenswrapper[4723]: I0309 13:05:20.194528 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-8nl7t"] Mar 09 13:05:20 crc kubenswrapper[4723]: I0309 13:05:20.196417 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-8nl7t" Mar 09 13:05:20 crc kubenswrapper[4723]: I0309 13:05:20.204129 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 09 13:05:20 crc kubenswrapper[4723]: I0309 13:05:20.204275 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 09 13:05:20 crc kubenswrapper[4723]: I0309 13:05:20.204338 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-sn4xm" Mar 09 13:05:20 crc kubenswrapper[4723]: I0309 13:05:20.205215 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 09 13:05:20 crc kubenswrapper[4723]: I0309 13:05:20.208606 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-8nl7t"] Mar 09 13:05:20 crc kubenswrapper[4723]: I0309 13:05:20.328702 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/03e74036-b752-4a9e-9cab-287e8161439b-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-8nl7t\" (UID: \"03e74036-b752-4a9e-9cab-287e8161439b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-8nl7t" Mar 09 13:05:20 crc kubenswrapper[4723]: I0309 13:05:20.328741 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/03e74036-b752-4a9e-9cab-287e8161439b-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-8nl7t\" (UID: \"03e74036-b752-4a9e-9cab-287e8161439b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-8nl7t" Mar 09 13:05:20 crc kubenswrapper[4723]: I0309 13:05:20.328813 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/03e74036-b752-4a9e-9cab-287e8161439b-metrics-client-ca\") pod \"prometheus-operator-db54df47d-8nl7t\" (UID: \"03e74036-b752-4a9e-9cab-287e8161439b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-8nl7t" Mar 09 13:05:20 crc kubenswrapper[4723]: I0309 13:05:20.328849 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knvwj\" (UniqueName: \"kubernetes.io/projected/03e74036-b752-4a9e-9cab-287e8161439b-kube-api-access-knvwj\") pod \"prometheus-operator-db54df47d-8nl7t\" (UID: \"03e74036-b752-4a9e-9cab-287e8161439b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-8nl7t" Mar 09 13:05:20 crc kubenswrapper[4723]: I0309 13:05:20.429775 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/03e74036-b752-4a9e-9cab-287e8161439b-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-8nl7t\" (UID: \"03e74036-b752-4a9e-9cab-287e8161439b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-8nl7t" Mar 09 13:05:20 crc kubenswrapper[4723]: I0309 13:05:20.429888 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/03e74036-b752-4a9e-9cab-287e8161439b-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-8nl7t\" (UID: \"03e74036-b752-4a9e-9cab-287e8161439b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-8nl7t" Mar 09 13:05:20 crc kubenswrapper[4723]: I0309 13:05:20.430042 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/03e74036-b752-4a9e-9cab-287e8161439b-metrics-client-ca\") pod \"prometheus-operator-db54df47d-8nl7t\" (UID: \"03e74036-b752-4a9e-9cab-287e8161439b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-8nl7t" Mar 09 13:05:20 crc kubenswrapper[4723]: I0309 13:05:20.430108 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knvwj\" (UniqueName: \"kubernetes.io/projected/03e74036-b752-4a9e-9cab-287e8161439b-kube-api-access-knvwj\") pod \"prometheus-operator-db54df47d-8nl7t\" (UID: \"03e74036-b752-4a9e-9cab-287e8161439b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-8nl7t" Mar 09 13:05:20 crc kubenswrapper[4723]: I0309 13:05:20.431058 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/03e74036-b752-4a9e-9cab-287e8161439b-metrics-client-ca\") pod \"prometheus-operator-db54df47d-8nl7t\" (UID: \"03e74036-b752-4a9e-9cab-287e8161439b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-8nl7t" Mar 09 13:05:20 crc kubenswrapper[4723]: I0309 13:05:20.438691 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/03e74036-b752-4a9e-9cab-287e8161439b-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-8nl7t\" (UID: \"03e74036-b752-4a9e-9cab-287e8161439b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-8nl7t" Mar 09 13:05:20 crc kubenswrapper[4723]: I0309 13:05:20.441306 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/03e74036-b752-4a9e-9cab-287e8161439b-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-8nl7t\" (UID: \"03e74036-b752-4a9e-9cab-287e8161439b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-8nl7t" Mar 09 13:05:20 crc kubenswrapper[4723]: I0309 13:05:20.447574 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knvwj\" (UniqueName: \"kubernetes.io/projected/03e74036-b752-4a9e-9cab-287e8161439b-kube-api-access-knvwj\") pod \"prometheus-operator-db54df47d-8nl7t\" (UID: \"03e74036-b752-4a9e-9cab-287e8161439b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-8nl7t" Mar 09 13:05:20 crc kubenswrapper[4723]: I0309 13:05:20.521985 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-8nl7t" Mar 09 13:05:20 crc kubenswrapper[4723]: I0309 13:05:20.981588 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-8nl7t"] Mar 09 13:05:20 crc kubenswrapper[4723]: W0309 13:05:20.982158 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03e74036_b752_4a9e_9cab_287e8161439b.slice/crio-5860b511eda54460718db9579cf68421af536a699627f949b0fa8c758e2a5d74 WatchSource:0}: Error finding container 5860b511eda54460718db9579cf68421af536a699627f949b0fa8c758e2a5d74: Status 404 returned error can't find the container with id 5860b511eda54460718db9579cf68421af536a699627f949b0fa8c758e2a5d74 Mar 09 13:05:21 crc kubenswrapper[4723]: I0309 13:05:21.994060 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-8nl7t" event={"ID":"03e74036-b752-4a9e-9cab-287e8161439b","Type":"ContainerStarted","Data":"5860b511eda54460718db9579cf68421af536a699627f949b0fa8c758e2a5d74"} Mar 09 13:05:24 crc kubenswrapper[4723]: I0309 13:05:24.010653 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-8nl7t" event={"ID":"03e74036-b752-4a9e-9cab-287e8161439b","Type":"ContainerStarted","Data":"96c873fdc2db92b7ccf0c37b5ee71d45dafeb7f0097cc5190a6486a3cfebaaf7"} Mar 09 13:05:24 crc kubenswrapper[4723]: I0309 13:05:24.011060 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-8nl7t" event={"ID":"03e74036-b752-4a9e-9cab-287e8161439b","Type":"ContainerStarted","Data":"16020baab5c7cec59dc4070f88f44dfa9fc85126ac40c5c027528fe97e303b76"} Mar 09 13:05:24 crc kubenswrapper[4723]: I0309 13:05:24.032637 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-db54df47d-8nl7t" podStartSLOduration=2.026939491 podStartE2EDuration="4.032603296s" podCreationTimestamp="2026-03-09 13:05:20 +0000 UTC" firstStartedPulling="2026-03-09 13:05:20.987482024 +0000 UTC m=+395.001949564" lastFinishedPulling="2026-03-09 13:05:22.993145829 +0000 UTC m=+397.007613369" observedRunningTime="2026-03-09 13:05:24.028035015 +0000 UTC m=+398.042502595" watchObservedRunningTime="2026-03-09 13:05:24.032603296 +0000 UTC m=+398.047070886" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.560566 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-8rb7d"] Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.561953 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-8rb7d" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.564275 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.564312 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.564726 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-hr9x4" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.564966 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-fqfbd"] Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.566035 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-fqfbd" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.569875 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.569952 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.570081 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-97szq" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.579701 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-fqfbd"] Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.598619 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-bm76v"] Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.599636 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bm76v" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.603826 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.604253 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.604253 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.604271 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-8f8jz" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.623696 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6f3b28a7-093c-4d83-bf9e-aaf99a4c2e78-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-fqfbd\" (UID: \"6f3b28a7-093c-4d83-bf9e-aaf99a4c2e78\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-fqfbd" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.623771 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx4gg\" (UniqueName: \"kubernetes.io/projected/6f3b28a7-093c-4d83-bf9e-aaf99a4c2e78-kube-api-access-sx4gg\") pod \"openshift-state-metrics-566fddb674-fqfbd\" (UID: \"6f3b28a7-093c-4d83-bf9e-aaf99a4c2e78\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-fqfbd" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.623792 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d8zp\" (UniqueName: \"kubernetes.io/projected/f1d223c3-c810-4695-89e6-a6d5c32a1622-kube-api-access-8d8zp\") pod \"kube-state-metrics-777cb5bd5d-bm76v\" (UID: \"f1d223c3-c810-4695-89e6-a6d5c32a1622\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bm76v" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.632993 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/dd4048da-7a95-4dd8-91c6-a4aa8ad13a13-root\") pod \"node-exporter-8rb7d\" (UID: \"dd4048da-7a95-4dd8-91c6-a4aa8ad13a13\") " pod="openshift-monitoring/node-exporter-8rb7d" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.633023 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f1d223c3-c810-4695-89e6-a6d5c32a1622-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-bm76v\" (UID: \"f1d223c3-c810-4695-89e6-a6d5c32a1622\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bm76v" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.633056 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f1d223c3-c810-4695-89e6-a6d5c32a1622-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-bm76v\" (UID: \"f1d223c3-c810-4695-89e6-a6d5c32a1622\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bm76v" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.633075 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5krq6\" (UniqueName: \"kubernetes.io/projected/dd4048da-7a95-4dd8-91c6-a4aa8ad13a13-kube-api-access-5krq6\") pod \"node-exporter-8rb7d\" (UID: \"dd4048da-7a95-4dd8-91c6-a4aa8ad13a13\") " pod="openshift-monitoring/node-exporter-8rb7d" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.633102 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f1d223c3-c810-4695-89e6-a6d5c32a1622-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-bm76v\" (UID: \"f1d223c3-c810-4695-89e6-a6d5c32a1622\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bm76v" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.633135 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6f3b28a7-093c-4d83-bf9e-aaf99a4c2e78-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-fqfbd\" (UID: \"6f3b28a7-093c-4d83-bf9e-aaf99a4c2e78\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-fqfbd" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.633161 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/dd4048da-7a95-4dd8-91c6-a4aa8ad13a13-node-exporter-wtmp\") pod \"node-exporter-8rb7d\" (UID: \"dd4048da-7a95-4dd8-91c6-a4aa8ad13a13\") " pod="openshift-monitoring/node-exporter-8rb7d" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.633192 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dd4048da-7a95-4dd8-91c6-a4aa8ad13a13-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8rb7d\" (UID: \"dd4048da-7a95-4dd8-91c6-a4aa8ad13a13\") " pod="openshift-monitoring/node-exporter-8rb7d" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.633212 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/dd4048da-7a95-4dd8-91c6-a4aa8ad13a13-node-exporter-tls\") pod \"node-exporter-8rb7d\" (UID: \"dd4048da-7a95-4dd8-91c6-a4aa8ad13a13\") " pod="openshift-monitoring/node-exporter-8rb7d" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.633245 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/dd4048da-7a95-4dd8-91c6-a4aa8ad13a13-node-exporter-textfile\") pod \"node-exporter-8rb7d\" (UID: \"dd4048da-7a95-4dd8-91c6-a4aa8ad13a13\") " pod="openshift-monitoring/node-exporter-8rb7d" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.633262 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dd4048da-7a95-4dd8-91c6-a4aa8ad13a13-metrics-client-ca\") pod \"node-exporter-8rb7d\" (UID: \"dd4048da-7a95-4dd8-91c6-a4aa8ad13a13\") " pod="openshift-monitoring/node-exporter-8rb7d" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.633293 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dd4048da-7a95-4dd8-91c6-a4aa8ad13a13-sys\") pod \"node-exporter-8rb7d\" (UID: \"dd4048da-7a95-4dd8-91c6-a4aa8ad13a13\") " pod="openshift-monitoring/node-exporter-8rb7d" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.633320 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f1d223c3-c810-4695-89e6-a6d5c32a1622-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-bm76v\" (UID: \"f1d223c3-c810-4695-89e6-a6d5c32a1622\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bm76v" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.633367 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f1d223c3-c810-4695-89e6-a6d5c32a1622-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-bm76v\" (UID: \"f1d223c3-c810-4695-89e6-a6d5c32a1622\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bm76v" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.633402 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/6f3b28a7-093c-4d83-bf9e-aaf99a4c2e78-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-fqfbd\" (UID: \"6f3b28a7-093c-4d83-bf9e-aaf99a4c2e78\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-fqfbd" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.650649 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-bm76v"] Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.734874 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6f3b28a7-093c-4d83-bf9e-aaf99a4c2e78-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-fqfbd\" (UID: \"6f3b28a7-093c-4d83-bf9e-aaf99a4c2e78\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-fqfbd" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.734930 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx4gg\" (UniqueName: \"kubernetes.io/projected/6f3b28a7-093c-4d83-bf9e-aaf99a4c2e78-kube-api-access-sx4gg\") pod \"openshift-state-metrics-566fddb674-fqfbd\" (UID: \"6f3b28a7-093c-4d83-bf9e-aaf99a4c2e78\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-fqfbd" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.734956 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d8zp\" (UniqueName: \"kubernetes.io/projected/f1d223c3-c810-4695-89e6-a6d5c32a1622-kube-api-access-8d8zp\") pod \"kube-state-metrics-777cb5bd5d-bm76v\" (UID: \"f1d223c3-c810-4695-89e6-a6d5c32a1622\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bm76v" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.735013 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/dd4048da-7a95-4dd8-91c6-a4aa8ad13a13-root\") pod \"node-exporter-8rb7d\" (UID: \"dd4048da-7a95-4dd8-91c6-a4aa8ad13a13\") " pod="openshift-monitoring/node-exporter-8rb7d" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.735196 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/dd4048da-7a95-4dd8-91c6-a4aa8ad13a13-root\") pod \"node-exporter-8rb7d\" (UID: \"dd4048da-7a95-4dd8-91c6-a4aa8ad13a13\") " pod="openshift-monitoring/node-exporter-8rb7d" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.735280 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f1d223c3-c810-4695-89e6-a6d5c32a1622-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-bm76v\" (UID: \"f1d223c3-c810-4695-89e6-a6d5c32a1622\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bm76v" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.736098 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f1d223c3-c810-4695-89e6-a6d5c32a1622-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-bm76v\" (UID: \"f1d223c3-c810-4695-89e6-a6d5c32a1622\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bm76v" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.736155 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f1d223c3-c810-4695-89e6-a6d5c32a1622-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-bm76v\" (UID: \"f1d223c3-c810-4695-89e6-a6d5c32a1622\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bm76v" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.736173 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5krq6\" (UniqueName: \"kubernetes.io/projected/dd4048da-7a95-4dd8-91c6-a4aa8ad13a13-kube-api-access-5krq6\") pod \"node-exporter-8rb7d\" (UID: \"dd4048da-7a95-4dd8-91c6-a4aa8ad13a13\") " pod="openshift-monitoring/node-exporter-8rb7d" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.736485 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f1d223c3-c810-4695-89e6-a6d5c32a1622-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-bm76v\" (UID: \"f1d223c3-c810-4695-89e6-a6d5c32a1622\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bm76v" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.736680 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f1d223c3-c810-4695-89e6-a6d5c32a1622-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-bm76v\" (UID: \"f1d223c3-c810-4695-89e6-a6d5c32a1622\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bm76v" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.737181 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6f3b28a7-093c-4d83-bf9e-aaf99a4c2e78-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-fqfbd\" (UID: \"6f3b28a7-093c-4d83-bf9e-aaf99a4c2e78\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-fqfbd" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.737272 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/dd4048da-7a95-4dd8-91c6-a4aa8ad13a13-node-exporter-wtmp\") pod \"node-exporter-8rb7d\" (UID: \"dd4048da-7a95-4dd8-91c6-a4aa8ad13a13\") " pod="openshift-monitoring/node-exporter-8rb7d" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.737346 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/dd4048da-7a95-4dd8-91c6-a4aa8ad13a13-node-exporter-tls\") pod \"node-exporter-8rb7d\" (UID: \"dd4048da-7a95-4dd8-91c6-a4aa8ad13a13\") " pod="openshift-monitoring/node-exporter-8rb7d" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.737368 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dd4048da-7a95-4dd8-91c6-a4aa8ad13a13-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8rb7d\" (UID: \"dd4048da-7a95-4dd8-91c6-a4aa8ad13a13\") " pod="openshift-monitoring/node-exporter-8rb7d" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.737401 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/dd4048da-7a95-4dd8-91c6-a4aa8ad13a13-node-exporter-textfile\") pod \"node-exporter-8rb7d\" (UID: \"dd4048da-7a95-4dd8-91c6-a4aa8ad13a13\") " pod="openshift-monitoring/node-exporter-8rb7d" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.737428 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dd4048da-7a95-4dd8-91c6-a4aa8ad13a13-metrics-client-ca\") pod \"node-exporter-8rb7d\" (UID: \"dd4048da-7a95-4dd8-91c6-a4aa8ad13a13\") " pod="openshift-monitoring/node-exporter-8rb7d" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.737457 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dd4048da-7a95-4dd8-91c6-a4aa8ad13a13-sys\") pod \"node-exporter-8rb7d\" (UID: \"dd4048da-7a95-4dd8-91c6-a4aa8ad13a13\") " pod="openshift-monitoring/node-exporter-8rb7d" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.737491 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f1d223c3-c810-4695-89e6-a6d5c32a1622-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-bm76v\" (UID: \"f1d223c3-c810-4695-89e6-a6d5c32a1622\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bm76v" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.737533 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f1d223c3-c810-4695-89e6-a6d5c32a1622-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-bm76v\" (UID: \"f1d223c3-c810-4695-89e6-a6d5c32a1622\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bm76v" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.737572 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/6f3b28a7-093c-4d83-bf9e-aaf99a4c2e78-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-fqfbd\" (UID: \"6f3b28a7-093c-4d83-bf9e-aaf99a4c2e78\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-fqfbd" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.737922 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/dd4048da-7a95-4dd8-91c6-a4aa8ad13a13-node-exporter-textfile\") pod \"node-exporter-8rb7d\" (UID: \"dd4048da-7a95-4dd8-91c6-a4aa8ad13a13\") " pod="openshift-monitoring/node-exporter-8rb7d" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.737981 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dd4048da-7a95-4dd8-91c6-a4aa8ad13a13-sys\") pod \"node-exporter-8rb7d\" (UID: \"dd4048da-7a95-4dd8-91c6-a4aa8ad13a13\") " pod="openshift-monitoring/node-exporter-8rb7d" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.738621 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/dd4048da-7a95-4dd8-91c6-a4aa8ad13a13-node-exporter-wtmp\") pod \"node-exporter-8rb7d\" (UID: \"dd4048da-7a95-4dd8-91c6-a4aa8ad13a13\") " pod="openshift-monitoring/node-exporter-8rb7d" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.737982 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6f3b28a7-093c-4d83-bf9e-aaf99a4c2e78-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-fqfbd\" (UID: \"6f3b28a7-093c-4d83-bf9e-aaf99a4c2e78\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-fqfbd" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.738933 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f1d223c3-c810-4695-89e6-a6d5c32a1622-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-bm76v\" (UID: \"f1d223c3-c810-4695-89e6-a6d5c32a1622\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bm76v" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.739375 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dd4048da-7a95-4dd8-91c6-a4aa8ad13a13-metrics-client-ca\") pod \"node-exporter-8rb7d\" (UID: \"dd4048da-7a95-4dd8-91c6-a4aa8ad13a13\") " pod="openshift-monitoring/node-exporter-8rb7d" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.743437 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dd4048da-7a95-4dd8-91c6-a4aa8ad13a13-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8rb7d\" (UID: \"dd4048da-7a95-4dd8-91c6-a4aa8ad13a13\") " pod="openshift-monitoring/node-exporter-8rb7d" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.743524 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f1d223c3-c810-4695-89e6-a6d5c32a1622-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-bm76v\" (UID: \"f1d223c3-c810-4695-89e6-a6d5c32a1622\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bm76v" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.743626 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f1d223c3-c810-4695-89e6-a6d5c32a1622-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-bm76v\" (UID: \"f1d223c3-c810-4695-89e6-a6d5c32a1622\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bm76v" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.747398 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/6f3b28a7-093c-4d83-bf9e-aaf99a4c2e78-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-fqfbd\" (UID: \"6f3b28a7-093c-4d83-bf9e-aaf99a4c2e78\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-fqfbd" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.748212 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6f3b28a7-093c-4d83-bf9e-aaf99a4c2e78-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-fqfbd\" (UID: \"6f3b28a7-093c-4d83-bf9e-aaf99a4c2e78\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-fqfbd" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.752149 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/dd4048da-7a95-4dd8-91c6-a4aa8ad13a13-node-exporter-tls\") pod \"node-exporter-8rb7d\" (UID: \"dd4048da-7a95-4dd8-91c6-a4aa8ad13a13\") " pod="openshift-monitoring/node-exporter-8rb7d" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.760899 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5krq6\" (UniqueName: \"kubernetes.io/projected/dd4048da-7a95-4dd8-91c6-a4aa8ad13a13-kube-api-access-5krq6\") pod \"node-exporter-8rb7d\" (UID: \"dd4048da-7a95-4dd8-91c6-a4aa8ad13a13\") " pod="openshift-monitoring/node-exporter-8rb7d" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.761193 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d8zp\" (UniqueName: \"kubernetes.io/projected/f1d223c3-c810-4695-89e6-a6d5c32a1622-kube-api-access-8d8zp\") pod \"kube-state-metrics-777cb5bd5d-bm76v\" (UID: \"f1d223c3-c810-4695-89e6-a6d5c32a1622\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bm76v" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.761822 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx4gg\" (UniqueName: \"kubernetes.io/projected/6f3b28a7-093c-4d83-bf9e-aaf99a4c2e78-kube-api-access-sx4gg\") pod \"openshift-state-metrics-566fddb674-fqfbd\" (UID: \"6f3b28a7-093c-4d83-bf9e-aaf99a4c2e78\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-fqfbd" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.879295 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-8rb7d" Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.886370 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-fqfbd" Mar 09 13:05:25 crc kubenswrapper[4723]: W0309 13:05:25.911109 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd4048da_7a95_4dd8_91c6_a4aa8ad13a13.slice/crio-bf5ac66981f961aee2a13966ed3b00733e6f69dcb26cb315444743a1cc45cda1 WatchSource:0}: Error finding container bf5ac66981f961aee2a13966ed3b00733e6f69dcb26cb315444743a1cc45cda1: Status 404 returned error can't find the container with id bf5ac66981f961aee2a13966ed3b00733e6f69dcb26cb315444743a1cc45cda1 Mar 09 13:05:25 crc kubenswrapper[4723]: I0309 13:05:25.914997 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bm76v" Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.026027 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8rb7d" event={"ID":"dd4048da-7a95-4dd8-91c6-a4aa8ad13a13","Type":"ContainerStarted","Data":"bf5ac66981f961aee2a13966ed3b00733e6f69dcb26cb315444743a1cc45cda1"} Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.325828 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-fqfbd"] Mar 09 13:05:26 crc kubenswrapper[4723]: W0309 13:05:26.333955 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f3b28a7_093c_4d83_bf9e_aaf99a4c2e78.slice/crio-317d5504c9ced5bfde06122eb6658a903def6751959cc32ce07803704b0acf41 WatchSource:0}: Error finding container 317d5504c9ced5bfde06122eb6658a903def6751959cc32ce07803704b0acf41: Status 404 returned error can't find the container with id 317d5504c9ced5bfde06122eb6658a903def6751959cc32ce07803704b0acf41 Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.426007 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-bm76v"] Mar 09 13:05:26 crc kubenswrapper[4723]: W0309 13:05:26.431709 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1d223c3_c810_4695_89e6_a6d5c32a1622.slice/crio-099823380a82b15c38a7d4294ca188a90aa0beeb592a4528925d3683fd6fd6b8 WatchSource:0}: Error finding container 099823380a82b15c38a7d4294ca188a90aa0beeb592a4528925d3683fd6fd6b8: Status 404 returned error can't find the container with id 099823380a82b15c38a7d4294ca188a90aa0beeb592a4528925d3683fd6fd6b8 Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.651518 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.653784 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.658268 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.658285 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.658306 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.658331 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.658492 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.658504 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.658648 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.659286 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-6qdf9" Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.663295 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.677360 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.758608 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/a789f366-e326-40e6-b705-6b24be86d982-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"a789f366-e326-40e6-b705-6b24be86d982\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.758675 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a789f366-e326-40e6-b705-6b24be86d982-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"a789f366-e326-40e6-b705-6b24be86d982\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.758785 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a789f366-e326-40e6-b705-6b24be86d982-config-out\") pod \"alertmanager-main-0\" (UID: \"a789f366-e326-40e6-b705-6b24be86d982\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.758867 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/a789f366-e326-40e6-b705-6b24be86d982-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"a789f366-e326-40e6-b705-6b24be86d982\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.758895 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a789f366-e326-40e6-b705-6b24be86d982-tls-assets\") pod \"alertmanager-main-0\" (UID: \"a789f366-e326-40e6-b705-6b24be86d982\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.758923 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a789f366-e326-40e6-b705-6b24be86d982-web-config\") pod \"alertmanager-main-0\" (UID: \"a789f366-e326-40e6-b705-6b24be86d982\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.759038 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/a789f366-e326-40e6-b705-6b24be86d982-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"a789f366-e326-40e6-b705-6b24be86d982\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.759101 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfjjf\" (UniqueName: \"kubernetes.io/projected/a789f366-e326-40e6-b705-6b24be86d982-kube-api-access-lfjjf\") pod \"alertmanager-main-0\" (UID: \"a789f366-e326-40e6-b705-6b24be86d982\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.759132 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a789f366-e326-40e6-b705-6b24be86d982-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"a789f366-e326-40e6-b705-6b24be86d982\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.759153 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a789f366-e326-40e6-b705-6b24be86d982-config-volume\") pod \"alertmanager-main-0\" (UID: \"a789f366-e326-40e6-b705-6b24be86d982\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.759175 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a789f366-e326-40e6-b705-6b24be86d982-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"a789f366-e326-40e6-b705-6b24be86d982\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.759208 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a789f366-e326-40e6-b705-6b24be86d982-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"a789f366-e326-40e6-b705-6b24be86d982\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.860427 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfjjf\" (UniqueName: \"kubernetes.io/projected/a789f366-e326-40e6-b705-6b24be86d982-kube-api-access-lfjjf\") pod \"alertmanager-main-0\" (UID: \"a789f366-e326-40e6-b705-6b24be86d982\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.860518 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a789f366-e326-40e6-b705-6b24be86d982-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"a789f366-e326-40e6-b705-6b24be86d982\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.860544 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a789f366-e326-40e6-b705-6b24be86d982-config-volume\") pod \"alertmanager-main-0\" (UID: \"a789f366-e326-40e6-b705-6b24be86d982\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.860595 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a789f366-e326-40e6-b705-6b24be86d982-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"a789f366-e326-40e6-b705-6b24be86d982\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.860684 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a789f366-e326-40e6-b705-6b24be86d982-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"a789f366-e326-40e6-b705-6b24be86d982\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.861968 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/a789f366-e326-40e6-b705-6b24be86d982-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"a789f366-e326-40e6-b705-6b24be86d982\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.861897 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a789f366-e326-40e6-b705-6b24be86d982-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"a789f366-e326-40e6-b705-6b24be86d982\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.861588 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a789f366-e326-40e6-b705-6b24be86d982-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"a789f366-e326-40e6-b705-6b24be86d982\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.862299 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/a789f366-e326-40e6-b705-6b24be86d982-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"a789f366-e326-40e6-b705-6b24be86d982\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.862448 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a789f366-e326-40e6-b705-6b24be86d982-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"a789f366-e326-40e6-b705-6b24be86d982\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.862873 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a789f366-e326-40e6-b705-6b24be86d982-config-out\") pod \"alertmanager-main-0\" (UID: \"a789f366-e326-40e6-b705-6b24be86d982\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.862939 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/a789f366-e326-40e6-b705-6b24be86d982-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"a789f366-e326-40e6-b705-6b24be86d982\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.862969 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a789f366-e326-40e6-b705-6b24be86d982-tls-assets\") pod \"alertmanager-main-0\" (UID: \"a789f366-e326-40e6-b705-6b24be86d982\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.863047 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a789f366-e326-40e6-b705-6b24be86d982-web-config\") pod \"alertmanager-main-0\" (UID: \"a789f366-e326-40e6-b705-6b24be86d982\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.863103 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/a789f366-e326-40e6-b705-6b24be86d982-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"a789f366-e326-40e6-b705-6b24be86d982\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 13:05:26 crc kubenswrapper[4723]: E0309 13:05:26.864616 4723 secret.go:188] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Mar 09 13:05:26 crc kubenswrapper[4723]: E0309 13:05:26.864701 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a789f366-e326-40e6-b705-6b24be86d982-secret-alertmanager-main-tls podName:a789f366-e326-40e6-b705-6b24be86d982 nodeName:}" failed. No retries permitted until 2026-03-09 13:05:27.364681494 +0000 UTC m=+401.379149094 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/a789f366-e326-40e6-b705-6b24be86d982-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "a789f366-e326-40e6-b705-6b24be86d982") : secret "alertmanager-main-tls" not found Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.868686 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a789f366-e326-40e6-b705-6b24be86d982-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"a789f366-e326-40e6-b705-6b24be86d982\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.868901 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a789f366-e326-40e6-b705-6b24be86d982-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"a789f366-e326-40e6-b705-6b24be86d982\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.869355 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/a789f366-e326-40e6-b705-6b24be86d982-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"a789f366-e326-40e6-b705-6b24be86d982\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.869598 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a789f366-e326-40e6-b705-6b24be86d982-config-volume\") pod \"alertmanager-main-0\" (UID: \"a789f366-e326-40e6-b705-6b24be86d982\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.870180 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a789f366-e326-40e6-b705-6b24be86d982-tls-assets\") pod \"alertmanager-main-0\" (UID: \"a789f366-e326-40e6-b705-6b24be86d982\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.880233 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a789f366-e326-40e6-b705-6b24be86d982-config-out\") pod \"alertmanager-main-0\" (UID: \"a789f366-e326-40e6-b705-6b24be86d982\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.880344 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a789f366-e326-40e6-b705-6b24be86d982-web-config\") pod \"alertmanager-main-0\" (UID: \"a789f366-e326-40e6-b705-6b24be86d982\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 13:05:26 crc kubenswrapper[4723]: I0309 13:05:26.893023 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfjjf\" (UniqueName: \"kubernetes.io/projected/a789f366-e326-40e6-b705-6b24be86d982-kube-api-access-lfjjf\") pod \"alertmanager-main-0\" (UID: \"a789f366-e326-40e6-b705-6b24be86d982\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 13:05:27 crc kubenswrapper[4723]: I0309 13:05:27.031944 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bm76v" event={"ID":"f1d223c3-c810-4695-89e6-a6d5c32a1622","Type":"ContainerStarted","Data":"099823380a82b15c38a7d4294ca188a90aa0beeb592a4528925d3683fd6fd6b8"} Mar 09 13:05:27 crc kubenswrapper[4723]: I0309 13:05:27.033496 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-fqfbd" event={"ID":"6f3b28a7-093c-4d83-bf9e-aaf99a4c2e78","Type":"ContainerStarted","Data":"bd111ff91ddf552e7fa5b5464bebeb46b0ac444ef48c477f6cfe9e42dc864c59"} Mar 09 13:05:27 crc kubenswrapper[4723]: I0309 13:05:27.033520 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-fqfbd" event={"ID":"6f3b28a7-093c-4d83-bf9e-aaf99a4c2e78","Type":"ContainerStarted","Data":"5f9c978311b3aff1d80d3beba19267528c4f3b0a067561b7b099deb76641e361"} Mar 09 13:05:27 crc kubenswrapper[4723]: I0309 13:05:27.033529 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-fqfbd" event={"ID":"6f3b28a7-093c-4d83-bf9e-aaf99a4c2e78","Type":"ContainerStarted","Data":"317d5504c9ced5bfde06122eb6658a903def6751959cc32ce07803704b0acf41"} Mar 09 13:05:27 crc kubenswrapper[4723]: I0309 13:05:27.369548 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/a789f366-e326-40e6-b705-6b24be86d982-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"a789f366-e326-40e6-b705-6b24be86d982\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 13:05:27 crc kubenswrapper[4723]: E0309 13:05:27.369712 4723 secret.go:188] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Mar 09 13:05:27 crc kubenswrapper[4723]: E0309 13:05:27.369927 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a789f366-e326-40e6-b705-6b24be86d982-secret-alertmanager-main-tls podName:a789f366-e326-40e6-b705-6b24be86d982 nodeName:}" failed. No retries permitted until 2026-03-09 13:05:28.369913402 +0000 UTC m=+402.384380942 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/a789f366-e326-40e6-b705-6b24be86d982-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "a789f366-e326-40e6-b705-6b24be86d982") : secret "alertmanager-main-tls" not found Mar 09 13:05:27 crc kubenswrapper[4723]: I0309 13:05:27.665047 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-f994cb665-42jsl"] Mar 09 13:05:27 crc kubenswrapper[4723]: I0309 13:05:27.667512 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-f994cb665-42jsl" Mar 09 13:05:27 crc kubenswrapper[4723]: I0309 13:05:27.672195 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Mar 09 13:05:27 crc kubenswrapper[4723]: I0309 13:05:27.672447 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Mar 09 13:05:27 crc kubenswrapper[4723]: I0309 13:05:27.673589 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Mar 09 13:05:27 crc kubenswrapper[4723]: I0309 13:05:27.673982 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-dockercfg-lz8hs" Mar 09 13:05:27 crc kubenswrapper[4723]: I0309 13:05:27.674483 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Mar 09 13:05:27 crc kubenswrapper[4723]: I0309 13:05:27.674538 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-9aht4i07v7jqt" Mar 09 13:05:27 crc kubenswrapper[4723]: I0309 13:05:27.674794 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Mar 09 13:05:27 crc kubenswrapper[4723]: I0309 13:05:27.687682 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-f994cb665-42jsl"] Mar 09 13:05:27 crc kubenswrapper[4723]: I0309 13:05:27.792083 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/338186cb-4546-4740-bba3-c1c430d8aacc-secret-thanos-querier-tls\") pod \"thanos-querier-f994cb665-42jsl\" (UID: \"338186cb-4546-4740-bba3-c1c430d8aacc\") " pod="openshift-monitoring/thanos-querier-f994cb665-42jsl" Mar 09 13:05:27 crc kubenswrapper[4723]: I0309 13:05:27.792382 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/338186cb-4546-4740-bba3-c1c430d8aacc-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-f994cb665-42jsl\" (UID: \"338186cb-4546-4740-bba3-c1c430d8aacc\") " pod="openshift-monitoring/thanos-querier-f994cb665-42jsl" Mar 09 13:05:27 crc kubenswrapper[4723]: I0309 13:05:27.792725 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/338186cb-4546-4740-bba3-c1c430d8aacc-metrics-client-ca\") pod \"thanos-querier-f994cb665-42jsl\" (UID: \"338186cb-4546-4740-bba3-c1c430d8aacc\") " pod="openshift-monitoring/thanos-querier-f994cb665-42jsl" Mar 09 13:05:27 crc kubenswrapper[4723]: I0309 13:05:27.793011 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/338186cb-4546-4740-bba3-c1c430d8aacc-secret-grpc-tls\") pod \"thanos-querier-f994cb665-42jsl\" (UID: \"338186cb-4546-4740-bba3-c1c430d8aacc\") " pod="openshift-monitoring/thanos-querier-f994cb665-42jsl" Mar 09 13:05:27 crc kubenswrapper[4723]: I0309 13:05:27.793310 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/338186cb-4546-4740-bba3-c1c430d8aacc-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-f994cb665-42jsl\" (UID: \"338186cb-4546-4740-bba3-c1c430d8aacc\") " pod="openshift-monitoring/thanos-querier-f994cb665-42jsl" Mar 09 13:05:27 crc kubenswrapper[4723]: I0309 13:05:27.793494 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl4gf\" (UniqueName: \"kubernetes.io/projected/338186cb-4546-4740-bba3-c1c430d8aacc-kube-api-access-bl4gf\") pod \"thanos-querier-f994cb665-42jsl\" (UID: \"338186cb-4546-4740-bba3-c1c430d8aacc\") " pod="openshift-monitoring/thanos-querier-f994cb665-42jsl" Mar 09 13:05:27 crc kubenswrapper[4723]: I0309 13:05:27.793627 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/338186cb-4546-4740-bba3-c1c430d8aacc-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-f994cb665-42jsl\" (UID: \"338186cb-4546-4740-bba3-c1c430d8aacc\") " pod="openshift-monitoring/thanos-querier-f994cb665-42jsl" Mar 09 13:05:27 crc kubenswrapper[4723]: I0309 13:05:27.793790 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/338186cb-4546-4740-bba3-c1c430d8aacc-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-f994cb665-42jsl\" (UID: \"338186cb-4546-4740-bba3-c1c430d8aacc\") " pod="openshift-monitoring/thanos-querier-f994cb665-42jsl" Mar 09 13:05:27 crc kubenswrapper[4723]: I0309 13:05:27.896442 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/338186cb-4546-4740-bba3-c1c430d8aacc-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-f994cb665-42jsl\" (UID: \"338186cb-4546-4740-bba3-c1c430d8aacc\") " pod="openshift-monitoring/thanos-querier-f994cb665-42jsl" Mar 09 13:05:27 crc kubenswrapper[4723]: I0309 13:05:27.898281 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/338186cb-4546-4740-bba3-c1c430d8aacc-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-f994cb665-42jsl\" (UID: \"338186cb-4546-4740-bba3-c1c430d8aacc\") " pod="openshift-monitoring/thanos-querier-f994cb665-42jsl" Mar 09 13:05:27 crc kubenswrapper[4723]: I0309 13:05:27.899723 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/338186cb-4546-4740-bba3-c1c430d8aacc-secret-thanos-querier-tls\") pod \"thanos-querier-f994cb665-42jsl\" (UID: \"338186cb-4546-4740-bba3-c1c430d8aacc\") " pod="openshift-monitoring/thanos-querier-f994cb665-42jsl" Mar 09 13:05:27 crc kubenswrapper[4723]: I0309 13:05:27.899763 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/338186cb-4546-4740-bba3-c1c430d8aacc-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-f994cb665-42jsl\" (UID: \"338186cb-4546-4740-bba3-c1c430d8aacc\") " pod="openshift-monitoring/thanos-querier-f994cb665-42jsl" Mar 09 13:05:27 crc kubenswrapper[4723]: I0309 13:05:27.899873 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/338186cb-4546-4740-bba3-c1c430d8aacc-metrics-client-ca\") pod \"thanos-querier-f994cb665-42jsl\" (UID: \"338186cb-4546-4740-bba3-c1c430d8aacc\") " pod="openshift-monitoring/thanos-querier-f994cb665-42jsl" Mar 09 13:05:27 crc kubenswrapper[4723]: I0309 13:05:27.900242 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/338186cb-4546-4740-bba3-c1c430d8aacc-secret-grpc-tls\") pod \"thanos-querier-f994cb665-42jsl\" (UID: \"338186cb-4546-4740-bba3-c1c430d8aacc\") " pod="openshift-monitoring/thanos-querier-f994cb665-42jsl" Mar 09 13:05:27 crc kubenswrapper[4723]: I0309 13:05:27.900426 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/338186cb-4546-4740-bba3-c1c430d8aacc-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-f994cb665-42jsl\" (UID: \"338186cb-4546-4740-bba3-c1c430d8aacc\") " pod="openshift-monitoring/thanos-querier-f994cb665-42jsl" Mar 09 13:05:27 crc kubenswrapper[4723]: I0309 13:05:27.900461 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bl4gf\" (UniqueName: \"kubernetes.io/projected/338186cb-4546-4740-bba3-c1c430d8aacc-kube-api-access-bl4gf\") pod \"thanos-querier-f994cb665-42jsl\" (UID: \"338186cb-4546-4740-bba3-c1c430d8aacc\") " pod="openshift-monitoring/thanos-querier-f994cb665-42jsl" Mar 09 13:05:27 crc kubenswrapper[4723]: I0309 13:05:27.901179 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/338186cb-4546-4740-bba3-c1c430d8aacc-metrics-client-ca\") pod \"thanos-querier-f994cb665-42jsl\" (UID: \"338186cb-4546-4740-bba3-c1c430d8aacc\") " pod="openshift-monitoring/thanos-querier-f994cb665-42jsl" Mar 09 13:05:27 crc kubenswrapper[4723]: I0309 13:05:27.903765 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/338186cb-4546-4740-bba3-c1c430d8aacc-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-f994cb665-42jsl\" (UID: \"338186cb-4546-4740-bba3-c1c430d8aacc\") " pod="openshift-monitoring/thanos-querier-f994cb665-42jsl" Mar 09 13:05:27 crc kubenswrapper[4723]: I0309 13:05:27.903765 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/338186cb-4546-4740-bba3-c1c430d8aacc-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-f994cb665-42jsl\" (UID: \"338186cb-4546-4740-bba3-c1c430d8aacc\") " pod="openshift-monitoring/thanos-querier-f994cb665-42jsl" Mar 09 13:05:27 crc kubenswrapper[4723]: I0309 13:05:27.904438 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/338186cb-4546-4740-bba3-c1c430d8aacc-secret-thanos-querier-tls\") pod \"thanos-querier-f994cb665-42jsl\" (UID: \"338186cb-4546-4740-bba3-c1c430d8aacc\") " pod="openshift-monitoring/thanos-querier-f994cb665-42jsl" Mar 09 13:05:27 crc kubenswrapper[4723]: I0309 13:05:27.908957 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/338186cb-4546-4740-bba3-c1c430d8aacc-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-f994cb665-42jsl\" (UID: \"338186cb-4546-4740-bba3-c1c430d8aacc\") " pod="openshift-monitoring/thanos-querier-f994cb665-42jsl" Mar 09 13:05:27 crc kubenswrapper[4723]: I0309 13:05:27.911598 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/338186cb-4546-4740-bba3-c1c430d8aacc-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-f994cb665-42jsl\" (UID: \"338186cb-4546-4740-bba3-c1c430d8aacc\") " pod="openshift-monitoring/thanos-querier-f994cb665-42jsl" Mar 09 13:05:27 crc kubenswrapper[4723]: I0309 13:05:27.922960 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl4gf\" (UniqueName: \"kubernetes.io/projected/338186cb-4546-4740-bba3-c1c430d8aacc-kube-api-access-bl4gf\") pod \"thanos-querier-f994cb665-42jsl\" (UID: \"338186cb-4546-4740-bba3-c1c430d8aacc\") " pod="openshift-monitoring/thanos-querier-f994cb665-42jsl" Mar 09 13:05:27 crc kubenswrapper[4723]: I0309 13:05:27.932141 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/338186cb-4546-4740-bba3-c1c430d8aacc-secret-grpc-tls\") pod \"thanos-querier-f994cb665-42jsl\" (UID: \"338186cb-4546-4740-bba3-c1c430d8aacc\") " pod="openshift-monitoring/thanos-querier-f994cb665-42jsl" Mar 09 13:05:28 crc kubenswrapper[4723]: I0309 13:05:28.003999 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-f994cb665-42jsl" Mar 09 13:05:28 crc kubenswrapper[4723]: I0309 13:05:28.041772 4723 generic.go:334] "Generic (PLEG): container finished" podID="dd4048da-7a95-4dd8-91c6-a4aa8ad13a13" containerID="9d43f4ad98d8ed2d677b63da06d87c7b3845dc28a893678d00066877662e8aa7" exitCode=0 Mar 09 13:05:28 crc kubenswrapper[4723]: I0309 13:05:28.041830 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8rb7d" event={"ID":"dd4048da-7a95-4dd8-91c6-a4aa8ad13a13","Type":"ContainerDied","Data":"9d43f4ad98d8ed2d677b63da06d87c7b3845dc28a893678d00066877662e8aa7"} Mar 09 13:05:28 crc kubenswrapper[4723]: I0309 13:05:28.410652 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/a789f366-e326-40e6-b705-6b24be86d982-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"a789f366-e326-40e6-b705-6b24be86d982\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 13:05:28 crc kubenswrapper[4723]: I0309 13:05:28.413644 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/a789f366-e326-40e6-b705-6b24be86d982-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"a789f366-e326-40e6-b705-6b24be86d982\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 13:05:28 crc kubenswrapper[4723]: I0309 13:05:28.472554 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 09 13:05:29 crc kubenswrapper[4723]: I0309 13:05:29.049831 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bm76v" event={"ID":"f1d223c3-c810-4695-89e6-a6d5c32a1622","Type":"ContainerStarted","Data":"83e22f5321f52271086ac9e1ee14db47020bf92a444cd5db047bb9fdd59edf09"} Mar 09 13:05:29 crc kubenswrapper[4723]: I0309 13:05:29.050390 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bm76v" event={"ID":"f1d223c3-c810-4695-89e6-a6d5c32a1622","Type":"ContainerStarted","Data":"5094578ef448d67a7fcac3b3fc939cb557060f377328438456b03d3cf2108a24"} Mar 09 13:05:29 crc kubenswrapper[4723]: I0309 13:05:29.058122 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8rb7d" event={"ID":"dd4048da-7a95-4dd8-91c6-a4aa8ad13a13","Type":"ContainerStarted","Data":"39dad3d3c9cf7b9e269d50a0114e2151d9c7cde5fefa537cf32a5dd9d341fc1c"} Mar 09 13:05:29 crc kubenswrapper[4723]: I0309 13:05:29.058160 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8rb7d" event={"ID":"dd4048da-7a95-4dd8-91c6-a4aa8ad13a13","Type":"ContainerStarted","Data":"92f6effce391dc858bff7428661cc177148116a199002f1ac2e6746444db3543"} Mar 09 13:05:29 crc kubenswrapper[4723]: I0309 13:05:29.061180 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-fqfbd" event={"ID":"6f3b28a7-093c-4d83-bf9e-aaf99a4c2e78","Type":"ContainerStarted","Data":"92ae00cf30083d2886721eb98a49f425340523eac7633cdec635ed616b850136"} Mar 09 13:05:29 crc kubenswrapper[4723]: I0309 13:05:29.074100 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-8rb7d" podStartSLOduration=2.8009262230000003 podStartE2EDuration="4.074082977s" podCreationTimestamp="2026-03-09 13:05:25 +0000 UTC" firstStartedPulling="2026-03-09 13:05:25.912607621 +0000 UTC m=+399.927075181" lastFinishedPulling="2026-03-09 13:05:27.185764395 +0000 UTC m=+401.200231935" observedRunningTime="2026-03-09 13:05:29.072956635 +0000 UTC m=+403.087424175" watchObservedRunningTime="2026-03-09 13:05:29.074082977 +0000 UTC m=+403.088550517" Mar 09 13:05:29 crc kubenswrapper[4723]: I0309 13:05:29.081791 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-f994cb665-42jsl"] Mar 09 13:05:29 crc kubenswrapper[4723]: W0309 13:05:29.096408 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod338186cb_4546_4740_bba3_c1c430d8aacc.slice/crio-82669eb80522958ce2276787c662a7c64d4af66dbad0c4513cf57bc28251df34 WatchSource:0}: Error finding container 82669eb80522958ce2276787c662a7c64d4af66dbad0c4513cf57bc28251df34: Status 404 returned error can't find the container with id 82669eb80522958ce2276787c662a7c64d4af66dbad0c4513cf57bc28251df34 Mar 09 13:05:29 crc kubenswrapper[4723]: I0309 13:05:29.102459 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-566fddb674-fqfbd" podStartSLOduration=2.095714504 podStartE2EDuration="4.10243966s" podCreationTimestamp="2026-03-09 13:05:25 +0000 UTC" firstStartedPulling="2026-03-09 13:05:26.594561733 +0000 UTC m=+400.609029273" lastFinishedPulling="2026-03-09 13:05:28.601286889 +0000 UTC m=+402.615754429" observedRunningTime="2026-03-09 13:05:29.099958449 +0000 UTC m=+403.114425989" watchObservedRunningTime="2026-03-09 13:05:29.10243966 +0000 UTC m=+403.116907200" Mar 09 13:05:29 crc kubenswrapper[4723]: I0309 13:05:29.224298 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 09 13:05:29 crc kubenswrapper[4723]: W0309 13:05:29.239969 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda789f366_e326_40e6_b705_6b24be86d982.slice/crio-ebcb8ba619ce228c217f00cdd2f6f29c06609844a5122c0c78fb9f71983df5a3 WatchSource:0}: Error finding container ebcb8ba619ce228c217f00cdd2f6f29c06609844a5122c0c78fb9f71983df5a3: Status 404 returned error can't find the container with id ebcb8ba619ce228c217f00cdd2f6f29c06609844a5122c0c78fb9f71983df5a3 Mar 09 13:05:30 crc kubenswrapper[4723]: I0309 13:05:30.069176 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-f994cb665-42jsl" event={"ID":"338186cb-4546-4740-bba3-c1c430d8aacc","Type":"ContainerStarted","Data":"82669eb80522958ce2276787c662a7c64d4af66dbad0c4513cf57bc28251df34"} Mar 09 13:05:30 crc kubenswrapper[4723]: I0309 13:05:30.072309 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bm76v" event={"ID":"f1d223c3-c810-4695-89e6-a6d5c32a1622","Type":"ContainerStarted","Data":"53b6b172053472d81ea86068535f122d0c1e2ec3c855370ed21e52f901d5cef6"} Mar 09 13:05:30 crc kubenswrapper[4723]: I0309 13:05:30.074083 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a789f366-e326-40e6-b705-6b24be86d982","Type":"ContainerStarted","Data":"ebcb8ba619ce228c217f00cdd2f6f29c06609844a5122c0c78fb9f71983df5a3"} Mar 09 13:05:30 crc kubenswrapper[4723]: I0309 13:05:30.099830 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bm76v" podStartSLOduration=2.933265635 podStartE2EDuration="5.09980269s" podCreationTimestamp="2026-03-09 13:05:25 +0000 UTC" firstStartedPulling="2026-03-09 13:05:26.434028183 +0000 UTC m=+400.448495723" lastFinishedPulling="2026-03-09 13:05:28.600565238 +0000 UTC m=+402.615032778" observedRunningTime="2026-03-09 13:05:30.092090079 +0000 UTC m=+404.106557639" watchObservedRunningTime="2026-03-09 13:05:30.09980269 +0000 UTC m=+404.114270240" Mar 09 13:05:30 crc kubenswrapper[4723]: I0309 13:05:30.367481 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-66b5df45c6-zvqgd"] Mar 09 13:05:30 crc kubenswrapper[4723]: I0309 13:05:30.368465 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66b5df45c6-zvqgd" Mar 09 13:05:30 crc kubenswrapper[4723]: I0309 13:05:30.379710 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66b5df45c6-zvqgd"] Mar 09 13:05:30 crc kubenswrapper[4723]: I0309 13:05:30.448951 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/664f7870-22a7-4f17-b89f-5c9a9616a2d1-console-serving-cert\") pod \"console-66b5df45c6-zvqgd\" (UID: \"664f7870-22a7-4f17-b89f-5c9a9616a2d1\") " pod="openshift-console/console-66b5df45c6-zvqgd" Mar 09 13:05:30 crc kubenswrapper[4723]: I0309 13:05:30.448982 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/664f7870-22a7-4f17-b89f-5c9a9616a2d1-console-oauth-config\") pod \"console-66b5df45c6-zvqgd\" (UID: \"664f7870-22a7-4f17-b89f-5c9a9616a2d1\") " pod="openshift-console/console-66b5df45c6-zvqgd" Mar 09 13:05:30 crc kubenswrapper[4723]: I0309 13:05:30.449016 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/664f7870-22a7-4f17-b89f-5c9a9616a2d1-oauth-serving-cert\") pod \"console-66b5df45c6-zvqgd\" (UID: \"664f7870-22a7-4f17-b89f-5c9a9616a2d1\") " pod="openshift-console/console-66b5df45c6-zvqgd" Mar 09 13:05:30 crc kubenswrapper[4723]: I0309 13:05:30.449040 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzwmz\" (UniqueName: \"kubernetes.io/projected/664f7870-22a7-4f17-b89f-5c9a9616a2d1-kube-api-access-pzwmz\") pod \"console-66b5df45c6-zvqgd\" (UID: \"664f7870-22a7-4f17-b89f-5c9a9616a2d1\") " pod="openshift-console/console-66b5df45c6-zvqgd" Mar 09 13:05:30 crc kubenswrapper[4723]: I0309 13:05:30.449056 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/664f7870-22a7-4f17-b89f-5c9a9616a2d1-console-config\") pod \"console-66b5df45c6-zvqgd\" (UID: \"664f7870-22a7-4f17-b89f-5c9a9616a2d1\") " pod="openshift-console/console-66b5df45c6-zvqgd" Mar 09 13:05:30 crc kubenswrapper[4723]: I0309 13:05:30.449081 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/664f7870-22a7-4f17-b89f-5c9a9616a2d1-trusted-ca-bundle\") pod \"console-66b5df45c6-zvqgd\" (UID: \"664f7870-22a7-4f17-b89f-5c9a9616a2d1\") " pod="openshift-console/console-66b5df45c6-zvqgd" Mar 09 13:05:30 crc kubenswrapper[4723]: I0309 13:05:30.449108 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/664f7870-22a7-4f17-b89f-5c9a9616a2d1-service-ca\") pod \"console-66b5df45c6-zvqgd\" (UID: \"664f7870-22a7-4f17-b89f-5c9a9616a2d1\") " pod="openshift-console/console-66b5df45c6-zvqgd" Mar 09 13:05:30 crc kubenswrapper[4723]: I0309 13:05:30.550966 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/664f7870-22a7-4f17-b89f-5c9a9616a2d1-console-serving-cert\") pod \"console-66b5df45c6-zvqgd\" (UID: \"664f7870-22a7-4f17-b89f-5c9a9616a2d1\") " pod="openshift-console/console-66b5df45c6-zvqgd" Mar 09 13:05:30 crc kubenswrapper[4723]: I0309 13:05:30.551320 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/664f7870-22a7-4f17-b89f-5c9a9616a2d1-console-oauth-config\") pod \"console-66b5df45c6-zvqgd\" (UID: \"664f7870-22a7-4f17-b89f-5c9a9616a2d1\") " pod="openshift-console/console-66b5df45c6-zvqgd" Mar 09 13:05:30 crc kubenswrapper[4723]: I0309 13:05:30.551375 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/664f7870-22a7-4f17-b89f-5c9a9616a2d1-oauth-serving-cert\") pod \"console-66b5df45c6-zvqgd\" (UID: \"664f7870-22a7-4f17-b89f-5c9a9616a2d1\") " pod="openshift-console/console-66b5df45c6-zvqgd" Mar 09 13:05:30 crc kubenswrapper[4723]: I0309 13:05:30.551407 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzwmz\" (UniqueName: \"kubernetes.io/projected/664f7870-22a7-4f17-b89f-5c9a9616a2d1-kube-api-access-pzwmz\") pod \"console-66b5df45c6-zvqgd\" (UID: \"664f7870-22a7-4f17-b89f-5c9a9616a2d1\") " pod="openshift-console/console-66b5df45c6-zvqgd" Mar 09 13:05:30 crc kubenswrapper[4723]: I0309 13:05:30.551426 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/664f7870-22a7-4f17-b89f-5c9a9616a2d1-console-config\") pod \"console-66b5df45c6-zvqgd\" (UID: \"664f7870-22a7-4f17-b89f-5c9a9616a2d1\") " pod="openshift-console/console-66b5df45c6-zvqgd" Mar 09 13:05:30 crc kubenswrapper[4723]: I0309 13:05:30.551473 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/664f7870-22a7-4f17-b89f-5c9a9616a2d1-trusted-ca-bundle\") pod \"console-66b5df45c6-zvqgd\" (UID: \"664f7870-22a7-4f17-b89f-5c9a9616a2d1\") " pod="openshift-console/console-66b5df45c6-zvqgd" Mar 09 13:05:30 crc kubenswrapper[4723]: I0309 13:05:30.551505 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/664f7870-22a7-4f17-b89f-5c9a9616a2d1-service-ca\") pod \"console-66b5df45c6-zvqgd\" (UID: \"664f7870-22a7-4f17-b89f-5c9a9616a2d1\") " pod="openshift-console/console-66b5df45c6-zvqgd" Mar 09 13:05:30 crc kubenswrapper[4723]: I0309 13:05:30.552895 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/664f7870-22a7-4f17-b89f-5c9a9616a2d1-oauth-serving-cert\") pod \"console-66b5df45c6-zvqgd\" (UID: \"664f7870-22a7-4f17-b89f-5c9a9616a2d1\") " pod="openshift-console/console-66b5df45c6-zvqgd" Mar 09 13:05:30 crc kubenswrapper[4723]: I0309 13:05:30.553564 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/664f7870-22a7-4f17-b89f-5c9a9616a2d1-service-ca\") pod \"console-66b5df45c6-zvqgd\" (UID: \"664f7870-22a7-4f17-b89f-5c9a9616a2d1\") " pod="openshift-console/console-66b5df45c6-zvqgd" Mar 09 13:05:30 crc kubenswrapper[4723]: I0309 13:05:30.553702 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/664f7870-22a7-4f17-b89f-5c9a9616a2d1-console-config\") pod \"console-66b5df45c6-zvqgd\" (UID: \"664f7870-22a7-4f17-b89f-5c9a9616a2d1\") " pod="openshift-console/console-66b5df45c6-zvqgd" Mar 09 13:05:30 crc kubenswrapper[4723]: I0309 13:05:30.554630 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/664f7870-22a7-4f17-b89f-5c9a9616a2d1-trusted-ca-bundle\") pod \"console-66b5df45c6-zvqgd\" (UID: \"664f7870-22a7-4f17-b89f-5c9a9616a2d1\") " pod="openshift-console/console-66b5df45c6-zvqgd" Mar 09 13:05:30 crc kubenswrapper[4723]: I0309 13:05:30.569282 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzwmz\" (UniqueName: \"kubernetes.io/projected/664f7870-22a7-4f17-b89f-5c9a9616a2d1-kube-api-access-pzwmz\") pod \"console-66b5df45c6-zvqgd\" (UID: \"664f7870-22a7-4f17-b89f-5c9a9616a2d1\") " pod="openshift-console/console-66b5df45c6-zvqgd" Mar 09 13:05:30 crc kubenswrapper[4723]: I0309 13:05:30.574875 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/664f7870-22a7-4f17-b89f-5c9a9616a2d1-console-serving-cert\") pod \"console-66b5df45c6-zvqgd\" (UID: \"664f7870-22a7-4f17-b89f-5c9a9616a2d1\") " pod="openshift-console/console-66b5df45c6-zvqgd" Mar 09 13:05:30 crc kubenswrapper[4723]: I0309 13:05:30.575832 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/664f7870-22a7-4f17-b89f-5c9a9616a2d1-console-oauth-config\") pod \"console-66b5df45c6-zvqgd\" (UID: \"664f7870-22a7-4f17-b89f-5c9a9616a2d1\") " pod="openshift-console/console-66b5df45c6-zvqgd" Mar 09 13:05:30 crc kubenswrapper[4723]: I0309 13:05:30.705747 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66b5df45c6-zvqgd" Mar 09 13:05:30 crc kubenswrapper[4723]: I0309 13:05:30.955911 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-57fc8677f7-9hvt8"] Mar 09 13:05:30 crc kubenswrapper[4723]: I0309 13:05:30.956924 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-57fc8677f7-9hvt8" Mar 09 13:05:30 crc kubenswrapper[4723]: I0309 13:05:30.959225 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-wkcp7" Mar 09 13:05:30 crc kubenswrapper[4723]: I0309 13:05:30.959265 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 09 13:05:30 crc kubenswrapper[4723]: I0309 13:05:30.959488 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 09 13:05:30 crc kubenswrapper[4723]: I0309 13:05:30.960590 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 09 13:05:30 crc kubenswrapper[4723]: I0309 13:05:30.960848 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 09 13:05:30 crc kubenswrapper[4723]: I0309 13:05:30.961005 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-ai8kqhul412rg" Mar 09 13:05:30 crc kubenswrapper[4723]: I0309 13:05:30.963217 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-57fc8677f7-9hvt8"] Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.059371 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0a0e755-4205-4573-b59a-07b4b6ec7ce2-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-57fc8677f7-9hvt8\" (UID: \"d0a0e755-4205-4573-b59a-07b4b6ec7ce2\") " pod="openshift-monitoring/metrics-server-57fc8677f7-9hvt8" Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.059436 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/d0a0e755-4205-4573-b59a-07b4b6ec7ce2-secret-metrics-server-tls\") pod \"metrics-server-57fc8677f7-9hvt8\" (UID: \"d0a0e755-4205-4573-b59a-07b4b6ec7ce2\") " pod="openshift-monitoring/metrics-server-57fc8677f7-9hvt8" Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.059470 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d0a0e755-4205-4573-b59a-07b4b6ec7ce2-secret-metrics-client-certs\") pod \"metrics-server-57fc8677f7-9hvt8\" (UID: \"d0a0e755-4205-4573-b59a-07b4b6ec7ce2\") " pod="openshift-monitoring/metrics-server-57fc8677f7-9hvt8" Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.059489 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/d0a0e755-4205-4573-b59a-07b4b6ec7ce2-audit-log\") pod \"metrics-server-57fc8677f7-9hvt8\" (UID: \"d0a0e755-4205-4573-b59a-07b4b6ec7ce2\") " pod="openshift-monitoring/metrics-server-57fc8677f7-9hvt8" Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.059590 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0a0e755-4205-4573-b59a-07b4b6ec7ce2-client-ca-bundle\") pod \"metrics-server-57fc8677f7-9hvt8\" (UID: \"d0a0e755-4205-4573-b59a-07b4b6ec7ce2\") " pod="openshift-monitoring/metrics-server-57fc8677f7-9hvt8" Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.059672 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj4zb\" (UniqueName: \"kubernetes.io/projected/d0a0e755-4205-4573-b59a-07b4b6ec7ce2-kube-api-access-rj4zb\") pod \"metrics-server-57fc8677f7-9hvt8\" (UID: \"d0a0e755-4205-4573-b59a-07b4b6ec7ce2\") " pod="openshift-monitoring/metrics-server-57fc8677f7-9hvt8" Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.059771 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/d0a0e755-4205-4573-b59a-07b4b6ec7ce2-metrics-server-audit-profiles\") pod \"metrics-server-57fc8677f7-9hvt8\" (UID: \"d0a0e755-4205-4573-b59a-07b4b6ec7ce2\") " pod="openshift-monitoring/metrics-server-57fc8677f7-9hvt8" Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.161410 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/d0a0e755-4205-4573-b59a-07b4b6ec7ce2-metrics-server-audit-profiles\") pod \"metrics-server-57fc8677f7-9hvt8\" (UID: \"d0a0e755-4205-4573-b59a-07b4b6ec7ce2\") " pod="openshift-monitoring/metrics-server-57fc8677f7-9hvt8" Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.161503 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0a0e755-4205-4573-b59a-07b4b6ec7ce2-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-57fc8677f7-9hvt8\" (UID: \"d0a0e755-4205-4573-b59a-07b4b6ec7ce2\") " pod="openshift-monitoring/metrics-server-57fc8677f7-9hvt8" Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.161553 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/d0a0e755-4205-4573-b59a-07b4b6ec7ce2-secret-metrics-server-tls\") pod \"metrics-server-57fc8677f7-9hvt8\" (UID: \"d0a0e755-4205-4573-b59a-07b4b6ec7ce2\") " pod="openshift-monitoring/metrics-server-57fc8677f7-9hvt8" Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.161599 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d0a0e755-4205-4573-b59a-07b4b6ec7ce2-secret-metrics-client-certs\") pod \"metrics-server-57fc8677f7-9hvt8\" (UID: \"d0a0e755-4205-4573-b59a-07b4b6ec7ce2\") " pod="openshift-monitoring/metrics-server-57fc8677f7-9hvt8" Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.161621 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/d0a0e755-4205-4573-b59a-07b4b6ec7ce2-audit-log\") pod \"metrics-server-57fc8677f7-9hvt8\" (UID: \"d0a0e755-4205-4573-b59a-07b4b6ec7ce2\") " pod="openshift-monitoring/metrics-server-57fc8677f7-9hvt8" Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.161645 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0a0e755-4205-4573-b59a-07b4b6ec7ce2-client-ca-bundle\") pod \"metrics-server-57fc8677f7-9hvt8\" (UID: \"d0a0e755-4205-4573-b59a-07b4b6ec7ce2\") " pod="openshift-monitoring/metrics-server-57fc8677f7-9hvt8" Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.163846 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/d0a0e755-4205-4573-b59a-07b4b6ec7ce2-audit-log\") pod \"metrics-server-57fc8677f7-9hvt8\" (UID: \"d0a0e755-4205-4573-b59a-07b4b6ec7ce2\") " pod="openshift-monitoring/metrics-server-57fc8677f7-9hvt8" Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.164016 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rj4zb\" (UniqueName: \"kubernetes.io/projected/d0a0e755-4205-4573-b59a-07b4b6ec7ce2-kube-api-access-rj4zb\") pod \"metrics-server-57fc8677f7-9hvt8\" (UID: \"d0a0e755-4205-4573-b59a-07b4b6ec7ce2\") " pod="openshift-monitoring/metrics-server-57fc8677f7-9hvt8" Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.164731 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0a0e755-4205-4573-b59a-07b4b6ec7ce2-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-57fc8677f7-9hvt8\" (UID: \"d0a0e755-4205-4573-b59a-07b4b6ec7ce2\") " pod="openshift-monitoring/metrics-server-57fc8677f7-9hvt8" Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.165229 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/d0a0e755-4205-4573-b59a-07b4b6ec7ce2-metrics-server-audit-profiles\") pod \"metrics-server-57fc8677f7-9hvt8\" (UID: \"d0a0e755-4205-4573-b59a-07b4b6ec7ce2\") " pod="openshift-monitoring/metrics-server-57fc8677f7-9hvt8" Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.167098 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0a0e755-4205-4573-b59a-07b4b6ec7ce2-client-ca-bundle\") pod \"metrics-server-57fc8677f7-9hvt8\" (UID: \"d0a0e755-4205-4573-b59a-07b4b6ec7ce2\") " pod="openshift-monitoring/metrics-server-57fc8677f7-9hvt8" Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.167977 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d0a0e755-4205-4573-b59a-07b4b6ec7ce2-secret-metrics-client-certs\") pod \"metrics-server-57fc8677f7-9hvt8\" (UID: \"d0a0e755-4205-4573-b59a-07b4b6ec7ce2\") " pod="openshift-monitoring/metrics-server-57fc8677f7-9hvt8" Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.170446 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/d0a0e755-4205-4573-b59a-07b4b6ec7ce2-secret-metrics-server-tls\") pod \"metrics-server-57fc8677f7-9hvt8\" (UID: \"d0a0e755-4205-4573-b59a-07b4b6ec7ce2\") " pod="openshift-monitoring/metrics-server-57fc8677f7-9hvt8" Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.181366 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rj4zb\" (UniqueName: \"kubernetes.io/projected/d0a0e755-4205-4573-b59a-07b4b6ec7ce2-kube-api-access-rj4zb\") pod \"metrics-server-57fc8677f7-9hvt8\" (UID: \"d0a0e755-4205-4573-b59a-07b4b6ec7ce2\") " pod="openshift-monitoring/metrics-server-57fc8677f7-9hvt8" Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.273732 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-57fc8677f7-9hvt8" Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.900081 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.902254 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.904291 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.904583 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.904747 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-26t6n" Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.904909 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.905514 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.905802 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.906016 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.907469 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.907573 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.908026 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.908335 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-8ctba7jkqmiui" Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.912284 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.929501 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.949330 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.996396 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5b572550-466a-4fae-9334-0a471e7c39be-config-out\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.996472 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5b572550-466a-4fae-9334-0a471e7c39be-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.996502 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5b572550-466a-4fae-9334-0a471e7c39be-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.999528 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5b572550-466a-4fae-9334-0a471e7c39be-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.999567 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5b572550-466a-4fae-9334-0a471e7c39be-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.999634 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5b572550-466a-4fae-9334-0a471e7c39be-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.999736 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5b572550-466a-4fae-9334-0a471e7c39be-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.999905 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4lxm\" (UniqueName: \"kubernetes.io/projected/5b572550-466a-4fae-9334-0a471e7c39be-kube-api-access-j4lxm\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.999957 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5b572550-466a-4fae-9334-0a471e7c39be-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:31 crc kubenswrapper[4723]: I0309 13:05:31.999980 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5b572550-466a-4fae-9334-0a471e7c39be-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.000009 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5b572550-466a-4fae-9334-0a471e7c39be-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.000034 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b572550-466a-4fae-9334-0a471e7c39be-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.000078 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5b572550-466a-4fae-9334-0a471e7c39be-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.000113 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b572550-466a-4fae-9334-0a471e7c39be-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.000156 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5b572550-466a-4fae-9334-0a471e7c39be-config\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.000214 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5b572550-466a-4fae-9334-0a471e7c39be-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.000242 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5b572550-466a-4fae-9334-0a471e7c39be-web-config\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.000266 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b572550-466a-4fae-9334-0a471e7c39be-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.101191 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5b572550-466a-4fae-9334-0a471e7c39be-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.101461 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5b572550-466a-4fae-9334-0a471e7c39be-web-config\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.101494 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b572550-466a-4fae-9334-0a471e7c39be-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.101521 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5b572550-466a-4fae-9334-0a471e7c39be-config-out\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.101536 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5b572550-466a-4fae-9334-0a471e7c39be-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.101560 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5b572550-466a-4fae-9334-0a471e7c39be-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.101578 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5b572550-466a-4fae-9334-0a471e7c39be-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.101597 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5b572550-466a-4fae-9334-0a471e7c39be-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.101627 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5b572550-466a-4fae-9334-0a471e7c39be-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.101654 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5b572550-466a-4fae-9334-0a471e7c39be-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.101677 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4lxm\" (UniqueName: \"kubernetes.io/projected/5b572550-466a-4fae-9334-0a471e7c39be-kube-api-access-j4lxm\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.101692 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5b572550-466a-4fae-9334-0a471e7c39be-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.101710 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5b572550-466a-4fae-9334-0a471e7c39be-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.101726 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5b572550-466a-4fae-9334-0a471e7c39be-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.101742 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b572550-466a-4fae-9334-0a471e7c39be-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.101766 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5b572550-466a-4fae-9334-0a471e7c39be-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.101783 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b572550-466a-4fae-9334-0a471e7c39be-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.101808 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5b572550-466a-4fae-9334-0a471e7c39be-config\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.102205 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5b572550-466a-4fae-9334-0a471e7c39be-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.104157 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b572550-466a-4fae-9334-0a471e7c39be-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.105304 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b572550-466a-4fae-9334-0a471e7c39be-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.105543 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b572550-466a-4fae-9334-0a471e7c39be-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.107702 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5b572550-466a-4fae-9334-0a471e7c39be-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.108806 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5b572550-466a-4fae-9334-0a471e7c39be-config\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.109516 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5b572550-466a-4fae-9334-0a471e7c39be-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.110187 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5b572550-466a-4fae-9334-0a471e7c39be-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.110307 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5b572550-466a-4fae-9334-0a471e7c39be-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.110422 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5b572550-466a-4fae-9334-0a471e7c39be-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.110794 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5b572550-466a-4fae-9334-0a471e7c39be-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.110893 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5b572550-466a-4fae-9334-0a471e7c39be-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.116837 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5b572550-466a-4fae-9334-0a471e7c39be-config-out\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.117016 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5b572550-466a-4fae-9334-0a471e7c39be-web-config\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.117057 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5b572550-466a-4fae-9334-0a471e7c39be-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.124317 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4lxm\" (UniqueName: \"kubernetes.io/projected/5b572550-466a-4fae-9334-0a471e7c39be-kube-api-access-j4lxm\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.125622 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5b572550-466a-4fae-9334-0a471e7c39be-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.122912 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5b572550-466a-4fae-9334-0a471e7c39be-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"5b572550-466a-4fae-9334-0a471e7c39be\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.334604 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-66b86d466-5tr4x"] Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.335573 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-66b86d466-5tr4x" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.340484 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.341411 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-66b86d466-5tr4x"] Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.341433 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-6tstp" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.377108 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.405339 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66b5df45c6-zvqgd"] Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.427711 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-57fc8677f7-9hvt8"] Mar 09 13:05:32 crc kubenswrapper[4723]: W0309 13:05:32.435985 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0a0e755_4205_4573_b59a_07b4b6ec7ce2.slice/crio-d5c99913928ca9034f7308f97f4a22c938e681757780ef4ab4e92b0d3871fc38 WatchSource:0}: Error finding container d5c99913928ca9034f7308f97f4a22c938e681757780ef4ab4e92b0d3871fc38: Status 404 returned error can't find the container with id d5c99913928ca9034f7308f97f4a22c938e681757780ef4ab4e92b0d3871fc38 Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.510518 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/289dc72b-221e-415a-8c97-3889de9ceaed-monitoring-plugin-cert\") pod \"monitoring-plugin-66b86d466-5tr4x\" (UID: \"289dc72b-221e-415a-8c97-3889de9ceaed\") " pod="openshift-monitoring/monitoring-plugin-66b86d466-5tr4x" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.611487 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/289dc72b-221e-415a-8c97-3889de9ceaed-monitoring-plugin-cert\") pod \"monitoring-plugin-66b86d466-5tr4x\" (UID: \"289dc72b-221e-415a-8c97-3889de9ceaed\") " pod="openshift-monitoring/monitoring-plugin-66b86d466-5tr4x" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.617609 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/289dc72b-221e-415a-8c97-3889de9ceaed-monitoring-plugin-cert\") pod \"monitoring-plugin-66b86d466-5tr4x\" (UID: \"289dc72b-221e-415a-8c97-3889de9ceaed\") " pod="openshift-monitoring/monitoring-plugin-66b86d466-5tr4x" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.657355 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-66b86d466-5tr4x" Mar 09 13:05:32 crc kubenswrapper[4723]: I0309 13:05:32.780549 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 09 13:05:33 crc kubenswrapper[4723]: I0309 13:05:33.105604 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66b5df45c6-zvqgd" event={"ID":"664f7870-22a7-4f17-b89f-5c9a9616a2d1","Type":"ContainerStarted","Data":"9d62557bc9a2f927d1152caaaae4978b5850a3c8e3f25a7bfbfc9e2006e3687c"} Mar 09 13:05:33 crc kubenswrapper[4723]: I0309 13:05:33.105737 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66b5df45c6-zvqgd" event={"ID":"664f7870-22a7-4f17-b89f-5c9a9616a2d1","Type":"ContainerStarted","Data":"90f5d3fd6edc890142578d76c4bf7bf5eb7a4354ad3df316889c6d22a0a63e53"} Mar 09 13:05:33 crc kubenswrapper[4723]: I0309 13:05:33.106992 4723 generic.go:334] "Generic (PLEG): container finished" podID="a789f366-e326-40e6-b705-6b24be86d982" containerID="8b47370fae7f236bc08543632a156c32434b624638d29b5fde892174a6158aa5" exitCode=0 Mar 09 13:05:33 crc kubenswrapper[4723]: I0309 13:05:33.107153 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a789f366-e326-40e6-b705-6b24be86d982","Type":"ContainerDied","Data":"8b47370fae7f236bc08543632a156c32434b624638d29b5fde892174a6158aa5"} Mar 09 13:05:33 crc kubenswrapper[4723]: I0309 13:05:33.110477 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-f994cb665-42jsl" event={"ID":"338186cb-4546-4740-bba3-c1c430d8aacc","Type":"ContainerStarted","Data":"6765ee3cdf7d95d6b681b02122dc72dbb6ccb9b1512f7e053bee0cad7265bfa6"} Mar 09 13:05:33 crc kubenswrapper[4723]: I0309 13:05:33.110517 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-f994cb665-42jsl" event={"ID":"338186cb-4546-4740-bba3-c1c430d8aacc","Type":"ContainerStarted","Data":"97f0f82b822e4938b33cbf23b35f3124b8424e0fe263901c4121ed5f1b6862fe"} Mar 09 13:05:33 crc kubenswrapper[4723]: I0309 13:05:33.110527 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-f994cb665-42jsl" event={"ID":"338186cb-4546-4740-bba3-c1c430d8aacc","Type":"ContainerStarted","Data":"95c2df1828ddd1c90e3b10f4188917e62c3a06e0394781c4f964a02e1adb9502"} Mar 09 13:05:33 crc kubenswrapper[4723]: I0309 13:05:33.111558 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-57fc8677f7-9hvt8" event={"ID":"d0a0e755-4205-4573-b59a-07b4b6ec7ce2","Type":"ContainerStarted","Data":"d5c99913928ca9034f7308f97f4a22c938e681757780ef4ab4e92b0d3871fc38"} Mar 09 13:05:33 crc kubenswrapper[4723]: I0309 13:05:33.114739 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-66b86d466-5tr4x"] Mar 09 13:05:33 crc kubenswrapper[4723]: I0309 13:05:33.117607 4723 generic.go:334] "Generic (PLEG): container finished" podID="5b572550-466a-4fae-9334-0a471e7c39be" containerID="d41eef78edd7f68076d13710fc37327025767bd29e28e48fe1294db2dfe29319" exitCode=0 Mar 09 13:05:33 crc kubenswrapper[4723]: I0309 13:05:33.117641 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5b572550-466a-4fae-9334-0a471e7c39be","Type":"ContainerDied","Data":"d41eef78edd7f68076d13710fc37327025767bd29e28e48fe1294db2dfe29319"} Mar 09 13:05:33 crc kubenswrapper[4723]: I0309 13:05:33.117665 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5b572550-466a-4fae-9334-0a471e7c39be","Type":"ContainerStarted","Data":"485302148ff4c97d02136cae767712cbc4c97c926f0bb06db812a9ff83b1e3d9"} Mar 09 13:05:33 crc kubenswrapper[4723]: I0309 13:05:33.143336 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-66b5df45c6-zvqgd" podStartSLOduration=3.143316775 podStartE2EDuration="3.143316775s" podCreationTimestamp="2026-03-09 13:05:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:05:33.126558195 +0000 UTC m=+407.141025775" watchObservedRunningTime="2026-03-09 13:05:33.143316775 +0000 UTC m=+407.157784315" Mar 09 13:05:34 crc kubenswrapper[4723]: I0309 13:05:34.129597 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-f994cb665-42jsl" event={"ID":"338186cb-4546-4740-bba3-c1c430d8aacc","Type":"ContainerStarted","Data":"07f7875f5ac8ddcee149d0d399d41c14d561306eb49ce09ca334c78d9e839e71"} Mar 09 13:05:34 crc kubenswrapper[4723]: I0309 13:05:34.129976 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-f994cb665-42jsl" event={"ID":"338186cb-4546-4740-bba3-c1c430d8aacc","Type":"ContainerStarted","Data":"7b5363c940c0fe76c8244704e3187dc9dd30fcffa10e6aeefd26f432bbaf5473"} Mar 09 13:05:34 crc kubenswrapper[4723]: I0309 13:05:34.132573 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-66b86d466-5tr4x" event={"ID":"289dc72b-221e-415a-8c97-3889de9ceaed","Type":"ContainerStarted","Data":"235bb2257078cebbbbfedf258f2c98a255c93fbae9553a08c024720701081b0c"} Mar 09 13:05:36 crc kubenswrapper[4723]: I0309 13:05:36.443589 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-nm8sh" Mar 09 13:05:36 crc kubenswrapper[4723]: I0309 13:05:36.500433 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-s6gh6"] Mar 09 13:05:37 crc kubenswrapper[4723]: I0309 13:05:37.154233 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-66b86d466-5tr4x" event={"ID":"289dc72b-221e-415a-8c97-3889de9ceaed","Type":"ContainerStarted","Data":"6bb6841f0916b0f3fdcf9a1f1cd5fccfeb6a12c7ee7449a4f8a43b773e79f08f"} Mar 09 13:05:37 crc kubenswrapper[4723]: I0309 13:05:37.155120 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-66b86d466-5tr4x" Mar 09 13:05:37 crc kubenswrapper[4723]: I0309 13:05:37.158634 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5b572550-466a-4fae-9334-0a471e7c39be","Type":"ContainerStarted","Data":"0edba4139858709e0da1d11439321624c64b4c7a6c75f8ea7200acd3abea7816"} Mar 09 13:05:37 crc kubenswrapper[4723]: I0309 13:05:37.158667 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5b572550-466a-4fae-9334-0a471e7c39be","Type":"ContainerStarted","Data":"a6bf27b940cd5724d120f33f060e88be7a50712b5553a845209c4814a50deb19"} Mar 09 13:05:37 crc kubenswrapper[4723]: I0309 13:05:37.159768 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-57fc8677f7-9hvt8" event={"ID":"d0a0e755-4205-4573-b59a-07b4b6ec7ce2","Type":"ContainerStarted","Data":"ffc0e2c1022a19fb78c8feb4c2d3e8f0516adc4451cbe64a93e86f2aa7f9af14"} Mar 09 13:05:37 crc kubenswrapper[4723]: I0309 13:05:37.164201 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-66b86d466-5tr4x" Mar 09 13:05:37 crc kubenswrapper[4723]: I0309 13:05:37.166455 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a789f366-e326-40e6-b705-6b24be86d982","Type":"ContainerStarted","Data":"a097157d8b0850d6b55d52dfebdc88b25ec9d8701e0220f1485314652d6bef89"} Mar 09 13:05:37 crc kubenswrapper[4723]: I0309 13:05:37.166481 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a789f366-e326-40e6-b705-6b24be86d982","Type":"ContainerStarted","Data":"8a044aa0a665c2042e0afd4475c065272ddaf906c519dce3be90f21ea7491bed"} Mar 09 13:05:37 crc kubenswrapper[4723]: I0309 13:05:37.174420 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-f994cb665-42jsl" event={"ID":"338186cb-4546-4740-bba3-c1c430d8aacc","Type":"ContainerStarted","Data":"271a61e630431a902f2874eaada16f8234bd708c11208a3ecca5a7abc6271cd6"} Mar 09 13:05:37 crc kubenswrapper[4723]: I0309 13:05:37.175545 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/thanos-querier-f994cb665-42jsl" Mar 09 13:05:37 crc kubenswrapper[4723]: I0309 13:05:37.179530 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-66b86d466-5tr4x" podStartSLOduration=2.8913187860000003 podStartE2EDuration="5.179511337s" podCreationTimestamp="2026-03-09 13:05:32 +0000 UTC" firstStartedPulling="2026-03-09 13:05:33.11802499 +0000 UTC m=+407.132492540" lastFinishedPulling="2026-03-09 13:05:35.406217521 +0000 UTC m=+409.420685091" observedRunningTime="2026-03-09 13:05:37.173111583 +0000 UTC m=+411.187579123" watchObservedRunningTime="2026-03-09 13:05:37.179511337 +0000 UTC m=+411.193978877" Mar 09 13:05:37 crc kubenswrapper[4723]: I0309 13:05:37.192311 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-f994cb665-42jsl" Mar 09 13:05:37 crc kubenswrapper[4723]: I0309 13:05:37.193695 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-57fc8677f7-9hvt8" podStartSLOduration=4.236603075 podStartE2EDuration="7.193679993s" podCreationTimestamp="2026-03-09 13:05:30 +0000 UTC" firstStartedPulling="2026-03-09 13:05:32.437672515 +0000 UTC m=+406.452140055" lastFinishedPulling="2026-03-09 13:05:35.394749443 +0000 UTC m=+409.409216973" observedRunningTime="2026-03-09 13:05:37.192036346 +0000 UTC m=+411.206503886" watchObservedRunningTime="2026-03-09 13:05:37.193679993 +0000 UTC m=+411.208147533" Mar 09 13:05:37 crc kubenswrapper[4723]: I0309 13:05:37.266663 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-f994cb665-42jsl" podStartSLOduration=5.604138484 podStartE2EDuration="10.266641903s" podCreationTimestamp="2026-03-09 13:05:27 +0000 UTC" firstStartedPulling="2026-03-09 13:05:29.100215256 +0000 UTC m=+403.114682796" lastFinishedPulling="2026-03-09 13:05:33.762718675 +0000 UTC m=+407.777186215" observedRunningTime="2026-03-09 13:05:37.25221536 +0000 UTC m=+411.266682920" watchObservedRunningTime="2026-03-09 13:05:37.266641903 +0000 UTC m=+411.281109443" Mar 09 13:05:38 crc kubenswrapper[4723]: I0309 13:05:38.185663 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a789f366-e326-40e6-b705-6b24be86d982","Type":"ContainerStarted","Data":"69e27b4b93399fe3f419032c187ec91c9e367aa613d3b16077cf36b52fa3bc08"} Mar 09 13:05:38 crc kubenswrapper[4723]: I0309 13:05:38.185987 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a789f366-e326-40e6-b705-6b24be86d982","Type":"ContainerStarted","Data":"b5b4ce2d4c05dee4e37c9c92788e22acc27c52e3ea0135efa69010c066d97624"} Mar 09 13:05:38 crc kubenswrapper[4723]: I0309 13:05:38.185998 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a789f366-e326-40e6-b705-6b24be86d982","Type":"ContainerStarted","Data":"d0b63f507047ef7a24ef2d50b57db8e0bfd2f8a14e7581dc69b2bc6e3691fe28"} Mar 09 13:05:38 crc kubenswrapper[4723]: I0309 13:05:38.186009 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a789f366-e326-40e6-b705-6b24be86d982","Type":"ContainerStarted","Data":"cdef90db0965e6f138a7e8faa377236f5fc058305e88fb37c2766c4ee36eb94f"} Mar 09 13:05:38 crc kubenswrapper[4723]: I0309 13:05:38.191308 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5b572550-466a-4fae-9334-0a471e7c39be","Type":"ContainerStarted","Data":"fd6684b844d2ba42592c6d544ea28b4f812f4d37cff2b31c7424acc3e5d4e29d"} Mar 09 13:05:38 crc kubenswrapper[4723]: I0309 13:05:38.191342 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5b572550-466a-4fae-9334-0a471e7c39be","Type":"ContainerStarted","Data":"cc574463e96f9d92ceddc39c337618f5c72f58a7b716f2ff74a846113f5443ed"} Mar 09 13:05:38 crc kubenswrapper[4723]: I0309 13:05:38.191352 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5b572550-466a-4fae-9334-0a471e7c39be","Type":"ContainerStarted","Data":"9c488a3cb693bd4a97b136c5cd01525ece85ed2ac37ca93ad24b172251c786f3"} Mar 09 13:05:38 crc kubenswrapper[4723]: I0309 13:05:38.191362 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5b572550-466a-4fae-9334-0a471e7c39be","Type":"ContainerStarted","Data":"44bad1d6a53ac574d8f8ef90dc8ded2c97354f0cc435ee607261f4b04358396d"} Mar 09 13:05:38 crc kubenswrapper[4723]: I0309 13:05:38.220949 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=6.052327582 podStartE2EDuration="12.22092661s" podCreationTimestamp="2026-03-09 13:05:26 +0000 UTC" firstStartedPulling="2026-03-09 13:05:29.241588037 +0000 UTC m=+403.256055577" lastFinishedPulling="2026-03-09 13:05:35.410187025 +0000 UTC m=+409.424654605" observedRunningTime="2026-03-09 13:05:38.216717879 +0000 UTC m=+412.231185459" watchObservedRunningTime="2026-03-09 13:05:38.22092661 +0000 UTC m=+412.235394150" Mar 09 13:05:40 crc kubenswrapper[4723]: I0309 13:05:40.707204 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-66b5df45c6-zvqgd" Mar 09 13:05:40 crc kubenswrapper[4723]: I0309 13:05:40.707952 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-66b5df45c6-zvqgd" Mar 09 13:05:40 crc kubenswrapper[4723]: I0309 13:05:40.716709 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-66b5df45c6-zvqgd" Mar 09 13:05:40 crc kubenswrapper[4723]: I0309 13:05:40.757708 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=6.181772031 podStartE2EDuration="9.757685263s" podCreationTimestamp="2026-03-09 13:05:31 +0000 UTC" firstStartedPulling="2026-03-09 13:05:33.126024509 +0000 UTC m=+407.140492089" lastFinishedPulling="2026-03-09 13:05:36.701937791 +0000 UTC m=+410.716405321" observedRunningTime="2026-03-09 13:05:38.253945896 +0000 UTC m=+412.268413436" watchObservedRunningTime="2026-03-09 13:05:40.757685263 +0000 UTC m=+414.772152823" Mar 09 13:05:41 crc kubenswrapper[4723]: I0309 13:05:41.222121 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-66b5df45c6-zvqgd" Mar 09 13:05:41 crc kubenswrapper[4723]: I0309 13:05:41.287223 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-6gtjl"] Mar 09 13:05:42 crc kubenswrapper[4723]: I0309 13:05:42.377530 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:05:51 crc kubenswrapper[4723]: I0309 13:05:51.274116 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-57fc8677f7-9hvt8" Mar 09 13:05:51 crc kubenswrapper[4723]: I0309 13:05:51.274456 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-57fc8677f7-9hvt8" Mar 09 13:06:00 crc kubenswrapper[4723]: I0309 13:06:00.129909 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551026-vm6hr"] Mar 09 13:06:00 crc kubenswrapper[4723]: I0309 13:06:00.134367 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551026-vm6hr" Mar 09 13:06:00 crc kubenswrapper[4723]: I0309 13:06:00.134569 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551026-vm6hr"] Mar 09 13:06:00 crc kubenswrapper[4723]: I0309 13:06:00.136563 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jrp4x" Mar 09 13:06:00 crc kubenswrapper[4723]: I0309 13:06:00.137224 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:06:00 crc kubenswrapper[4723]: I0309 13:06:00.137737 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:06:00 crc kubenswrapper[4723]: I0309 13:06:00.240570 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26fvg\" (UniqueName: \"kubernetes.io/projected/77132bc2-3cff-42ca-b132-bca14aa41733-kube-api-access-26fvg\") pod \"auto-csr-approver-29551026-vm6hr\" (UID: \"77132bc2-3cff-42ca-b132-bca14aa41733\") " pod="openshift-infra/auto-csr-approver-29551026-vm6hr" Mar 09 13:06:00 crc kubenswrapper[4723]: I0309 13:06:00.342080 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26fvg\" (UniqueName: \"kubernetes.io/projected/77132bc2-3cff-42ca-b132-bca14aa41733-kube-api-access-26fvg\") pod \"auto-csr-approver-29551026-vm6hr\" (UID: \"77132bc2-3cff-42ca-b132-bca14aa41733\") " pod="openshift-infra/auto-csr-approver-29551026-vm6hr" Mar 09 13:06:00 crc kubenswrapper[4723]: I0309 13:06:00.377477 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26fvg\" (UniqueName: \"kubernetes.io/projected/77132bc2-3cff-42ca-b132-bca14aa41733-kube-api-access-26fvg\") pod \"auto-csr-approver-29551026-vm6hr\" (UID: \"77132bc2-3cff-42ca-b132-bca14aa41733\") " pod="openshift-infra/auto-csr-approver-29551026-vm6hr" Mar 09 13:06:00 crc kubenswrapper[4723]: I0309 13:06:00.462573 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551026-vm6hr" Mar 09 13:06:00 crc kubenswrapper[4723]: W0309 13:06:00.892875 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77132bc2_3cff_42ca_b132_bca14aa41733.slice/crio-e9a4248f2bea06cb309eda1653cdbb764683b4d7fa6182365be655edeb063c56 WatchSource:0}: Error finding container e9a4248f2bea06cb309eda1653cdbb764683b4d7fa6182365be655edeb063c56: Status 404 returned error can't find the container with id e9a4248f2bea06cb309eda1653cdbb764683b4d7fa6182365be655edeb063c56 Mar 09 13:06:00 crc kubenswrapper[4723]: I0309 13:06:00.894763 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551026-vm6hr"] Mar 09 13:06:01 crc kubenswrapper[4723]: I0309 13:06:01.370164 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551026-vm6hr" event={"ID":"77132bc2-3cff-42ca-b132-bca14aa41733","Type":"ContainerStarted","Data":"e9a4248f2bea06cb309eda1653cdbb764683b4d7fa6182365be655edeb063c56"} Mar 09 13:06:01 crc kubenswrapper[4723]: I0309 13:06:01.546145 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" podUID="6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef" containerName="registry" containerID="cri-o://6997df8f497c4ff917189e52e29cc2842ade8e6e2ca3ac5caf201f9cffa6296d" gracePeriod=30 Mar 09 13:06:02 crc kubenswrapper[4723]: I0309 13:06:02.079783 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:06:02 crc kubenswrapper[4723]: I0309 13:06:02.168375 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef-trusted-ca\") pod \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " Mar 09 13:06:02 crc kubenswrapper[4723]: I0309 13:06:02.168422 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef-installation-pull-secrets\") pod \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " Mar 09 13:06:02 crc kubenswrapper[4723]: I0309 13:06:02.168450 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef-ca-trust-extracted\") pod \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " Mar 09 13:06:02 crc kubenswrapper[4723]: I0309 13:06:02.168512 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef-registry-tls\") pod \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " Mar 09 13:06:02 crc kubenswrapper[4723]: I0309 13:06:02.168536 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef-bound-sa-token\") pod \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " Mar 09 13:06:02 crc kubenswrapper[4723]: I0309 13:06:02.168641 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " Mar 09 13:06:02 crc kubenswrapper[4723]: I0309 13:06:02.168696 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dd7gm\" (UniqueName: \"kubernetes.io/projected/6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef-kube-api-access-dd7gm\") pod \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " Mar 09 13:06:02 crc kubenswrapper[4723]: I0309 13:06:02.168749 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef-registry-certificates\") pod \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\" (UID: \"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef\") " Mar 09 13:06:02 crc kubenswrapper[4723]: I0309 13:06:02.170803 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:06:02 crc kubenswrapper[4723]: I0309 13:06:02.170908 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:06:02 crc kubenswrapper[4723]: I0309 13:06:02.176974 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:06:02 crc kubenswrapper[4723]: I0309 13:06:02.179067 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:06:02 crc kubenswrapper[4723]: I0309 13:06:02.180878 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef-kube-api-access-dd7gm" (OuterVolumeSpecName: "kube-api-access-dd7gm") pod "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef"). InnerVolumeSpecName "kube-api-access-dd7gm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:06:02 crc kubenswrapper[4723]: I0309 13:06:02.181773 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:06:02 crc kubenswrapper[4723]: I0309 13:06:02.186925 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 09 13:06:02 crc kubenswrapper[4723]: I0309 13:06:02.203306 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef" (UID: "6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:06:02 crc kubenswrapper[4723]: I0309 13:06:02.270217 4723 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:06:02 crc kubenswrapper[4723]: I0309 13:06:02.270257 4723 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 09 13:06:02 crc kubenswrapper[4723]: I0309 13:06:02.270269 4723 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 09 13:06:02 crc kubenswrapper[4723]: I0309 13:06:02.270277 4723 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:06:02 crc kubenswrapper[4723]: I0309 13:06:02.270285 4723 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 09 13:06:02 crc kubenswrapper[4723]: I0309 13:06:02.270294 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dd7gm\" (UniqueName: \"kubernetes.io/projected/6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef-kube-api-access-dd7gm\") on node \"crc\" DevicePath \"\"" Mar 09 13:06:02 crc kubenswrapper[4723]: I0309 13:06:02.270302 4723 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 09 13:06:02 crc kubenswrapper[4723]: I0309 13:06:02.380178 4723 generic.go:334] "Generic (PLEG): container finished" podID="77132bc2-3cff-42ca-b132-bca14aa41733" containerID="4e21a6980b85e71f90d18fd1058d71f241f1bb4828c79c7ce64e9be6f21a0add" exitCode=0 Mar 09 13:06:02 crc kubenswrapper[4723]: I0309 13:06:02.380434 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551026-vm6hr" event={"ID":"77132bc2-3cff-42ca-b132-bca14aa41733","Type":"ContainerDied","Data":"4e21a6980b85e71f90d18fd1058d71f241f1bb4828c79c7ce64e9be6f21a0add"} Mar 09 13:06:02 crc kubenswrapper[4723]: I0309 13:06:02.382130 4723 generic.go:334] "Generic (PLEG): container finished" podID="6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef" containerID="6997df8f497c4ff917189e52e29cc2842ade8e6e2ca3ac5caf201f9cffa6296d" exitCode=0 Mar 09 13:06:02 crc kubenswrapper[4723]: I0309 13:06:02.382164 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" event={"ID":"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef","Type":"ContainerDied","Data":"6997df8f497c4ff917189e52e29cc2842ade8e6e2ca3ac5caf201f9cffa6296d"} Mar 09 13:06:02 crc kubenswrapper[4723]: I0309 13:06:02.382185 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" event={"ID":"6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef","Type":"ContainerDied","Data":"a447c0c706b067fe924b41af78e9a1b9a2863adcee5f0dfe8e3c35e433051e7b"} Mar 09 13:06:02 crc kubenswrapper[4723]: I0309 13:06:02.382203 4723 scope.go:117] "RemoveContainer" containerID="6997df8f497c4ff917189e52e29cc2842ade8e6e2ca3ac5caf201f9cffa6296d" Mar 09 13:06:02 crc kubenswrapper[4723]: I0309 13:06:02.382314 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-s6gh6" Mar 09 13:06:02 crc kubenswrapper[4723]: I0309 13:06:02.446750 4723 scope.go:117] "RemoveContainer" containerID="6997df8f497c4ff917189e52e29cc2842ade8e6e2ca3ac5caf201f9cffa6296d" Mar 09 13:06:02 crc kubenswrapper[4723]: E0309 13:06:02.447418 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6997df8f497c4ff917189e52e29cc2842ade8e6e2ca3ac5caf201f9cffa6296d\": container with ID starting with 6997df8f497c4ff917189e52e29cc2842ade8e6e2ca3ac5caf201f9cffa6296d not found: ID does not exist" containerID="6997df8f497c4ff917189e52e29cc2842ade8e6e2ca3ac5caf201f9cffa6296d" Mar 09 13:06:02 crc kubenswrapper[4723]: I0309 13:06:02.447459 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6997df8f497c4ff917189e52e29cc2842ade8e6e2ca3ac5caf201f9cffa6296d"} err="failed to get container status \"6997df8f497c4ff917189e52e29cc2842ade8e6e2ca3ac5caf201f9cffa6296d\": rpc error: code = NotFound desc = could not find container \"6997df8f497c4ff917189e52e29cc2842ade8e6e2ca3ac5caf201f9cffa6296d\": container with ID starting with 6997df8f497c4ff917189e52e29cc2842ade8e6e2ca3ac5caf201f9cffa6296d not found: ID does not exist" Mar 09 13:06:02 crc kubenswrapper[4723]: I0309 13:06:02.461091 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-s6gh6"] Mar 09 13:06:02 crc kubenswrapper[4723]: I0309 13:06:02.469803 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-s6gh6"] Mar 09 13:06:02 crc kubenswrapper[4723]: I0309 13:06:02.891275 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef" path="/var/lib/kubelet/pods/6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef/volumes" Mar 09 13:06:03 crc kubenswrapper[4723]: I0309 13:06:03.684378 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551026-vm6hr" Mar 09 13:06:03 crc kubenswrapper[4723]: I0309 13:06:03.791818 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26fvg\" (UniqueName: \"kubernetes.io/projected/77132bc2-3cff-42ca-b132-bca14aa41733-kube-api-access-26fvg\") pod \"77132bc2-3cff-42ca-b132-bca14aa41733\" (UID: \"77132bc2-3cff-42ca-b132-bca14aa41733\") " Mar 09 13:06:03 crc kubenswrapper[4723]: I0309 13:06:03.798210 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77132bc2-3cff-42ca-b132-bca14aa41733-kube-api-access-26fvg" (OuterVolumeSpecName: "kube-api-access-26fvg") pod "77132bc2-3cff-42ca-b132-bca14aa41733" (UID: "77132bc2-3cff-42ca-b132-bca14aa41733"). InnerVolumeSpecName "kube-api-access-26fvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:06:03 crc kubenswrapper[4723]: I0309 13:06:03.893319 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26fvg\" (UniqueName: \"kubernetes.io/projected/77132bc2-3cff-42ca-b132-bca14aa41733-kube-api-access-26fvg\") on node \"crc\" DevicePath \"\"" Mar 09 13:06:04 crc kubenswrapper[4723]: I0309 13:06:04.400655 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551026-vm6hr" event={"ID":"77132bc2-3cff-42ca-b132-bca14aa41733","Type":"ContainerDied","Data":"e9a4248f2bea06cb309eda1653cdbb764683b4d7fa6182365be655edeb063c56"} Mar 09 13:06:04 crc kubenswrapper[4723]: I0309 13:06:04.400705 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9a4248f2bea06cb309eda1653cdbb764683b4d7fa6182365be655edeb063c56" Mar 09 13:06:04 crc kubenswrapper[4723]: I0309 13:06:04.401416 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551026-vm6hr" Mar 09 13:06:06 crc kubenswrapper[4723]: I0309 13:06:06.339683 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-6gtjl" podUID="6775c6a2-49ba-48fb-9f8f-ff26a7155618" containerName="console" containerID="cri-o://3528b54bcf3c325faeb80344f900bd80b025b73afeaf6efeef5c37ad41e9beda" gracePeriod=15 Mar 09 13:06:06 crc kubenswrapper[4723]: I0309 13:06:06.713435 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-6gtjl_6775c6a2-49ba-48fb-9f8f-ff26a7155618/console/0.log" Mar 09 13:06:06 crc kubenswrapper[4723]: I0309 13:06:06.713506 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-6gtjl" Mar 09 13:06:06 crc kubenswrapper[4723]: I0309 13:06:06.734458 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6775c6a2-49ba-48fb-9f8f-ff26a7155618-service-ca\") pod \"6775c6a2-49ba-48fb-9f8f-ff26a7155618\" (UID: \"6775c6a2-49ba-48fb-9f8f-ff26a7155618\") " Mar 09 13:06:06 crc kubenswrapper[4723]: I0309 13:06:06.734523 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6775c6a2-49ba-48fb-9f8f-ff26a7155618-console-config\") pod \"6775c6a2-49ba-48fb-9f8f-ff26a7155618\" (UID: \"6775c6a2-49ba-48fb-9f8f-ff26a7155618\") " Mar 09 13:06:06 crc kubenswrapper[4723]: I0309 13:06:06.734563 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6775c6a2-49ba-48fb-9f8f-ff26a7155618-console-oauth-config\") pod \"6775c6a2-49ba-48fb-9f8f-ff26a7155618\" (UID: \"6775c6a2-49ba-48fb-9f8f-ff26a7155618\") " Mar 09 13:06:06 crc kubenswrapper[4723]: I0309 13:06:06.734600 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6775c6a2-49ba-48fb-9f8f-ff26a7155618-trusted-ca-bundle\") pod \"6775c6a2-49ba-48fb-9f8f-ff26a7155618\" (UID: \"6775c6a2-49ba-48fb-9f8f-ff26a7155618\") " Mar 09 13:06:06 crc kubenswrapper[4723]: I0309 13:06:06.734637 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5lpg\" (UniqueName: \"kubernetes.io/projected/6775c6a2-49ba-48fb-9f8f-ff26a7155618-kube-api-access-m5lpg\") pod \"6775c6a2-49ba-48fb-9f8f-ff26a7155618\" (UID: \"6775c6a2-49ba-48fb-9f8f-ff26a7155618\") " Mar 09 13:06:06 crc kubenswrapper[4723]: I0309 13:06:06.734699 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6775c6a2-49ba-48fb-9f8f-ff26a7155618-console-serving-cert\") pod \"6775c6a2-49ba-48fb-9f8f-ff26a7155618\" (UID: \"6775c6a2-49ba-48fb-9f8f-ff26a7155618\") " Mar 09 13:06:06 crc kubenswrapper[4723]: I0309 13:06:06.734748 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6775c6a2-49ba-48fb-9f8f-ff26a7155618-oauth-serving-cert\") pod \"6775c6a2-49ba-48fb-9f8f-ff26a7155618\" (UID: \"6775c6a2-49ba-48fb-9f8f-ff26a7155618\") " Mar 09 13:06:06 crc kubenswrapper[4723]: I0309 13:06:06.736016 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6775c6a2-49ba-48fb-9f8f-ff26a7155618-service-ca" (OuterVolumeSpecName: "service-ca") pod "6775c6a2-49ba-48fb-9f8f-ff26a7155618" (UID: "6775c6a2-49ba-48fb-9f8f-ff26a7155618"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:06:06 crc kubenswrapper[4723]: I0309 13:06:06.736035 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6775c6a2-49ba-48fb-9f8f-ff26a7155618-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "6775c6a2-49ba-48fb-9f8f-ff26a7155618" (UID: "6775c6a2-49ba-48fb-9f8f-ff26a7155618"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:06:06 crc kubenswrapper[4723]: I0309 13:06:06.740936 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6775c6a2-49ba-48fb-9f8f-ff26a7155618-console-config" (OuterVolumeSpecName: "console-config") pod "6775c6a2-49ba-48fb-9f8f-ff26a7155618" (UID: "6775c6a2-49ba-48fb-9f8f-ff26a7155618"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:06:06 crc kubenswrapper[4723]: I0309 13:06:06.741313 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6775c6a2-49ba-48fb-9f8f-ff26a7155618-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6775c6a2-49ba-48fb-9f8f-ff26a7155618" (UID: "6775c6a2-49ba-48fb-9f8f-ff26a7155618"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:06:06 crc kubenswrapper[4723]: I0309 13:06:06.752266 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6775c6a2-49ba-48fb-9f8f-ff26a7155618-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "6775c6a2-49ba-48fb-9f8f-ff26a7155618" (UID: "6775c6a2-49ba-48fb-9f8f-ff26a7155618"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:06:06 crc kubenswrapper[4723]: I0309 13:06:06.753427 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6775c6a2-49ba-48fb-9f8f-ff26a7155618-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "6775c6a2-49ba-48fb-9f8f-ff26a7155618" (UID: "6775c6a2-49ba-48fb-9f8f-ff26a7155618"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:06:06 crc kubenswrapper[4723]: I0309 13:06:06.755985 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6775c6a2-49ba-48fb-9f8f-ff26a7155618-kube-api-access-m5lpg" (OuterVolumeSpecName: "kube-api-access-m5lpg") pod "6775c6a2-49ba-48fb-9f8f-ff26a7155618" (UID: "6775c6a2-49ba-48fb-9f8f-ff26a7155618"). InnerVolumeSpecName "kube-api-access-m5lpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:06:06 crc kubenswrapper[4723]: I0309 13:06:06.836480 4723 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6775c6a2-49ba-48fb-9f8f-ff26a7155618-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:06:06 crc kubenswrapper[4723]: I0309 13:06:06.836520 4723 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6775c6a2-49ba-48fb-9f8f-ff26a7155618-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:06:06 crc kubenswrapper[4723]: I0309 13:06:06.836532 4723 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6775c6a2-49ba-48fb-9f8f-ff26a7155618-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:06:06 crc kubenswrapper[4723]: I0309 13:06:06.836539 4723 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6775c6a2-49ba-48fb-9f8f-ff26a7155618-console-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:06:06 crc kubenswrapper[4723]: I0309 13:06:06.836547 4723 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6775c6a2-49ba-48fb-9f8f-ff26a7155618-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:06:06 crc kubenswrapper[4723]: I0309 13:06:06.836556 4723 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6775c6a2-49ba-48fb-9f8f-ff26a7155618-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:06:06 crc kubenswrapper[4723]: I0309 13:06:06.836564 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5lpg\" (UniqueName: \"kubernetes.io/projected/6775c6a2-49ba-48fb-9f8f-ff26a7155618-kube-api-access-m5lpg\") on node \"crc\" DevicePath \"\"" Mar 09 13:06:07 crc kubenswrapper[4723]: I0309 13:06:07.452522 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-6gtjl_6775c6a2-49ba-48fb-9f8f-ff26a7155618/console/0.log" Mar 09 13:06:07 crc kubenswrapper[4723]: I0309 13:06:07.452624 4723 generic.go:334] "Generic (PLEG): container finished" podID="6775c6a2-49ba-48fb-9f8f-ff26a7155618" containerID="3528b54bcf3c325faeb80344f900bd80b025b73afeaf6efeef5c37ad41e9beda" exitCode=2 Mar 09 13:06:07 crc kubenswrapper[4723]: I0309 13:06:07.452673 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-6gtjl" event={"ID":"6775c6a2-49ba-48fb-9f8f-ff26a7155618","Type":"ContainerDied","Data":"3528b54bcf3c325faeb80344f900bd80b025b73afeaf6efeef5c37ad41e9beda"} Mar 09 13:06:07 crc kubenswrapper[4723]: I0309 13:06:07.452705 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-6gtjl" event={"ID":"6775c6a2-49ba-48fb-9f8f-ff26a7155618","Type":"ContainerDied","Data":"7627da4ef3ad5637aec325793c6d20eb62f3261ec552f8ae4c7afb1c042e35e9"} Mar 09 13:06:07 crc kubenswrapper[4723]: I0309 13:06:07.452758 4723 scope.go:117] "RemoveContainer" containerID="3528b54bcf3c325faeb80344f900bd80b025b73afeaf6efeef5c37ad41e9beda" Mar 09 13:06:07 crc kubenswrapper[4723]: I0309 13:06:07.452921 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-6gtjl" Mar 09 13:06:08 crc kubenswrapper[4723]: I0309 13:06:08.664016 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-6gtjl"] Mar 09 13:06:08 crc kubenswrapper[4723]: I0309 13:06:08.666827 4723 scope.go:117] "RemoveContainer" containerID="3528b54bcf3c325faeb80344f900bd80b025b73afeaf6efeef5c37ad41e9beda" Mar 09 13:06:08 crc kubenswrapper[4723]: E0309 13:06:08.672500 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3528b54bcf3c325faeb80344f900bd80b025b73afeaf6efeef5c37ad41e9beda\": container with ID starting with 3528b54bcf3c325faeb80344f900bd80b025b73afeaf6efeef5c37ad41e9beda not found: ID does not exist" containerID="3528b54bcf3c325faeb80344f900bd80b025b73afeaf6efeef5c37ad41e9beda" Mar 09 13:06:08 crc kubenswrapper[4723]: I0309 13:06:08.672568 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3528b54bcf3c325faeb80344f900bd80b025b73afeaf6efeef5c37ad41e9beda"} err="failed to get container status \"3528b54bcf3c325faeb80344f900bd80b025b73afeaf6efeef5c37ad41e9beda\": rpc error: code = NotFound desc = could not find container \"3528b54bcf3c325faeb80344f900bd80b025b73afeaf6efeef5c37ad41e9beda\": container with ID starting with 3528b54bcf3c325faeb80344f900bd80b025b73afeaf6efeef5c37ad41e9beda not found: ID does not exist" Mar 09 13:06:08 crc kubenswrapper[4723]: I0309 13:06:08.681744 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-6gtjl"] Mar 09 13:06:08 crc kubenswrapper[4723]: I0309 13:06:08.890016 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6775c6a2-49ba-48fb-9f8f-ff26a7155618" path="/var/lib/kubelet/pods/6775c6a2-49ba-48fb-9f8f-ff26a7155618/volumes" Mar 09 13:06:11 crc kubenswrapper[4723]: I0309 13:06:11.282622 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-57fc8677f7-9hvt8" Mar 09 13:06:11 crc kubenswrapper[4723]: I0309 13:06:11.287523 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-57fc8677f7-9hvt8" Mar 09 13:06:32 crc kubenswrapper[4723]: I0309 13:06:32.377621 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:06:32 crc kubenswrapper[4723]: I0309 13:06:32.429226 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:06:32 crc kubenswrapper[4723]: I0309 13:06:32.694100 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 09 13:06:33 crc kubenswrapper[4723]: I0309 13:06:33.947597 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:06:33 crc kubenswrapper[4723]: I0309 13:06:33.949044 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:06:50 crc kubenswrapper[4723]: I0309 13:06:50.040637 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-7f69f56458-z9f7c"] Mar 09 13:06:50 crc kubenswrapper[4723]: E0309 13:06:50.041508 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6775c6a2-49ba-48fb-9f8f-ff26a7155618" containerName="console" Mar 09 13:06:50 crc kubenswrapper[4723]: I0309 13:06:50.041524 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="6775c6a2-49ba-48fb-9f8f-ff26a7155618" containerName="console" Mar 09 13:06:50 crc kubenswrapper[4723]: E0309 13:06:50.041555 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef" containerName="registry" Mar 09 13:06:50 crc kubenswrapper[4723]: I0309 13:06:50.041565 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef" containerName="registry" Mar 09 13:06:50 crc kubenswrapper[4723]: E0309 13:06:50.041583 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77132bc2-3cff-42ca-b132-bca14aa41733" containerName="oc" Mar 09 13:06:50 crc kubenswrapper[4723]: I0309 13:06:50.041591 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="77132bc2-3cff-42ca-b132-bca14aa41733" containerName="oc" Mar 09 13:06:50 crc kubenswrapper[4723]: I0309 13:06:50.041718 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="6775c6a2-49ba-48fb-9f8f-ff26a7155618" containerName="console" Mar 09 13:06:50 crc kubenswrapper[4723]: I0309 13:06:50.041739 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="77132bc2-3cff-42ca-b132-bca14aa41733" containerName="oc" Mar 09 13:06:50 crc kubenswrapper[4723]: I0309 13:06:50.041750 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d7eb7cc-ca52-4777-ac3b-e90d4aba9bef" containerName="registry" Mar 09 13:06:50 crc kubenswrapper[4723]: I0309 13:06:50.042239 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7f69f56458-z9f7c" Mar 09 13:06:50 crc kubenswrapper[4723]: I0309 13:06:50.045171 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-4a3b7v7njkfbr" Mar 09 13:06:50 crc kubenswrapper[4723]: I0309 13:06:50.052678 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/metrics-server-57fc8677f7-9hvt8"] Mar 09 13:06:50 crc kubenswrapper[4723]: I0309 13:06:50.052978 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/metrics-server-57fc8677f7-9hvt8" podUID="d0a0e755-4205-4573-b59a-07b4b6ec7ce2" containerName="metrics-server" containerID="cri-o://ffc0e2c1022a19fb78c8feb4c2d3e8f0516adc4451cbe64a93e86f2aa7f9af14" gracePeriod=170 Mar 09 13:06:50 crc kubenswrapper[4723]: I0309 13:06:50.063045 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7f69f56458-z9f7c"] Mar 09 13:06:50 crc kubenswrapper[4723]: I0309 13:06:50.163432 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a4a344f-6f96-422b-9468-56c8e988ad3f-client-ca-bundle\") pod \"metrics-server-7f69f56458-z9f7c\" (UID: \"9a4a344f-6f96-422b-9468-56c8e988ad3f\") " pod="openshift-monitoring/metrics-server-7f69f56458-z9f7c" Mar 09 13:06:50 crc kubenswrapper[4723]: I0309 13:06:50.163492 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9a4a344f-6f96-422b-9468-56c8e988ad3f-secret-metrics-client-certs\") pod \"metrics-server-7f69f56458-z9f7c\" (UID: \"9a4a344f-6f96-422b-9468-56c8e988ad3f\") " pod="openshift-monitoring/metrics-server-7f69f56458-z9f7c" Mar 09 13:06:50 crc kubenswrapper[4723]: I0309 13:06:50.163513 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/9a4a344f-6f96-422b-9468-56c8e988ad3f-audit-log\") pod \"metrics-server-7f69f56458-z9f7c\" (UID: \"9a4a344f-6f96-422b-9468-56c8e988ad3f\") " pod="openshift-monitoring/metrics-server-7f69f56458-z9f7c" Mar 09 13:06:50 crc kubenswrapper[4723]: I0309 13:06:50.163548 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/9a4a344f-6f96-422b-9468-56c8e988ad3f-metrics-server-audit-profiles\") pod \"metrics-server-7f69f56458-z9f7c\" (UID: \"9a4a344f-6f96-422b-9468-56c8e988ad3f\") " pod="openshift-monitoring/metrics-server-7f69f56458-z9f7c" Mar 09 13:06:50 crc kubenswrapper[4723]: I0309 13:06:50.163982 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/9a4a344f-6f96-422b-9468-56c8e988ad3f-secret-metrics-server-tls\") pod \"metrics-server-7f69f56458-z9f7c\" (UID: \"9a4a344f-6f96-422b-9468-56c8e988ad3f\") " pod="openshift-monitoring/metrics-server-7f69f56458-z9f7c" Mar 09 13:06:50 crc kubenswrapper[4723]: I0309 13:06:50.164033 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a4a344f-6f96-422b-9468-56c8e988ad3f-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7f69f56458-z9f7c\" (UID: \"9a4a344f-6f96-422b-9468-56c8e988ad3f\") " pod="openshift-monitoring/metrics-server-7f69f56458-z9f7c" Mar 09 13:06:50 crc kubenswrapper[4723]: I0309 13:06:50.164065 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-659p8\" (UniqueName: \"kubernetes.io/projected/9a4a344f-6f96-422b-9468-56c8e988ad3f-kube-api-access-659p8\") pod \"metrics-server-7f69f56458-z9f7c\" (UID: \"9a4a344f-6f96-422b-9468-56c8e988ad3f\") " pod="openshift-monitoring/metrics-server-7f69f56458-z9f7c" Mar 09 13:06:50 crc kubenswrapper[4723]: I0309 13:06:50.265673 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/9a4a344f-6f96-422b-9468-56c8e988ad3f-secret-metrics-server-tls\") pod \"metrics-server-7f69f56458-z9f7c\" (UID: \"9a4a344f-6f96-422b-9468-56c8e988ad3f\") " pod="openshift-monitoring/metrics-server-7f69f56458-z9f7c" Mar 09 13:06:50 crc kubenswrapper[4723]: I0309 13:06:50.265729 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a4a344f-6f96-422b-9468-56c8e988ad3f-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7f69f56458-z9f7c\" (UID: \"9a4a344f-6f96-422b-9468-56c8e988ad3f\") " pod="openshift-monitoring/metrics-server-7f69f56458-z9f7c" Mar 09 13:06:50 crc kubenswrapper[4723]: I0309 13:06:50.265753 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-659p8\" (UniqueName: \"kubernetes.io/projected/9a4a344f-6f96-422b-9468-56c8e988ad3f-kube-api-access-659p8\") pod \"metrics-server-7f69f56458-z9f7c\" (UID: \"9a4a344f-6f96-422b-9468-56c8e988ad3f\") " pod="openshift-monitoring/metrics-server-7f69f56458-z9f7c" Mar 09 13:06:50 crc kubenswrapper[4723]: I0309 13:06:50.265785 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a4a344f-6f96-422b-9468-56c8e988ad3f-client-ca-bundle\") pod \"metrics-server-7f69f56458-z9f7c\" (UID: \"9a4a344f-6f96-422b-9468-56c8e988ad3f\") " pod="openshift-monitoring/metrics-server-7f69f56458-z9f7c" Mar 09 13:06:50 crc kubenswrapper[4723]: I0309 13:06:50.265810 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9a4a344f-6f96-422b-9468-56c8e988ad3f-secret-metrics-client-certs\") pod \"metrics-server-7f69f56458-z9f7c\" (UID: \"9a4a344f-6f96-422b-9468-56c8e988ad3f\") " pod="openshift-monitoring/metrics-server-7f69f56458-z9f7c" Mar 09 13:06:50 crc kubenswrapper[4723]: I0309 13:06:50.265827 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/9a4a344f-6f96-422b-9468-56c8e988ad3f-audit-log\") pod \"metrics-server-7f69f56458-z9f7c\" (UID: \"9a4a344f-6f96-422b-9468-56c8e988ad3f\") " pod="openshift-monitoring/metrics-server-7f69f56458-z9f7c" Mar 09 13:06:50 crc kubenswrapper[4723]: I0309 13:06:50.265847 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/9a4a344f-6f96-422b-9468-56c8e988ad3f-metrics-server-audit-profiles\") pod \"metrics-server-7f69f56458-z9f7c\" (UID: \"9a4a344f-6f96-422b-9468-56c8e988ad3f\") " pod="openshift-monitoring/metrics-server-7f69f56458-z9f7c" Mar 09 13:06:50 crc kubenswrapper[4723]: I0309 13:06:50.266932 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/9a4a344f-6f96-422b-9468-56c8e988ad3f-audit-log\") pod \"metrics-server-7f69f56458-z9f7c\" (UID: \"9a4a344f-6f96-422b-9468-56c8e988ad3f\") " pod="openshift-monitoring/metrics-server-7f69f56458-z9f7c" Mar 09 13:06:50 crc kubenswrapper[4723]: I0309 13:06:50.267255 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/9a4a344f-6f96-422b-9468-56c8e988ad3f-metrics-server-audit-profiles\") pod \"metrics-server-7f69f56458-z9f7c\" (UID: \"9a4a344f-6f96-422b-9468-56c8e988ad3f\") " pod="openshift-monitoring/metrics-server-7f69f56458-z9f7c" Mar 09 13:06:50 crc kubenswrapper[4723]: I0309 13:06:50.267298 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a4a344f-6f96-422b-9468-56c8e988ad3f-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7f69f56458-z9f7c\" (UID: \"9a4a344f-6f96-422b-9468-56c8e988ad3f\") " pod="openshift-monitoring/metrics-server-7f69f56458-z9f7c" Mar 09 13:06:50 crc kubenswrapper[4723]: I0309 13:06:50.272342 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a4a344f-6f96-422b-9468-56c8e988ad3f-client-ca-bundle\") pod \"metrics-server-7f69f56458-z9f7c\" (UID: \"9a4a344f-6f96-422b-9468-56c8e988ad3f\") " pod="openshift-monitoring/metrics-server-7f69f56458-z9f7c" Mar 09 13:06:50 crc kubenswrapper[4723]: I0309 13:06:50.274653 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9a4a344f-6f96-422b-9468-56c8e988ad3f-secret-metrics-client-certs\") pod \"metrics-server-7f69f56458-z9f7c\" (UID: \"9a4a344f-6f96-422b-9468-56c8e988ad3f\") " pod="openshift-monitoring/metrics-server-7f69f56458-z9f7c" Mar 09 13:06:50 crc kubenswrapper[4723]: I0309 13:06:50.275685 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/9a4a344f-6f96-422b-9468-56c8e988ad3f-secret-metrics-server-tls\") pod \"metrics-server-7f69f56458-z9f7c\" (UID: \"9a4a344f-6f96-422b-9468-56c8e988ad3f\") " pod="openshift-monitoring/metrics-server-7f69f56458-z9f7c" Mar 09 13:06:50 crc kubenswrapper[4723]: I0309 13:06:50.285885 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-659p8\" (UniqueName: \"kubernetes.io/projected/9a4a344f-6f96-422b-9468-56c8e988ad3f-kube-api-access-659p8\") pod \"metrics-server-7f69f56458-z9f7c\" (UID: \"9a4a344f-6f96-422b-9468-56c8e988ad3f\") " pod="openshift-monitoring/metrics-server-7f69f56458-z9f7c" Mar 09 13:06:50 crc kubenswrapper[4723]: I0309 13:06:50.373619 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7f69f56458-z9f7c" Mar 09 13:06:50 crc kubenswrapper[4723]: I0309 13:06:50.827934 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7f69f56458-z9f7c"] Mar 09 13:06:51 crc kubenswrapper[4723]: I0309 13:06:51.793257 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7f69f56458-z9f7c" event={"ID":"9a4a344f-6f96-422b-9468-56c8e988ad3f","Type":"ContainerStarted","Data":"1cf976ee2556a62cbe7223703cbe5dc18a2e0cef64591475186da4c9dd8de172"} Mar 09 13:06:51 crc kubenswrapper[4723]: I0309 13:06:51.793687 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7f69f56458-z9f7c" event={"ID":"9a4a344f-6f96-422b-9468-56c8e988ad3f","Type":"ContainerStarted","Data":"2afe4d940d06360c97d48f3e24870ed623ac56548174a40e10d1e3ebe044996e"} Mar 09 13:06:51 crc kubenswrapper[4723]: I0309 13:06:51.820044 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-7f69f56458-z9f7c" podStartSLOduration=1.8200208249999998 podStartE2EDuration="1.820020825s" podCreationTimestamp="2026-03-09 13:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:06:51.817207244 +0000 UTC m=+485.831674824" watchObservedRunningTime="2026-03-09 13:06:51.820020825 +0000 UTC m=+485.834488405" Mar 09 13:07:03 crc kubenswrapper[4723]: I0309 13:07:03.947456 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:07:03 crc kubenswrapper[4723]: I0309 13:07:03.947847 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:07:04 crc kubenswrapper[4723]: I0309 13:07:04.310378 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6dc6c4d949-sfpmd"] Mar 09 13:07:04 crc kubenswrapper[4723]: I0309 13:07:04.313933 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6dc6c4d949-sfpmd" Mar 09 13:07:04 crc kubenswrapper[4723]: I0309 13:07:04.335054 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6dc6c4d949-sfpmd"] Mar 09 13:07:04 crc kubenswrapper[4723]: I0309 13:07:04.418060 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/44c2f727-e5d0-4d0c-967c-6411c885db43-service-ca\") pod \"console-6dc6c4d949-sfpmd\" (UID: \"44c2f727-e5d0-4d0c-967c-6411c885db43\") " pod="openshift-console/console-6dc6c4d949-sfpmd" Mar 09 13:07:04 crc kubenswrapper[4723]: I0309 13:07:04.418116 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/44c2f727-e5d0-4d0c-967c-6411c885db43-console-config\") pod \"console-6dc6c4d949-sfpmd\" (UID: \"44c2f727-e5d0-4d0c-967c-6411c885db43\") " pod="openshift-console/console-6dc6c4d949-sfpmd" Mar 09 13:07:04 crc kubenswrapper[4723]: I0309 13:07:04.418187 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/44c2f727-e5d0-4d0c-967c-6411c885db43-oauth-serving-cert\") pod \"console-6dc6c4d949-sfpmd\" (UID: \"44c2f727-e5d0-4d0c-967c-6411c885db43\") " pod="openshift-console/console-6dc6c4d949-sfpmd" Mar 09 13:07:04 crc kubenswrapper[4723]: I0309 13:07:04.418246 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wv76\" (UniqueName: \"kubernetes.io/projected/44c2f727-e5d0-4d0c-967c-6411c885db43-kube-api-access-9wv76\") pod \"console-6dc6c4d949-sfpmd\" (UID: \"44c2f727-e5d0-4d0c-967c-6411c885db43\") " pod="openshift-console/console-6dc6c4d949-sfpmd" Mar 09 13:07:04 crc kubenswrapper[4723]: I0309 13:07:04.418294 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44c2f727-e5d0-4d0c-967c-6411c885db43-trusted-ca-bundle\") pod \"console-6dc6c4d949-sfpmd\" (UID: \"44c2f727-e5d0-4d0c-967c-6411c885db43\") " pod="openshift-console/console-6dc6c4d949-sfpmd" Mar 09 13:07:04 crc kubenswrapper[4723]: I0309 13:07:04.418336 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/44c2f727-e5d0-4d0c-967c-6411c885db43-console-serving-cert\") pod \"console-6dc6c4d949-sfpmd\" (UID: \"44c2f727-e5d0-4d0c-967c-6411c885db43\") " pod="openshift-console/console-6dc6c4d949-sfpmd" Mar 09 13:07:04 crc kubenswrapper[4723]: I0309 13:07:04.418352 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/44c2f727-e5d0-4d0c-967c-6411c885db43-console-oauth-config\") pod \"console-6dc6c4d949-sfpmd\" (UID: \"44c2f727-e5d0-4d0c-967c-6411c885db43\") " pod="openshift-console/console-6dc6c4d949-sfpmd" Mar 09 13:07:04 crc kubenswrapper[4723]: I0309 13:07:04.519931 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/44c2f727-e5d0-4d0c-967c-6411c885db43-oauth-serving-cert\") pod \"console-6dc6c4d949-sfpmd\" (UID: \"44c2f727-e5d0-4d0c-967c-6411c885db43\") " pod="openshift-console/console-6dc6c4d949-sfpmd" Mar 09 13:07:04 crc kubenswrapper[4723]: I0309 13:07:04.519999 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wv76\" (UniqueName: \"kubernetes.io/projected/44c2f727-e5d0-4d0c-967c-6411c885db43-kube-api-access-9wv76\") pod \"console-6dc6c4d949-sfpmd\" (UID: \"44c2f727-e5d0-4d0c-967c-6411c885db43\") " pod="openshift-console/console-6dc6c4d949-sfpmd" Mar 09 13:07:04 crc kubenswrapper[4723]: I0309 13:07:04.520027 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44c2f727-e5d0-4d0c-967c-6411c885db43-trusted-ca-bundle\") pod \"console-6dc6c4d949-sfpmd\" (UID: \"44c2f727-e5d0-4d0c-967c-6411c885db43\") " pod="openshift-console/console-6dc6c4d949-sfpmd" Mar 09 13:07:04 crc kubenswrapper[4723]: I0309 13:07:04.520070 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/44c2f727-e5d0-4d0c-967c-6411c885db43-console-serving-cert\") pod \"console-6dc6c4d949-sfpmd\" (UID: \"44c2f727-e5d0-4d0c-967c-6411c885db43\") " pod="openshift-console/console-6dc6c4d949-sfpmd" Mar 09 13:07:04 crc kubenswrapper[4723]: I0309 13:07:04.520084 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/44c2f727-e5d0-4d0c-967c-6411c885db43-console-oauth-config\") pod \"console-6dc6c4d949-sfpmd\" (UID: \"44c2f727-e5d0-4d0c-967c-6411c885db43\") " pod="openshift-console/console-6dc6c4d949-sfpmd" Mar 09 13:07:04 crc kubenswrapper[4723]: I0309 13:07:04.520123 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/44c2f727-e5d0-4d0c-967c-6411c885db43-service-ca\") pod \"console-6dc6c4d949-sfpmd\" (UID: \"44c2f727-e5d0-4d0c-967c-6411c885db43\") " pod="openshift-console/console-6dc6c4d949-sfpmd" Mar 09 13:07:04 crc kubenswrapper[4723]: I0309 13:07:04.520485 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/44c2f727-e5d0-4d0c-967c-6411c885db43-console-config\") pod \"console-6dc6c4d949-sfpmd\" (UID: \"44c2f727-e5d0-4d0c-967c-6411c885db43\") " pod="openshift-console/console-6dc6c4d949-sfpmd" Mar 09 13:07:04 crc kubenswrapper[4723]: I0309 13:07:04.521743 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/44c2f727-e5d0-4d0c-967c-6411c885db43-console-config\") pod \"console-6dc6c4d949-sfpmd\" (UID: \"44c2f727-e5d0-4d0c-967c-6411c885db43\") " pod="openshift-console/console-6dc6c4d949-sfpmd" Mar 09 13:07:04 crc kubenswrapper[4723]: I0309 13:07:04.521975 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/44c2f727-e5d0-4d0c-967c-6411c885db43-service-ca\") pod \"console-6dc6c4d949-sfpmd\" (UID: \"44c2f727-e5d0-4d0c-967c-6411c885db43\") " pod="openshift-console/console-6dc6c4d949-sfpmd" Mar 09 13:07:04 crc kubenswrapper[4723]: I0309 13:07:04.522004 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/44c2f727-e5d0-4d0c-967c-6411c885db43-oauth-serving-cert\") pod \"console-6dc6c4d949-sfpmd\" (UID: \"44c2f727-e5d0-4d0c-967c-6411c885db43\") " pod="openshift-console/console-6dc6c4d949-sfpmd" Mar 09 13:07:04 crc kubenswrapper[4723]: I0309 13:07:04.522405 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44c2f727-e5d0-4d0c-967c-6411c885db43-trusted-ca-bundle\") pod \"console-6dc6c4d949-sfpmd\" (UID: \"44c2f727-e5d0-4d0c-967c-6411c885db43\") " pod="openshift-console/console-6dc6c4d949-sfpmd" Mar 09 13:07:04 crc kubenswrapper[4723]: I0309 13:07:04.527995 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/44c2f727-e5d0-4d0c-967c-6411c885db43-console-oauth-config\") pod \"console-6dc6c4d949-sfpmd\" (UID: \"44c2f727-e5d0-4d0c-967c-6411c885db43\") " pod="openshift-console/console-6dc6c4d949-sfpmd" Mar 09 13:07:04 crc kubenswrapper[4723]: I0309 13:07:04.534520 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/44c2f727-e5d0-4d0c-967c-6411c885db43-console-serving-cert\") pod \"console-6dc6c4d949-sfpmd\" (UID: \"44c2f727-e5d0-4d0c-967c-6411c885db43\") " pod="openshift-console/console-6dc6c4d949-sfpmd" Mar 09 13:07:04 crc kubenswrapper[4723]: I0309 13:07:04.537113 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wv76\" (UniqueName: \"kubernetes.io/projected/44c2f727-e5d0-4d0c-967c-6411c885db43-kube-api-access-9wv76\") pod \"console-6dc6c4d949-sfpmd\" (UID: \"44c2f727-e5d0-4d0c-967c-6411c885db43\") " pod="openshift-console/console-6dc6c4d949-sfpmd" Mar 09 13:07:04 crc kubenswrapper[4723]: I0309 13:07:04.642057 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6dc6c4d949-sfpmd" Mar 09 13:07:05 crc kubenswrapper[4723]: I0309 13:07:05.002520 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6dc6c4d949-sfpmd"] Mar 09 13:07:05 crc kubenswrapper[4723]: I0309 13:07:05.914937 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6dc6c4d949-sfpmd" event={"ID":"44c2f727-e5d0-4d0c-967c-6411c885db43","Type":"ContainerStarted","Data":"a687edafcebadcfac28d729d6e28ce380e2f6ca4c95e86fed961696bb95fb67f"} Mar 09 13:07:05 crc kubenswrapper[4723]: I0309 13:07:05.915336 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6dc6c4d949-sfpmd" event={"ID":"44c2f727-e5d0-4d0c-967c-6411c885db43","Type":"ContainerStarted","Data":"e52fc4bed4c03c48c4b4f23c841546827d815994ef080fceb59077ab535ca0b6"} Mar 09 13:07:05 crc kubenswrapper[4723]: I0309 13:07:05.936749 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6dc6c4d949-sfpmd" podStartSLOduration=1.936725401 podStartE2EDuration="1.936725401s" podCreationTimestamp="2026-03-09 13:07:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:07:05.933759584 +0000 UTC m=+499.948227144" watchObservedRunningTime="2026-03-09 13:07:05.936725401 +0000 UTC m=+499.951192951" Mar 09 13:07:10 crc kubenswrapper[4723]: I0309 13:07:10.374701 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-7f69f56458-z9f7c" Mar 09 13:07:10 crc kubenswrapper[4723]: I0309 13:07:10.377141 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-7f69f56458-z9f7c" Mar 09 13:07:10 crc kubenswrapper[4723]: I0309 13:07:10.382465 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-7f69f56458-z9f7c" Mar 09 13:07:10 crc kubenswrapper[4723]: I0309 13:07:10.964555 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-7f69f56458-z9f7c" Mar 09 13:07:14 crc kubenswrapper[4723]: I0309 13:07:14.642336 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6dc6c4d949-sfpmd" Mar 09 13:07:14 crc kubenswrapper[4723]: I0309 13:07:14.642608 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6dc6c4d949-sfpmd" Mar 09 13:07:14 crc kubenswrapper[4723]: I0309 13:07:14.648654 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6dc6c4d949-sfpmd" Mar 09 13:07:14 crc kubenswrapper[4723]: I0309 13:07:14.993202 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6dc6c4d949-sfpmd" Mar 09 13:07:15 crc kubenswrapper[4723]: I0309 13:07:15.095019 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-66b5df45c6-zvqgd"] Mar 09 13:07:33 crc kubenswrapper[4723]: I0309 13:07:33.947415 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:07:33 crc kubenswrapper[4723]: I0309 13:07:33.948182 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:07:33 crc kubenswrapper[4723]: I0309 13:07:33.948267 4723 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" Mar 09 13:07:33 crc kubenswrapper[4723]: I0309 13:07:33.949497 4723 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d054e197559b4879e59df42a68d1c798a7c319b81a2cd49030fdbc518b252634"} pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 13:07:33 crc kubenswrapper[4723]: I0309 13:07:33.949797 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" containerID="cri-o://d054e197559b4879e59df42a68d1c798a7c319b81a2cd49030fdbc518b252634" gracePeriod=600 Mar 09 13:07:34 crc kubenswrapper[4723]: I0309 13:07:34.143936 4723 generic.go:334] "Generic (PLEG): container finished" podID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerID="d054e197559b4879e59df42a68d1c798a7c319b81a2cd49030fdbc518b252634" exitCode=0 Mar 09 13:07:34 crc kubenswrapper[4723]: I0309 13:07:34.144009 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" event={"ID":"983d5ed4-cfc7-402a-b226-29dc071c6e4e","Type":"ContainerDied","Data":"d054e197559b4879e59df42a68d1c798a7c319b81a2cd49030fdbc518b252634"} Mar 09 13:07:34 crc kubenswrapper[4723]: I0309 13:07:34.144056 4723 scope.go:117] "RemoveContainer" containerID="8b99b2d350bcf051156986ebffa7486b5fd35d0dcd70f0163bbbeb54050408e1" Mar 09 13:07:35 crc kubenswrapper[4723]: I0309 13:07:35.155574 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" event={"ID":"983d5ed4-cfc7-402a-b226-29dc071c6e4e","Type":"ContainerStarted","Data":"0fa72ca2b7e100c53424b0c6c728520cb30db8e9432e97e83e4d09f170a81438"} Mar 09 13:07:40 crc kubenswrapper[4723]: I0309 13:07:40.141999 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-66b5df45c6-zvqgd" podUID="664f7870-22a7-4f17-b89f-5c9a9616a2d1" containerName="console" containerID="cri-o://9d62557bc9a2f927d1152caaaae4978b5850a3c8e3f25a7bfbfc9e2006e3687c" gracePeriod=15 Mar 09 13:07:40 crc kubenswrapper[4723]: I0309 13:07:40.489167 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66b5df45c6-zvqgd_664f7870-22a7-4f17-b89f-5c9a9616a2d1/console/0.log" Mar 09 13:07:40 crc kubenswrapper[4723]: I0309 13:07:40.489422 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66b5df45c6-zvqgd" Mar 09 13:07:40 crc kubenswrapper[4723]: I0309 13:07:40.568381 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/664f7870-22a7-4f17-b89f-5c9a9616a2d1-console-config\") pod \"664f7870-22a7-4f17-b89f-5c9a9616a2d1\" (UID: \"664f7870-22a7-4f17-b89f-5c9a9616a2d1\") " Mar 09 13:07:40 crc kubenswrapper[4723]: I0309 13:07:40.568428 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/664f7870-22a7-4f17-b89f-5c9a9616a2d1-console-oauth-config\") pod \"664f7870-22a7-4f17-b89f-5c9a9616a2d1\" (UID: \"664f7870-22a7-4f17-b89f-5c9a9616a2d1\") " Mar 09 13:07:40 crc kubenswrapper[4723]: I0309 13:07:40.568458 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/664f7870-22a7-4f17-b89f-5c9a9616a2d1-service-ca\") pod \"664f7870-22a7-4f17-b89f-5c9a9616a2d1\" (UID: \"664f7870-22a7-4f17-b89f-5c9a9616a2d1\") " Mar 09 13:07:40 crc kubenswrapper[4723]: I0309 13:07:40.568521 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/664f7870-22a7-4f17-b89f-5c9a9616a2d1-oauth-serving-cert\") pod \"664f7870-22a7-4f17-b89f-5c9a9616a2d1\" (UID: \"664f7870-22a7-4f17-b89f-5c9a9616a2d1\") " Mar 09 13:07:40 crc kubenswrapper[4723]: I0309 13:07:40.568545 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzwmz\" (UniqueName: \"kubernetes.io/projected/664f7870-22a7-4f17-b89f-5c9a9616a2d1-kube-api-access-pzwmz\") pod \"664f7870-22a7-4f17-b89f-5c9a9616a2d1\" (UID: \"664f7870-22a7-4f17-b89f-5c9a9616a2d1\") " Mar 09 13:07:40 crc kubenswrapper[4723]: I0309 13:07:40.568559 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/664f7870-22a7-4f17-b89f-5c9a9616a2d1-console-serving-cert\") pod \"664f7870-22a7-4f17-b89f-5c9a9616a2d1\" (UID: \"664f7870-22a7-4f17-b89f-5c9a9616a2d1\") " Mar 09 13:07:40 crc kubenswrapper[4723]: I0309 13:07:40.568608 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/664f7870-22a7-4f17-b89f-5c9a9616a2d1-trusted-ca-bundle\") pod \"664f7870-22a7-4f17-b89f-5c9a9616a2d1\" (UID: \"664f7870-22a7-4f17-b89f-5c9a9616a2d1\") " Mar 09 13:07:40 crc kubenswrapper[4723]: I0309 13:07:40.569388 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/664f7870-22a7-4f17-b89f-5c9a9616a2d1-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "664f7870-22a7-4f17-b89f-5c9a9616a2d1" (UID: "664f7870-22a7-4f17-b89f-5c9a9616a2d1"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:07:40 crc kubenswrapper[4723]: I0309 13:07:40.569400 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/664f7870-22a7-4f17-b89f-5c9a9616a2d1-console-config" (OuterVolumeSpecName: "console-config") pod "664f7870-22a7-4f17-b89f-5c9a9616a2d1" (UID: "664f7870-22a7-4f17-b89f-5c9a9616a2d1"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:07:40 crc kubenswrapper[4723]: I0309 13:07:40.569414 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/664f7870-22a7-4f17-b89f-5c9a9616a2d1-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "664f7870-22a7-4f17-b89f-5c9a9616a2d1" (UID: "664f7870-22a7-4f17-b89f-5c9a9616a2d1"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:07:40 crc kubenswrapper[4723]: I0309 13:07:40.569511 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/664f7870-22a7-4f17-b89f-5c9a9616a2d1-service-ca" (OuterVolumeSpecName: "service-ca") pod "664f7870-22a7-4f17-b89f-5c9a9616a2d1" (UID: "664f7870-22a7-4f17-b89f-5c9a9616a2d1"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:07:40 crc kubenswrapper[4723]: I0309 13:07:40.573384 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/664f7870-22a7-4f17-b89f-5c9a9616a2d1-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "664f7870-22a7-4f17-b89f-5c9a9616a2d1" (UID: "664f7870-22a7-4f17-b89f-5c9a9616a2d1"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:07:40 crc kubenswrapper[4723]: I0309 13:07:40.573695 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/664f7870-22a7-4f17-b89f-5c9a9616a2d1-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "664f7870-22a7-4f17-b89f-5c9a9616a2d1" (UID: "664f7870-22a7-4f17-b89f-5c9a9616a2d1"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:07:40 crc kubenswrapper[4723]: I0309 13:07:40.574066 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/664f7870-22a7-4f17-b89f-5c9a9616a2d1-kube-api-access-pzwmz" (OuterVolumeSpecName: "kube-api-access-pzwmz") pod "664f7870-22a7-4f17-b89f-5c9a9616a2d1" (UID: "664f7870-22a7-4f17-b89f-5c9a9616a2d1"). InnerVolumeSpecName "kube-api-access-pzwmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:07:40 crc kubenswrapper[4723]: I0309 13:07:40.670312 4723 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/664f7870-22a7-4f17-b89f-5c9a9616a2d1-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:07:40 crc kubenswrapper[4723]: I0309 13:07:40.670342 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzwmz\" (UniqueName: \"kubernetes.io/projected/664f7870-22a7-4f17-b89f-5c9a9616a2d1-kube-api-access-pzwmz\") on node \"crc\" DevicePath \"\"" Mar 09 13:07:40 crc kubenswrapper[4723]: I0309 13:07:40.670353 4723 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/664f7870-22a7-4f17-b89f-5c9a9616a2d1-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:07:40 crc kubenswrapper[4723]: I0309 13:07:40.670362 4723 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/664f7870-22a7-4f17-b89f-5c9a9616a2d1-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:07:40 crc kubenswrapper[4723]: I0309 13:07:40.670371 4723 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/664f7870-22a7-4f17-b89f-5c9a9616a2d1-console-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:07:40 crc kubenswrapper[4723]: I0309 13:07:40.670380 4723 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/664f7870-22a7-4f17-b89f-5c9a9616a2d1-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:07:40 crc kubenswrapper[4723]: I0309 13:07:40.670389 4723 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/664f7870-22a7-4f17-b89f-5c9a9616a2d1-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:07:41 crc kubenswrapper[4723]: I0309 13:07:41.203200 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66b5df45c6-zvqgd_664f7870-22a7-4f17-b89f-5c9a9616a2d1/console/0.log" Mar 09 13:07:41 crc kubenswrapper[4723]: I0309 13:07:41.203611 4723 generic.go:334] "Generic (PLEG): container finished" podID="664f7870-22a7-4f17-b89f-5c9a9616a2d1" containerID="9d62557bc9a2f927d1152caaaae4978b5850a3c8e3f25a7bfbfc9e2006e3687c" exitCode=2 Mar 09 13:07:41 crc kubenswrapper[4723]: I0309 13:07:41.203655 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66b5df45c6-zvqgd" event={"ID":"664f7870-22a7-4f17-b89f-5c9a9616a2d1","Type":"ContainerDied","Data":"9d62557bc9a2f927d1152caaaae4978b5850a3c8e3f25a7bfbfc9e2006e3687c"} Mar 09 13:07:41 crc kubenswrapper[4723]: I0309 13:07:41.203698 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66b5df45c6-zvqgd" event={"ID":"664f7870-22a7-4f17-b89f-5c9a9616a2d1","Type":"ContainerDied","Data":"90f5d3fd6edc890142578d76c4bf7bf5eb7a4354ad3df316889c6d22a0a63e53"} Mar 09 13:07:41 crc kubenswrapper[4723]: I0309 13:07:41.203727 4723 scope.go:117] "RemoveContainer" containerID="9d62557bc9a2f927d1152caaaae4978b5850a3c8e3f25a7bfbfc9e2006e3687c" Mar 09 13:07:41 crc kubenswrapper[4723]: I0309 13:07:41.203739 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66b5df45c6-zvqgd" Mar 09 13:07:41 crc kubenswrapper[4723]: I0309 13:07:41.234268 4723 scope.go:117] "RemoveContainer" containerID="9d62557bc9a2f927d1152caaaae4978b5850a3c8e3f25a7bfbfc9e2006e3687c" Mar 09 13:07:41 crc kubenswrapper[4723]: E0309 13:07:41.235235 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d62557bc9a2f927d1152caaaae4978b5850a3c8e3f25a7bfbfc9e2006e3687c\": container with ID starting with 9d62557bc9a2f927d1152caaaae4978b5850a3c8e3f25a7bfbfc9e2006e3687c not found: ID does not exist" containerID="9d62557bc9a2f927d1152caaaae4978b5850a3c8e3f25a7bfbfc9e2006e3687c" Mar 09 13:07:41 crc kubenswrapper[4723]: I0309 13:07:41.235297 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d62557bc9a2f927d1152caaaae4978b5850a3c8e3f25a7bfbfc9e2006e3687c"} err="failed to get container status \"9d62557bc9a2f927d1152caaaae4978b5850a3c8e3f25a7bfbfc9e2006e3687c\": rpc error: code = NotFound desc = could not find container \"9d62557bc9a2f927d1152caaaae4978b5850a3c8e3f25a7bfbfc9e2006e3687c\": container with ID starting with 9d62557bc9a2f927d1152caaaae4978b5850a3c8e3f25a7bfbfc9e2006e3687c not found: ID does not exist" Mar 09 13:07:41 crc kubenswrapper[4723]: I0309 13:07:41.244272 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-66b5df45c6-zvqgd"] Mar 09 13:07:41 crc kubenswrapper[4723]: I0309 13:07:41.251687 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-66b5df45c6-zvqgd"] Mar 09 13:07:42 crc kubenswrapper[4723]: I0309 13:07:42.894009 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="664f7870-22a7-4f17-b89f-5c9a9616a2d1" path="/var/lib/kubelet/pods/664f7870-22a7-4f17-b89f-5c9a9616a2d1/volumes" Mar 09 13:08:00 crc kubenswrapper[4723]: I0309 13:08:00.150578 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551028-dg5sf"] Mar 09 13:08:00 crc kubenswrapper[4723]: E0309 13:08:00.151852 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="664f7870-22a7-4f17-b89f-5c9a9616a2d1" containerName="console" Mar 09 13:08:00 crc kubenswrapper[4723]: I0309 13:08:00.151924 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="664f7870-22a7-4f17-b89f-5c9a9616a2d1" containerName="console" Mar 09 13:08:00 crc kubenswrapper[4723]: I0309 13:08:00.152234 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="664f7870-22a7-4f17-b89f-5c9a9616a2d1" containerName="console" Mar 09 13:08:00 crc kubenswrapper[4723]: I0309 13:08:00.153208 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551028-dg5sf" Mar 09 13:08:00 crc kubenswrapper[4723]: I0309 13:08:00.156682 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:08:00 crc kubenswrapper[4723]: I0309 13:08:00.156744 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:08:00 crc kubenswrapper[4723]: I0309 13:08:00.156811 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jrp4x" Mar 09 13:08:00 crc kubenswrapper[4723]: I0309 13:08:00.167718 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551028-dg5sf"] Mar 09 13:08:00 crc kubenswrapper[4723]: I0309 13:08:00.192172 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sqqn\" (UniqueName: \"kubernetes.io/projected/e526ef20-e343-48ac-8600-e647ac6996a4-kube-api-access-9sqqn\") pod \"auto-csr-approver-29551028-dg5sf\" (UID: \"e526ef20-e343-48ac-8600-e647ac6996a4\") " pod="openshift-infra/auto-csr-approver-29551028-dg5sf" Mar 09 13:08:00 crc kubenswrapper[4723]: I0309 13:08:00.293792 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sqqn\" (UniqueName: \"kubernetes.io/projected/e526ef20-e343-48ac-8600-e647ac6996a4-kube-api-access-9sqqn\") pod \"auto-csr-approver-29551028-dg5sf\" (UID: \"e526ef20-e343-48ac-8600-e647ac6996a4\") " pod="openshift-infra/auto-csr-approver-29551028-dg5sf" Mar 09 13:08:00 crc kubenswrapper[4723]: I0309 13:08:00.317316 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sqqn\" (UniqueName: \"kubernetes.io/projected/e526ef20-e343-48ac-8600-e647ac6996a4-kube-api-access-9sqqn\") pod \"auto-csr-approver-29551028-dg5sf\" (UID: \"e526ef20-e343-48ac-8600-e647ac6996a4\") " pod="openshift-infra/auto-csr-approver-29551028-dg5sf" Mar 09 13:08:00 crc kubenswrapper[4723]: I0309 13:08:00.508228 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551028-dg5sf" Mar 09 13:08:00 crc kubenswrapper[4723]: I0309 13:08:00.777617 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551028-dg5sf"] Mar 09 13:08:00 crc kubenswrapper[4723]: W0309 13:08:00.790236 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode526ef20_e343_48ac_8600_e647ac6996a4.slice/crio-43b58b9f9168c422ba63730d0b02d871d554af82a14cf47a662f83e097f7c6c0 WatchSource:0}: Error finding container 43b58b9f9168c422ba63730d0b02d871d554af82a14cf47a662f83e097f7c6c0: Status 404 returned error can't find the container with id 43b58b9f9168c422ba63730d0b02d871d554af82a14cf47a662f83e097f7c6c0 Mar 09 13:08:00 crc kubenswrapper[4723]: I0309 13:08:00.793313 4723 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 13:08:01 crc kubenswrapper[4723]: I0309 13:08:01.367297 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551028-dg5sf" event={"ID":"e526ef20-e343-48ac-8600-e647ac6996a4","Type":"ContainerStarted","Data":"43b58b9f9168c422ba63730d0b02d871d554af82a14cf47a662f83e097f7c6c0"} Mar 09 13:08:02 crc kubenswrapper[4723]: I0309 13:08:02.385280 4723 generic.go:334] "Generic (PLEG): container finished" podID="e526ef20-e343-48ac-8600-e647ac6996a4" containerID="66e466e6f155f03120dfc4c8c99001e064213bfd19d0d5dbd92949dd103e501c" exitCode=0 Mar 09 13:08:02 crc kubenswrapper[4723]: I0309 13:08:02.385499 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551028-dg5sf" event={"ID":"e526ef20-e343-48ac-8600-e647ac6996a4","Type":"ContainerDied","Data":"66e466e6f155f03120dfc4c8c99001e064213bfd19d0d5dbd92949dd103e501c"} Mar 09 13:08:03 crc kubenswrapper[4723]: I0309 13:08:03.676745 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551028-dg5sf" Mar 09 13:08:03 crc kubenswrapper[4723]: I0309 13:08:03.853643 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sqqn\" (UniqueName: \"kubernetes.io/projected/e526ef20-e343-48ac-8600-e647ac6996a4-kube-api-access-9sqqn\") pod \"e526ef20-e343-48ac-8600-e647ac6996a4\" (UID: \"e526ef20-e343-48ac-8600-e647ac6996a4\") " Mar 09 13:08:03 crc kubenswrapper[4723]: I0309 13:08:03.865116 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e526ef20-e343-48ac-8600-e647ac6996a4-kube-api-access-9sqqn" (OuterVolumeSpecName: "kube-api-access-9sqqn") pod "e526ef20-e343-48ac-8600-e647ac6996a4" (UID: "e526ef20-e343-48ac-8600-e647ac6996a4"). InnerVolumeSpecName "kube-api-access-9sqqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:08:03 crc kubenswrapper[4723]: I0309 13:08:03.956739 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sqqn\" (UniqueName: \"kubernetes.io/projected/e526ef20-e343-48ac-8600-e647ac6996a4-kube-api-access-9sqqn\") on node \"crc\" DevicePath \"\"" Mar 09 13:08:04 crc kubenswrapper[4723]: I0309 13:08:04.408358 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551028-dg5sf" event={"ID":"e526ef20-e343-48ac-8600-e647ac6996a4","Type":"ContainerDied","Data":"43b58b9f9168c422ba63730d0b02d871d554af82a14cf47a662f83e097f7c6c0"} Mar 09 13:08:04 crc kubenswrapper[4723]: I0309 13:08:04.408405 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43b58b9f9168c422ba63730d0b02d871d554af82a14cf47a662f83e097f7c6c0" Mar 09 13:08:04 crc kubenswrapper[4723]: I0309 13:08:04.408443 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551028-dg5sf" Mar 09 13:08:04 crc kubenswrapper[4723]: I0309 13:08:04.747689 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551022-xgzbd"] Mar 09 13:08:04 crc kubenswrapper[4723]: I0309 13:08:04.755574 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551022-xgzbd"] Mar 09 13:08:04 crc kubenswrapper[4723]: I0309 13:08:04.895275 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2586638-1604-4545-8203-6b89e38129e6" path="/var/lib/kubelet/pods/b2586638-1604-4545-8203-6b89e38129e6/volumes" Mar 09 13:08:47 crc kubenswrapper[4723]: I0309 13:08:47.224232 4723 scope.go:117] "RemoveContainer" containerID="f057a186bca17facee44643cdcad3fa67ea23bd1739dd713755164310d050dfd" Mar 09 13:09:20 crc kubenswrapper[4723]: I0309 13:09:20.447536 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-57fc8677f7-9hvt8" Mar 09 13:09:20 crc kubenswrapper[4723]: I0309 13:09:20.546103 4723 generic.go:334] "Generic (PLEG): container finished" podID="d0a0e755-4205-4573-b59a-07b4b6ec7ce2" containerID="ffc0e2c1022a19fb78c8feb4c2d3e8f0516adc4451cbe64a93e86f2aa7f9af14" exitCode=0 Mar 09 13:09:20 crc kubenswrapper[4723]: I0309 13:09:20.546152 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-57fc8677f7-9hvt8" event={"ID":"d0a0e755-4205-4573-b59a-07b4b6ec7ce2","Type":"ContainerDied","Data":"ffc0e2c1022a19fb78c8feb4c2d3e8f0516adc4451cbe64a93e86f2aa7f9af14"} Mar 09 13:09:20 crc kubenswrapper[4723]: I0309 13:09:20.546166 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-57fc8677f7-9hvt8" Mar 09 13:09:20 crc kubenswrapper[4723]: I0309 13:09:20.546183 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-57fc8677f7-9hvt8" event={"ID":"d0a0e755-4205-4573-b59a-07b4b6ec7ce2","Type":"ContainerDied","Data":"d5c99913928ca9034f7308f97f4a22c938e681757780ef4ab4e92b0d3871fc38"} Mar 09 13:09:20 crc kubenswrapper[4723]: I0309 13:09:20.546204 4723 scope.go:117] "RemoveContainer" containerID="ffc0e2c1022a19fb78c8feb4c2d3e8f0516adc4451cbe64a93e86f2aa7f9af14" Mar 09 13:09:20 crc kubenswrapper[4723]: I0309 13:09:20.568254 4723 scope.go:117] "RemoveContainer" containerID="ffc0e2c1022a19fb78c8feb4c2d3e8f0516adc4451cbe64a93e86f2aa7f9af14" Mar 09 13:09:20 crc kubenswrapper[4723]: E0309 13:09:20.568756 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffc0e2c1022a19fb78c8feb4c2d3e8f0516adc4451cbe64a93e86f2aa7f9af14\": container with ID starting with ffc0e2c1022a19fb78c8feb4c2d3e8f0516adc4451cbe64a93e86f2aa7f9af14 not found: ID does not exist" containerID="ffc0e2c1022a19fb78c8feb4c2d3e8f0516adc4451cbe64a93e86f2aa7f9af14" Mar 09 13:09:20 crc kubenswrapper[4723]: I0309 13:09:20.568791 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffc0e2c1022a19fb78c8feb4c2d3e8f0516adc4451cbe64a93e86f2aa7f9af14"} err="failed to get container status \"ffc0e2c1022a19fb78c8feb4c2d3e8f0516adc4451cbe64a93e86f2aa7f9af14\": rpc error: code = NotFound desc = could not find container \"ffc0e2c1022a19fb78c8feb4c2d3e8f0516adc4451cbe64a93e86f2aa7f9af14\": container with ID starting with ffc0e2c1022a19fb78c8feb4c2d3e8f0516adc4451cbe64a93e86f2aa7f9af14 not found: ID does not exist" Mar 09 13:09:20 crc kubenswrapper[4723]: I0309 13:09:20.594586 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rj4zb\" (UniqueName: \"kubernetes.io/projected/d0a0e755-4205-4573-b59a-07b4b6ec7ce2-kube-api-access-rj4zb\") pod \"d0a0e755-4205-4573-b59a-07b4b6ec7ce2\" (UID: \"d0a0e755-4205-4573-b59a-07b4b6ec7ce2\") " Mar 09 13:09:20 crc kubenswrapper[4723]: I0309 13:09:20.594655 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/d0a0e755-4205-4573-b59a-07b4b6ec7ce2-audit-log\") pod \"d0a0e755-4205-4573-b59a-07b4b6ec7ce2\" (UID: \"d0a0e755-4205-4573-b59a-07b4b6ec7ce2\") " Mar 09 13:09:20 crc kubenswrapper[4723]: I0309 13:09:20.594728 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/d0a0e755-4205-4573-b59a-07b4b6ec7ce2-secret-metrics-server-tls\") pod \"d0a0e755-4205-4573-b59a-07b4b6ec7ce2\" (UID: \"d0a0e755-4205-4573-b59a-07b4b6ec7ce2\") " Mar 09 13:09:20 crc kubenswrapper[4723]: I0309 13:09:20.594761 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d0a0e755-4205-4573-b59a-07b4b6ec7ce2-secret-metrics-client-certs\") pod \"d0a0e755-4205-4573-b59a-07b4b6ec7ce2\" (UID: \"d0a0e755-4205-4573-b59a-07b4b6ec7ce2\") " Mar 09 13:09:20 crc kubenswrapper[4723]: I0309 13:09:20.594826 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0a0e755-4205-4573-b59a-07b4b6ec7ce2-client-ca-bundle\") pod \"d0a0e755-4205-4573-b59a-07b4b6ec7ce2\" (UID: \"d0a0e755-4205-4573-b59a-07b4b6ec7ce2\") " Mar 09 13:09:20 crc kubenswrapper[4723]: I0309 13:09:20.594845 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/d0a0e755-4205-4573-b59a-07b4b6ec7ce2-metrics-server-audit-profiles\") pod \"d0a0e755-4205-4573-b59a-07b4b6ec7ce2\" (UID: \"d0a0e755-4205-4573-b59a-07b4b6ec7ce2\") " Mar 09 13:09:20 crc kubenswrapper[4723]: I0309 13:09:20.594885 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0a0e755-4205-4573-b59a-07b4b6ec7ce2-configmap-kubelet-serving-ca-bundle\") pod \"d0a0e755-4205-4573-b59a-07b4b6ec7ce2\" (UID: \"d0a0e755-4205-4573-b59a-07b4b6ec7ce2\") " Mar 09 13:09:20 crc kubenswrapper[4723]: I0309 13:09:20.595877 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0a0e755-4205-4573-b59a-07b4b6ec7ce2-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "d0a0e755-4205-4573-b59a-07b4b6ec7ce2" (UID: "d0a0e755-4205-4573-b59a-07b4b6ec7ce2"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:09:20 crc kubenswrapper[4723]: I0309 13:09:20.597237 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0a0e755-4205-4573-b59a-07b4b6ec7ce2-audit-log" (OuterVolumeSpecName: "audit-log") pod "d0a0e755-4205-4573-b59a-07b4b6ec7ce2" (UID: "d0a0e755-4205-4573-b59a-07b4b6ec7ce2"). InnerVolumeSpecName "audit-log". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:09:20 crc kubenswrapper[4723]: I0309 13:09:20.597578 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0a0e755-4205-4573-b59a-07b4b6ec7ce2-metrics-server-audit-profiles" (OuterVolumeSpecName: "metrics-server-audit-profiles") pod "d0a0e755-4205-4573-b59a-07b4b6ec7ce2" (UID: "d0a0e755-4205-4573-b59a-07b4b6ec7ce2"). InnerVolumeSpecName "metrics-server-audit-profiles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:09:20 crc kubenswrapper[4723]: I0309 13:09:20.600322 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0a0e755-4205-4573-b59a-07b4b6ec7ce2-kube-api-access-rj4zb" (OuterVolumeSpecName: "kube-api-access-rj4zb") pod "d0a0e755-4205-4573-b59a-07b4b6ec7ce2" (UID: "d0a0e755-4205-4573-b59a-07b4b6ec7ce2"). InnerVolumeSpecName "kube-api-access-rj4zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:09:20 crc kubenswrapper[4723]: I0309 13:09:20.600675 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0a0e755-4205-4573-b59a-07b4b6ec7ce2-secret-metrics-server-tls" (OuterVolumeSpecName: "secret-metrics-server-tls") pod "d0a0e755-4205-4573-b59a-07b4b6ec7ce2" (UID: "d0a0e755-4205-4573-b59a-07b4b6ec7ce2"). InnerVolumeSpecName "secret-metrics-server-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:09:20 crc kubenswrapper[4723]: I0309 13:09:20.602014 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0a0e755-4205-4573-b59a-07b4b6ec7ce2-client-ca-bundle" (OuterVolumeSpecName: "client-ca-bundle") pod "d0a0e755-4205-4573-b59a-07b4b6ec7ce2" (UID: "d0a0e755-4205-4573-b59a-07b4b6ec7ce2"). InnerVolumeSpecName "client-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:09:20 crc kubenswrapper[4723]: I0309 13:09:20.607350 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0a0e755-4205-4573-b59a-07b4b6ec7ce2-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "d0a0e755-4205-4573-b59a-07b4b6ec7ce2" (UID: "d0a0e755-4205-4573-b59a-07b4b6ec7ce2"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:09:20 crc kubenswrapper[4723]: I0309 13:09:20.696978 4723 reconciler_common.go:293] "Volume detached for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0a0e755-4205-4573-b59a-07b4b6ec7ce2-client-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:09:20 crc kubenswrapper[4723]: I0309 13:09:20.697025 4723 reconciler_common.go:293] "Volume detached for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/d0a0e755-4205-4573-b59a-07b4b6ec7ce2-metrics-server-audit-profiles\") on node \"crc\" DevicePath \"\"" Mar 09 13:09:20 crc kubenswrapper[4723]: I0309 13:09:20.697048 4723 reconciler_common.go:293] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0a0e755-4205-4573-b59a-07b4b6ec7ce2-configmap-kubelet-serving-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:09:20 crc kubenswrapper[4723]: I0309 13:09:20.697068 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rj4zb\" (UniqueName: \"kubernetes.io/projected/d0a0e755-4205-4573-b59a-07b4b6ec7ce2-kube-api-access-rj4zb\") on node \"crc\" DevicePath \"\"" Mar 09 13:09:20 crc kubenswrapper[4723]: I0309 13:09:20.697085 4723 reconciler_common.go:293] "Volume detached for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/d0a0e755-4205-4573-b59a-07b4b6ec7ce2-audit-log\") on node \"crc\" DevicePath \"\"" Mar 09 13:09:20 crc kubenswrapper[4723]: I0309 13:09:20.697100 4723 reconciler_common.go:293] "Volume detached for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/d0a0e755-4205-4573-b59a-07b4b6ec7ce2-secret-metrics-server-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:09:20 crc kubenswrapper[4723]: I0309 13:09:20.697114 4723 reconciler_common.go:293] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d0a0e755-4205-4573-b59a-07b4b6ec7ce2-secret-metrics-client-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:09:20 crc kubenswrapper[4723]: I0309 13:09:20.902420 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/metrics-server-57fc8677f7-9hvt8"] Mar 09 13:09:20 crc kubenswrapper[4723]: I0309 13:09:20.902787 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/metrics-server-57fc8677f7-9hvt8"] Mar 09 13:09:22 crc kubenswrapper[4723]: I0309 13:09:22.891111 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0a0e755-4205-4573-b59a-07b4b6ec7ce2" path="/var/lib/kubelet/pods/d0a0e755-4205-4573-b59a-07b4b6ec7ce2/volumes" Mar 09 13:09:53 crc kubenswrapper[4723]: I0309 13:09:53.653308 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z6m5p"] Mar 09 13:09:53 crc kubenswrapper[4723]: E0309 13:09:53.653984 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e526ef20-e343-48ac-8600-e647ac6996a4" containerName="oc" Mar 09 13:09:53 crc kubenswrapper[4723]: I0309 13:09:53.653996 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="e526ef20-e343-48ac-8600-e647ac6996a4" containerName="oc" Mar 09 13:09:53 crc kubenswrapper[4723]: E0309 13:09:53.654012 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a0e755-4205-4573-b59a-07b4b6ec7ce2" containerName="metrics-server" Mar 09 13:09:53 crc kubenswrapper[4723]: I0309 13:09:53.654017 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a0e755-4205-4573-b59a-07b4b6ec7ce2" containerName="metrics-server" Mar 09 13:09:53 crc kubenswrapper[4723]: I0309 13:09:53.654144 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="e526ef20-e343-48ac-8600-e647ac6996a4" containerName="oc" Mar 09 13:09:53 crc kubenswrapper[4723]: I0309 13:09:53.654159 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0a0e755-4205-4573-b59a-07b4b6ec7ce2" containerName="metrics-server" Mar 09 13:09:53 crc kubenswrapper[4723]: I0309 13:09:53.655054 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z6m5p" Mar 09 13:09:53 crc kubenswrapper[4723]: I0309 13:09:53.657481 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 09 13:09:53 crc kubenswrapper[4723]: I0309 13:09:53.704786 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z6m5p"] Mar 09 13:09:53 crc kubenswrapper[4723]: I0309 13:09:53.750392 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/618feaa1-6349-4b7e-b344-f750770dc970-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z6m5p\" (UID: \"618feaa1-6349-4b7e-b344-f750770dc970\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z6m5p" Mar 09 13:09:53 crc kubenswrapper[4723]: I0309 13:09:53.750489 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/618feaa1-6349-4b7e-b344-f750770dc970-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z6m5p\" (UID: \"618feaa1-6349-4b7e-b344-f750770dc970\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z6m5p" Mar 09 13:09:53 crc kubenswrapper[4723]: I0309 13:09:53.750519 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5l9j\" (UniqueName: \"kubernetes.io/projected/618feaa1-6349-4b7e-b344-f750770dc970-kube-api-access-t5l9j\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z6m5p\" (UID: \"618feaa1-6349-4b7e-b344-f750770dc970\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z6m5p" Mar 09 13:09:53 crc kubenswrapper[4723]: I0309 13:09:53.852191 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/618feaa1-6349-4b7e-b344-f750770dc970-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z6m5p\" (UID: \"618feaa1-6349-4b7e-b344-f750770dc970\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z6m5p" Mar 09 13:09:53 crc kubenswrapper[4723]: I0309 13:09:53.852235 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5l9j\" (UniqueName: \"kubernetes.io/projected/618feaa1-6349-4b7e-b344-f750770dc970-kube-api-access-t5l9j\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z6m5p\" (UID: \"618feaa1-6349-4b7e-b344-f750770dc970\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z6m5p" Mar 09 13:09:53 crc kubenswrapper[4723]: I0309 13:09:53.852303 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/618feaa1-6349-4b7e-b344-f750770dc970-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z6m5p\" (UID: \"618feaa1-6349-4b7e-b344-f750770dc970\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z6m5p" Mar 09 13:09:53 crc kubenswrapper[4723]: I0309 13:09:53.852728 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/618feaa1-6349-4b7e-b344-f750770dc970-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z6m5p\" (UID: \"618feaa1-6349-4b7e-b344-f750770dc970\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z6m5p" Mar 09 13:09:53 crc kubenswrapper[4723]: I0309 13:09:53.852763 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/618feaa1-6349-4b7e-b344-f750770dc970-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z6m5p\" (UID: \"618feaa1-6349-4b7e-b344-f750770dc970\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z6m5p" Mar 09 13:09:53 crc kubenswrapper[4723]: I0309 13:09:53.872328 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5l9j\" (UniqueName: \"kubernetes.io/projected/618feaa1-6349-4b7e-b344-f750770dc970-kube-api-access-t5l9j\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z6m5p\" (UID: \"618feaa1-6349-4b7e-b344-f750770dc970\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z6m5p" Mar 09 13:09:53 crc kubenswrapper[4723]: I0309 13:09:53.975050 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z6m5p" Mar 09 13:09:54 crc kubenswrapper[4723]: I0309 13:09:54.184258 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z6m5p"] Mar 09 13:09:54 crc kubenswrapper[4723]: I0309 13:09:54.818431 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z6m5p" event={"ID":"618feaa1-6349-4b7e-b344-f750770dc970","Type":"ContainerDied","Data":"303c7a9131206be519b1fc64768e0912c4e2cf13e1310fba4705796ffbacbb31"} Mar 09 13:09:54 crc kubenswrapper[4723]: I0309 13:09:54.818233 4723 generic.go:334] "Generic (PLEG): container finished" podID="618feaa1-6349-4b7e-b344-f750770dc970" containerID="303c7a9131206be519b1fc64768e0912c4e2cf13e1310fba4705796ffbacbb31" exitCode=0 Mar 09 13:09:54 crc kubenswrapper[4723]: I0309 13:09:54.819053 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z6m5p" event={"ID":"618feaa1-6349-4b7e-b344-f750770dc970","Type":"ContainerStarted","Data":"4bb37dbcd57614bfe8fb4150af708cbc5d97b2d7c94da9f1d874558ea16b1968"} Mar 09 13:09:56 crc kubenswrapper[4723]: I0309 13:09:56.835246 4723 generic.go:334] "Generic (PLEG): container finished" podID="618feaa1-6349-4b7e-b344-f750770dc970" containerID="7206c8f0cca15c789109220d0ef74040febdc5ba6d85bba8abba6c769f635c11" exitCode=0 Mar 09 13:09:56 crc kubenswrapper[4723]: I0309 13:09:56.835374 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z6m5p" event={"ID":"618feaa1-6349-4b7e-b344-f750770dc970","Type":"ContainerDied","Data":"7206c8f0cca15c789109220d0ef74040febdc5ba6d85bba8abba6c769f635c11"} Mar 09 13:09:57 crc kubenswrapper[4723]: I0309 13:09:57.846966 4723 generic.go:334] "Generic (PLEG): container finished" podID="618feaa1-6349-4b7e-b344-f750770dc970" containerID="9257f160be6e5f1d65f177f7b858bca043656f798548b6ff1e96445cbc3f02ce" exitCode=0 Mar 09 13:09:57 crc kubenswrapper[4723]: I0309 13:09:57.847060 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z6m5p" event={"ID":"618feaa1-6349-4b7e-b344-f750770dc970","Type":"ContainerDied","Data":"9257f160be6e5f1d65f177f7b858bca043656f798548b6ff1e96445cbc3f02ce"} Mar 09 13:09:59 crc kubenswrapper[4723]: I0309 13:09:59.184493 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z6m5p" Mar 09 13:09:59 crc kubenswrapper[4723]: I0309 13:09:59.335072 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/618feaa1-6349-4b7e-b344-f750770dc970-bundle\") pod \"618feaa1-6349-4b7e-b344-f750770dc970\" (UID: \"618feaa1-6349-4b7e-b344-f750770dc970\") " Mar 09 13:09:59 crc kubenswrapper[4723]: I0309 13:09:59.335172 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/618feaa1-6349-4b7e-b344-f750770dc970-util\") pod \"618feaa1-6349-4b7e-b344-f750770dc970\" (UID: \"618feaa1-6349-4b7e-b344-f750770dc970\") " Mar 09 13:09:59 crc kubenswrapper[4723]: I0309 13:09:59.335286 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5l9j\" (UniqueName: \"kubernetes.io/projected/618feaa1-6349-4b7e-b344-f750770dc970-kube-api-access-t5l9j\") pod \"618feaa1-6349-4b7e-b344-f750770dc970\" (UID: \"618feaa1-6349-4b7e-b344-f750770dc970\") " Mar 09 13:09:59 crc kubenswrapper[4723]: I0309 13:09:59.339106 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/618feaa1-6349-4b7e-b344-f750770dc970-bundle" (OuterVolumeSpecName: "bundle") pod "618feaa1-6349-4b7e-b344-f750770dc970" (UID: "618feaa1-6349-4b7e-b344-f750770dc970"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:09:59 crc kubenswrapper[4723]: I0309 13:09:59.350312 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/618feaa1-6349-4b7e-b344-f750770dc970-kube-api-access-t5l9j" (OuterVolumeSpecName: "kube-api-access-t5l9j") pod "618feaa1-6349-4b7e-b344-f750770dc970" (UID: "618feaa1-6349-4b7e-b344-f750770dc970"). InnerVolumeSpecName "kube-api-access-t5l9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:09:59 crc kubenswrapper[4723]: I0309 13:09:59.380224 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/618feaa1-6349-4b7e-b344-f750770dc970-util" (OuterVolumeSpecName: "util") pod "618feaa1-6349-4b7e-b344-f750770dc970" (UID: "618feaa1-6349-4b7e-b344-f750770dc970"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:09:59 crc kubenswrapper[4723]: I0309 13:09:59.436843 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5l9j\" (UniqueName: \"kubernetes.io/projected/618feaa1-6349-4b7e-b344-f750770dc970-kube-api-access-t5l9j\") on node \"crc\" DevicePath \"\"" Mar 09 13:09:59 crc kubenswrapper[4723]: I0309 13:09:59.436895 4723 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/618feaa1-6349-4b7e-b344-f750770dc970-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:09:59 crc kubenswrapper[4723]: I0309 13:09:59.436904 4723 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/618feaa1-6349-4b7e-b344-f750770dc970-util\") on node \"crc\" DevicePath \"\"" Mar 09 13:09:59 crc kubenswrapper[4723]: I0309 13:09:59.865274 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z6m5p" event={"ID":"618feaa1-6349-4b7e-b344-f750770dc970","Type":"ContainerDied","Data":"4bb37dbcd57614bfe8fb4150af708cbc5d97b2d7c94da9f1d874558ea16b1968"} Mar 09 13:09:59 crc kubenswrapper[4723]: I0309 13:09:59.865342 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4bb37dbcd57614bfe8fb4150af708cbc5d97b2d7c94da9f1d874558ea16b1968" Mar 09 13:09:59 crc kubenswrapper[4723]: I0309 13:09:59.865389 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z6m5p" Mar 09 13:10:00 crc kubenswrapper[4723]: I0309 13:10:00.147048 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551030-lchm5"] Mar 09 13:10:00 crc kubenswrapper[4723]: E0309 13:10:00.147969 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="618feaa1-6349-4b7e-b344-f750770dc970" containerName="extract" Mar 09 13:10:00 crc kubenswrapper[4723]: I0309 13:10:00.148090 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="618feaa1-6349-4b7e-b344-f750770dc970" containerName="extract" Mar 09 13:10:00 crc kubenswrapper[4723]: E0309 13:10:00.148189 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="618feaa1-6349-4b7e-b344-f750770dc970" containerName="util" Mar 09 13:10:00 crc kubenswrapper[4723]: I0309 13:10:00.148284 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="618feaa1-6349-4b7e-b344-f750770dc970" containerName="util" Mar 09 13:10:00 crc kubenswrapper[4723]: E0309 13:10:00.148400 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="618feaa1-6349-4b7e-b344-f750770dc970" containerName="pull" Mar 09 13:10:00 crc kubenswrapper[4723]: I0309 13:10:00.148506 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="618feaa1-6349-4b7e-b344-f750770dc970" containerName="pull" Mar 09 13:10:00 crc kubenswrapper[4723]: I0309 13:10:00.148793 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="618feaa1-6349-4b7e-b344-f750770dc970" containerName="extract" Mar 09 13:10:00 crc kubenswrapper[4723]: I0309 13:10:00.149776 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551030-lchm5" Mar 09 13:10:00 crc kubenswrapper[4723]: I0309 13:10:00.152041 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jrp4x" Mar 09 13:10:00 crc kubenswrapper[4723]: I0309 13:10:00.152097 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:10:00 crc kubenswrapper[4723]: I0309 13:10:00.154030 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:10:00 crc kubenswrapper[4723]: I0309 13:10:00.154472 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551030-lchm5"] Mar 09 13:10:00 crc kubenswrapper[4723]: I0309 13:10:00.248593 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csl4b\" (UniqueName: \"kubernetes.io/projected/619d3345-763a-4a5b-b038-1f4443d8b0c8-kube-api-access-csl4b\") pod \"auto-csr-approver-29551030-lchm5\" (UID: \"619d3345-763a-4a5b-b038-1f4443d8b0c8\") " pod="openshift-infra/auto-csr-approver-29551030-lchm5" Mar 09 13:10:00 crc kubenswrapper[4723]: I0309 13:10:00.349590 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csl4b\" (UniqueName: \"kubernetes.io/projected/619d3345-763a-4a5b-b038-1f4443d8b0c8-kube-api-access-csl4b\") pod \"auto-csr-approver-29551030-lchm5\" (UID: \"619d3345-763a-4a5b-b038-1f4443d8b0c8\") " pod="openshift-infra/auto-csr-approver-29551030-lchm5" Mar 09 13:10:00 crc kubenswrapper[4723]: I0309 13:10:00.369878 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csl4b\" (UniqueName: \"kubernetes.io/projected/619d3345-763a-4a5b-b038-1f4443d8b0c8-kube-api-access-csl4b\") pod \"auto-csr-approver-29551030-lchm5\" (UID: \"619d3345-763a-4a5b-b038-1f4443d8b0c8\") " pod="openshift-infra/auto-csr-approver-29551030-lchm5" Mar 09 13:10:00 crc kubenswrapper[4723]: I0309 13:10:00.473055 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551030-lchm5" Mar 09 13:10:00 crc kubenswrapper[4723]: I0309 13:10:00.707101 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551030-lchm5"] Mar 09 13:10:00 crc kubenswrapper[4723]: I0309 13:10:00.870631 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551030-lchm5" event={"ID":"619d3345-763a-4a5b-b038-1f4443d8b0c8","Type":"ContainerStarted","Data":"869e2cb1e991ce4b7b71e957d177bb08f7d23f6f8120bc1de6fb0705810375e6"} Mar 09 13:10:02 crc kubenswrapper[4723]: I0309 13:10:02.896484 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551030-lchm5" event={"ID":"619d3345-763a-4a5b-b038-1f4443d8b0c8","Type":"ContainerStarted","Data":"a4f3f7b89c97320355e6b1d445b6fe38c1eea5fbdd02b89e050f53194759a45f"} Mar 09 13:10:02 crc kubenswrapper[4723]: I0309 13:10:02.935667 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551030-lchm5" podStartSLOduration=1.706618159 podStartE2EDuration="2.935646193s" podCreationTimestamp="2026-03-09 13:10:00 +0000 UTC" firstStartedPulling="2026-03-09 13:10:00.716038917 +0000 UTC m=+674.730506457" lastFinishedPulling="2026-03-09 13:10:01.945066941 +0000 UTC m=+675.959534491" observedRunningTime="2026-03-09 13:10:02.933635201 +0000 UTC m=+676.948102741" watchObservedRunningTime="2026-03-09 13:10:02.935646193 +0000 UTC m=+676.950113733" Mar 09 13:10:03 crc kubenswrapper[4723]: I0309 13:10:03.896889 4723 generic.go:334] "Generic (PLEG): container finished" podID="619d3345-763a-4a5b-b038-1f4443d8b0c8" containerID="a4f3f7b89c97320355e6b1d445b6fe38c1eea5fbdd02b89e050f53194759a45f" exitCode=0 Mar 09 13:10:03 crc kubenswrapper[4723]: I0309 13:10:03.896958 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551030-lchm5" event={"ID":"619d3345-763a-4a5b-b038-1f4443d8b0c8","Type":"ContainerDied","Data":"a4f3f7b89c97320355e6b1d445b6fe38c1eea5fbdd02b89e050f53194759a45f"} Mar 09 13:10:03 crc kubenswrapper[4723]: I0309 13:10:03.946643 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:10:03 crc kubenswrapper[4723]: I0309 13:10:03.946695 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:10:04 crc kubenswrapper[4723]: I0309 13:10:04.641724 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zngwx"] Mar 09 13:10:04 crc kubenswrapper[4723]: I0309 13:10:04.642319 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerName="ovn-controller" containerID="cri-o://84ae27509fbfe9c2f53f5de2a01cfc809ca8abe68dcd96a96963438f0924ddb1" gracePeriod=30 Mar 09 13:10:04 crc kubenswrapper[4723]: I0309 13:10:04.642415 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerName="nbdb" containerID="cri-o://2d41e316c1f5e87b224e0037a94379033e22254097db3d2decf47a5fd335f766" gracePeriod=30 Mar 09 13:10:04 crc kubenswrapper[4723]: I0309 13:10:04.642481 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerName="sbdb" containerID="cri-o://ca158acc1cc3ee4b2280f8d4a28c1ba412659c33412a3f887c38601163536461" gracePeriod=30 Mar 09 13:10:04 crc kubenswrapper[4723]: I0309 13:10:04.642554 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerName="northd" containerID="cri-o://093509abfd0f010ebce9a116475da8c898a65569861b646314023d4ca81fea7e" gracePeriod=30 Mar 09 13:10:04 crc kubenswrapper[4723]: I0309 13:10:04.642636 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerName="kube-rbac-proxy-node" containerID="cri-o://db49efb6d6e555b280f71c32194247ba51a121250abba598286800970b6bec74" gracePeriod=30 Mar 09 13:10:04 crc kubenswrapper[4723]: I0309 13:10:04.642637 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://74e157a2b8f0e011a72fd9bb3979bede5694bfab635bf7a6c4ee6780d35119e7" gracePeriod=30 Mar 09 13:10:04 crc kubenswrapper[4723]: I0309 13:10:04.642656 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerName="ovn-acl-logging" containerID="cri-o://0f52ae34a25520c1f48c20a86ec3e00706385ab5dcfeb3bfd616422b454cf5c9" gracePeriod=30 Mar 09 13:10:04 crc kubenswrapper[4723]: I0309 13:10:04.671368 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerName="ovnkube-controller" containerID="cri-o://9a74506556c2b274ba3b56ae6f4ef88ce7ffffaf51be22738ce1491bcb6f426e" gracePeriod=30 Mar 09 13:10:04 crc kubenswrapper[4723]: I0309 13:10:04.918565 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-g92rf_242d3bf9-4462-4562-813a-f3548edc94fd/kube-multus/1.log" Mar 09 13:10:04 crc kubenswrapper[4723]: I0309 13:10:04.918961 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-g92rf_242d3bf9-4462-4562-813a-f3548edc94fd/kube-multus/0.log" Mar 09 13:10:04 crc kubenswrapper[4723]: I0309 13:10:04.918993 4723 generic.go:334] "Generic (PLEG): container finished" podID="242d3bf9-4462-4562-813a-f3548edc94fd" containerID="9e3f00295ab5c8b08630d59915b6f04285bc0f618ea72db8e5954cd6b4a21bee" exitCode=2 Mar 09 13:10:04 crc kubenswrapper[4723]: I0309 13:10:04.919038 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-g92rf" event={"ID":"242d3bf9-4462-4562-813a-f3548edc94fd","Type":"ContainerDied","Data":"9e3f00295ab5c8b08630d59915b6f04285bc0f618ea72db8e5954cd6b4a21bee"} Mar 09 13:10:04 crc kubenswrapper[4723]: I0309 13:10:04.919070 4723 scope.go:117] "RemoveContainer" containerID="ca482bd84eb9a872cae7dd27e7606909c24675c47b5def6c7d3b0a947fe198a5" Mar 09 13:10:04 crc kubenswrapper[4723]: I0309 13:10:04.919558 4723 scope.go:117] "RemoveContainer" containerID="9e3f00295ab5c8b08630d59915b6f04285bc0f618ea72db8e5954cd6b4a21bee" Mar 09 13:10:04 crc kubenswrapper[4723]: E0309 13:10:04.919721 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-g92rf_openshift-multus(242d3bf9-4462-4562-813a-f3548edc94fd)\"" pod="openshift-multus/multus-g92rf" podUID="242d3bf9-4462-4562-813a-f3548edc94fd" Mar 09 13:10:04 crc kubenswrapper[4723]: I0309 13:10:04.923910 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zngwx_edb23619-78b6-4d63-aacf-98d7ce86bc5b/ovnkube-controller/2.log" Mar 09 13:10:04 crc kubenswrapper[4723]: I0309 13:10:04.931598 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zngwx_edb23619-78b6-4d63-aacf-98d7ce86bc5b/ovn-acl-logging/0.log" Mar 09 13:10:04 crc kubenswrapper[4723]: I0309 13:10:04.932224 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zngwx_edb23619-78b6-4d63-aacf-98d7ce86bc5b/ovn-controller/0.log" Mar 09 13:10:04 crc kubenswrapper[4723]: I0309 13:10:04.932590 4723 generic.go:334] "Generic (PLEG): container finished" podID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerID="9a74506556c2b274ba3b56ae6f4ef88ce7ffffaf51be22738ce1491bcb6f426e" exitCode=0 Mar 09 13:10:04 crc kubenswrapper[4723]: I0309 13:10:04.932613 4723 generic.go:334] "Generic (PLEG): container finished" podID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerID="2d41e316c1f5e87b224e0037a94379033e22254097db3d2decf47a5fd335f766" exitCode=0 Mar 09 13:10:04 crc kubenswrapper[4723]: I0309 13:10:04.932621 4723 generic.go:334] "Generic (PLEG): container finished" podID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerID="093509abfd0f010ebce9a116475da8c898a65569861b646314023d4ca81fea7e" exitCode=0 Mar 09 13:10:04 crc kubenswrapper[4723]: I0309 13:10:04.932629 4723 generic.go:334] "Generic (PLEG): container finished" podID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerID="0f52ae34a25520c1f48c20a86ec3e00706385ab5dcfeb3bfd616422b454cf5c9" exitCode=143 Mar 09 13:10:04 crc kubenswrapper[4723]: I0309 13:10:04.932636 4723 generic.go:334] "Generic (PLEG): container finished" podID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerID="84ae27509fbfe9c2f53f5de2a01cfc809ca8abe68dcd96a96963438f0924ddb1" exitCode=143 Mar 09 13:10:04 crc kubenswrapper[4723]: I0309 13:10:04.932669 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" event={"ID":"edb23619-78b6-4d63-aacf-98d7ce86bc5b","Type":"ContainerDied","Data":"9a74506556c2b274ba3b56ae6f4ef88ce7ffffaf51be22738ce1491bcb6f426e"} Mar 09 13:10:04 crc kubenswrapper[4723]: I0309 13:10:04.932708 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" event={"ID":"edb23619-78b6-4d63-aacf-98d7ce86bc5b","Type":"ContainerDied","Data":"2d41e316c1f5e87b224e0037a94379033e22254097db3d2decf47a5fd335f766"} Mar 09 13:10:04 crc kubenswrapper[4723]: I0309 13:10:04.932718 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" event={"ID":"edb23619-78b6-4d63-aacf-98d7ce86bc5b","Type":"ContainerDied","Data":"093509abfd0f010ebce9a116475da8c898a65569861b646314023d4ca81fea7e"} Mar 09 13:10:04 crc kubenswrapper[4723]: I0309 13:10:04.932729 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" event={"ID":"edb23619-78b6-4d63-aacf-98d7ce86bc5b","Type":"ContainerDied","Data":"0f52ae34a25520c1f48c20a86ec3e00706385ab5dcfeb3bfd616422b454cf5c9"} Mar 09 13:10:04 crc kubenswrapper[4723]: I0309 13:10:04.932738 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" event={"ID":"edb23619-78b6-4d63-aacf-98d7ce86bc5b","Type":"ContainerDied","Data":"84ae27509fbfe9c2f53f5de2a01cfc809ca8abe68dcd96a96963438f0924ddb1"} Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.050440 4723 scope.go:117] "RemoveContainer" containerID="ed3c87a39e89b3d451419241ae60097d87d31eb4e982e9937611c9d1acd2d95f" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.056741 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551030-lchm5" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.238844 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csl4b\" (UniqueName: \"kubernetes.io/projected/619d3345-763a-4a5b-b038-1f4443d8b0c8-kube-api-access-csl4b\") pod \"619d3345-763a-4a5b-b038-1f4443d8b0c8\" (UID: \"619d3345-763a-4a5b-b038-1f4443d8b0c8\") " Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.245047 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/619d3345-763a-4a5b-b038-1f4443d8b0c8-kube-api-access-csl4b" (OuterVolumeSpecName: "kube-api-access-csl4b") pod "619d3345-763a-4a5b-b038-1f4443d8b0c8" (UID: "619d3345-763a-4a5b-b038-1f4443d8b0c8"). InnerVolumeSpecName "kube-api-access-csl4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.340695 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csl4b\" (UniqueName: \"kubernetes.io/projected/619d3345-763a-4a5b-b038-1f4443d8b0c8-kube-api-access-csl4b\") on node \"crc\" DevicePath \"\"" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.821986 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zngwx_edb23619-78b6-4d63-aacf-98d7ce86bc5b/ovn-acl-logging/0.log" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.822699 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zngwx_edb23619-78b6-4d63-aacf-98d7ce86bc5b/ovn-controller/0.log" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.823077 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.876696 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wvmv2"] Mar 09 13:10:05 crc kubenswrapper[4723]: E0309 13:10:05.876919 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerName="ovnkube-controller" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.876938 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerName="ovnkube-controller" Mar 09 13:10:05 crc kubenswrapper[4723]: E0309 13:10:05.876947 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerName="ovn-controller" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.876954 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerName="ovn-controller" Mar 09 13:10:05 crc kubenswrapper[4723]: E0309 13:10:05.876960 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerName="kubecfg-setup" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.876967 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerName="kubecfg-setup" Mar 09 13:10:05 crc kubenswrapper[4723]: E0309 13:10:05.876976 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerName="ovnkube-controller" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.876981 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerName="ovnkube-controller" Mar 09 13:10:05 crc kubenswrapper[4723]: E0309 13:10:05.876988 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerName="sbdb" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.876994 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerName="sbdb" Mar 09 13:10:05 crc kubenswrapper[4723]: E0309 13:10:05.877008 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerName="kube-rbac-proxy-ovn-metrics" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.877014 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerName="kube-rbac-proxy-ovn-metrics" Mar 09 13:10:05 crc kubenswrapper[4723]: E0309 13:10:05.877026 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerName="northd" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.877034 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerName="northd" Mar 09 13:10:05 crc kubenswrapper[4723]: E0309 13:10:05.877045 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="619d3345-763a-4a5b-b038-1f4443d8b0c8" containerName="oc" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.877051 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="619d3345-763a-4a5b-b038-1f4443d8b0c8" containerName="oc" Mar 09 13:10:05 crc kubenswrapper[4723]: E0309 13:10:05.877061 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerName="kube-rbac-proxy-node" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.877067 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerName="kube-rbac-proxy-node" Mar 09 13:10:05 crc kubenswrapper[4723]: E0309 13:10:05.877077 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerName="ovn-acl-logging" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.877082 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerName="ovn-acl-logging" Mar 09 13:10:05 crc kubenswrapper[4723]: E0309 13:10:05.877093 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerName="nbdb" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.877099 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerName="nbdb" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.877189 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerName="northd" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.877197 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="619d3345-763a-4a5b-b038-1f4443d8b0c8" containerName="oc" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.877203 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerName="ovnkube-controller" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.877212 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerName="kube-rbac-proxy-node" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.877222 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerName="ovnkube-controller" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.877231 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerName="ovnkube-controller" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.877238 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerName="ovn-controller" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.877247 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerName="nbdb" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.877255 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerName="sbdb" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.877264 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerName="ovn-acl-logging" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.877272 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerName="kube-rbac-proxy-ovn-metrics" Mar 09 13:10:05 crc kubenswrapper[4723]: E0309 13:10:05.877364 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerName="ovnkube-controller" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.877372 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerName="ovnkube-controller" Mar 09 13:10:05 crc kubenswrapper[4723]: E0309 13:10:05.877381 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerName="ovnkube-controller" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.877387 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerName="ovnkube-controller" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.877502 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerName="ovnkube-controller" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.879145 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.938308 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551030-lchm5" event={"ID":"619d3345-763a-4a5b-b038-1f4443d8b0c8","Type":"ContainerDied","Data":"869e2cb1e991ce4b7b71e957d177bb08f7d23f6f8120bc1de6fb0705810375e6"} Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.938343 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="869e2cb1e991ce4b7b71e957d177bb08f7d23f6f8120bc1de6fb0705810375e6" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.938626 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551030-lchm5" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.939910 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-g92rf_242d3bf9-4462-4562-813a-f3548edc94fd/kube-multus/1.log" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.943329 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zngwx_edb23619-78b6-4d63-aacf-98d7ce86bc5b/ovn-acl-logging/0.log" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.943998 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zngwx_edb23619-78b6-4d63-aacf-98d7ce86bc5b/ovn-controller/0.log" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.944320 4723 generic.go:334] "Generic (PLEG): container finished" podID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerID="ca158acc1cc3ee4b2280f8d4a28c1ba412659c33412a3f887c38601163536461" exitCode=0 Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.944348 4723 generic.go:334] "Generic (PLEG): container finished" podID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerID="74e157a2b8f0e011a72fd9bb3979bede5694bfab635bf7a6c4ee6780d35119e7" exitCode=0 Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.944359 4723 generic.go:334] "Generic (PLEG): container finished" podID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" containerID="db49efb6d6e555b280f71c32194247ba51a121250abba598286800970b6bec74" exitCode=0 Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.944395 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" event={"ID":"edb23619-78b6-4d63-aacf-98d7ce86bc5b","Type":"ContainerDied","Data":"ca158acc1cc3ee4b2280f8d4a28c1ba412659c33412a3f887c38601163536461"} Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.944446 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" event={"ID":"edb23619-78b6-4d63-aacf-98d7ce86bc5b","Type":"ContainerDied","Data":"74e157a2b8f0e011a72fd9bb3979bede5694bfab635bf7a6c4ee6780d35119e7"} Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.944462 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" event={"ID":"edb23619-78b6-4d63-aacf-98d7ce86bc5b","Type":"ContainerDied","Data":"db49efb6d6e555b280f71c32194247ba51a121250abba598286800970b6bec74"} Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.944475 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" event={"ID":"edb23619-78b6-4d63-aacf-98d7ce86bc5b","Type":"ContainerDied","Data":"2c7853925c23f3246ae068df12e4e648173c9cb519e0bdf339c2d4e4fc6c7aa8"} Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.944411 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zngwx" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.944495 4723 scope.go:117] "RemoveContainer" containerID="9a74506556c2b274ba3b56ae6f4ef88ce7ffffaf51be22738ce1491bcb6f426e" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.947526 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/edb23619-78b6-4d63-aacf-98d7ce86bc5b-ovnkube-config\") pod \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.947572 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-run-systemd\") pod \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.947615 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjtfd\" (UniqueName: \"kubernetes.io/projected/edb23619-78b6-4d63-aacf-98d7ce86bc5b-kube-api-access-qjtfd\") pod \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.947646 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.947670 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-host-cni-bin\") pod \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.947702 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-etc-openvswitch\") pod \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.947741 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/edb23619-78b6-4d63-aacf-98d7ce86bc5b-env-overrides\") pod \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.947762 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-var-lib-openvswitch\") pod \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.947785 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-host-slash\") pod \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.947807 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-run-openvswitch\") pod \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.947834 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/edb23619-78b6-4d63-aacf-98d7ce86bc5b-ovn-node-metrics-cert\") pod \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.947885 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-run-ovn\") pod \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.947910 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-systemd-units\") pod \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.948004 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-host-kubelet\") pod \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.948575 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-node-log\") pod \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.948601 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-host-cni-netd\") pod \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.948619 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-log-socket\") pod \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.948633 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-host-run-ovn-kubernetes\") pod \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.948652 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/edb23619-78b6-4d63-aacf-98d7ce86bc5b-ovnkube-script-lib\") pod \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.948666 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-host-run-netns\") pod \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\" (UID: \"edb23619-78b6-4d63-aacf-98d7ce86bc5b\") " Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.948301 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edb23619-78b6-4d63-aacf-98d7ce86bc5b-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "edb23619-78b6-4d63-aacf-98d7ce86bc5b" (UID: "edb23619-78b6-4d63-aacf-98d7ce86bc5b"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.948323 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "edb23619-78b6-4d63-aacf-98d7ce86bc5b" (UID: "edb23619-78b6-4d63-aacf-98d7ce86bc5b"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.948339 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "edb23619-78b6-4d63-aacf-98d7ce86bc5b" (UID: "edb23619-78b6-4d63-aacf-98d7ce86bc5b"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.948353 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "edb23619-78b6-4d63-aacf-98d7ce86bc5b" (UID: "edb23619-78b6-4d63-aacf-98d7ce86bc5b"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.948614 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edb23619-78b6-4d63-aacf-98d7ce86bc5b-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "edb23619-78b6-4d63-aacf-98d7ce86bc5b" (UID: "edb23619-78b6-4d63-aacf-98d7ce86bc5b"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.949018 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "edb23619-78b6-4d63-aacf-98d7ce86bc5b" (UID: "edb23619-78b6-4d63-aacf-98d7ce86bc5b"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.948638 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "edb23619-78b6-4d63-aacf-98d7ce86bc5b" (UID: "edb23619-78b6-4d63-aacf-98d7ce86bc5b"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.948652 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "edb23619-78b6-4d63-aacf-98d7ce86bc5b" (UID: "edb23619-78b6-4d63-aacf-98d7ce86bc5b"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.948665 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-host-slash" (OuterVolumeSpecName: "host-slash") pod "edb23619-78b6-4d63-aacf-98d7ce86bc5b" (UID: "edb23619-78b6-4d63-aacf-98d7ce86bc5b"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.948677 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "edb23619-78b6-4d63-aacf-98d7ce86bc5b" (UID: "edb23619-78b6-4d63-aacf-98d7ce86bc5b"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.948968 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-node-log" (OuterVolumeSpecName: "node-log") pod "edb23619-78b6-4d63-aacf-98d7ce86bc5b" (UID: "edb23619-78b6-4d63-aacf-98d7ce86bc5b"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.948991 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "edb23619-78b6-4d63-aacf-98d7ce86bc5b" (UID: "edb23619-78b6-4d63-aacf-98d7ce86bc5b"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.949046 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "edb23619-78b6-4d63-aacf-98d7ce86bc5b" (UID: "edb23619-78b6-4d63-aacf-98d7ce86bc5b"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.949068 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "edb23619-78b6-4d63-aacf-98d7ce86bc5b" (UID: "edb23619-78b6-4d63-aacf-98d7ce86bc5b"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.949087 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-log-socket" (OuterVolumeSpecName: "log-socket") pod "edb23619-78b6-4d63-aacf-98d7ce86bc5b" (UID: "edb23619-78b6-4d63-aacf-98d7ce86bc5b"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.949103 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "edb23619-78b6-4d63-aacf-98d7ce86bc5b" (UID: "edb23619-78b6-4d63-aacf-98d7ce86bc5b"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.949189 4723 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.949206 4723 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.949219 4723 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/edb23619-78b6-4d63-aacf-98d7ce86bc5b-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.949233 4723 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.949245 4723 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.949258 4723 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.949269 4723 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/edb23619-78b6-4d63-aacf-98d7ce86bc5b-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.949280 4723 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.949291 4723 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-host-slash\") on node \"crc\" DevicePath \"\"" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.949301 4723 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.949341 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edb23619-78b6-4d63-aacf-98d7ce86bc5b-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "edb23619-78b6-4d63-aacf-98d7ce86bc5b" (UID: "edb23619-78b6-4d63-aacf-98d7ce86bc5b"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.960831 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "edb23619-78b6-4d63-aacf-98d7ce86bc5b" (UID: "edb23619-78b6-4d63-aacf-98d7ce86bc5b"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.960998 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edb23619-78b6-4d63-aacf-98d7ce86bc5b-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "edb23619-78b6-4d63-aacf-98d7ce86bc5b" (UID: "edb23619-78b6-4d63-aacf-98d7ce86bc5b"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:10:05 crc kubenswrapper[4723]: I0309 13:10:05.962172 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edb23619-78b6-4d63-aacf-98d7ce86bc5b-kube-api-access-qjtfd" (OuterVolumeSpecName: "kube-api-access-qjtfd") pod "edb23619-78b6-4d63-aacf-98d7ce86bc5b" (UID: "edb23619-78b6-4d63-aacf-98d7ce86bc5b"). InnerVolumeSpecName "kube-api-access-qjtfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.002348 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551024-flwht"] Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.006392 4723 scope.go:117] "RemoveContainer" containerID="ca158acc1cc3ee4b2280f8d4a28c1ba412659c33412a3f887c38601163536461" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.009230 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551024-flwht"] Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.022149 4723 scope.go:117] "RemoveContainer" containerID="2d41e316c1f5e87b224e0037a94379033e22254097db3d2decf47a5fd335f766" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.051653 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/babce048-2000-4f1c-af6b-4738e8fbb8a3-run-openvswitch\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.051700 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/babce048-2000-4f1c-af6b-4738e8fbb8a3-ovnkube-script-lib\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.051722 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqcjb\" (UniqueName: \"kubernetes.io/projected/babce048-2000-4f1c-af6b-4738e8fbb8a3-kube-api-access-vqcjb\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.051742 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/babce048-2000-4f1c-af6b-4738e8fbb8a3-host-run-netns\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.051836 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/babce048-2000-4f1c-af6b-4738e8fbb8a3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.051971 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/babce048-2000-4f1c-af6b-4738e8fbb8a3-run-ovn\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.052069 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/babce048-2000-4f1c-af6b-4738e8fbb8a3-ovnkube-config\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.052098 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/babce048-2000-4f1c-af6b-4738e8fbb8a3-host-slash\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.052169 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/babce048-2000-4f1c-af6b-4738e8fbb8a3-ovn-node-metrics-cert\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.052195 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/babce048-2000-4f1c-af6b-4738e8fbb8a3-systemd-units\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.052234 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/babce048-2000-4f1c-af6b-4738e8fbb8a3-env-overrides\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.052304 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/babce048-2000-4f1c-af6b-4738e8fbb8a3-var-lib-openvswitch\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.052354 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/babce048-2000-4f1c-af6b-4738e8fbb8a3-node-log\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.052405 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/babce048-2000-4f1c-af6b-4738e8fbb8a3-run-systemd\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.052482 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/babce048-2000-4f1c-af6b-4738e8fbb8a3-log-socket\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.052550 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/babce048-2000-4f1c-af6b-4738e8fbb8a3-host-run-ovn-kubernetes\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.052578 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/babce048-2000-4f1c-af6b-4738e8fbb8a3-etc-openvswitch\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.052655 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/babce048-2000-4f1c-af6b-4738e8fbb8a3-host-cni-netd\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.052709 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/babce048-2000-4f1c-af6b-4738e8fbb8a3-host-cni-bin\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.052756 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/babce048-2000-4f1c-af6b-4738e8fbb8a3-host-kubelet\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.052890 4723 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-node-log\") on node \"crc\" DevicePath \"\"" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.052906 4723 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.052920 4723 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-log-socket\") on node \"crc\" DevicePath \"\"" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.052932 4723 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.052942 4723 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/edb23619-78b6-4d63-aacf-98d7ce86bc5b-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.052953 4723 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.052963 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjtfd\" (UniqueName: \"kubernetes.io/projected/edb23619-78b6-4d63-aacf-98d7ce86bc5b-kube-api-access-qjtfd\") on node \"crc\" DevicePath \"\"" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.052973 4723 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.052984 4723 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/edb23619-78b6-4d63-aacf-98d7ce86bc5b-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.052995 4723 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/edb23619-78b6-4d63-aacf-98d7ce86bc5b-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.055247 4723 scope.go:117] "RemoveContainer" containerID="093509abfd0f010ebce9a116475da8c898a65569861b646314023d4ca81fea7e" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.069800 4723 scope.go:117] "RemoveContainer" containerID="74e157a2b8f0e011a72fd9bb3979bede5694bfab635bf7a6c4ee6780d35119e7" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.086190 4723 scope.go:117] "RemoveContainer" containerID="db49efb6d6e555b280f71c32194247ba51a121250abba598286800970b6bec74" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.117813 4723 scope.go:117] "RemoveContainer" containerID="0f52ae34a25520c1f48c20a86ec3e00706385ab5dcfeb3bfd616422b454cf5c9" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.153880 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/babce048-2000-4f1c-af6b-4738e8fbb8a3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.153921 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/babce048-2000-4f1c-af6b-4738e8fbb8a3-run-ovn\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.153943 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/babce048-2000-4f1c-af6b-4738e8fbb8a3-ovnkube-config\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.153957 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/babce048-2000-4f1c-af6b-4738e8fbb8a3-host-slash\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.153973 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/babce048-2000-4f1c-af6b-4738e8fbb8a3-ovn-node-metrics-cert\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.153991 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/babce048-2000-4f1c-af6b-4738e8fbb8a3-systemd-units\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.154022 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/babce048-2000-4f1c-af6b-4738e8fbb8a3-run-ovn\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.154070 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/babce048-2000-4f1c-af6b-4738e8fbb8a3-env-overrides\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.154821 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/babce048-2000-4f1c-af6b-4738e8fbb8a3-systemd-units\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.154846 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/babce048-2000-4f1c-af6b-4738e8fbb8a3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.154881 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/babce048-2000-4f1c-af6b-4738e8fbb8a3-host-slash\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.154935 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/babce048-2000-4f1c-af6b-4738e8fbb8a3-var-lib-openvswitch\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.155046 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/babce048-2000-4f1c-af6b-4738e8fbb8a3-var-lib-openvswitch\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.155097 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/babce048-2000-4f1c-af6b-4738e8fbb8a3-node-log\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.155116 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/babce048-2000-4f1c-af6b-4738e8fbb8a3-run-systemd\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.155133 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/babce048-2000-4f1c-af6b-4738e8fbb8a3-log-socket\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.155153 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/babce048-2000-4f1c-af6b-4738e8fbb8a3-host-run-ovn-kubernetes\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.155172 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/babce048-2000-4f1c-af6b-4738e8fbb8a3-etc-openvswitch\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.155193 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/babce048-2000-4f1c-af6b-4738e8fbb8a3-host-cni-netd\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.155209 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/babce048-2000-4f1c-af6b-4738e8fbb8a3-host-cni-bin\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.155211 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/babce048-2000-4f1c-af6b-4738e8fbb8a3-node-log\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.155232 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/babce048-2000-4f1c-af6b-4738e8fbb8a3-log-socket\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.155239 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/babce048-2000-4f1c-af6b-4738e8fbb8a3-host-kubelet\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.155265 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/babce048-2000-4f1c-af6b-4738e8fbb8a3-etc-openvswitch\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.155272 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/babce048-2000-4f1c-af6b-4738e8fbb8a3-run-openvswitch\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.155271 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/babce048-2000-4f1c-af6b-4738e8fbb8a3-host-kubelet\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.155286 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/babce048-2000-4f1c-af6b-4738e8fbb8a3-host-run-ovn-kubernetes\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.155315 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/babce048-2000-4f1c-af6b-4738e8fbb8a3-host-cni-netd\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.155344 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/babce048-2000-4f1c-af6b-4738e8fbb8a3-host-cni-bin\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.155359 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/babce048-2000-4f1c-af6b-4738e8fbb8a3-run-systemd\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.155373 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/babce048-2000-4f1c-af6b-4738e8fbb8a3-run-openvswitch\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.155373 4723 scope.go:117] "RemoveContainer" containerID="84ae27509fbfe9c2f53f5de2a01cfc809ca8abe68dcd96a96963438f0924ddb1" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.155433 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/babce048-2000-4f1c-af6b-4738e8fbb8a3-ovnkube-script-lib\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.155452 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqcjb\" (UniqueName: \"kubernetes.io/projected/babce048-2000-4f1c-af6b-4738e8fbb8a3-kube-api-access-vqcjb\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.155468 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/babce048-2000-4f1c-af6b-4738e8fbb8a3-host-run-netns\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.155498 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/babce048-2000-4f1c-af6b-4738e8fbb8a3-env-overrides\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.155571 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/babce048-2000-4f1c-af6b-4738e8fbb8a3-host-run-netns\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.155620 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/babce048-2000-4f1c-af6b-4738e8fbb8a3-ovnkube-config\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.156026 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/babce048-2000-4f1c-af6b-4738e8fbb8a3-ovnkube-script-lib\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.159569 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/babce048-2000-4f1c-af6b-4738e8fbb8a3-ovn-node-metrics-cert\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.182337 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqcjb\" (UniqueName: \"kubernetes.io/projected/babce048-2000-4f1c-af6b-4738e8fbb8a3-kube-api-access-vqcjb\") pod \"ovnkube-node-wvmv2\" (UID: \"babce048-2000-4f1c-af6b-4738e8fbb8a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.185125 4723 scope.go:117] "RemoveContainer" containerID="b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.192295 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.227603 4723 scope.go:117] "RemoveContainer" containerID="9a74506556c2b274ba3b56ae6f4ef88ce7ffffaf51be22738ce1491bcb6f426e" Mar 09 13:10:06 crc kubenswrapper[4723]: E0309 13:10:06.233386 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a74506556c2b274ba3b56ae6f4ef88ce7ffffaf51be22738ce1491bcb6f426e\": container with ID starting with 9a74506556c2b274ba3b56ae6f4ef88ce7ffffaf51be22738ce1491bcb6f426e not found: ID does not exist" containerID="9a74506556c2b274ba3b56ae6f4ef88ce7ffffaf51be22738ce1491bcb6f426e" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.233438 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a74506556c2b274ba3b56ae6f4ef88ce7ffffaf51be22738ce1491bcb6f426e"} err="failed to get container status \"9a74506556c2b274ba3b56ae6f4ef88ce7ffffaf51be22738ce1491bcb6f426e\": rpc error: code = NotFound desc = could not find container \"9a74506556c2b274ba3b56ae6f4ef88ce7ffffaf51be22738ce1491bcb6f426e\": container with ID starting with 9a74506556c2b274ba3b56ae6f4ef88ce7ffffaf51be22738ce1491bcb6f426e not found: ID does not exist" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.233467 4723 scope.go:117] "RemoveContainer" containerID="ca158acc1cc3ee4b2280f8d4a28c1ba412659c33412a3f887c38601163536461" Mar 09 13:10:06 crc kubenswrapper[4723]: E0309 13:10:06.236301 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca158acc1cc3ee4b2280f8d4a28c1ba412659c33412a3f887c38601163536461\": container with ID starting with ca158acc1cc3ee4b2280f8d4a28c1ba412659c33412a3f887c38601163536461 not found: ID does not exist" containerID="ca158acc1cc3ee4b2280f8d4a28c1ba412659c33412a3f887c38601163536461" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.236345 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca158acc1cc3ee4b2280f8d4a28c1ba412659c33412a3f887c38601163536461"} err="failed to get container status \"ca158acc1cc3ee4b2280f8d4a28c1ba412659c33412a3f887c38601163536461\": rpc error: code = NotFound desc = could not find container \"ca158acc1cc3ee4b2280f8d4a28c1ba412659c33412a3f887c38601163536461\": container with ID starting with ca158acc1cc3ee4b2280f8d4a28c1ba412659c33412a3f887c38601163536461 not found: ID does not exist" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.236372 4723 scope.go:117] "RemoveContainer" containerID="2d41e316c1f5e87b224e0037a94379033e22254097db3d2decf47a5fd335f766" Mar 09 13:10:06 crc kubenswrapper[4723]: E0309 13:10:06.236822 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d41e316c1f5e87b224e0037a94379033e22254097db3d2decf47a5fd335f766\": container with ID starting with 2d41e316c1f5e87b224e0037a94379033e22254097db3d2decf47a5fd335f766 not found: ID does not exist" containerID="2d41e316c1f5e87b224e0037a94379033e22254097db3d2decf47a5fd335f766" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.236854 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d41e316c1f5e87b224e0037a94379033e22254097db3d2decf47a5fd335f766"} err="failed to get container status \"2d41e316c1f5e87b224e0037a94379033e22254097db3d2decf47a5fd335f766\": rpc error: code = NotFound desc = could not find container \"2d41e316c1f5e87b224e0037a94379033e22254097db3d2decf47a5fd335f766\": container with ID starting with 2d41e316c1f5e87b224e0037a94379033e22254097db3d2decf47a5fd335f766 not found: ID does not exist" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.236887 4723 scope.go:117] "RemoveContainer" containerID="093509abfd0f010ebce9a116475da8c898a65569861b646314023d4ca81fea7e" Mar 09 13:10:06 crc kubenswrapper[4723]: E0309 13:10:06.237152 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"093509abfd0f010ebce9a116475da8c898a65569861b646314023d4ca81fea7e\": container with ID starting with 093509abfd0f010ebce9a116475da8c898a65569861b646314023d4ca81fea7e not found: ID does not exist" containerID="093509abfd0f010ebce9a116475da8c898a65569861b646314023d4ca81fea7e" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.237177 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"093509abfd0f010ebce9a116475da8c898a65569861b646314023d4ca81fea7e"} err="failed to get container status \"093509abfd0f010ebce9a116475da8c898a65569861b646314023d4ca81fea7e\": rpc error: code = NotFound desc = could not find container \"093509abfd0f010ebce9a116475da8c898a65569861b646314023d4ca81fea7e\": container with ID starting with 093509abfd0f010ebce9a116475da8c898a65569861b646314023d4ca81fea7e not found: ID does not exist" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.237194 4723 scope.go:117] "RemoveContainer" containerID="74e157a2b8f0e011a72fd9bb3979bede5694bfab635bf7a6c4ee6780d35119e7" Mar 09 13:10:06 crc kubenswrapper[4723]: E0309 13:10:06.237420 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74e157a2b8f0e011a72fd9bb3979bede5694bfab635bf7a6c4ee6780d35119e7\": container with ID starting with 74e157a2b8f0e011a72fd9bb3979bede5694bfab635bf7a6c4ee6780d35119e7 not found: ID does not exist" containerID="74e157a2b8f0e011a72fd9bb3979bede5694bfab635bf7a6c4ee6780d35119e7" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.237450 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74e157a2b8f0e011a72fd9bb3979bede5694bfab635bf7a6c4ee6780d35119e7"} err="failed to get container status \"74e157a2b8f0e011a72fd9bb3979bede5694bfab635bf7a6c4ee6780d35119e7\": rpc error: code = NotFound desc = could not find container \"74e157a2b8f0e011a72fd9bb3979bede5694bfab635bf7a6c4ee6780d35119e7\": container with ID starting with 74e157a2b8f0e011a72fd9bb3979bede5694bfab635bf7a6c4ee6780d35119e7 not found: ID does not exist" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.237467 4723 scope.go:117] "RemoveContainer" containerID="db49efb6d6e555b280f71c32194247ba51a121250abba598286800970b6bec74" Mar 09 13:10:06 crc kubenswrapper[4723]: E0309 13:10:06.237707 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db49efb6d6e555b280f71c32194247ba51a121250abba598286800970b6bec74\": container with ID starting with db49efb6d6e555b280f71c32194247ba51a121250abba598286800970b6bec74 not found: ID does not exist" containerID="db49efb6d6e555b280f71c32194247ba51a121250abba598286800970b6bec74" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.237731 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db49efb6d6e555b280f71c32194247ba51a121250abba598286800970b6bec74"} err="failed to get container status \"db49efb6d6e555b280f71c32194247ba51a121250abba598286800970b6bec74\": rpc error: code = NotFound desc = could not find container \"db49efb6d6e555b280f71c32194247ba51a121250abba598286800970b6bec74\": container with ID starting with db49efb6d6e555b280f71c32194247ba51a121250abba598286800970b6bec74 not found: ID does not exist" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.237750 4723 scope.go:117] "RemoveContainer" containerID="0f52ae34a25520c1f48c20a86ec3e00706385ab5dcfeb3bfd616422b454cf5c9" Mar 09 13:10:06 crc kubenswrapper[4723]: E0309 13:10:06.237999 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f52ae34a25520c1f48c20a86ec3e00706385ab5dcfeb3bfd616422b454cf5c9\": container with ID starting with 0f52ae34a25520c1f48c20a86ec3e00706385ab5dcfeb3bfd616422b454cf5c9 not found: ID does not exist" containerID="0f52ae34a25520c1f48c20a86ec3e00706385ab5dcfeb3bfd616422b454cf5c9" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.238021 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f52ae34a25520c1f48c20a86ec3e00706385ab5dcfeb3bfd616422b454cf5c9"} err="failed to get container status \"0f52ae34a25520c1f48c20a86ec3e00706385ab5dcfeb3bfd616422b454cf5c9\": rpc error: code = NotFound desc = could not find container \"0f52ae34a25520c1f48c20a86ec3e00706385ab5dcfeb3bfd616422b454cf5c9\": container with ID starting with 0f52ae34a25520c1f48c20a86ec3e00706385ab5dcfeb3bfd616422b454cf5c9 not found: ID does not exist" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.238037 4723 scope.go:117] "RemoveContainer" containerID="84ae27509fbfe9c2f53f5de2a01cfc809ca8abe68dcd96a96963438f0924ddb1" Mar 09 13:10:06 crc kubenswrapper[4723]: E0309 13:10:06.238465 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84ae27509fbfe9c2f53f5de2a01cfc809ca8abe68dcd96a96963438f0924ddb1\": container with ID starting with 84ae27509fbfe9c2f53f5de2a01cfc809ca8abe68dcd96a96963438f0924ddb1 not found: ID does not exist" containerID="84ae27509fbfe9c2f53f5de2a01cfc809ca8abe68dcd96a96963438f0924ddb1" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.238494 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84ae27509fbfe9c2f53f5de2a01cfc809ca8abe68dcd96a96963438f0924ddb1"} err="failed to get container status \"84ae27509fbfe9c2f53f5de2a01cfc809ca8abe68dcd96a96963438f0924ddb1\": rpc error: code = NotFound desc = could not find container \"84ae27509fbfe9c2f53f5de2a01cfc809ca8abe68dcd96a96963438f0924ddb1\": container with ID starting with 84ae27509fbfe9c2f53f5de2a01cfc809ca8abe68dcd96a96963438f0924ddb1 not found: ID does not exist" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.238516 4723 scope.go:117] "RemoveContainer" containerID="b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258" Mar 09 13:10:06 crc kubenswrapper[4723]: E0309 13:10:06.238744 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\": container with ID starting with b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258 not found: ID does not exist" containerID="b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.238785 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258"} err="failed to get container status \"b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\": rpc error: code = NotFound desc = could not find container \"b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\": container with ID starting with b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258 not found: ID does not exist" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.238802 4723 scope.go:117] "RemoveContainer" containerID="9a74506556c2b274ba3b56ae6f4ef88ce7ffffaf51be22738ce1491bcb6f426e" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.239031 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a74506556c2b274ba3b56ae6f4ef88ce7ffffaf51be22738ce1491bcb6f426e"} err="failed to get container status \"9a74506556c2b274ba3b56ae6f4ef88ce7ffffaf51be22738ce1491bcb6f426e\": rpc error: code = NotFound desc = could not find container \"9a74506556c2b274ba3b56ae6f4ef88ce7ffffaf51be22738ce1491bcb6f426e\": container with ID starting with 9a74506556c2b274ba3b56ae6f4ef88ce7ffffaf51be22738ce1491bcb6f426e not found: ID does not exist" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.239053 4723 scope.go:117] "RemoveContainer" containerID="ca158acc1cc3ee4b2280f8d4a28c1ba412659c33412a3f887c38601163536461" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.239335 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca158acc1cc3ee4b2280f8d4a28c1ba412659c33412a3f887c38601163536461"} err="failed to get container status \"ca158acc1cc3ee4b2280f8d4a28c1ba412659c33412a3f887c38601163536461\": rpc error: code = NotFound desc = could not find container \"ca158acc1cc3ee4b2280f8d4a28c1ba412659c33412a3f887c38601163536461\": container with ID starting with ca158acc1cc3ee4b2280f8d4a28c1ba412659c33412a3f887c38601163536461 not found: ID does not exist" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.239366 4723 scope.go:117] "RemoveContainer" containerID="2d41e316c1f5e87b224e0037a94379033e22254097db3d2decf47a5fd335f766" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.239601 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d41e316c1f5e87b224e0037a94379033e22254097db3d2decf47a5fd335f766"} err="failed to get container status \"2d41e316c1f5e87b224e0037a94379033e22254097db3d2decf47a5fd335f766\": rpc error: code = NotFound desc = could not find container \"2d41e316c1f5e87b224e0037a94379033e22254097db3d2decf47a5fd335f766\": container with ID starting with 2d41e316c1f5e87b224e0037a94379033e22254097db3d2decf47a5fd335f766 not found: ID does not exist" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.239629 4723 scope.go:117] "RemoveContainer" containerID="093509abfd0f010ebce9a116475da8c898a65569861b646314023d4ca81fea7e" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.239986 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"093509abfd0f010ebce9a116475da8c898a65569861b646314023d4ca81fea7e"} err="failed to get container status \"093509abfd0f010ebce9a116475da8c898a65569861b646314023d4ca81fea7e\": rpc error: code = NotFound desc = could not find container \"093509abfd0f010ebce9a116475da8c898a65569861b646314023d4ca81fea7e\": container with ID starting with 093509abfd0f010ebce9a116475da8c898a65569861b646314023d4ca81fea7e not found: ID does not exist" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.240014 4723 scope.go:117] "RemoveContainer" containerID="74e157a2b8f0e011a72fd9bb3979bede5694bfab635bf7a6c4ee6780d35119e7" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.240343 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74e157a2b8f0e011a72fd9bb3979bede5694bfab635bf7a6c4ee6780d35119e7"} err="failed to get container status \"74e157a2b8f0e011a72fd9bb3979bede5694bfab635bf7a6c4ee6780d35119e7\": rpc error: code = NotFound desc = could not find container \"74e157a2b8f0e011a72fd9bb3979bede5694bfab635bf7a6c4ee6780d35119e7\": container with ID starting with 74e157a2b8f0e011a72fd9bb3979bede5694bfab635bf7a6c4ee6780d35119e7 not found: ID does not exist" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.240395 4723 scope.go:117] "RemoveContainer" containerID="db49efb6d6e555b280f71c32194247ba51a121250abba598286800970b6bec74" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.240705 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db49efb6d6e555b280f71c32194247ba51a121250abba598286800970b6bec74"} err="failed to get container status \"db49efb6d6e555b280f71c32194247ba51a121250abba598286800970b6bec74\": rpc error: code = NotFound desc = could not find container \"db49efb6d6e555b280f71c32194247ba51a121250abba598286800970b6bec74\": container with ID starting with db49efb6d6e555b280f71c32194247ba51a121250abba598286800970b6bec74 not found: ID does not exist" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.240734 4723 scope.go:117] "RemoveContainer" containerID="0f52ae34a25520c1f48c20a86ec3e00706385ab5dcfeb3bfd616422b454cf5c9" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.240953 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f52ae34a25520c1f48c20a86ec3e00706385ab5dcfeb3bfd616422b454cf5c9"} err="failed to get container status \"0f52ae34a25520c1f48c20a86ec3e00706385ab5dcfeb3bfd616422b454cf5c9\": rpc error: code = NotFound desc = could not find container \"0f52ae34a25520c1f48c20a86ec3e00706385ab5dcfeb3bfd616422b454cf5c9\": container with ID starting with 0f52ae34a25520c1f48c20a86ec3e00706385ab5dcfeb3bfd616422b454cf5c9 not found: ID does not exist" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.240980 4723 scope.go:117] "RemoveContainer" containerID="84ae27509fbfe9c2f53f5de2a01cfc809ca8abe68dcd96a96963438f0924ddb1" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.241181 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84ae27509fbfe9c2f53f5de2a01cfc809ca8abe68dcd96a96963438f0924ddb1"} err="failed to get container status \"84ae27509fbfe9c2f53f5de2a01cfc809ca8abe68dcd96a96963438f0924ddb1\": rpc error: code = NotFound desc = could not find container \"84ae27509fbfe9c2f53f5de2a01cfc809ca8abe68dcd96a96963438f0924ddb1\": container with ID starting with 84ae27509fbfe9c2f53f5de2a01cfc809ca8abe68dcd96a96963438f0924ddb1 not found: ID does not exist" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.241208 4723 scope.go:117] "RemoveContainer" containerID="b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.241397 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258"} err="failed to get container status \"b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\": rpc error: code = NotFound desc = could not find container \"b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\": container with ID starting with b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258 not found: ID does not exist" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.241431 4723 scope.go:117] "RemoveContainer" containerID="9a74506556c2b274ba3b56ae6f4ef88ce7ffffaf51be22738ce1491bcb6f426e" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.242438 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a74506556c2b274ba3b56ae6f4ef88ce7ffffaf51be22738ce1491bcb6f426e"} err="failed to get container status \"9a74506556c2b274ba3b56ae6f4ef88ce7ffffaf51be22738ce1491bcb6f426e\": rpc error: code = NotFound desc = could not find container \"9a74506556c2b274ba3b56ae6f4ef88ce7ffffaf51be22738ce1491bcb6f426e\": container with ID starting with 9a74506556c2b274ba3b56ae6f4ef88ce7ffffaf51be22738ce1491bcb6f426e not found: ID does not exist" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.242464 4723 scope.go:117] "RemoveContainer" containerID="ca158acc1cc3ee4b2280f8d4a28c1ba412659c33412a3f887c38601163536461" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.242735 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca158acc1cc3ee4b2280f8d4a28c1ba412659c33412a3f887c38601163536461"} err="failed to get container status \"ca158acc1cc3ee4b2280f8d4a28c1ba412659c33412a3f887c38601163536461\": rpc error: code = NotFound desc = could not find container \"ca158acc1cc3ee4b2280f8d4a28c1ba412659c33412a3f887c38601163536461\": container with ID starting with ca158acc1cc3ee4b2280f8d4a28c1ba412659c33412a3f887c38601163536461 not found: ID does not exist" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.242770 4723 scope.go:117] "RemoveContainer" containerID="2d41e316c1f5e87b224e0037a94379033e22254097db3d2decf47a5fd335f766" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.243143 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d41e316c1f5e87b224e0037a94379033e22254097db3d2decf47a5fd335f766"} err="failed to get container status \"2d41e316c1f5e87b224e0037a94379033e22254097db3d2decf47a5fd335f766\": rpc error: code = NotFound desc = could not find container \"2d41e316c1f5e87b224e0037a94379033e22254097db3d2decf47a5fd335f766\": container with ID starting with 2d41e316c1f5e87b224e0037a94379033e22254097db3d2decf47a5fd335f766 not found: ID does not exist" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.243164 4723 scope.go:117] "RemoveContainer" containerID="093509abfd0f010ebce9a116475da8c898a65569861b646314023d4ca81fea7e" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.243549 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"093509abfd0f010ebce9a116475da8c898a65569861b646314023d4ca81fea7e"} err="failed to get container status \"093509abfd0f010ebce9a116475da8c898a65569861b646314023d4ca81fea7e\": rpc error: code = NotFound desc = could not find container \"093509abfd0f010ebce9a116475da8c898a65569861b646314023d4ca81fea7e\": container with ID starting with 093509abfd0f010ebce9a116475da8c898a65569861b646314023d4ca81fea7e not found: ID does not exist" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.243575 4723 scope.go:117] "RemoveContainer" containerID="74e157a2b8f0e011a72fd9bb3979bede5694bfab635bf7a6c4ee6780d35119e7" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.243834 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74e157a2b8f0e011a72fd9bb3979bede5694bfab635bf7a6c4ee6780d35119e7"} err="failed to get container status \"74e157a2b8f0e011a72fd9bb3979bede5694bfab635bf7a6c4ee6780d35119e7\": rpc error: code = NotFound desc = could not find container \"74e157a2b8f0e011a72fd9bb3979bede5694bfab635bf7a6c4ee6780d35119e7\": container with ID starting with 74e157a2b8f0e011a72fd9bb3979bede5694bfab635bf7a6c4ee6780d35119e7 not found: ID does not exist" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.243853 4723 scope.go:117] "RemoveContainer" containerID="db49efb6d6e555b280f71c32194247ba51a121250abba598286800970b6bec74" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.244197 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db49efb6d6e555b280f71c32194247ba51a121250abba598286800970b6bec74"} err="failed to get container status \"db49efb6d6e555b280f71c32194247ba51a121250abba598286800970b6bec74\": rpc error: code = NotFound desc = could not find container \"db49efb6d6e555b280f71c32194247ba51a121250abba598286800970b6bec74\": container with ID starting with db49efb6d6e555b280f71c32194247ba51a121250abba598286800970b6bec74 not found: ID does not exist" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.244222 4723 scope.go:117] "RemoveContainer" containerID="0f52ae34a25520c1f48c20a86ec3e00706385ab5dcfeb3bfd616422b454cf5c9" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.244537 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f52ae34a25520c1f48c20a86ec3e00706385ab5dcfeb3bfd616422b454cf5c9"} err="failed to get container status \"0f52ae34a25520c1f48c20a86ec3e00706385ab5dcfeb3bfd616422b454cf5c9\": rpc error: code = NotFound desc = could not find container \"0f52ae34a25520c1f48c20a86ec3e00706385ab5dcfeb3bfd616422b454cf5c9\": container with ID starting with 0f52ae34a25520c1f48c20a86ec3e00706385ab5dcfeb3bfd616422b454cf5c9 not found: ID does not exist" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.244563 4723 scope.go:117] "RemoveContainer" containerID="84ae27509fbfe9c2f53f5de2a01cfc809ca8abe68dcd96a96963438f0924ddb1" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.244840 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84ae27509fbfe9c2f53f5de2a01cfc809ca8abe68dcd96a96963438f0924ddb1"} err="failed to get container status \"84ae27509fbfe9c2f53f5de2a01cfc809ca8abe68dcd96a96963438f0924ddb1\": rpc error: code = NotFound desc = could not find container \"84ae27509fbfe9c2f53f5de2a01cfc809ca8abe68dcd96a96963438f0924ddb1\": container with ID starting with 84ae27509fbfe9c2f53f5de2a01cfc809ca8abe68dcd96a96963438f0924ddb1 not found: ID does not exist" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.244879 4723 scope.go:117] "RemoveContainer" containerID="b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.245458 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258"} err="failed to get container status \"b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\": rpc error: code = NotFound desc = could not find container \"b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258\": container with ID starting with b34e5e54b6c3625b8cd0220e7197fcfa242ad6cb634b1819546dcc54aa983258 not found: ID does not exist" Mar 09 13:10:06 crc kubenswrapper[4723]: W0309 13:10:06.245807 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbabce048_2000_4f1c_af6b_4738e8fbb8a3.slice/crio-b6359bb5e6df2f7c75fa95099491d6936135b1ff69919d93cb933c5a157e490b WatchSource:0}: Error finding container b6359bb5e6df2f7c75fa95099491d6936135b1ff69919d93cb933c5a157e490b: Status 404 returned error can't find the container with id b6359bb5e6df2f7c75fa95099491d6936135b1ff69919d93cb933c5a157e490b Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.293844 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zngwx"] Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.301919 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zngwx"] Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.888976 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c990608-7d03-402a-a042-9db3b406ca16" path="/var/lib/kubelet/pods/1c990608-7d03-402a-a042-9db3b406ca16/volumes" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.889572 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edb23619-78b6-4d63-aacf-98d7ce86bc5b" path="/var/lib/kubelet/pods/edb23619-78b6-4d63-aacf-98d7ce86bc5b/volumes" Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.950299 4723 generic.go:334] "Generic (PLEG): container finished" podID="babce048-2000-4f1c-af6b-4738e8fbb8a3" containerID="8336bccf1f9cf3f86ab5a63b683975205dcd569c7d749066bcd08c2a4e5ccc9b" exitCode=0 Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.950365 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" event={"ID":"babce048-2000-4f1c-af6b-4738e8fbb8a3","Type":"ContainerDied","Data":"8336bccf1f9cf3f86ab5a63b683975205dcd569c7d749066bcd08c2a4e5ccc9b"} Mar 09 13:10:06 crc kubenswrapper[4723]: I0309 13:10:06.950398 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" event={"ID":"babce048-2000-4f1c-af6b-4738e8fbb8a3","Type":"ContainerStarted","Data":"b6359bb5e6df2f7c75fa95099491d6936135b1ff69919d93cb933c5a157e490b"} Mar 09 13:10:07 crc kubenswrapper[4723]: I0309 13:10:07.838821 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-26ngl"] Mar 09 13:10:07 crc kubenswrapper[4723]: I0309 13:10:07.839680 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-26ngl" Mar 09 13:10:07 crc kubenswrapper[4723]: I0309 13:10:07.841820 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-xjv8h" Mar 09 13:10:07 crc kubenswrapper[4723]: I0309 13:10:07.842053 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 09 13:10:07 crc kubenswrapper[4723]: I0309 13:10:07.842223 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 09 13:10:07 crc kubenswrapper[4723]: I0309 13:10:07.879117 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kzfk\" (UniqueName: \"kubernetes.io/projected/bcc48a4f-2d0e-4fb9-98d7-af5958403a01-kube-api-access-4kzfk\") pod \"obo-prometheus-operator-68bc856cb9-26ngl\" (UID: \"bcc48a4f-2d0e-4fb9-98d7-af5958403a01\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-26ngl" Mar 09 13:10:07 crc kubenswrapper[4723]: I0309 13:10:07.961075 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" event={"ID":"babce048-2000-4f1c-af6b-4738e8fbb8a3","Type":"ContainerStarted","Data":"5509b8ec72e17b63be575448644ea3839a76cb4348550a22b1fbf9e90448e870"} Mar 09 13:10:07 crc kubenswrapper[4723]: I0309 13:10:07.961125 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" event={"ID":"babce048-2000-4f1c-af6b-4738e8fbb8a3","Type":"ContainerStarted","Data":"c45cf634c84753ac986e7568e0821abf9fb448643fe1ccbf8e43017105eb1e2c"} Mar 09 13:10:07 crc kubenswrapper[4723]: I0309 13:10:07.961138 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" event={"ID":"babce048-2000-4f1c-af6b-4738e8fbb8a3","Type":"ContainerStarted","Data":"11391073f96bef959a2006697b3e4f39d6dedaf10e76e5d25b90f076791a71de"} Mar 09 13:10:07 crc kubenswrapper[4723]: I0309 13:10:07.961150 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" event={"ID":"babce048-2000-4f1c-af6b-4738e8fbb8a3","Type":"ContainerStarted","Data":"4b225aa2ed692f64de07a93e080194b4e899e034e2c757952d94a0046b884eae"} Mar 09 13:10:07 crc kubenswrapper[4723]: I0309 13:10:07.961159 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" event={"ID":"babce048-2000-4f1c-af6b-4738e8fbb8a3","Type":"ContainerStarted","Data":"89455066eeb1c706f70fb57e0a384d61d314d3f4024612a0150c4b3bc34a8281"} Mar 09 13:10:07 crc kubenswrapper[4723]: I0309 13:10:07.961167 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" event={"ID":"babce048-2000-4f1c-af6b-4738e8fbb8a3","Type":"ContainerStarted","Data":"2815a3198efd9b5bd4b100c1151d6e6cee14af02e5ca79da2e02454760b8841d"} Mar 09 13:10:07 crc kubenswrapper[4723]: I0309 13:10:07.980564 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kzfk\" (UniqueName: \"kubernetes.io/projected/bcc48a4f-2d0e-4fb9-98d7-af5958403a01-kube-api-access-4kzfk\") pod \"obo-prometheus-operator-68bc856cb9-26ngl\" (UID: \"bcc48a4f-2d0e-4fb9-98d7-af5958403a01\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-26ngl" Mar 09 13:10:08 crc kubenswrapper[4723]: I0309 13:10:08.002589 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-h6x8j"] Mar 09 13:10:08 crc kubenswrapper[4723]: I0309 13:10:08.003291 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-h6x8j" Mar 09 13:10:08 crc kubenswrapper[4723]: I0309 13:10:08.006127 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kzfk\" (UniqueName: \"kubernetes.io/projected/bcc48a4f-2d0e-4fb9-98d7-af5958403a01-kube-api-access-4kzfk\") pod \"obo-prometheus-operator-68bc856cb9-26ngl\" (UID: \"bcc48a4f-2d0e-4fb9-98d7-af5958403a01\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-26ngl" Mar 09 13:10:08 crc kubenswrapper[4723]: I0309 13:10:08.006238 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-fk9nd" Mar 09 13:10:08 crc kubenswrapper[4723]: I0309 13:10:08.009330 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 09 13:10:08 crc kubenswrapper[4723]: I0309 13:10:08.018589 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-vwh7x"] Mar 09 13:10:08 crc kubenswrapper[4723]: I0309 13:10:08.019296 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-vwh7x" Mar 09 13:10:08 crc kubenswrapper[4723]: I0309 13:10:08.157308 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-26ngl" Mar 09 13:10:08 crc kubenswrapper[4723]: I0309 13:10:08.173509 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-2nqwq"] Mar 09 13:10:08 crc kubenswrapper[4723]: I0309 13:10:08.174280 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-2nqwq" Mar 09 13:10:08 crc kubenswrapper[4723]: I0309 13:10:08.176558 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-4mhh8" Mar 09 13:10:08 crc kubenswrapper[4723]: I0309 13:10:08.176770 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 09 13:10:08 crc kubenswrapper[4723]: E0309 13:10:08.182461 4723 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-26ngl_openshift-operators_bcc48a4f-2d0e-4fb9-98d7-af5958403a01_0(f37e40a042e6f16f78db90d17bfd046958733ca01b15447d7346ed7a2b665b0e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:10:08 crc kubenswrapper[4723]: E0309 13:10:08.182531 4723 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-26ngl_openshift-operators_bcc48a4f-2d0e-4fb9-98d7-af5958403a01_0(f37e40a042e6f16f78db90d17bfd046958733ca01b15447d7346ed7a2b665b0e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-26ngl" Mar 09 13:10:08 crc kubenswrapper[4723]: E0309 13:10:08.182570 4723 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-26ngl_openshift-operators_bcc48a4f-2d0e-4fb9-98d7-af5958403a01_0(f37e40a042e6f16f78db90d17bfd046958733ca01b15447d7346ed7a2b665b0e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-26ngl" Mar 09 13:10:08 crc kubenswrapper[4723]: E0309 13:10:08.182610 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-26ngl_openshift-operators(bcc48a4f-2d0e-4fb9-98d7-af5958403a01)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-26ngl_openshift-operators(bcc48a4f-2d0e-4fb9-98d7-af5958403a01)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-26ngl_openshift-operators_bcc48a4f-2d0e-4fb9-98d7-af5958403a01_0(f37e40a042e6f16f78db90d17bfd046958733ca01b15447d7346ed7a2b665b0e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-26ngl" podUID="bcc48a4f-2d0e-4fb9-98d7-af5958403a01" Mar 09 13:10:08 crc kubenswrapper[4723]: I0309 13:10:08.188291 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/441fc6d3-ed2e-44b6-9e0d-d1925412eb23-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-77c6585f54-h6x8j\" (UID: \"441fc6d3-ed2e-44b6-9e0d-d1925412eb23\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-h6x8j" Mar 09 13:10:08 crc kubenswrapper[4723]: I0309 13:10:08.188342 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wns7p\" (UniqueName: \"kubernetes.io/projected/2bc0446d-1f37-4214-bd0a-0f7c64f844a8-kube-api-access-wns7p\") pod \"observability-operator-59bdc8b94-2nqwq\" (UID: \"2bc0446d-1f37-4214-bd0a-0f7c64f844a8\") " pod="openshift-operators/observability-operator-59bdc8b94-2nqwq" Mar 09 13:10:08 crc kubenswrapper[4723]: I0309 13:10:08.188365 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b3774887-0abb-4692-a856-fb86baa11ba6-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-77c6585f54-vwh7x\" (UID: \"b3774887-0abb-4692-a856-fb86baa11ba6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-vwh7x" Mar 09 13:10:08 crc kubenswrapper[4723]: I0309 13:10:08.188402 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b3774887-0abb-4692-a856-fb86baa11ba6-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-77c6585f54-vwh7x\" (UID: \"b3774887-0abb-4692-a856-fb86baa11ba6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-vwh7x" Mar 09 13:10:08 crc kubenswrapper[4723]: I0309 13:10:08.188451 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/441fc6d3-ed2e-44b6-9e0d-d1925412eb23-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-77c6585f54-h6x8j\" (UID: \"441fc6d3-ed2e-44b6-9e0d-d1925412eb23\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-h6x8j" Mar 09 13:10:08 crc kubenswrapper[4723]: I0309 13:10:08.188472 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/2bc0446d-1f37-4214-bd0a-0f7c64f844a8-observability-operator-tls\") pod \"observability-operator-59bdc8b94-2nqwq\" (UID: \"2bc0446d-1f37-4214-bd0a-0f7c64f844a8\") " pod="openshift-operators/observability-operator-59bdc8b94-2nqwq" Mar 09 13:10:08 crc kubenswrapper[4723]: I0309 13:10:08.269434 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-25fp4"] Mar 09 13:10:08 crc kubenswrapper[4723]: I0309 13:10:08.270407 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-25fp4" Mar 09 13:10:08 crc kubenswrapper[4723]: I0309 13:10:08.272164 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-tldcv" Mar 09 13:10:08 crc kubenswrapper[4723]: I0309 13:10:08.289197 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b3774887-0abb-4692-a856-fb86baa11ba6-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-77c6585f54-vwh7x\" (UID: \"b3774887-0abb-4692-a856-fb86baa11ba6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-vwh7x" Mar 09 13:10:08 crc kubenswrapper[4723]: I0309 13:10:08.289264 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/2bc0446d-1f37-4214-bd0a-0f7c64f844a8-observability-operator-tls\") pod \"observability-operator-59bdc8b94-2nqwq\" (UID: \"2bc0446d-1f37-4214-bd0a-0f7c64f844a8\") " pod="openshift-operators/observability-operator-59bdc8b94-2nqwq" Mar 09 13:10:08 crc kubenswrapper[4723]: I0309 13:10:08.289286 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/441fc6d3-ed2e-44b6-9e0d-d1925412eb23-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-77c6585f54-h6x8j\" (UID: \"441fc6d3-ed2e-44b6-9e0d-d1925412eb23\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-h6x8j" Mar 09 13:10:08 crc kubenswrapper[4723]: I0309 13:10:08.289559 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/441fc6d3-ed2e-44b6-9e0d-d1925412eb23-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-77c6585f54-h6x8j\" (UID: \"441fc6d3-ed2e-44b6-9e0d-d1925412eb23\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-h6x8j" Mar 09 13:10:08 crc kubenswrapper[4723]: I0309 13:10:08.289615 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wns7p\" (UniqueName: \"kubernetes.io/projected/2bc0446d-1f37-4214-bd0a-0f7c64f844a8-kube-api-access-wns7p\") pod \"observability-operator-59bdc8b94-2nqwq\" (UID: \"2bc0446d-1f37-4214-bd0a-0f7c64f844a8\") " pod="openshift-operators/observability-operator-59bdc8b94-2nqwq" Mar 09 13:10:08 crc kubenswrapper[4723]: I0309 13:10:08.289650 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b3774887-0abb-4692-a856-fb86baa11ba6-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-77c6585f54-vwh7x\" (UID: \"b3774887-0abb-4692-a856-fb86baa11ba6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-vwh7x" Mar 09 13:10:08 crc kubenswrapper[4723]: I0309 13:10:08.292114 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/441fc6d3-ed2e-44b6-9e0d-d1925412eb23-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-77c6585f54-h6x8j\" (UID: \"441fc6d3-ed2e-44b6-9e0d-d1925412eb23\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-h6x8j" Mar 09 13:10:08 crc kubenswrapper[4723]: I0309 13:10:08.292182 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/441fc6d3-ed2e-44b6-9e0d-d1925412eb23-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-77c6585f54-h6x8j\" (UID: \"441fc6d3-ed2e-44b6-9e0d-d1925412eb23\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-h6x8j" Mar 09 13:10:08 crc kubenswrapper[4723]: I0309 13:10:08.295602 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b3774887-0abb-4692-a856-fb86baa11ba6-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-77c6585f54-vwh7x\" (UID: \"b3774887-0abb-4692-a856-fb86baa11ba6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-vwh7x" Mar 09 13:10:08 crc kubenswrapper[4723]: I0309 13:10:08.295934 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b3774887-0abb-4692-a856-fb86baa11ba6-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-77c6585f54-vwh7x\" (UID: \"b3774887-0abb-4692-a856-fb86baa11ba6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-vwh7x" Mar 09 13:10:08 crc kubenswrapper[4723]: I0309 13:10:08.298131 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/2bc0446d-1f37-4214-bd0a-0f7c64f844a8-observability-operator-tls\") pod \"observability-operator-59bdc8b94-2nqwq\" (UID: \"2bc0446d-1f37-4214-bd0a-0f7c64f844a8\") " pod="openshift-operators/observability-operator-59bdc8b94-2nqwq" Mar 09 13:10:08 crc kubenswrapper[4723]: I0309 13:10:08.307773 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wns7p\" (UniqueName: \"kubernetes.io/projected/2bc0446d-1f37-4214-bd0a-0f7c64f844a8-kube-api-access-wns7p\") pod \"observability-operator-59bdc8b94-2nqwq\" (UID: \"2bc0446d-1f37-4214-bd0a-0f7c64f844a8\") " pod="openshift-operators/observability-operator-59bdc8b94-2nqwq" Mar 09 13:10:08 crc kubenswrapper[4723]: I0309 13:10:08.357600 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-h6x8j" Mar 09 13:10:08 crc kubenswrapper[4723]: I0309 13:10:08.365015 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-vwh7x" Mar 09 13:10:08 crc kubenswrapper[4723]: I0309 13:10:08.391053 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdfqs\" (UniqueName: \"kubernetes.io/projected/5d6ca68e-2044-42ed-9ee9-41bcf5aaf8e6-kube-api-access-hdfqs\") pod \"perses-operator-5bf474d74f-25fp4\" (UID: \"5d6ca68e-2044-42ed-9ee9-41bcf5aaf8e6\") " pod="openshift-operators/perses-operator-5bf474d74f-25fp4" Mar 09 13:10:08 crc kubenswrapper[4723]: I0309 13:10:08.391152 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/5d6ca68e-2044-42ed-9ee9-41bcf5aaf8e6-openshift-service-ca\") pod \"perses-operator-5bf474d74f-25fp4\" (UID: \"5d6ca68e-2044-42ed-9ee9-41bcf5aaf8e6\") " pod="openshift-operators/perses-operator-5bf474d74f-25fp4" Mar 09 13:10:08 crc kubenswrapper[4723]: E0309 13:10:08.391157 4723 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-77c6585f54-h6x8j_openshift-operators_441fc6d3-ed2e-44b6-9e0d-d1925412eb23_0(e42a090d353a39d930d4479cd03d690b51a92c14a6d149520908648f17780609): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:10:08 crc kubenswrapper[4723]: E0309 13:10:08.391243 4723 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-77c6585f54-h6x8j_openshift-operators_441fc6d3-ed2e-44b6-9e0d-d1925412eb23_0(e42a090d353a39d930d4479cd03d690b51a92c14a6d149520908648f17780609): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-h6x8j" Mar 09 13:10:08 crc kubenswrapper[4723]: E0309 13:10:08.391271 4723 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-77c6585f54-h6x8j_openshift-operators_441fc6d3-ed2e-44b6-9e0d-d1925412eb23_0(e42a090d353a39d930d4479cd03d690b51a92c14a6d149520908648f17780609): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-h6x8j" Mar 09 13:10:08 crc kubenswrapper[4723]: E0309 13:10:08.391329 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-77c6585f54-h6x8j_openshift-operators(441fc6d3-ed2e-44b6-9e0d-d1925412eb23)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-77c6585f54-h6x8j_openshift-operators(441fc6d3-ed2e-44b6-9e0d-d1925412eb23)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-77c6585f54-h6x8j_openshift-operators_441fc6d3-ed2e-44b6-9e0d-d1925412eb23_0(e42a090d353a39d930d4479cd03d690b51a92c14a6d149520908648f17780609): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-h6x8j" podUID="441fc6d3-ed2e-44b6-9e0d-d1925412eb23" Mar 09 13:10:08 crc kubenswrapper[4723]: E0309 13:10:08.403655 4723 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-77c6585f54-vwh7x_openshift-operators_b3774887-0abb-4692-a856-fb86baa11ba6_0(d4c4a9529d09bb5edb46e6233c29737c59dd3c77359a6dcd44e2cc46718bca05): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:10:08 crc kubenswrapper[4723]: E0309 13:10:08.403732 4723 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-77c6585f54-vwh7x_openshift-operators_b3774887-0abb-4692-a856-fb86baa11ba6_0(d4c4a9529d09bb5edb46e6233c29737c59dd3c77359a6dcd44e2cc46718bca05): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-vwh7x" Mar 09 13:10:08 crc kubenswrapper[4723]: E0309 13:10:08.403757 4723 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-77c6585f54-vwh7x_openshift-operators_b3774887-0abb-4692-a856-fb86baa11ba6_0(d4c4a9529d09bb5edb46e6233c29737c59dd3c77359a6dcd44e2cc46718bca05): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-vwh7x" Mar 09 13:10:08 crc kubenswrapper[4723]: E0309 13:10:08.403821 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-77c6585f54-vwh7x_openshift-operators(b3774887-0abb-4692-a856-fb86baa11ba6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-77c6585f54-vwh7x_openshift-operators(b3774887-0abb-4692-a856-fb86baa11ba6)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-77c6585f54-vwh7x_openshift-operators_b3774887-0abb-4692-a856-fb86baa11ba6_0(d4c4a9529d09bb5edb46e6233c29737c59dd3c77359a6dcd44e2cc46718bca05): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-vwh7x" podUID="b3774887-0abb-4692-a856-fb86baa11ba6" Mar 09 13:10:08 crc kubenswrapper[4723]: I0309 13:10:08.492776 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdfqs\" (UniqueName: \"kubernetes.io/projected/5d6ca68e-2044-42ed-9ee9-41bcf5aaf8e6-kube-api-access-hdfqs\") pod \"perses-operator-5bf474d74f-25fp4\" (UID: \"5d6ca68e-2044-42ed-9ee9-41bcf5aaf8e6\") " pod="openshift-operators/perses-operator-5bf474d74f-25fp4" Mar 09 13:10:08 crc kubenswrapper[4723]: I0309 13:10:08.492885 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/5d6ca68e-2044-42ed-9ee9-41bcf5aaf8e6-openshift-service-ca\") pod \"perses-operator-5bf474d74f-25fp4\" (UID: \"5d6ca68e-2044-42ed-9ee9-41bcf5aaf8e6\") " pod="openshift-operators/perses-operator-5bf474d74f-25fp4" Mar 09 13:10:08 crc kubenswrapper[4723]: I0309 13:10:08.493773 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/5d6ca68e-2044-42ed-9ee9-41bcf5aaf8e6-openshift-service-ca\") pod \"perses-operator-5bf474d74f-25fp4\" (UID: \"5d6ca68e-2044-42ed-9ee9-41bcf5aaf8e6\") " pod="openshift-operators/perses-operator-5bf474d74f-25fp4" Mar 09 13:10:08 crc kubenswrapper[4723]: I0309 13:10:08.509462 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdfqs\" (UniqueName: \"kubernetes.io/projected/5d6ca68e-2044-42ed-9ee9-41bcf5aaf8e6-kube-api-access-hdfqs\") pod \"perses-operator-5bf474d74f-25fp4\" (UID: \"5d6ca68e-2044-42ed-9ee9-41bcf5aaf8e6\") " pod="openshift-operators/perses-operator-5bf474d74f-25fp4" Mar 09 13:10:08 crc kubenswrapper[4723]: I0309 13:10:08.520081 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-2nqwq" Mar 09 13:10:08 crc kubenswrapper[4723]: E0309 13:10:08.544732 4723 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-2nqwq_openshift-operators_2bc0446d-1f37-4214-bd0a-0f7c64f844a8_0(aa2c698919f002e38fb6bdf491df1c4fd340385f7a4dcaa0b39d40eb83f971ac): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:10:08 crc kubenswrapper[4723]: E0309 13:10:08.544841 4723 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-2nqwq_openshift-operators_2bc0446d-1f37-4214-bd0a-0f7c64f844a8_0(aa2c698919f002e38fb6bdf491df1c4fd340385f7a4dcaa0b39d40eb83f971ac): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-2nqwq" Mar 09 13:10:08 crc kubenswrapper[4723]: E0309 13:10:08.544903 4723 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-2nqwq_openshift-operators_2bc0446d-1f37-4214-bd0a-0f7c64f844a8_0(aa2c698919f002e38fb6bdf491df1c4fd340385f7a4dcaa0b39d40eb83f971ac): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-2nqwq" Mar 09 13:10:08 crc kubenswrapper[4723]: E0309 13:10:08.544972 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-2nqwq_openshift-operators(2bc0446d-1f37-4214-bd0a-0f7c64f844a8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-2nqwq_openshift-operators(2bc0446d-1f37-4214-bd0a-0f7c64f844a8)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-2nqwq_openshift-operators_2bc0446d-1f37-4214-bd0a-0f7c64f844a8_0(aa2c698919f002e38fb6bdf491df1c4fd340385f7a4dcaa0b39d40eb83f971ac): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-2nqwq" podUID="2bc0446d-1f37-4214-bd0a-0f7c64f844a8" Mar 09 13:10:08 crc kubenswrapper[4723]: I0309 13:10:08.583553 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-25fp4" Mar 09 13:10:08 crc kubenswrapper[4723]: E0309 13:10:08.632920 4723 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-25fp4_openshift-operators_5d6ca68e-2044-42ed-9ee9-41bcf5aaf8e6_0(642daae0d6b263180f4bb4714e96baf39d6f3e3a856db81783e3565ab3032b51): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:10:08 crc kubenswrapper[4723]: E0309 13:10:08.632990 4723 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-25fp4_openshift-operators_5d6ca68e-2044-42ed-9ee9-41bcf5aaf8e6_0(642daae0d6b263180f4bb4714e96baf39d6f3e3a856db81783e3565ab3032b51): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-25fp4" Mar 09 13:10:08 crc kubenswrapper[4723]: E0309 13:10:08.633024 4723 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-25fp4_openshift-operators_5d6ca68e-2044-42ed-9ee9-41bcf5aaf8e6_0(642daae0d6b263180f4bb4714e96baf39d6f3e3a856db81783e3565ab3032b51): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-25fp4" Mar 09 13:10:08 crc kubenswrapper[4723]: E0309 13:10:08.633081 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-25fp4_openshift-operators(5d6ca68e-2044-42ed-9ee9-41bcf5aaf8e6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-25fp4_openshift-operators(5d6ca68e-2044-42ed-9ee9-41bcf5aaf8e6)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-25fp4_openshift-operators_5d6ca68e-2044-42ed-9ee9-41bcf5aaf8e6_0(642daae0d6b263180f4bb4714e96baf39d6f3e3a856db81783e3565ab3032b51): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-25fp4" podUID="5d6ca68e-2044-42ed-9ee9-41bcf5aaf8e6" Mar 09 13:10:09 crc kubenswrapper[4723]: I0309 13:10:09.974808 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" event={"ID":"babce048-2000-4f1c-af6b-4738e8fbb8a3","Type":"ContainerStarted","Data":"5ee696e1ff5acd76db3cb6d5cb01b7c568ecdb5bf07b9b7c73f2915a771830f4"} Mar 09 13:10:13 crc kubenswrapper[4723]: I0309 13:10:13.008980 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" event={"ID":"babce048-2000-4f1c-af6b-4738e8fbb8a3","Type":"ContainerStarted","Data":"8161fb8f43664a4253e0d0bcc649351d47705780c860492f98affa6aa0d4dbe7"} Mar 09 13:10:13 crc kubenswrapper[4723]: I0309 13:10:13.009636 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:13 crc kubenswrapper[4723]: I0309 13:10:13.009671 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:13 crc kubenswrapper[4723]: I0309 13:10:13.009682 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:13 crc kubenswrapper[4723]: I0309 13:10:13.032800 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:13 crc kubenswrapper[4723]: I0309 13:10:13.042658 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" podStartSLOduration=8.042616344 podStartE2EDuration="8.042616344s" podCreationTimestamp="2026-03-09 13:10:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:10:13.036259628 +0000 UTC m=+687.050727168" watchObservedRunningTime="2026-03-09 13:10:13.042616344 +0000 UTC m=+687.057083884" Mar 09 13:10:13 crc kubenswrapper[4723]: I0309 13:10:13.048192 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:13 crc kubenswrapper[4723]: I0309 13:10:13.076930 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-26ngl"] Mar 09 13:10:13 crc kubenswrapper[4723]: I0309 13:10:13.077061 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-26ngl" Mar 09 13:10:13 crc kubenswrapper[4723]: I0309 13:10:13.077462 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-26ngl" Mar 09 13:10:13 crc kubenswrapper[4723]: I0309 13:10:13.088911 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-vwh7x"] Mar 09 13:10:13 crc kubenswrapper[4723]: I0309 13:10:13.089050 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-vwh7x" Mar 09 13:10:13 crc kubenswrapper[4723]: I0309 13:10:13.089572 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-vwh7x" Mar 09 13:10:13 crc kubenswrapper[4723]: E0309 13:10:13.114592 4723 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-26ngl_openshift-operators_bcc48a4f-2d0e-4fb9-98d7-af5958403a01_0(aea336aa679a68a3833fe15bf8a1576b16d7f8db8d0d03419b0183ecd57f90b2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:10:13 crc kubenswrapper[4723]: E0309 13:10:13.114664 4723 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-26ngl_openshift-operators_bcc48a4f-2d0e-4fb9-98d7-af5958403a01_0(aea336aa679a68a3833fe15bf8a1576b16d7f8db8d0d03419b0183ecd57f90b2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-26ngl" Mar 09 13:10:13 crc kubenswrapper[4723]: E0309 13:10:13.114690 4723 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-26ngl_openshift-operators_bcc48a4f-2d0e-4fb9-98d7-af5958403a01_0(aea336aa679a68a3833fe15bf8a1576b16d7f8db8d0d03419b0183ecd57f90b2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-26ngl" Mar 09 13:10:13 crc kubenswrapper[4723]: E0309 13:10:13.114733 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-26ngl_openshift-operators(bcc48a4f-2d0e-4fb9-98d7-af5958403a01)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-26ngl_openshift-operators(bcc48a4f-2d0e-4fb9-98d7-af5958403a01)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-26ngl_openshift-operators_bcc48a4f-2d0e-4fb9-98d7-af5958403a01_0(aea336aa679a68a3833fe15bf8a1576b16d7f8db8d0d03419b0183ecd57f90b2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-26ngl" podUID="bcc48a4f-2d0e-4fb9-98d7-af5958403a01" Mar 09 13:10:13 crc kubenswrapper[4723]: I0309 13:10:13.115932 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-2nqwq"] Mar 09 13:10:13 crc kubenswrapper[4723]: I0309 13:10:13.116086 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-2nqwq" Mar 09 13:10:13 crc kubenswrapper[4723]: I0309 13:10:13.116607 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-2nqwq" Mar 09 13:10:13 crc kubenswrapper[4723]: I0309 13:10:13.123613 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-h6x8j"] Mar 09 13:10:13 crc kubenswrapper[4723]: I0309 13:10:13.123759 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-h6x8j" Mar 09 13:10:13 crc kubenswrapper[4723]: I0309 13:10:13.124188 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-h6x8j" Mar 09 13:10:13 crc kubenswrapper[4723]: E0309 13:10:13.148098 4723 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-77c6585f54-vwh7x_openshift-operators_b3774887-0abb-4692-a856-fb86baa11ba6_0(45a3b16ef66466eb79680e66704852f3d28a9592e46f645f336c9b40d7bc3288): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:10:13 crc kubenswrapper[4723]: E0309 13:10:13.148173 4723 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-77c6585f54-vwh7x_openshift-operators_b3774887-0abb-4692-a856-fb86baa11ba6_0(45a3b16ef66466eb79680e66704852f3d28a9592e46f645f336c9b40d7bc3288): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-vwh7x" Mar 09 13:10:13 crc kubenswrapper[4723]: E0309 13:10:13.148214 4723 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-77c6585f54-vwh7x_openshift-operators_b3774887-0abb-4692-a856-fb86baa11ba6_0(45a3b16ef66466eb79680e66704852f3d28a9592e46f645f336c9b40d7bc3288): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-vwh7x" Mar 09 13:10:13 crc kubenswrapper[4723]: E0309 13:10:13.148267 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-77c6585f54-vwh7x_openshift-operators(b3774887-0abb-4692-a856-fb86baa11ba6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-77c6585f54-vwh7x_openshift-operators(b3774887-0abb-4692-a856-fb86baa11ba6)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-77c6585f54-vwh7x_openshift-operators_b3774887-0abb-4692-a856-fb86baa11ba6_0(45a3b16ef66466eb79680e66704852f3d28a9592e46f645f336c9b40d7bc3288): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-vwh7x" podUID="b3774887-0abb-4692-a856-fb86baa11ba6" Mar 09 13:10:13 crc kubenswrapper[4723]: I0309 13:10:13.155778 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-25fp4"] Mar 09 13:10:13 crc kubenswrapper[4723]: I0309 13:10:13.155918 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-25fp4" Mar 09 13:10:13 crc kubenswrapper[4723]: I0309 13:10:13.156369 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-25fp4" Mar 09 13:10:13 crc kubenswrapper[4723]: E0309 13:10:13.196365 4723 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-2nqwq_openshift-operators_2bc0446d-1f37-4214-bd0a-0f7c64f844a8_0(b0470f12d2dab034a10c187a47ba57d0ea03bccf0e0eca44297bdacc3b1d1120): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:10:13 crc kubenswrapper[4723]: E0309 13:10:13.196443 4723 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-2nqwq_openshift-operators_2bc0446d-1f37-4214-bd0a-0f7c64f844a8_0(b0470f12d2dab034a10c187a47ba57d0ea03bccf0e0eca44297bdacc3b1d1120): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-2nqwq" Mar 09 13:10:13 crc kubenswrapper[4723]: E0309 13:10:13.196487 4723 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-2nqwq_openshift-operators_2bc0446d-1f37-4214-bd0a-0f7c64f844a8_0(b0470f12d2dab034a10c187a47ba57d0ea03bccf0e0eca44297bdacc3b1d1120): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-2nqwq" Mar 09 13:10:13 crc kubenswrapper[4723]: E0309 13:10:13.196532 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-2nqwq_openshift-operators(2bc0446d-1f37-4214-bd0a-0f7c64f844a8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-2nqwq_openshift-operators(2bc0446d-1f37-4214-bd0a-0f7c64f844a8)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-2nqwq_openshift-operators_2bc0446d-1f37-4214-bd0a-0f7c64f844a8_0(b0470f12d2dab034a10c187a47ba57d0ea03bccf0e0eca44297bdacc3b1d1120): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-2nqwq" podUID="2bc0446d-1f37-4214-bd0a-0f7c64f844a8" Mar 09 13:10:13 crc kubenswrapper[4723]: E0309 13:10:13.197440 4723 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-77c6585f54-h6x8j_openshift-operators_441fc6d3-ed2e-44b6-9e0d-d1925412eb23_0(bef260414638a4e592e814a0757cc8a95dd7efbab30353df9b1966a8afa11c29): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:10:13 crc kubenswrapper[4723]: E0309 13:10:13.197465 4723 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-77c6585f54-h6x8j_openshift-operators_441fc6d3-ed2e-44b6-9e0d-d1925412eb23_0(bef260414638a4e592e814a0757cc8a95dd7efbab30353df9b1966a8afa11c29): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-h6x8j" Mar 09 13:10:13 crc kubenswrapper[4723]: E0309 13:10:13.197479 4723 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-77c6585f54-h6x8j_openshift-operators_441fc6d3-ed2e-44b6-9e0d-d1925412eb23_0(bef260414638a4e592e814a0757cc8a95dd7efbab30353df9b1966a8afa11c29): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-h6x8j" Mar 09 13:10:13 crc kubenswrapper[4723]: E0309 13:10:13.197513 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-77c6585f54-h6x8j_openshift-operators(441fc6d3-ed2e-44b6-9e0d-d1925412eb23)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-77c6585f54-h6x8j_openshift-operators(441fc6d3-ed2e-44b6-9e0d-d1925412eb23)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-77c6585f54-h6x8j_openshift-operators_441fc6d3-ed2e-44b6-9e0d-d1925412eb23_0(bef260414638a4e592e814a0757cc8a95dd7efbab30353df9b1966a8afa11c29): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-h6x8j" podUID="441fc6d3-ed2e-44b6-9e0d-d1925412eb23" Mar 09 13:10:13 crc kubenswrapper[4723]: E0309 13:10:13.203937 4723 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-25fp4_openshift-operators_5d6ca68e-2044-42ed-9ee9-41bcf5aaf8e6_0(c2df8e81f8c4eb77c5307c89b5016ed3b0e4c92a605d3622daa6fabd9f846008): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:10:13 crc kubenswrapper[4723]: E0309 13:10:13.204001 4723 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-25fp4_openshift-operators_5d6ca68e-2044-42ed-9ee9-41bcf5aaf8e6_0(c2df8e81f8c4eb77c5307c89b5016ed3b0e4c92a605d3622daa6fabd9f846008): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-25fp4" Mar 09 13:10:13 crc kubenswrapper[4723]: E0309 13:10:13.204028 4723 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-25fp4_openshift-operators_5d6ca68e-2044-42ed-9ee9-41bcf5aaf8e6_0(c2df8e81f8c4eb77c5307c89b5016ed3b0e4c92a605d3622daa6fabd9f846008): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-25fp4" Mar 09 13:10:13 crc kubenswrapper[4723]: E0309 13:10:13.204081 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-25fp4_openshift-operators(5d6ca68e-2044-42ed-9ee9-41bcf5aaf8e6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-25fp4_openshift-operators(5d6ca68e-2044-42ed-9ee9-41bcf5aaf8e6)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-25fp4_openshift-operators_5d6ca68e-2044-42ed-9ee9-41bcf5aaf8e6_0(c2df8e81f8c4eb77c5307c89b5016ed3b0e4c92a605d3622daa6fabd9f846008): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-25fp4" podUID="5d6ca68e-2044-42ed-9ee9-41bcf5aaf8e6" Mar 09 13:10:18 crc kubenswrapper[4723]: I0309 13:10:18.881582 4723 scope.go:117] "RemoveContainer" containerID="9e3f00295ab5c8b08630d59915b6f04285bc0f618ea72db8e5954cd6b4a21bee" Mar 09 13:10:19 crc kubenswrapper[4723]: I0309 13:10:19.051255 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-g92rf_242d3bf9-4462-4562-813a-f3548edc94fd/kube-multus/1.log" Mar 09 13:10:19 crc kubenswrapper[4723]: I0309 13:10:19.051606 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-g92rf" event={"ID":"242d3bf9-4462-4562-813a-f3548edc94fd","Type":"ContainerStarted","Data":"be7be6eee11ca3fd83214dc8eec486cea5e8c3be06ea97119120f3eb1c903982"} Mar 09 13:10:23 crc kubenswrapper[4723]: I0309 13:10:23.880366 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-2nqwq" Mar 09 13:10:23 crc kubenswrapper[4723]: I0309 13:10:23.881404 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-2nqwq" Mar 09 13:10:24 crc kubenswrapper[4723]: I0309 13:10:24.322442 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-2nqwq"] Mar 09 13:10:24 crc kubenswrapper[4723]: W0309 13:10:24.337404 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bc0446d_1f37_4214_bd0a_0f7c64f844a8.slice/crio-9ce5b940c04446785212b8566034e3fbc2fc7fd37a0dbbb2e4b3f054cfed5196 WatchSource:0}: Error finding container 9ce5b940c04446785212b8566034e3fbc2fc7fd37a0dbbb2e4b3f054cfed5196: Status 404 returned error can't find the container with id 9ce5b940c04446785212b8566034e3fbc2fc7fd37a0dbbb2e4b3f054cfed5196 Mar 09 13:10:25 crc kubenswrapper[4723]: I0309 13:10:25.090900 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-2nqwq" event={"ID":"2bc0446d-1f37-4214-bd0a-0f7c64f844a8","Type":"ContainerStarted","Data":"9ce5b940c04446785212b8566034e3fbc2fc7fd37a0dbbb2e4b3f054cfed5196"} Mar 09 13:10:25 crc kubenswrapper[4723]: I0309 13:10:25.880894 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-h6x8j" Mar 09 13:10:25 crc kubenswrapper[4723]: I0309 13:10:25.881772 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-h6x8j" Mar 09 13:10:26 crc kubenswrapper[4723]: I0309 13:10:26.420439 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-h6x8j"] Mar 09 13:10:26 crc kubenswrapper[4723]: W0309 13:10:26.428815 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod441fc6d3_ed2e_44b6_9e0d_d1925412eb23.slice/crio-602231b5c49b91058cbc44db2fe27b6be48c5f70d2c51b1f56b0daff943401d1 WatchSource:0}: Error finding container 602231b5c49b91058cbc44db2fe27b6be48c5f70d2c51b1f56b0daff943401d1: Status 404 returned error can't find the container with id 602231b5c49b91058cbc44db2fe27b6be48c5f70d2c51b1f56b0daff943401d1 Mar 09 13:10:26 crc kubenswrapper[4723]: I0309 13:10:26.886657 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-26ngl" Mar 09 13:10:26 crc kubenswrapper[4723]: I0309 13:10:26.886986 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-26ngl" Mar 09 13:10:27 crc kubenswrapper[4723]: I0309 13:10:27.116938 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-h6x8j" event={"ID":"441fc6d3-ed2e-44b6-9e0d-d1925412eb23","Type":"ContainerStarted","Data":"602231b5c49b91058cbc44db2fe27b6be48c5f70d2c51b1f56b0daff943401d1"} Mar 09 13:10:27 crc kubenswrapper[4723]: I0309 13:10:27.314734 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-26ngl"] Mar 09 13:10:27 crc kubenswrapper[4723]: W0309 13:10:27.322704 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcc48a4f_2d0e_4fb9_98d7_af5958403a01.slice/crio-bde4e7ad877ba6625c810056fc4359de9061514e2e229133e5228e0c116c7255 WatchSource:0}: Error finding container bde4e7ad877ba6625c810056fc4359de9061514e2e229133e5228e0c116c7255: Status 404 returned error can't find the container with id bde4e7ad877ba6625c810056fc4359de9061514e2e229133e5228e0c116c7255 Mar 09 13:10:27 crc kubenswrapper[4723]: I0309 13:10:27.880732 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-25fp4" Mar 09 13:10:27 crc kubenswrapper[4723]: I0309 13:10:27.882456 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-25fp4" Mar 09 13:10:28 crc kubenswrapper[4723]: I0309 13:10:28.129088 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-26ngl" event={"ID":"bcc48a4f-2d0e-4fb9-98d7-af5958403a01","Type":"ContainerStarted","Data":"bde4e7ad877ba6625c810056fc4359de9061514e2e229133e5228e0c116c7255"} Mar 09 13:10:28 crc kubenswrapper[4723]: I0309 13:10:28.362694 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-25fp4"] Mar 09 13:10:28 crc kubenswrapper[4723]: I0309 13:10:28.882100 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-vwh7x" Mar 09 13:10:28 crc kubenswrapper[4723]: I0309 13:10:28.882523 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-vwh7x" Mar 09 13:10:32 crc kubenswrapper[4723]: I0309 13:10:32.171016 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-25fp4" event={"ID":"5d6ca68e-2044-42ed-9ee9-41bcf5aaf8e6","Type":"ContainerStarted","Data":"be54b29d8ff246ae395776609abf74db435d91aabc74ae0ff28e00ca07f09823"} Mar 09 13:10:32 crc kubenswrapper[4723]: I0309 13:10:32.895796 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-vwh7x"] Mar 09 13:10:32 crc kubenswrapper[4723]: W0309 13:10:32.914680 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3774887_0abb_4692_a856_fb86baa11ba6.slice/crio-ed955b50fc4210a983292a6cb92d2d9d8b1ec17d89b7112de4015eaeef5b30a2 WatchSource:0}: Error finding container ed955b50fc4210a983292a6cb92d2d9d8b1ec17d89b7112de4015eaeef5b30a2: Status 404 returned error can't find the container with id ed955b50fc4210a983292a6cb92d2d9d8b1ec17d89b7112de4015eaeef5b30a2 Mar 09 13:10:33 crc kubenswrapper[4723]: I0309 13:10:33.191993 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-h6x8j" event={"ID":"441fc6d3-ed2e-44b6-9e0d-d1925412eb23","Type":"ContainerStarted","Data":"f2cd359ee2f8ff32284eb645035fa83938036edac7b34b26ebf310417fcb3866"} Mar 09 13:10:33 crc kubenswrapper[4723]: I0309 13:10:33.196617 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-2nqwq" event={"ID":"2bc0446d-1f37-4214-bd0a-0f7c64f844a8","Type":"ContainerStarted","Data":"2c14553978a656aee48096498a7ef6cbe4de47d9224deed86c2c389a5cbe69aa"} Mar 09 13:10:33 crc kubenswrapper[4723]: I0309 13:10:33.197079 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-2nqwq" Mar 09 13:10:33 crc kubenswrapper[4723]: I0309 13:10:33.198585 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-vwh7x" event={"ID":"b3774887-0abb-4692-a856-fb86baa11ba6","Type":"ContainerStarted","Data":"826ee146868f5bcb33b2b2d30638c5d543b2dba02d85a5770be00b67092ed8c1"} Mar 09 13:10:33 crc kubenswrapper[4723]: I0309 13:10:33.198608 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-vwh7x" event={"ID":"b3774887-0abb-4692-a856-fb86baa11ba6","Type":"ContainerStarted","Data":"ed955b50fc4210a983292a6cb92d2d9d8b1ec17d89b7112de4015eaeef5b30a2"} Mar 09 13:10:33 crc kubenswrapper[4723]: I0309 13:10:33.202956 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-2nqwq" Mar 09 13:10:33 crc kubenswrapper[4723]: I0309 13:10:33.244421 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-h6x8j" podStartSLOduration=20.036123774 podStartE2EDuration="26.244404916s" podCreationTimestamp="2026-03-09 13:10:07 +0000 UTC" firstStartedPulling="2026-03-09 13:10:26.431527977 +0000 UTC m=+700.445995517" lastFinishedPulling="2026-03-09 13:10:32.639809119 +0000 UTC m=+706.654276659" observedRunningTime="2026-03-09 13:10:33.21631844 +0000 UTC m=+707.230786000" watchObservedRunningTime="2026-03-09 13:10:33.244404916 +0000 UTC m=+707.258872456" Mar 09 13:10:33 crc kubenswrapper[4723]: I0309 13:10:33.245106 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-2nqwq" podStartSLOduration=16.93982634 podStartE2EDuration="25.245100647s" podCreationTimestamp="2026-03-09 13:10:08 +0000 UTC" firstStartedPulling="2026-03-09 13:10:24.341556689 +0000 UTC m=+698.356024239" lastFinishedPulling="2026-03-09 13:10:32.646831006 +0000 UTC m=+706.661298546" observedRunningTime="2026-03-09 13:10:33.242459446 +0000 UTC m=+707.256926986" watchObservedRunningTime="2026-03-09 13:10:33.245100647 +0000 UTC m=+707.259568187" Mar 09 13:10:33 crc kubenswrapper[4723]: I0309 13:10:33.265050 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77c6585f54-vwh7x" podStartSLOduration=26.265030912 podStartE2EDuration="26.265030912s" podCreationTimestamp="2026-03-09 13:10:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:10:33.2630262 +0000 UTC m=+707.277493750" watchObservedRunningTime="2026-03-09 13:10:33.265030912 +0000 UTC m=+707.279498452" Mar 09 13:10:33 crc kubenswrapper[4723]: I0309 13:10:33.947219 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:10:33 crc kubenswrapper[4723]: I0309 13:10:33.947994 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:10:35 crc kubenswrapper[4723]: I0309 13:10:35.219074 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-25fp4" event={"ID":"5d6ca68e-2044-42ed-9ee9-41bcf5aaf8e6","Type":"ContainerStarted","Data":"f9a1ee259fa0936e5629167568615b89692286d5716c2b2a4687d6b76d1f69a1"} Mar 09 13:10:35 crc kubenswrapper[4723]: I0309 13:10:35.219236 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-25fp4" Mar 09 13:10:35 crc kubenswrapper[4723]: I0309 13:10:35.241697 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-25fp4" podStartSLOduration=24.313412053 podStartE2EDuration="27.241669394s" podCreationTimestamp="2026-03-09 13:10:08 +0000 UTC" firstStartedPulling="2026-03-09 13:10:31.937967013 +0000 UTC m=+705.952434553" lastFinishedPulling="2026-03-09 13:10:34.866224354 +0000 UTC m=+708.880691894" observedRunningTime="2026-03-09 13:10:35.236322569 +0000 UTC m=+709.250790119" watchObservedRunningTime="2026-03-09 13:10:35.241669394 +0000 UTC m=+709.256136954" Mar 09 13:10:36 crc kubenswrapper[4723]: I0309 13:10:36.221556 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wvmv2" Mar 09 13:10:36 crc kubenswrapper[4723]: I0309 13:10:36.229109 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-26ngl" event={"ID":"bcc48a4f-2d0e-4fb9-98d7-af5958403a01","Type":"ContainerStarted","Data":"729f9f288488c829e57dde7aed0b0c31976ac23c5f7fb3d7bbde439262189fd1"} Mar 09 13:10:36 crc kubenswrapper[4723]: I0309 13:10:36.277525 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-26ngl" podStartSLOduration=21.744199233 podStartE2EDuration="29.277507951s" podCreationTimestamp="2026-03-09 13:10:07 +0000 UTC" firstStartedPulling="2026-03-09 13:10:27.326188859 +0000 UTC m=+701.340656399" lastFinishedPulling="2026-03-09 13:10:34.859497567 +0000 UTC m=+708.873965117" observedRunningTime="2026-03-09 13:10:36.275498629 +0000 UTC m=+710.289966179" watchObservedRunningTime="2026-03-09 13:10:36.277507951 +0000 UTC m=+710.291975491" Mar 09 13:10:42 crc kubenswrapper[4723]: I0309 13:10:42.145139 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-b9zhr"] Mar 09 13:10:42 crc kubenswrapper[4723]: I0309 13:10:42.147552 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-b9zhr" Mar 09 13:10:42 crc kubenswrapper[4723]: I0309 13:10:42.153229 4723 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-t2q2c" Mar 09 13:10:42 crc kubenswrapper[4723]: I0309 13:10:42.153375 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 09 13:10:42 crc kubenswrapper[4723]: I0309 13:10:42.154484 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 09 13:10:42 crc kubenswrapper[4723]: I0309 13:10:42.158386 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-s7s5g"] Mar 09 13:10:42 crc kubenswrapper[4723]: I0309 13:10:42.159368 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-s7s5g" Mar 09 13:10:42 crc kubenswrapper[4723]: I0309 13:10:42.161364 4723 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-xj8zc" Mar 09 13:10:42 crc kubenswrapper[4723]: I0309 13:10:42.169771 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-b9zhr"] Mar 09 13:10:42 crc kubenswrapper[4723]: I0309 13:10:42.179285 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-6wj4n"] Mar 09 13:10:42 crc kubenswrapper[4723]: I0309 13:10:42.180117 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-6wj4n" Mar 09 13:10:42 crc kubenswrapper[4723]: I0309 13:10:42.182687 4723 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-c6bnr" Mar 09 13:10:42 crc kubenswrapper[4723]: I0309 13:10:42.192583 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-s7s5g"] Mar 09 13:10:42 crc kubenswrapper[4723]: I0309 13:10:42.203136 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-6wj4n"] Mar 09 13:10:42 crc kubenswrapper[4723]: I0309 13:10:42.310239 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph4t7\" (UniqueName: \"kubernetes.io/projected/95099130-ff5e-4bed-9f9d-b53820c77540-kube-api-access-ph4t7\") pod \"cert-manager-858654f9db-s7s5g\" (UID: \"95099130-ff5e-4bed-9f9d-b53820c77540\") " pod="cert-manager/cert-manager-858654f9db-s7s5g" Mar 09 13:10:42 crc kubenswrapper[4723]: I0309 13:10:42.310521 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grg47\" (UniqueName: \"kubernetes.io/projected/80ed3d36-6e36-4f64-8c40-62b445173079-kube-api-access-grg47\") pod \"cert-manager-cainjector-cf98fcc89-b9zhr\" (UID: \"80ed3d36-6e36-4f64-8c40-62b445173079\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-b9zhr" Mar 09 13:10:42 crc kubenswrapper[4723]: I0309 13:10:42.310658 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdbfg\" (UniqueName: \"kubernetes.io/projected/7b820c49-0780-4d8d-a069-6cecf6ee0f1e-kube-api-access-zdbfg\") pod \"cert-manager-webhook-687f57d79b-6wj4n\" (UID: \"7b820c49-0780-4d8d-a069-6cecf6ee0f1e\") " pod="cert-manager/cert-manager-webhook-687f57d79b-6wj4n" Mar 09 13:10:42 crc kubenswrapper[4723]: I0309 13:10:42.411873 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph4t7\" (UniqueName: \"kubernetes.io/projected/95099130-ff5e-4bed-9f9d-b53820c77540-kube-api-access-ph4t7\") pod \"cert-manager-858654f9db-s7s5g\" (UID: \"95099130-ff5e-4bed-9f9d-b53820c77540\") " pod="cert-manager/cert-manager-858654f9db-s7s5g" Mar 09 13:10:42 crc kubenswrapper[4723]: I0309 13:10:42.412242 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grg47\" (UniqueName: \"kubernetes.io/projected/80ed3d36-6e36-4f64-8c40-62b445173079-kube-api-access-grg47\") pod \"cert-manager-cainjector-cf98fcc89-b9zhr\" (UID: \"80ed3d36-6e36-4f64-8c40-62b445173079\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-b9zhr" Mar 09 13:10:42 crc kubenswrapper[4723]: I0309 13:10:42.412305 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdbfg\" (UniqueName: \"kubernetes.io/projected/7b820c49-0780-4d8d-a069-6cecf6ee0f1e-kube-api-access-zdbfg\") pod \"cert-manager-webhook-687f57d79b-6wj4n\" (UID: \"7b820c49-0780-4d8d-a069-6cecf6ee0f1e\") " pod="cert-manager/cert-manager-webhook-687f57d79b-6wj4n" Mar 09 13:10:42 crc kubenswrapper[4723]: I0309 13:10:42.430753 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdbfg\" (UniqueName: \"kubernetes.io/projected/7b820c49-0780-4d8d-a069-6cecf6ee0f1e-kube-api-access-zdbfg\") pod \"cert-manager-webhook-687f57d79b-6wj4n\" (UID: \"7b820c49-0780-4d8d-a069-6cecf6ee0f1e\") " pod="cert-manager/cert-manager-webhook-687f57d79b-6wj4n" Mar 09 13:10:42 crc kubenswrapper[4723]: I0309 13:10:42.430762 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph4t7\" (UniqueName: \"kubernetes.io/projected/95099130-ff5e-4bed-9f9d-b53820c77540-kube-api-access-ph4t7\") pod \"cert-manager-858654f9db-s7s5g\" (UID: \"95099130-ff5e-4bed-9f9d-b53820c77540\") " pod="cert-manager/cert-manager-858654f9db-s7s5g" Mar 09 13:10:42 crc kubenswrapper[4723]: I0309 13:10:42.435155 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grg47\" (UniqueName: \"kubernetes.io/projected/80ed3d36-6e36-4f64-8c40-62b445173079-kube-api-access-grg47\") pod \"cert-manager-cainjector-cf98fcc89-b9zhr\" (UID: \"80ed3d36-6e36-4f64-8c40-62b445173079\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-b9zhr" Mar 09 13:10:42 crc kubenswrapper[4723]: I0309 13:10:42.466713 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-b9zhr" Mar 09 13:10:42 crc kubenswrapper[4723]: I0309 13:10:42.480046 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-s7s5g" Mar 09 13:10:42 crc kubenswrapper[4723]: I0309 13:10:42.494433 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-6wj4n" Mar 09 13:10:42 crc kubenswrapper[4723]: I0309 13:10:42.921642 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-b9zhr"] Mar 09 13:10:42 crc kubenswrapper[4723]: W0309 13:10:42.923379 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80ed3d36_6e36_4f64_8c40_62b445173079.slice/crio-f32a03291897d3a189f6b3e7aaf4733786030e29782601d7d41447ef4103ae02 WatchSource:0}: Error finding container f32a03291897d3a189f6b3e7aaf4733786030e29782601d7d41447ef4103ae02: Status 404 returned error can't find the container with id f32a03291897d3a189f6b3e7aaf4733786030e29782601d7d41447ef4103ae02 Mar 09 13:10:42 crc kubenswrapper[4723]: I0309 13:10:42.961912 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-6wj4n"] Mar 09 13:10:42 crc kubenswrapper[4723]: W0309 13:10:42.963840 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b820c49_0780_4d8d_a069_6cecf6ee0f1e.slice/crio-18bd25d5ec440cf4eb625e4af00547c61be25a842d1ccfeffa093774b4845641 WatchSource:0}: Error finding container 18bd25d5ec440cf4eb625e4af00547c61be25a842d1ccfeffa093774b4845641: Status 404 returned error can't find the container with id 18bd25d5ec440cf4eb625e4af00547c61be25a842d1ccfeffa093774b4845641 Mar 09 13:10:42 crc kubenswrapper[4723]: I0309 13:10:42.983059 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-s7s5g"] Mar 09 13:10:42 crc kubenswrapper[4723]: W0309 13:10:42.995039 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95099130_ff5e_4bed_9f9d_b53820c77540.slice/crio-e5063cd69276aa3ad6d962cdc0533d55c0be91f43e86bc6dcd9baa0829eefeee WatchSource:0}: Error finding container e5063cd69276aa3ad6d962cdc0533d55c0be91f43e86bc6dcd9baa0829eefeee: Status 404 returned error can't find the container with id e5063cd69276aa3ad6d962cdc0533d55c0be91f43e86bc6dcd9baa0829eefeee Mar 09 13:10:43 crc kubenswrapper[4723]: I0309 13:10:43.290969 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-b9zhr" event={"ID":"80ed3d36-6e36-4f64-8c40-62b445173079","Type":"ContainerStarted","Data":"f32a03291897d3a189f6b3e7aaf4733786030e29782601d7d41447ef4103ae02"} Mar 09 13:10:43 crc kubenswrapper[4723]: I0309 13:10:43.298742 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-s7s5g" event={"ID":"95099130-ff5e-4bed-9f9d-b53820c77540","Type":"ContainerStarted","Data":"e5063cd69276aa3ad6d962cdc0533d55c0be91f43e86bc6dcd9baa0829eefeee"} Mar 09 13:10:43 crc kubenswrapper[4723]: I0309 13:10:43.309697 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-6wj4n" event={"ID":"7b820c49-0780-4d8d-a069-6cecf6ee0f1e","Type":"ContainerStarted","Data":"18bd25d5ec440cf4eb625e4af00547c61be25a842d1ccfeffa093774b4845641"} Mar 09 13:10:47 crc kubenswrapper[4723]: I0309 13:10:47.303753 4723 scope.go:117] "RemoveContainer" containerID="8a4b67848d5f5b3107170fd3a79e6d8f7dea820bfb2924ffc63d2832a8e2c6c1" Mar 09 13:10:47 crc kubenswrapper[4723]: I0309 13:10:47.431340 4723 scope.go:117] "RemoveContainer" containerID="3b57386b307fe14f2c250d7291c8986af7055c39429cea230334f2ec4a055b79" Mar 09 13:10:48 crc kubenswrapper[4723]: I0309 13:10:48.349213 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-6wj4n" event={"ID":"7b820c49-0780-4d8d-a069-6cecf6ee0f1e","Type":"ContainerStarted","Data":"0f101fc9fd795bbe3979bf715cf3e3d976e1f559438c297001a5d95700954455"} Mar 09 13:10:48 crc kubenswrapper[4723]: I0309 13:10:48.349637 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-6wj4n" Mar 09 13:10:48 crc kubenswrapper[4723]: I0309 13:10:48.352376 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-b9zhr" event={"ID":"80ed3d36-6e36-4f64-8c40-62b445173079","Type":"ContainerStarted","Data":"a398e2cc47cd36cea7647fcf9930ded93ccbcfd67c8074adef64f787e1c41d27"} Mar 09 13:10:48 crc kubenswrapper[4723]: I0309 13:10:48.353814 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-s7s5g" event={"ID":"95099130-ff5e-4bed-9f9d-b53820c77540","Type":"ContainerStarted","Data":"4a64cfc6fd38e7e867b8d3f3c698846a5b9acbdafe4cd7c5b0f4f7d6c8d0cd83"} Mar 09 13:10:48 crc kubenswrapper[4723]: I0309 13:10:48.383361 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-6wj4n" podStartSLOduration=1.917684661 podStartE2EDuration="6.383341127s" podCreationTimestamp="2026-03-09 13:10:42 +0000 UTC" firstStartedPulling="2026-03-09 13:10:42.966440547 +0000 UTC m=+716.980908087" lastFinishedPulling="2026-03-09 13:10:47.432097013 +0000 UTC m=+721.446564553" observedRunningTime="2026-03-09 13:10:48.365967719 +0000 UTC m=+722.380435269" watchObservedRunningTime="2026-03-09 13:10:48.383341127 +0000 UTC m=+722.397808667" Mar 09 13:10:48 crc kubenswrapper[4723]: I0309 13:10:48.385570 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-b9zhr" podStartSLOduration=1.8755924290000001 podStartE2EDuration="6.385558097s" podCreationTimestamp="2026-03-09 13:10:42 +0000 UTC" firstStartedPulling="2026-03-09 13:10:42.926481722 +0000 UTC m=+716.940949262" lastFinishedPulling="2026-03-09 13:10:47.43644739 +0000 UTC m=+721.450914930" observedRunningTime="2026-03-09 13:10:48.381376194 +0000 UTC m=+722.395843754" watchObservedRunningTime="2026-03-09 13:10:48.385558097 +0000 UTC m=+722.400025637" Mar 09 13:10:48 crc kubenswrapper[4723]: I0309 13:10:48.408560 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-s7s5g" podStartSLOduration=1.972528517 podStartE2EDuration="6.408541975s" podCreationTimestamp="2026-03-09 13:10:42 +0000 UTC" firstStartedPulling="2026-03-09 13:10:42.999183988 +0000 UTC m=+717.013651528" lastFinishedPulling="2026-03-09 13:10:47.435197446 +0000 UTC m=+721.449664986" observedRunningTime="2026-03-09 13:10:48.404077085 +0000 UTC m=+722.418544625" watchObservedRunningTime="2026-03-09 13:10:48.408541975 +0000 UTC m=+722.423009535" Mar 09 13:10:48 crc kubenswrapper[4723]: I0309 13:10:48.588043 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-25fp4" Mar 09 13:10:52 crc kubenswrapper[4723]: I0309 13:10:52.497058 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-6wj4n" Mar 09 13:11:03 crc kubenswrapper[4723]: I0309 13:11:03.947772 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:11:03 crc kubenswrapper[4723]: I0309 13:11:03.948490 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:11:03 crc kubenswrapper[4723]: I0309 13:11:03.948579 4723 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" Mar 09 13:11:03 crc kubenswrapper[4723]: I0309 13:11:03.949732 4723 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0fa72ca2b7e100c53424b0c6c728520cb30db8e9432e97e83e4d09f170a81438"} pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 13:11:03 crc kubenswrapper[4723]: I0309 13:11:03.949836 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" containerID="cri-o://0fa72ca2b7e100c53424b0c6c728520cb30db8e9432e97e83e4d09f170a81438" gracePeriod=600 Mar 09 13:11:04 crc kubenswrapper[4723]: I0309 13:11:04.472216 4723 generic.go:334] "Generic (PLEG): container finished" podID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerID="0fa72ca2b7e100c53424b0c6c728520cb30db8e9432e97e83e4d09f170a81438" exitCode=0 Mar 09 13:11:04 crc kubenswrapper[4723]: I0309 13:11:04.472411 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" event={"ID":"983d5ed4-cfc7-402a-b226-29dc071c6e4e","Type":"ContainerDied","Data":"0fa72ca2b7e100c53424b0c6c728520cb30db8e9432e97e83e4d09f170a81438"} Mar 09 13:11:04 crc kubenswrapper[4723]: I0309 13:11:04.472586 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" event={"ID":"983d5ed4-cfc7-402a-b226-29dc071c6e4e","Type":"ContainerStarted","Data":"4d25501b3a9b23fada0109d3a471f491cc22bbb00f111c3efbddd551e1408485"} Mar 09 13:11:04 crc kubenswrapper[4723]: I0309 13:11:04.472617 4723 scope.go:117] "RemoveContainer" containerID="d054e197559b4879e59df42a68d1c798a7c319b81a2cd49030fdbc518b252634" Mar 09 13:11:14 crc kubenswrapper[4723]: I0309 13:11:14.551959 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989nbkvb"] Mar 09 13:11:14 crc kubenswrapper[4723]: I0309 13:11:14.554065 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989nbkvb" Mar 09 13:11:14 crc kubenswrapper[4723]: I0309 13:11:14.556689 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 09 13:11:14 crc kubenswrapper[4723]: I0309 13:11:14.567270 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989nbkvb"] Mar 09 13:11:14 crc kubenswrapper[4723]: I0309 13:11:14.621248 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/995955a2-1d3d-4705-826b-d61bf24a1f2d-bundle\") pod \"e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989nbkvb\" (UID: \"995955a2-1d3d-4705-826b-d61bf24a1f2d\") " pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989nbkvb" Mar 09 13:11:14 crc kubenswrapper[4723]: I0309 13:11:14.621345 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t282\" (UniqueName: \"kubernetes.io/projected/995955a2-1d3d-4705-826b-d61bf24a1f2d-kube-api-access-4t282\") pod \"e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989nbkvb\" (UID: \"995955a2-1d3d-4705-826b-d61bf24a1f2d\") " pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989nbkvb" Mar 09 13:11:14 crc kubenswrapper[4723]: I0309 13:11:14.621380 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/995955a2-1d3d-4705-826b-d61bf24a1f2d-util\") pod \"e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989nbkvb\" (UID: \"995955a2-1d3d-4705-826b-d61bf24a1f2d\") " pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989nbkvb" Mar 09 13:11:14 crc kubenswrapper[4723]: I0309 13:11:14.722617 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/995955a2-1d3d-4705-826b-d61bf24a1f2d-bundle\") pod \"e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989nbkvb\" (UID: \"995955a2-1d3d-4705-826b-d61bf24a1f2d\") " pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989nbkvb" Mar 09 13:11:14 crc kubenswrapper[4723]: I0309 13:11:14.722696 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t282\" (UniqueName: \"kubernetes.io/projected/995955a2-1d3d-4705-826b-d61bf24a1f2d-kube-api-access-4t282\") pod \"e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989nbkvb\" (UID: \"995955a2-1d3d-4705-826b-d61bf24a1f2d\") " pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989nbkvb" Mar 09 13:11:14 crc kubenswrapper[4723]: I0309 13:11:14.722725 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/995955a2-1d3d-4705-826b-d61bf24a1f2d-util\") pod \"e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989nbkvb\" (UID: \"995955a2-1d3d-4705-826b-d61bf24a1f2d\") " pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989nbkvb" Mar 09 13:11:14 crc kubenswrapper[4723]: I0309 13:11:14.723232 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/995955a2-1d3d-4705-826b-d61bf24a1f2d-util\") pod \"e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989nbkvb\" (UID: \"995955a2-1d3d-4705-826b-d61bf24a1f2d\") " pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989nbkvb" Mar 09 13:11:14 crc kubenswrapper[4723]: I0309 13:11:14.723253 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/995955a2-1d3d-4705-826b-d61bf24a1f2d-bundle\") pod \"e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989nbkvb\" (UID: \"995955a2-1d3d-4705-826b-d61bf24a1f2d\") " pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989nbkvb" Mar 09 13:11:14 crc kubenswrapper[4723]: I0309 13:11:14.736922 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19fs492"] Mar 09 13:11:14 crc kubenswrapper[4723]: I0309 13:11:14.738321 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19fs492" Mar 09 13:11:14 crc kubenswrapper[4723]: I0309 13:11:14.753164 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19fs492"] Mar 09 13:11:14 crc kubenswrapper[4723]: I0309 13:11:14.754713 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t282\" (UniqueName: \"kubernetes.io/projected/995955a2-1d3d-4705-826b-d61bf24a1f2d-kube-api-access-4t282\") pod \"e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989nbkvb\" (UID: \"995955a2-1d3d-4705-826b-d61bf24a1f2d\") " pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989nbkvb" Mar 09 13:11:14 crc kubenswrapper[4723]: I0309 13:11:14.824610 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/73775981-b81a-47d9-b93e-0ecf9ba86890-bundle\") pod \"371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19fs492\" (UID: \"73775981-b81a-47d9-b93e-0ecf9ba86890\") " pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19fs492" Mar 09 13:11:14 crc kubenswrapper[4723]: I0309 13:11:14.824678 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp2dr\" (UniqueName: \"kubernetes.io/projected/73775981-b81a-47d9-b93e-0ecf9ba86890-kube-api-access-kp2dr\") pod \"371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19fs492\" (UID: \"73775981-b81a-47d9-b93e-0ecf9ba86890\") " pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19fs492" Mar 09 13:11:14 crc kubenswrapper[4723]: I0309 13:11:14.824995 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/73775981-b81a-47d9-b93e-0ecf9ba86890-util\") pod \"371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19fs492\" (UID: \"73775981-b81a-47d9-b93e-0ecf9ba86890\") " pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19fs492" Mar 09 13:11:14 crc kubenswrapper[4723]: I0309 13:11:14.874026 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989nbkvb" Mar 09 13:11:14 crc kubenswrapper[4723]: I0309 13:11:14.926537 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/73775981-b81a-47d9-b93e-0ecf9ba86890-bundle\") pod \"371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19fs492\" (UID: \"73775981-b81a-47d9-b93e-0ecf9ba86890\") " pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19fs492" Mar 09 13:11:14 crc kubenswrapper[4723]: I0309 13:11:14.926587 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp2dr\" (UniqueName: \"kubernetes.io/projected/73775981-b81a-47d9-b93e-0ecf9ba86890-kube-api-access-kp2dr\") pod \"371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19fs492\" (UID: \"73775981-b81a-47d9-b93e-0ecf9ba86890\") " pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19fs492" Mar 09 13:11:14 crc kubenswrapper[4723]: I0309 13:11:14.926728 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/73775981-b81a-47d9-b93e-0ecf9ba86890-util\") pod \"371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19fs492\" (UID: \"73775981-b81a-47d9-b93e-0ecf9ba86890\") " pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19fs492" Mar 09 13:11:14 crc kubenswrapper[4723]: I0309 13:11:14.927140 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/73775981-b81a-47d9-b93e-0ecf9ba86890-bundle\") pod \"371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19fs492\" (UID: \"73775981-b81a-47d9-b93e-0ecf9ba86890\") " pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19fs492" Mar 09 13:11:14 crc kubenswrapper[4723]: I0309 13:11:14.927404 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/73775981-b81a-47d9-b93e-0ecf9ba86890-util\") pod \"371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19fs492\" (UID: \"73775981-b81a-47d9-b93e-0ecf9ba86890\") " pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19fs492" Mar 09 13:11:14 crc kubenswrapper[4723]: I0309 13:11:14.946153 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp2dr\" (UniqueName: \"kubernetes.io/projected/73775981-b81a-47d9-b93e-0ecf9ba86890-kube-api-access-kp2dr\") pod \"371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19fs492\" (UID: \"73775981-b81a-47d9-b93e-0ecf9ba86890\") " pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19fs492" Mar 09 13:11:15 crc kubenswrapper[4723]: I0309 13:11:15.085460 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19fs492" Mar 09 13:11:15 crc kubenswrapper[4723]: I0309 13:11:15.216406 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989nbkvb"] Mar 09 13:11:15 crc kubenswrapper[4723]: I0309 13:11:15.565468 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19fs492"] Mar 09 13:11:15 crc kubenswrapper[4723]: W0309 13:11:15.571525 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73775981_b81a_47d9_b93e_0ecf9ba86890.slice/crio-00a878cc3a1f6f602a43e686b693818006afb7423689a3e350b1b7f189e54344 WatchSource:0}: Error finding container 00a878cc3a1f6f602a43e686b693818006afb7423689a3e350b1b7f189e54344: Status 404 returned error can't find the container with id 00a878cc3a1f6f602a43e686b693818006afb7423689a3e350b1b7f189e54344 Mar 09 13:11:15 crc kubenswrapper[4723]: I0309 13:11:15.599144 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19fs492" event={"ID":"73775981-b81a-47d9-b93e-0ecf9ba86890","Type":"ContainerStarted","Data":"00a878cc3a1f6f602a43e686b693818006afb7423689a3e350b1b7f189e54344"} Mar 09 13:11:15 crc kubenswrapper[4723]: I0309 13:11:15.601686 4723 generic.go:334] "Generic (PLEG): container finished" podID="995955a2-1d3d-4705-826b-d61bf24a1f2d" containerID="99ad64bc8a4020406e2f7105320ef0b0b36c435fda568c2e2e7b2ae60575f2cc" exitCode=0 Mar 09 13:11:15 crc kubenswrapper[4723]: I0309 13:11:15.601777 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989nbkvb" event={"ID":"995955a2-1d3d-4705-826b-d61bf24a1f2d","Type":"ContainerDied","Data":"99ad64bc8a4020406e2f7105320ef0b0b36c435fda568c2e2e7b2ae60575f2cc"} Mar 09 13:11:15 crc kubenswrapper[4723]: I0309 13:11:15.601811 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989nbkvb" event={"ID":"995955a2-1d3d-4705-826b-d61bf24a1f2d","Type":"ContainerStarted","Data":"47f967be37f3038810a82a86b25dd616041eafb7672f19d11d47ff098fba1339"} Mar 09 13:11:16 crc kubenswrapper[4723]: I0309 13:11:16.610297 4723 generic.go:334] "Generic (PLEG): container finished" podID="73775981-b81a-47d9-b93e-0ecf9ba86890" containerID="6389dda19f625c5d40950be75043d26fe6d07191fd98e1f5b6e229e69fa0b385" exitCode=0 Mar 09 13:11:16 crc kubenswrapper[4723]: I0309 13:11:16.610414 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19fs492" event={"ID":"73775981-b81a-47d9-b93e-0ecf9ba86890","Type":"ContainerDied","Data":"6389dda19f625c5d40950be75043d26fe6d07191fd98e1f5b6e229e69fa0b385"} Mar 09 13:11:17 crc kubenswrapper[4723]: I0309 13:11:17.621992 4723 generic.go:334] "Generic (PLEG): container finished" podID="995955a2-1d3d-4705-826b-d61bf24a1f2d" containerID="124e06a3a208871a3e09c14ed4c5e1b53c916193ce66677a2d82e343b690d5cc" exitCode=0 Mar 09 13:11:17 crc kubenswrapper[4723]: I0309 13:11:17.622128 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989nbkvb" event={"ID":"995955a2-1d3d-4705-826b-d61bf24a1f2d","Type":"ContainerDied","Data":"124e06a3a208871a3e09c14ed4c5e1b53c916193ce66677a2d82e343b690d5cc"} Mar 09 13:11:18 crc kubenswrapper[4723]: I0309 13:11:18.633088 4723 generic.go:334] "Generic (PLEG): container finished" podID="73775981-b81a-47d9-b93e-0ecf9ba86890" containerID="3ddb57df602a245829d509857da21f6bd0525accc963457268190b1da4f9174c" exitCode=0 Mar 09 13:11:18 crc kubenswrapper[4723]: I0309 13:11:18.633202 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19fs492" event={"ID":"73775981-b81a-47d9-b93e-0ecf9ba86890","Type":"ContainerDied","Data":"3ddb57df602a245829d509857da21f6bd0525accc963457268190b1da4f9174c"} Mar 09 13:11:18 crc kubenswrapper[4723]: I0309 13:11:18.637681 4723 generic.go:334] "Generic (PLEG): container finished" podID="995955a2-1d3d-4705-826b-d61bf24a1f2d" containerID="cf93851c6603c65da510e46a97a73cd9382bb4c6bacd8b1a64ca364d87f34d1b" exitCode=0 Mar 09 13:11:18 crc kubenswrapper[4723]: I0309 13:11:18.637711 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989nbkvb" event={"ID":"995955a2-1d3d-4705-826b-d61bf24a1f2d","Type":"ContainerDied","Data":"cf93851c6603c65da510e46a97a73cd9382bb4c6bacd8b1a64ca364d87f34d1b"} Mar 09 13:11:19 crc kubenswrapper[4723]: I0309 13:11:19.646899 4723 generic.go:334] "Generic (PLEG): container finished" podID="73775981-b81a-47d9-b93e-0ecf9ba86890" containerID="7d3e1050689bda5e88e591cd613ffda0b882d4af9d6e4e9b15c689d2f7c06d16" exitCode=0 Mar 09 13:11:19 crc kubenswrapper[4723]: I0309 13:11:19.646989 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19fs492" event={"ID":"73775981-b81a-47d9-b93e-0ecf9ba86890","Type":"ContainerDied","Data":"7d3e1050689bda5e88e591cd613ffda0b882d4af9d6e4e9b15c689d2f7c06d16"} Mar 09 13:11:19 crc kubenswrapper[4723]: I0309 13:11:19.938364 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989nbkvb" Mar 09 13:11:20 crc kubenswrapper[4723]: I0309 13:11:20.008939 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/995955a2-1d3d-4705-826b-d61bf24a1f2d-bundle\") pod \"995955a2-1d3d-4705-826b-d61bf24a1f2d\" (UID: \"995955a2-1d3d-4705-826b-d61bf24a1f2d\") " Mar 09 13:11:20 crc kubenswrapper[4723]: I0309 13:11:20.008990 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/995955a2-1d3d-4705-826b-d61bf24a1f2d-util\") pod \"995955a2-1d3d-4705-826b-d61bf24a1f2d\" (UID: \"995955a2-1d3d-4705-826b-d61bf24a1f2d\") " Mar 09 13:11:20 crc kubenswrapper[4723]: I0309 13:11:20.009082 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t282\" (UniqueName: \"kubernetes.io/projected/995955a2-1d3d-4705-826b-d61bf24a1f2d-kube-api-access-4t282\") pod \"995955a2-1d3d-4705-826b-d61bf24a1f2d\" (UID: \"995955a2-1d3d-4705-826b-d61bf24a1f2d\") " Mar 09 13:11:20 crc kubenswrapper[4723]: I0309 13:11:20.009909 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/995955a2-1d3d-4705-826b-d61bf24a1f2d-bundle" (OuterVolumeSpecName: "bundle") pod "995955a2-1d3d-4705-826b-d61bf24a1f2d" (UID: "995955a2-1d3d-4705-826b-d61bf24a1f2d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:11:20 crc kubenswrapper[4723]: I0309 13:11:20.014971 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/995955a2-1d3d-4705-826b-d61bf24a1f2d-kube-api-access-4t282" (OuterVolumeSpecName: "kube-api-access-4t282") pod "995955a2-1d3d-4705-826b-d61bf24a1f2d" (UID: "995955a2-1d3d-4705-826b-d61bf24a1f2d"). InnerVolumeSpecName "kube-api-access-4t282". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:11:20 crc kubenswrapper[4723]: I0309 13:11:20.027781 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/995955a2-1d3d-4705-826b-d61bf24a1f2d-util" (OuterVolumeSpecName: "util") pod "995955a2-1d3d-4705-826b-d61bf24a1f2d" (UID: "995955a2-1d3d-4705-826b-d61bf24a1f2d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:11:20 crc kubenswrapper[4723]: I0309 13:11:20.111092 4723 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/995955a2-1d3d-4705-826b-d61bf24a1f2d-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:11:20 crc kubenswrapper[4723]: I0309 13:11:20.111427 4723 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/995955a2-1d3d-4705-826b-d61bf24a1f2d-util\") on node \"crc\" DevicePath \"\"" Mar 09 13:11:20 crc kubenswrapper[4723]: I0309 13:11:20.111530 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t282\" (UniqueName: \"kubernetes.io/projected/995955a2-1d3d-4705-826b-d61bf24a1f2d-kube-api-access-4t282\") on node \"crc\" DevicePath \"\"" Mar 09 13:11:20 crc kubenswrapper[4723]: I0309 13:11:20.657741 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989nbkvb" event={"ID":"995955a2-1d3d-4705-826b-d61bf24a1f2d","Type":"ContainerDied","Data":"47f967be37f3038810a82a86b25dd616041eafb7672f19d11d47ff098fba1339"} Mar 09 13:11:20 crc kubenswrapper[4723]: I0309 13:11:20.657779 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989nbkvb" Mar 09 13:11:20 crc kubenswrapper[4723]: I0309 13:11:20.657795 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47f967be37f3038810a82a86b25dd616041eafb7672f19d11d47ff098fba1339" Mar 09 13:11:20 crc kubenswrapper[4723]: I0309 13:11:20.964235 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19fs492" Mar 09 13:11:21 crc kubenswrapper[4723]: I0309 13:11:21.024250 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/73775981-b81a-47d9-b93e-0ecf9ba86890-bundle\") pod \"73775981-b81a-47d9-b93e-0ecf9ba86890\" (UID: \"73775981-b81a-47d9-b93e-0ecf9ba86890\") " Mar 09 13:11:21 crc kubenswrapper[4723]: I0309 13:11:21.024446 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kp2dr\" (UniqueName: \"kubernetes.io/projected/73775981-b81a-47d9-b93e-0ecf9ba86890-kube-api-access-kp2dr\") pod \"73775981-b81a-47d9-b93e-0ecf9ba86890\" (UID: \"73775981-b81a-47d9-b93e-0ecf9ba86890\") " Mar 09 13:11:21 crc kubenswrapper[4723]: I0309 13:11:21.024497 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/73775981-b81a-47d9-b93e-0ecf9ba86890-util\") pod \"73775981-b81a-47d9-b93e-0ecf9ba86890\" (UID: \"73775981-b81a-47d9-b93e-0ecf9ba86890\") " Mar 09 13:11:21 crc kubenswrapper[4723]: I0309 13:11:21.026604 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73775981-b81a-47d9-b93e-0ecf9ba86890-bundle" (OuterVolumeSpecName: "bundle") pod "73775981-b81a-47d9-b93e-0ecf9ba86890" (UID: "73775981-b81a-47d9-b93e-0ecf9ba86890"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:11:21 crc kubenswrapper[4723]: I0309 13:11:21.029364 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73775981-b81a-47d9-b93e-0ecf9ba86890-kube-api-access-kp2dr" (OuterVolumeSpecName: "kube-api-access-kp2dr") pod "73775981-b81a-47d9-b93e-0ecf9ba86890" (UID: "73775981-b81a-47d9-b93e-0ecf9ba86890"). InnerVolumeSpecName "kube-api-access-kp2dr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:11:21 crc kubenswrapper[4723]: I0309 13:11:21.126451 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kp2dr\" (UniqueName: \"kubernetes.io/projected/73775981-b81a-47d9-b93e-0ecf9ba86890-kube-api-access-kp2dr\") on node \"crc\" DevicePath \"\"" Mar 09 13:11:21 crc kubenswrapper[4723]: I0309 13:11:21.126481 4723 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/73775981-b81a-47d9-b93e-0ecf9ba86890-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:11:21 crc kubenswrapper[4723]: I0309 13:11:21.342320 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73775981-b81a-47d9-b93e-0ecf9ba86890-util" (OuterVolumeSpecName: "util") pod "73775981-b81a-47d9-b93e-0ecf9ba86890" (UID: "73775981-b81a-47d9-b93e-0ecf9ba86890"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:11:21 crc kubenswrapper[4723]: I0309 13:11:21.431682 4723 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/73775981-b81a-47d9-b93e-0ecf9ba86890-util\") on node \"crc\" DevicePath \"\"" Mar 09 13:11:21 crc kubenswrapper[4723]: I0309 13:11:21.668115 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19fs492" event={"ID":"73775981-b81a-47d9-b93e-0ecf9ba86890","Type":"ContainerDied","Data":"00a878cc3a1f6f602a43e686b693818006afb7423689a3e350b1b7f189e54344"} Mar 09 13:11:21 crc kubenswrapper[4723]: I0309 13:11:21.668167 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00a878cc3a1f6f602a43e686b693818006afb7423689a3e350b1b7f189e54344" Mar 09 13:11:21 crc kubenswrapper[4723]: I0309 13:11:21.668903 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19fs492" Mar 09 13:11:32 crc kubenswrapper[4723]: I0309 13:11:32.775061 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-856bf85654-nsk4x"] Mar 09 13:11:32 crc kubenswrapper[4723]: E0309 13:11:32.776017 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73775981-b81a-47d9-b93e-0ecf9ba86890" containerName="extract" Mar 09 13:11:32 crc kubenswrapper[4723]: I0309 13:11:32.776038 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="73775981-b81a-47d9-b93e-0ecf9ba86890" containerName="extract" Mar 09 13:11:32 crc kubenswrapper[4723]: E0309 13:11:32.776073 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="995955a2-1d3d-4705-826b-d61bf24a1f2d" containerName="pull" Mar 09 13:11:32 crc kubenswrapper[4723]: I0309 13:11:32.776083 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="995955a2-1d3d-4705-826b-d61bf24a1f2d" containerName="pull" Mar 09 13:11:32 crc kubenswrapper[4723]: E0309 13:11:32.776100 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="995955a2-1d3d-4705-826b-d61bf24a1f2d" containerName="util" Mar 09 13:11:32 crc kubenswrapper[4723]: I0309 13:11:32.776111 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="995955a2-1d3d-4705-826b-d61bf24a1f2d" containerName="util" Mar 09 13:11:32 crc kubenswrapper[4723]: E0309 13:11:32.776126 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73775981-b81a-47d9-b93e-0ecf9ba86890" containerName="pull" Mar 09 13:11:32 crc kubenswrapper[4723]: I0309 13:11:32.776137 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="73775981-b81a-47d9-b93e-0ecf9ba86890" containerName="pull" Mar 09 13:11:32 crc kubenswrapper[4723]: E0309 13:11:32.776148 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73775981-b81a-47d9-b93e-0ecf9ba86890" containerName="util" Mar 09 13:11:32 crc kubenswrapper[4723]: I0309 13:11:32.776157 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="73775981-b81a-47d9-b93e-0ecf9ba86890" containerName="util" Mar 09 13:11:32 crc kubenswrapper[4723]: E0309 13:11:32.776178 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="995955a2-1d3d-4705-826b-d61bf24a1f2d" containerName="extract" Mar 09 13:11:32 crc kubenswrapper[4723]: I0309 13:11:32.776188 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="995955a2-1d3d-4705-826b-d61bf24a1f2d" containerName="extract" Mar 09 13:11:32 crc kubenswrapper[4723]: I0309 13:11:32.776368 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="995955a2-1d3d-4705-826b-d61bf24a1f2d" containerName="extract" Mar 09 13:11:32 crc kubenswrapper[4723]: I0309 13:11:32.776402 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="73775981-b81a-47d9-b93e-0ecf9ba86890" containerName="extract" Mar 09 13:11:32 crc kubenswrapper[4723]: I0309 13:11:32.777500 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-856bf85654-nsk4x" Mar 09 13:11:32 crc kubenswrapper[4723]: I0309 13:11:32.781032 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Mar 09 13:11:32 crc kubenswrapper[4723]: I0309 13:11:32.781524 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Mar 09 13:11:32 crc kubenswrapper[4723]: I0309 13:11:32.781778 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Mar 09 13:11:32 crc kubenswrapper[4723]: I0309 13:11:32.782062 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-nzlrj" Mar 09 13:11:32 crc kubenswrapper[4723]: I0309 13:11:32.782279 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Mar 09 13:11:32 crc kubenswrapper[4723]: I0309 13:11:32.782498 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Mar 09 13:11:32 crc kubenswrapper[4723]: I0309 13:11:32.792211 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-856bf85654-nsk4x"] Mar 09 13:11:32 crc kubenswrapper[4723]: I0309 13:11:32.905684 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a4427e9d-2cc9-4cec-acf7-7bbcc1c91582-webhook-cert\") pod \"loki-operator-controller-manager-856bf85654-nsk4x\" (UID: \"a4427e9d-2cc9-4cec-acf7-7bbcc1c91582\") " pod="openshift-operators-redhat/loki-operator-controller-manager-856bf85654-nsk4x" Mar 09 13:11:32 crc kubenswrapper[4723]: I0309 13:11:32.905752 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6snd\" (UniqueName: \"kubernetes.io/projected/a4427e9d-2cc9-4cec-acf7-7bbcc1c91582-kube-api-access-l6snd\") pod \"loki-operator-controller-manager-856bf85654-nsk4x\" (UID: \"a4427e9d-2cc9-4cec-acf7-7bbcc1c91582\") " pod="openshift-operators-redhat/loki-operator-controller-manager-856bf85654-nsk4x" Mar 09 13:11:32 crc kubenswrapper[4723]: I0309 13:11:32.905852 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a4427e9d-2cc9-4cec-acf7-7bbcc1c91582-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-856bf85654-nsk4x\" (UID: \"a4427e9d-2cc9-4cec-acf7-7bbcc1c91582\") " pod="openshift-operators-redhat/loki-operator-controller-manager-856bf85654-nsk4x" Mar 09 13:11:32 crc kubenswrapper[4723]: I0309 13:11:32.906057 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a4427e9d-2cc9-4cec-acf7-7bbcc1c91582-apiservice-cert\") pod \"loki-operator-controller-manager-856bf85654-nsk4x\" (UID: \"a4427e9d-2cc9-4cec-acf7-7bbcc1c91582\") " pod="openshift-operators-redhat/loki-operator-controller-manager-856bf85654-nsk4x" Mar 09 13:11:32 crc kubenswrapper[4723]: I0309 13:11:32.906135 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/a4427e9d-2cc9-4cec-acf7-7bbcc1c91582-manager-config\") pod \"loki-operator-controller-manager-856bf85654-nsk4x\" (UID: \"a4427e9d-2cc9-4cec-acf7-7bbcc1c91582\") " pod="openshift-operators-redhat/loki-operator-controller-manager-856bf85654-nsk4x" Mar 09 13:11:33 crc kubenswrapper[4723]: I0309 13:11:33.006746 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a4427e9d-2cc9-4cec-acf7-7bbcc1c91582-webhook-cert\") pod \"loki-operator-controller-manager-856bf85654-nsk4x\" (UID: \"a4427e9d-2cc9-4cec-acf7-7bbcc1c91582\") " pod="openshift-operators-redhat/loki-operator-controller-manager-856bf85654-nsk4x" Mar 09 13:11:33 crc kubenswrapper[4723]: I0309 13:11:33.006795 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6snd\" (UniqueName: \"kubernetes.io/projected/a4427e9d-2cc9-4cec-acf7-7bbcc1c91582-kube-api-access-l6snd\") pod \"loki-operator-controller-manager-856bf85654-nsk4x\" (UID: \"a4427e9d-2cc9-4cec-acf7-7bbcc1c91582\") " pod="openshift-operators-redhat/loki-operator-controller-manager-856bf85654-nsk4x" Mar 09 13:11:33 crc kubenswrapper[4723]: I0309 13:11:33.006851 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a4427e9d-2cc9-4cec-acf7-7bbcc1c91582-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-856bf85654-nsk4x\" (UID: \"a4427e9d-2cc9-4cec-acf7-7bbcc1c91582\") " pod="openshift-operators-redhat/loki-operator-controller-manager-856bf85654-nsk4x" Mar 09 13:11:33 crc kubenswrapper[4723]: I0309 13:11:33.006908 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a4427e9d-2cc9-4cec-acf7-7bbcc1c91582-apiservice-cert\") pod \"loki-operator-controller-manager-856bf85654-nsk4x\" (UID: \"a4427e9d-2cc9-4cec-acf7-7bbcc1c91582\") " pod="openshift-operators-redhat/loki-operator-controller-manager-856bf85654-nsk4x" Mar 09 13:11:33 crc kubenswrapper[4723]: I0309 13:11:33.006931 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/a4427e9d-2cc9-4cec-acf7-7bbcc1c91582-manager-config\") pod \"loki-operator-controller-manager-856bf85654-nsk4x\" (UID: \"a4427e9d-2cc9-4cec-acf7-7bbcc1c91582\") " pod="openshift-operators-redhat/loki-operator-controller-manager-856bf85654-nsk4x" Mar 09 13:11:33 crc kubenswrapper[4723]: I0309 13:11:33.007729 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/a4427e9d-2cc9-4cec-acf7-7bbcc1c91582-manager-config\") pod \"loki-operator-controller-manager-856bf85654-nsk4x\" (UID: \"a4427e9d-2cc9-4cec-acf7-7bbcc1c91582\") " pod="openshift-operators-redhat/loki-operator-controller-manager-856bf85654-nsk4x" Mar 09 13:11:33 crc kubenswrapper[4723]: I0309 13:11:33.024602 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a4427e9d-2cc9-4cec-acf7-7bbcc1c91582-apiservice-cert\") pod \"loki-operator-controller-manager-856bf85654-nsk4x\" (UID: \"a4427e9d-2cc9-4cec-acf7-7bbcc1c91582\") " pod="openshift-operators-redhat/loki-operator-controller-manager-856bf85654-nsk4x" Mar 09 13:11:33 crc kubenswrapper[4723]: I0309 13:11:33.024644 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a4427e9d-2cc9-4cec-acf7-7bbcc1c91582-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-856bf85654-nsk4x\" (UID: \"a4427e9d-2cc9-4cec-acf7-7bbcc1c91582\") " pod="openshift-operators-redhat/loki-operator-controller-manager-856bf85654-nsk4x" Mar 09 13:11:33 crc kubenswrapper[4723]: I0309 13:11:33.027316 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6snd\" (UniqueName: \"kubernetes.io/projected/a4427e9d-2cc9-4cec-acf7-7bbcc1c91582-kube-api-access-l6snd\") pod \"loki-operator-controller-manager-856bf85654-nsk4x\" (UID: \"a4427e9d-2cc9-4cec-acf7-7bbcc1c91582\") " pod="openshift-operators-redhat/loki-operator-controller-manager-856bf85654-nsk4x" Mar 09 13:11:33 crc kubenswrapper[4723]: I0309 13:11:33.029172 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a4427e9d-2cc9-4cec-acf7-7bbcc1c91582-webhook-cert\") pod \"loki-operator-controller-manager-856bf85654-nsk4x\" (UID: \"a4427e9d-2cc9-4cec-acf7-7bbcc1c91582\") " pod="openshift-operators-redhat/loki-operator-controller-manager-856bf85654-nsk4x" Mar 09 13:11:33 crc kubenswrapper[4723]: I0309 13:11:33.094752 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-856bf85654-nsk4x" Mar 09 13:11:33 crc kubenswrapper[4723]: I0309 13:11:33.501693 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-856bf85654-nsk4x"] Mar 09 13:11:33 crc kubenswrapper[4723]: I0309 13:11:33.759097 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-856bf85654-nsk4x" event={"ID":"a4427e9d-2cc9-4cec-acf7-7bbcc1c91582","Type":"ContainerStarted","Data":"f8068ec1a8eefa21973f30712a2a4d27324b1f1b606495241e17890a658c166e"} Mar 09 13:11:34 crc kubenswrapper[4723]: I0309 13:11:34.526687 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/cluster-logging-operator-c769fd969-hr28t"] Mar 09 13:11:34 crc kubenswrapper[4723]: I0309 13:11:34.528633 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-c769fd969-hr28t" Mar 09 13:11:34 crc kubenswrapper[4723]: I0309 13:11:34.531498 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"cluster-logging-operator-dockercfg-mgjh9" Mar 09 13:11:34 crc kubenswrapper[4723]: I0309 13:11:34.532793 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"openshift-service-ca.crt" Mar 09 13:11:34 crc kubenswrapper[4723]: I0309 13:11:34.534219 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"kube-root-ca.crt" Mar 09 13:11:34 crc kubenswrapper[4723]: I0309 13:11:34.555083 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-c769fd969-hr28t"] Mar 09 13:11:34 crc kubenswrapper[4723]: I0309 13:11:34.642802 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvspx\" (UniqueName: \"kubernetes.io/projected/f671edd2-126e-4037-b17c-0d707e2a01e3-kube-api-access-nvspx\") pod \"cluster-logging-operator-c769fd969-hr28t\" (UID: \"f671edd2-126e-4037-b17c-0d707e2a01e3\") " pod="openshift-logging/cluster-logging-operator-c769fd969-hr28t" Mar 09 13:11:34 crc kubenswrapper[4723]: I0309 13:11:34.744295 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvspx\" (UniqueName: \"kubernetes.io/projected/f671edd2-126e-4037-b17c-0d707e2a01e3-kube-api-access-nvspx\") pod \"cluster-logging-operator-c769fd969-hr28t\" (UID: \"f671edd2-126e-4037-b17c-0d707e2a01e3\") " pod="openshift-logging/cluster-logging-operator-c769fd969-hr28t" Mar 09 13:11:34 crc kubenswrapper[4723]: I0309 13:11:34.773812 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvspx\" (UniqueName: \"kubernetes.io/projected/f671edd2-126e-4037-b17c-0d707e2a01e3-kube-api-access-nvspx\") pod \"cluster-logging-operator-c769fd969-hr28t\" (UID: \"f671edd2-126e-4037-b17c-0d707e2a01e3\") " pod="openshift-logging/cluster-logging-operator-c769fd969-hr28t" Mar 09 13:11:34 crc kubenswrapper[4723]: I0309 13:11:34.847892 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-c769fd969-hr28t" Mar 09 13:11:35 crc kubenswrapper[4723]: I0309 13:11:35.149701 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-c769fd969-hr28t"] Mar 09 13:11:35 crc kubenswrapper[4723]: I0309 13:11:35.774582 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-c769fd969-hr28t" event={"ID":"f671edd2-126e-4037-b17c-0d707e2a01e3","Type":"ContainerStarted","Data":"f9a07be9b4b1d37016543cd8f7acde3edbcf3c045accec6ed18b73d3f23efbc5"} Mar 09 13:11:38 crc kubenswrapper[4723]: I0309 13:11:38.801446 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-856bf85654-nsk4x" event={"ID":"a4427e9d-2cc9-4cec-acf7-7bbcc1c91582","Type":"ContainerStarted","Data":"101d5c71822250260d61b91d6a1651aa5acf07be0719dba943951e29b0d42cfe"} Mar 09 13:11:45 crc kubenswrapper[4723]: I0309 13:11:45.865272 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-856bf85654-nsk4x" event={"ID":"a4427e9d-2cc9-4cec-acf7-7bbcc1c91582","Type":"ContainerStarted","Data":"961a3a00dbcac5638bc208209ecf84cec4b7b7f2d4eaa6ef57fb7f09ede5746c"} Mar 09 13:11:45 crc kubenswrapper[4723]: I0309 13:11:45.865567 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-856bf85654-nsk4x" Mar 09 13:11:45 crc kubenswrapper[4723]: I0309 13:11:45.869610 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-856bf85654-nsk4x" Mar 09 13:11:45 crc kubenswrapper[4723]: I0309 13:11:45.870993 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-c769fd969-hr28t" event={"ID":"f671edd2-126e-4037-b17c-0d707e2a01e3","Type":"ContainerStarted","Data":"d207a33cf645734c7a43d7cdde12e070ee444210fd6243ab4e749258c1830429"} Mar 09 13:11:45 crc kubenswrapper[4723]: I0309 13:11:45.893524 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-856bf85654-nsk4x" podStartSLOduration=1.9780285549999999 podStartE2EDuration="13.893498978s" podCreationTimestamp="2026-03-09 13:11:32 +0000 UTC" firstStartedPulling="2026-03-09 13:11:33.512031109 +0000 UTC m=+767.526498649" lastFinishedPulling="2026-03-09 13:11:45.427501532 +0000 UTC m=+779.441969072" observedRunningTime="2026-03-09 13:11:45.886654475 +0000 UTC m=+779.901122015" watchObservedRunningTime="2026-03-09 13:11:45.893498978 +0000 UTC m=+779.907966518" Mar 09 13:11:45 crc kubenswrapper[4723]: I0309 13:11:45.913373 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/cluster-logging-operator-c769fd969-hr28t" podStartSLOduration=1.7988788009999999 podStartE2EDuration="11.913350708s" podCreationTimestamp="2026-03-09 13:11:34 +0000 UTC" firstStartedPulling="2026-03-09 13:11:35.214960388 +0000 UTC m=+769.229427928" lastFinishedPulling="2026-03-09 13:11:45.329432295 +0000 UTC m=+779.343899835" observedRunningTime="2026-03-09 13:11:45.907216694 +0000 UTC m=+779.921684254" watchObservedRunningTime="2026-03-09 13:11:45.913350708 +0000 UTC m=+779.927818268" Mar 09 13:11:50 crc kubenswrapper[4723]: I0309 13:11:50.957499 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Mar 09 13:11:50 crc kubenswrapper[4723]: I0309 13:11:50.958801 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Mar 09 13:11:50 crc kubenswrapper[4723]: I0309 13:11:50.961061 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Mar 09 13:11:50 crc kubenswrapper[4723]: I0309 13:11:50.961558 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Mar 09 13:11:50 crc kubenswrapper[4723]: I0309 13:11:50.972303 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Mar 09 13:11:51 crc kubenswrapper[4723]: I0309 13:11:51.099248 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6l5d\" (UniqueName: \"kubernetes.io/projected/72fd0648-94d1-4c11-b3a2-b4dafb36cc21-kube-api-access-w6l5d\") pod \"minio\" (UID: \"72fd0648-94d1-4c11-b3a2-b4dafb36cc21\") " pod="minio-dev/minio" Mar 09 13:11:51 crc kubenswrapper[4723]: I0309 13:11:51.099336 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-fa38b80a-0b55-474b-b8ac-32538177bdf7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fa38b80a-0b55-474b-b8ac-32538177bdf7\") pod \"minio\" (UID: \"72fd0648-94d1-4c11-b3a2-b4dafb36cc21\") " pod="minio-dev/minio" Mar 09 13:11:51 crc kubenswrapper[4723]: I0309 13:11:51.200493 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-fa38b80a-0b55-474b-b8ac-32538177bdf7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fa38b80a-0b55-474b-b8ac-32538177bdf7\") pod \"minio\" (UID: \"72fd0648-94d1-4c11-b3a2-b4dafb36cc21\") " pod="minio-dev/minio" Mar 09 13:11:51 crc kubenswrapper[4723]: I0309 13:11:51.200649 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6l5d\" (UniqueName: \"kubernetes.io/projected/72fd0648-94d1-4c11-b3a2-b4dafb36cc21-kube-api-access-w6l5d\") pod \"minio\" (UID: \"72fd0648-94d1-4c11-b3a2-b4dafb36cc21\") " pod="minio-dev/minio" Mar 09 13:11:51 crc kubenswrapper[4723]: I0309 13:11:51.203026 4723 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 13:11:51 crc kubenswrapper[4723]: I0309 13:11:51.203059 4723 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-fa38b80a-0b55-474b-b8ac-32538177bdf7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fa38b80a-0b55-474b-b8ac-32538177bdf7\") pod \"minio\" (UID: \"72fd0648-94d1-4c11-b3a2-b4dafb36cc21\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7ea3e0f75672c4b44519ae74882214d7ee01111faa2cb3e78f55a6279809a8d8/globalmount\"" pod="minio-dev/minio" Mar 09 13:11:51 crc kubenswrapper[4723]: I0309 13:11:51.218309 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6l5d\" (UniqueName: \"kubernetes.io/projected/72fd0648-94d1-4c11-b3a2-b4dafb36cc21-kube-api-access-w6l5d\") pod \"minio\" (UID: \"72fd0648-94d1-4c11-b3a2-b4dafb36cc21\") " pod="minio-dev/minio" Mar 09 13:11:51 crc kubenswrapper[4723]: I0309 13:11:51.232310 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-fa38b80a-0b55-474b-b8ac-32538177bdf7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fa38b80a-0b55-474b-b8ac-32538177bdf7\") pod \"minio\" (UID: \"72fd0648-94d1-4c11-b3a2-b4dafb36cc21\") " pod="minio-dev/minio" Mar 09 13:11:51 crc kubenswrapper[4723]: I0309 13:11:51.276108 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Mar 09 13:11:51 crc kubenswrapper[4723]: I0309 13:11:51.700133 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Mar 09 13:11:51 crc kubenswrapper[4723]: I0309 13:11:51.906711 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"72fd0648-94d1-4c11-b3a2-b4dafb36cc21","Type":"ContainerStarted","Data":"3c49f40607836b94b84ce090ee63a1e2b702d1f85db6879625e170a6739598db"} Mar 09 13:11:54 crc kubenswrapper[4723]: I0309 13:11:54.929065 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"72fd0648-94d1-4c11-b3a2-b4dafb36cc21","Type":"ContainerStarted","Data":"c5333c114ab7873a25964094a42b40c667ac0af13f1d629a15d5be0a53c1e7ff"} Mar 09 13:11:54 crc kubenswrapper[4723]: I0309 13:11:54.945026 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=3.896057104 podStartE2EDuration="6.945004408s" podCreationTimestamp="2026-03-09 13:11:48 +0000 UTC" firstStartedPulling="2026-03-09 13:11:51.706521756 +0000 UTC m=+785.720989296" lastFinishedPulling="2026-03-09 13:11:54.75546907 +0000 UTC m=+788.769936600" observedRunningTime="2026-03-09 13:11:54.944476884 +0000 UTC m=+788.958944464" watchObservedRunningTime="2026-03-09 13:11:54.945004408 +0000 UTC m=+788.959472078" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.195455 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-distributor-5d5548c9f5-zg9mx"] Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.196905 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-5d5548c9f5-zg9mx" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.201676 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-dockercfg-p8pzl" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.203301 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-config" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.203474 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-ca-bundle" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.203704 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-http" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.203896 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-grpc" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.214154 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-5d5548c9f5-zg9mx"] Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.350279 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04edbd9e-fd1b-4346-97ce-adfb011720a4-config\") pod \"logging-loki-distributor-5d5548c9f5-zg9mx\" (UID: \"04edbd9e-fd1b-4346-97ce-adfb011720a4\") " pod="openshift-logging/logging-loki-distributor-5d5548c9f5-zg9mx" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.350343 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/04edbd9e-fd1b-4346-97ce-adfb011720a4-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-5d5548c9f5-zg9mx\" (UID: \"04edbd9e-fd1b-4346-97ce-adfb011720a4\") " pod="openshift-logging/logging-loki-distributor-5d5548c9f5-zg9mx" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.350419 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/04edbd9e-fd1b-4346-97ce-adfb011720a4-logging-loki-distributor-http\") pod \"logging-loki-distributor-5d5548c9f5-zg9mx\" (UID: \"04edbd9e-fd1b-4346-97ce-adfb011720a4\") " pod="openshift-logging/logging-loki-distributor-5d5548c9f5-zg9mx" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.350438 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04edbd9e-fd1b-4346-97ce-adfb011720a4-logging-loki-ca-bundle\") pod \"logging-loki-distributor-5d5548c9f5-zg9mx\" (UID: \"04edbd9e-fd1b-4346-97ce-adfb011720a4\") " pod="openshift-logging/logging-loki-distributor-5d5548c9f5-zg9mx" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.350463 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgpqz\" (UniqueName: \"kubernetes.io/projected/04edbd9e-fd1b-4346-97ce-adfb011720a4-kube-api-access-pgpqz\") pod \"logging-loki-distributor-5d5548c9f5-zg9mx\" (UID: \"04edbd9e-fd1b-4346-97ce-adfb011720a4\") " pod="openshift-logging/logging-loki-distributor-5d5548c9f5-zg9mx" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.354046 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-querier-76bf7b6d45-2d54b"] Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.355408 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-76bf7b6d45-2d54b" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.360344 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-grpc" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.360517 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-http" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.361015 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-s3" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.363000 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-76bf7b6d45-2d54b"] Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.441115 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-query-frontend-6d6859c548-4mdnv"] Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.442452 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-6d6859c548-4mdnv" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.447237 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-grpc" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.449041 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-http" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.466769 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04edbd9e-fd1b-4346-97ce-adfb011720a4-config\") pod \"logging-loki-distributor-5d5548c9f5-zg9mx\" (UID: \"04edbd9e-fd1b-4346-97ce-adfb011720a4\") " pod="openshift-logging/logging-loki-distributor-5d5548c9f5-zg9mx" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.472592 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/04edbd9e-fd1b-4346-97ce-adfb011720a4-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-5d5548c9f5-zg9mx\" (UID: \"04edbd9e-fd1b-4346-97ce-adfb011720a4\") " pod="openshift-logging/logging-loki-distributor-5d5548c9f5-zg9mx" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.472665 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/8b9cdd14-6347-4701-9825-1ced6362cd8c-logging-loki-s3\") pod \"logging-loki-querier-76bf7b6d45-2d54b\" (UID: \"8b9cdd14-6347-4701-9825-1ced6362cd8c\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-2d54b" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.472753 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b9cdd14-6347-4701-9825-1ced6362cd8c-logging-loki-ca-bundle\") pod \"logging-loki-querier-76bf7b6d45-2d54b\" (UID: \"8b9cdd14-6347-4701-9825-1ced6362cd8c\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-2d54b" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.472793 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/8b9cdd14-6347-4701-9825-1ced6362cd8c-logging-loki-querier-http\") pod \"logging-loki-querier-76bf7b6d45-2d54b\" (UID: \"8b9cdd14-6347-4701-9825-1ced6362cd8c\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-2d54b" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.472887 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/04edbd9e-fd1b-4346-97ce-adfb011720a4-logging-loki-distributor-http\") pod \"logging-loki-distributor-5d5548c9f5-zg9mx\" (UID: \"04edbd9e-fd1b-4346-97ce-adfb011720a4\") " pod="openshift-logging/logging-loki-distributor-5d5548c9f5-zg9mx" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.472918 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04edbd9e-fd1b-4346-97ce-adfb011720a4-logging-loki-ca-bundle\") pod \"logging-loki-distributor-5d5548c9f5-zg9mx\" (UID: \"04edbd9e-fd1b-4346-97ce-adfb011720a4\") " pod="openshift-logging/logging-loki-distributor-5d5548c9f5-zg9mx" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.472941 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgpqz\" (UniqueName: \"kubernetes.io/projected/04edbd9e-fd1b-4346-97ce-adfb011720a4-kube-api-access-pgpqz\") pod \"logging-loki-distributor-5d5548c9f5-zg9mx\" (UID: \"04edbd9e-fd1b-4346-97ce-adfb011720a4\") " pod="openshift-logging/logging-loki-distributor-5d5548c9f5-zg9mx" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.472979 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b9cdd14-6347-4701-9825-1ced6362cd8c-config\") pod \"logging-loki-querier-76bf7b6d45-2d54b\" (UID: \"8b9cdd14-6347-4701-9825-1ced6362cd8c\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-2d54b" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.473031 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/8b9cdd14-6347-4701-9825-1ced6362cd8c-logging-loki-querier-grpc\") pod \"logging-loki-querier-76bf7b6d45-2d54b\" (UID: \"8b9cdd14-6347-4701-9825-1ced6362cd8c\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-2d54b" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.473084 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxq4x\" (UniqueName: \"kubernetes.io/projected/8b9cdd14-6347-4701-9825-1ced6362cd8c-kube-api-access-bxq4x\") pod \"logging-loki-querier-76bf7b6d45-2d54b\" (UID: \"8b9cdd14-6347-4701-9825-1ced6362cd8c\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-2d54b" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.488698 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04edbd9e-fd1b-4346-97ce-adfb011720a4-config\") pod \"logging-loki-distributor-5d5548c9f5-zg9mx\" (UID: \"04edbd9e-fd1b-4346-97ce-adfb011720a4\") " pod="openshift-logging/logging-loki-distributor-5d5548c9f5-zg9mx" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.488793 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-6d6859c548-4mdnv"] Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.490451 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04edbd9e-fd1b-4346-97ce-adfb011720a4-logging-loki-ca-bundle\") pod \"logging-loki-distributor-5d5548c9f5-zg9mx\" (UID: \"04edbd9e-fd1b-4346-97ce-adfb011720a4\") " pod="openshift-logging/logging-loki-distributor-5d5548c9f5-zg9mx" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.507393 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/04edbd9e-fd1b-4346-97ce-adfb011720a4-logging-loki-distributor-http\") pod \"logging-loki-distributor-5d5548c9f5-zg9mx\" (UID: \"04edbd9e-fd1b-4346-97ce-adfb011720a4\") " pod="openshift-logging/logging-loki-distributor-5d5548c9f5-zg9mx" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.507453 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgpqz\" (UniqueName: \"kubernetes.io/projected/04edbd9e-fd1b-4346-97ce-adfb011720a4-kube-api-access-pgpqz\") pod \"logging-loki-distributor-5d5548c9f5-zg9mx\" (UID: \"04edbd9e-fd1b-4346-97ce-adfb011720a4\") " pod="openshift-logging/logging-loki-distributor-5d5548c9f5-zg9mx" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.508840 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/04edbd9e-fd1b-4346-97ce-adfb011720a4-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-5d5548c9f5-zg9mx\" (UID: \"04edbd9e-fd1b-4346-97ce-adfb011720a4\") " pod="openshift-logging/logging-loki-distributor-5d5548c9f5-zg9mx" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.521478 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-5d5548c9f5-zg9mx" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.548491 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-867fb59d66-2pwh2"] Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.549658 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-867fb59d66-2pwh2" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.552132 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.552155 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-http" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.552172 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway-ca-bundle" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.552143 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-client-http" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.553081 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.555575 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-867fb59d66-2pwh2"] Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.580071 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cd1997b-cced-41c1-8a27-77321ffc48ae-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-6d6859c548-4mdnv\" (UID: \"9cd1997b-cced-41c1-8a27-77321ffc48ae\") " pod="openshift-logging/logging-loki-query-frontend-6d6859c548-4mdnv" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.580148 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cd1997b-cced-41c1-8a27-77321ffc48ae-config\") pod \"logging-loki-query-frontend-6d6859c548-4mdnv\" (UID: \"9cd1997b-cced-41c1-8a27-77321ffc48ae\") " pod="openshift-logging/logging-loki-query-frontend-6d6859c548-4mdnv" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.580182 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/8b9cdd14-6347-4701-9825-1ced6362cd8c-logging-loki-s3\") pod \"logging-loki-querier-76bf7b6d45-2d54b\" (UID: \"8b9cdd14-6347-4701-9825-1ced6362cd8c\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-2d54b" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.580203 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psdbk\" (UniqueName: \"kubernetes.io/projected/9cd1997b-cced-41c1-8a27-77321ffc48ae-kube-api-access-psdbk\") pod \"logging-loki-query-frontend-6d6859c548-4mdnv\" (UID: \"9cd1997b-cced-41c1-8a27-77321ffc48ae\") " pod="openshift-logging/logging-loki-query-frontend-6d6859c548-4mdnv" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.580239 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b9cdd14-6347-4701-9825-1ced6362cd8c-logging-loki-ca-bundle\") pod \"logging-loki-querier-76bf7b6d45-2d54b\" (UID: \"8b9cdd14-6347-4701-9825-1ced6362cd8c\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-2d54b" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.580260 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/9cd1997b-cced-41c1-8a27-77321ffc48ae-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-6d6859c548-4mdnv\" (UID: \"9cd1997b-cced-41c1-8a27-77321ffc48ae\") " pod="openshift-logging/logging-loki-query-frontend-6d6859c548-4mdnv" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.580283 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/8b9cdd14-6347-4701-9825-1ced6362cd8c-logging-loki-querier-http\") pod \"logging-loki-querier-76bf7b6d45-2d54b\" (UID: \"8b9cdd14-6347-4701-9825-1ced6362cd8c\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-2d54b" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.580304 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/9cd1997b-cced-41c1-8a27-77321ffc48ae-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-6d6859c548-4mdnv\" (UID: \"9cd1997b-cced-41c1-8a27-77321ffc48ae\") " pod="openshift-logging/logging-loki-query-frontend-6d6859c548-4mdnv" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.580342 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b9cdd14-6347-4701-9825-1ced6362cd8c-config\") pod \"logging-loki-querier-76bf7b6d45-2d54b\" (UID: \"8b9cdd14-6347-4701-9825-1ced6362cd8c\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-2d54b" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.580367 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/8b9cdd14-6347-4701-9825-1ced6362cd8c-logging-loki-querier-grpc\") pod \"logging-loki-querier-76bf7b6d45-2d54b\" (UID: \"8b9cdd14-6347-4701-9825-1ced6362cd8c\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-2d54b" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.580389 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxq4x\" (UniqueName: \"kubernetes.io/projected/8b9cdd14-6347-4701-9825-1ced6362cd8c-kube-api-access-bxq4x\") pod \"logging-loki-querier-76bf7b6d45-2d54b\" (UID: \"8b9cdd14-6347-4701-9825-1ced6362cd8c\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-2d54b" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.586084 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/8b9cdd14-6347-4701-9825-1ced6362cd8c-logging-loki-s3\") pod \"logging-loki-querier-76bf7b6d45-2d54b\" (UID: \"8b9cdd14-6347-4701-9825-1ced6362cd8c\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-2d54b" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.586808 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b9cdd14-6347-4701-9825-1ced6362cd8c-logging-loki-ca-bundle\") pod \"logging-loki-querier-76bf7b6d45-2d54b\" (UID: \"8b9cdd14-6347-4701-9825-1ced6362cd8c\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-2d54b" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.591021 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b9cdd14-6347-4701-9825-1ced6362cd8c-config\") pod \"logging-loki-querier-76bf7b6d45-2d54b\" (UID: \"8b9cdd14-6347-4701-9825-1ced6362cd8c\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-2d54b" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.592266 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/8b9cdd14-6347-4701-9825-1ced6362cd8c-logging-loki-querier-http\") pod \"logging-loki-querier-76bf7b6d45-2d54b\" (UID: \"8b9cdd14-6347-4701-9825-1ced6362cd8c\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-2d54b" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.599742 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/8b9cdd14-6347-4701-9825-1ced6362cd8c-logging-loki-querier-grpc\") pod \"logging-loki-querier-76bf7b6d45-2d54b\" (UID: \"8b9cdd14-6347-4701-9825-1ced6362cd8c\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-2d54b" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.609613 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxq4x\" (UniqueName: \"kubernetes.io/projected/8b9cdd14-6347-4701-9825-1ced6362cd8c-kube-api-access-bxq4x\") pod \"logging-loki-querier-76bf7b6d45-2d54b\" (UID: \"8b9cdd14-6347-4701-9825-1ced6362cd8c\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-2d54b" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.619743 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-867fb59d66-pxpr6"] Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.621211 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-867fb59d66-pxpr6" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.625511 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-dockercfg-gpczz" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.687483 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bd030fd-cf38-4403-971f-4170fdc71bb0-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-867fb59d66-2pwh2\" (UID: \"0bd030fd-cf38-4403-971f-4170fdc71bb0\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-2pwh2" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.687549 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/0bd030fd-cf38-4403-971f-4170fdc71bb0-lokistack-gateway\") pod \"logging-loki-gateway-867fb59d66-2pwh2\" (UID: \"0bd030fd-cf38-4403-971f-4170fdc71bb0\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-2pwh2" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.687603 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbd4q\" (UniqueName: \"kubernetes.io/projected/0bd030fd-cf38-4403-971f-4170fdc71bb0-kube-api-access-dbd4q\") pod \"logging-loki-gateway-867fb59d66-2pwh2\" (UID: \"0bd030fd-cf38-4403-971f-4170fdc71bb0\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-2pwh2" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.687654 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cd1997b-cced-41c1-8a27-77321ffc48ae-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-6d6859c548-4mdnv\" (UID: \"9cd1997b-cced-41c1-8a27-77321ffc48ae\") " pod="openshift-logging/logging-loki-query-frontend-6d6859c548-4mdnv" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.687699 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bd030fd-cf38-4403-971f-4170fdc71bb0-logging-loki-ca-bundle\") pod \"logging-loki-gateway-867fb59d66-2pwh2\" (UID: \"0bd030fd-cf38-4403-971f-4170fdc71bb0\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-2pwh2" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.687724 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/0bd030fd-cf38-4403-971f-4170fdc71bb0-rbac\") pod \"logging-loki-gateway-867fb59d66-2pwh2\" (UID: \"0bd030fd-cf38-4403-971f-4170fdc71bb0\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-2pwh2" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.687750 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cd1997b-cced-41c1-8a27-77321ffc48ae-config\") pod \"logging-loki-query-frontend-6d6859c548-4mdnv\" (UID: \"9cd1997b-cced-41c1-8a27-77321ffc48ae\") " pod="openshift-logging/logging-loki-query-frontend-6d6859c548-4mdnv" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.687776 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psdbk\" (UniqueName: \"kubernetes.io/projected/9cd1997b-cced-41c1-8a27-77321ffc48ae-kube-api-access-psdbk\") pod \"logging-loki-query-frontend-6d6859c548-4mdnv\" (UID: \"9cd1997b-cced-41c1-8a27-77321ffc48ae\") " pod="openshift-logging/logging-loki-query-frontend-6d6859c548-4mdnv" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.687804 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/0bd030fd-cf38-4403-971f-4170fdc71bb0-tls-secret\") pod \"logging-loki-gateway-867fb59d66-2pwh2\" (UID: \"0bd030fd-cf38-4403-971f-4170fdc71bb0\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-2pwh2" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.687840 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/0bd030fd-cf38-4403-971f-4170fdc71bb0-tenants\") pod \"logging-loki-gateway-867fb59d66-2pwh2\" (UID: \"0bd030fd-cf38-4403-971f-4170fdc71bb0\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-2pwh2" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.687835 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-867fb59d66-pxpr6"] Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.688298 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-76bf7b6d45-2d54b" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.689213 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cd1997b-cced-41c1-8a27-77321ffc48ae-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-6d6859c548-4mdnv\" (UID: \"9cd1997b-cced-41c1-8a27-77321ffc48ae\") " pod="openshift-logging/logging-loki-query-frontend-6d6859c548-4mdnv" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.689755 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cd1997b-cced-41c1-8a27-77321ffc48ae-config\") pod \"logging-loki-query-frontend-6d6859c548-4mdnv\" (UID: \"9cd1997b-cced-41c1-8a27-77321ffc48ae\") " pod="openshift-logging/logging-loki-query-frontend-6d6859c548-4mdnv" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.690090 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/9cd1997b-cced-41c1-8a27-77321ffc48ae-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-6d6859c548-4mdnv\" (UID: \"9cd1997b-cced-41c1-8a27-77321ffc48ae\") " pod="openshift-logging/logging-loki-query-frontend-6d6859c548-4mdnv" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.690592 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/9cd1997b-cced-41c1-8a27-77321ffc48ae-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-6d6859c548-4mdnv\" (UID: \"9cd1997b-cced-41c1-8a27-77321ffc48ae\") " pod="openshift-logging/logging-loki-query-frontend-6d6859c548-4mdnv" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.694413 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/9cd1997b-cced-41c1-8a27-77321ffc48ae-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-6d6859c548-4mdnv\" (UID: \"9cd1997b-cced-41c1-8a27-77321ffc48ae\") " pod="openshift-logging/logging-loki-query-frontend-6d6859c548-4mdnv" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.694503 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/0bd030fd-cf38-4403-971f-4170fdc71bb0-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-867fb59d66-2pwh2\" (UID: \"0bd030fd-cf38-4403-971f-4170fdc71bb0\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-2pwh2" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.699428 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/9cd1997b-cced-41c1-8a27-77321ffc48ae-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-6d6859c548-4mdnv\" (UID: \"9cd1997b-cced-41c1-8a27-77321ffc48ae\") " pod="openshift-logging/logging-loki-query-frontend-6d6859c548-4mdnv" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.710352 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psdbk\" (UniqueName: \"kubernetes.io/projected/9cd1997b-cced-41c1-8a27-77321ffc48ae-kube-api-access-psdbk\") pod \"logging-loki-query-frontend-6d6859c548-4mdnv\" (UID: \"9cd1997b-cced-41c1-8a27-77321ffc48ae\") " pod="openshift-logging/logging-loki-query-frontend-6d6859c548-4mdnv" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.767982 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-6d6859c548-4mdnv" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.796153 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bd030fd-cf38-4403-971f-4170fdc71bb0-logging-loki-ca-bundle\") pod \"logging-loki-gateway-867fb59d66-2pwh2\" (UID: \"0bd030fd-cf38-4403-971f-4170fdc71bb0\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-2pwh2" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.796196 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/0bd030fd-cf38-4403-971f-4170fdc71bb0-rbac\") pod \"logging-loki-gateway-867fb59d66-2pwh2\" (UID: \"0bd030fd-cf38-4403-971f-4170fdc71bb0\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-2pwh2" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.796219 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/3dcae42d-f05a-41f1-9d6a-11ccb28eb379-tls-secret\") pod \"logging-loki-gateway-867fb59d66-pxpr6\" (UID: \"3dcae42d-f05a-41f1-9d6a-11ccb28eb379\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-pxpr6" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.796245 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dcae42d-f05a-41f1-9d6a-11ccb28eb379-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-867fb59d66-pxpr6\" (UID: \"3dcae42d-f05a-41f1-9d6a-11ccb28eb379\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-pxpr6" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.796264 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/0bd030fd-cf38-4403-971f-4170fdc71bb0-tls-secret\") pod \"logging-loki-gateway-867fb59d66-2pwh2\" (UID: \"0bd030fd-cf38-4403-971f-4170fdc71bb0\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-2pwh2" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.796292 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/0bd030fd-cf38-4403-971f-4170fdc71bb0-tenants\") pod \"logging-loki-gateway-867fb59d66-2pwh2\" (UID: \"0bd030fd-cf38-4403-971f-4170fdc71bb0\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-2pwh2" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.796316 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/0bd030fd-cf38-4403-971f-4170fdc71bb0-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-867fb59d66-2pwh2\" (UID: \"0bd030fd-cf38-4403-971f-4170fdc71bb0\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-2pwh2" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.796419 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bd030fd-cf38-4403-971f-4170fdc71bb0-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-867fb59d66-2pwh2\" (UID: \"0bd030fd-cf38-4403-971f-4170fdc71bb0\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-2pwh2" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.796477 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dcae42d-f05a-41f1-9d6a-11ccb28eb379-logging-loki-ca-bundle\") pod \"logging-loki-gateway-867fb59d66-pxpr6\" (UID: \"3dcae42d-f05a-41f1-9d6a-11ccb28eb379\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-pxpr6" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.796505 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/0bd030fd-cf38-4403-971f-4170fdc71bb0-lokistack-gateway\") pod \"logging-loki-gateway-867fb59d66-2pwh2\" (UID: \"0bd030fd-cf38-4403-971f-4170fdc71bb0\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-2pwh2" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.796530 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7x7m\" (UniqueName: \"kubernetes.io/projected/3dcae42d-f05a-41f1-9d6a-11ccb28eb379-kube-api-access-d7x7m\") pod \"logging-loki-gateway-867fb59d66-pxpr6\" (UID: \"3dcae42d-f05a-41f1-9d6a-11ccb28eb379\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-pxpr6" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.796565 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/3dcae42d-f05a-41f1-9d6a-11ccb28eb379-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-867fb59d66-pxpr6\" (UID: \"3dcae42d-f05a-41f1-9d6a-11ccb28eb379\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-pxpr6" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.796655 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbd4q\" (UniqueName: \"kubernetes.io/projected/0bd030fd-cf38-4403-971f-4170fdc71bb0-kube-api-access-dbd4q\") pod \"logging-loki-gateway-867fb59d66-2pwh2\" (UID: \"0bd030fd-cf38-4403-971f-4170fdc71bb0\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-2pwh2" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.796707 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/3dcae42d-f05a-41f1-9d6a-11ccb28eb379-lokistack-gateway\") pod \"logging-loki-gateway-867fb59d66-pxpr6\" (UID: \"3dcae42d-f05a-41f1-9d6a-11ccb28eb379\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-pxpr6" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.796745 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/3dcae42d-f05a-41f1-9d6a-11ccb28eb379-tenants\") pod \"logging-loki-gateway-867fb59d66-pxpr6\" (UID: \"3dcae42d-f05a-41f1-9d6a-11ccb28eb379\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-pxpr6" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.796766 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/3dcae42d-f05a-41f1-9d6a-11ccb28eb379-rbac\") pod \"logging-loki-gateway-867fb59d66-pxpr6\" (UID: \"3dcae42d-f05a-41f1-9d6a-11ccb28eb379\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-pxpr6" Mar 09 13:11:59 crc kubenswrapper[4723]: E0309 13:11:59.796979 4723 secret.go:188] Couldn't get secret openshift-logging/logging-loki-gateway-http: secret "logging-loki-gateway-http" not found Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.797036 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bd030fd-cf38-4403-971f-4170fdc71bb0-logging-loki-ca-bundle\") pod \"logging-loki-gateway-867fb59d66-2pwh2\" (UID: \"0bd030fd-cf38-4403-971f-4170fdc71bb0\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-2pwh2" Mar 09 13:11:59 crc kubenswrapper[4723]: E0309 13:11:59.797042 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0bd030fd-cf38-4403-971f-4170fdc71bb0-tls-secret podName:0bd030fd-cf38-4403-971f-4170fdc71bb0 nodeName:}" failed. No retries permitted until 2026-03-09 13:12:00.297021278 +0000 UTC m=+794.311488898 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-secret" (UniqueName: "kubernetes.io/secret/0bd030fd-cf38-4403-971f-4170fdc71bb0-tls-secret") pod "logging-loki-gateway-867fb59d66-2pwh2" (UID: "0bd030fd-cf38-4403-971f-4170fdc71bb0") : secret "logging-loki-gateway-http" not found Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.798028 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/0bd030fd-cf38-4403-971f-4170fdc71bb0-rbac\") pod \"logging-loki-gateway-867fb59d66-2pwh2\" (UID: \"0bd030fd-cf38-4403-971f-4170fdc71bb0\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-2pwh2" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.798056 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bd030fd-cf38-4403-971f-4170fdc71bb0-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-867fb59d66-2pwh2\" (UID: \"0bd030fd-cf38-4403-971f-4170fdc71bb0\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-2pwh2" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.798092 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/0bd030fd-cf38-4403-971f-4170fdc71bb0-lokistack-gateway\") pod \"logging-loki-gateway-867fb59d66-2pwh2\" (UID: \"0bd030fd-cf38-4403-971f-4170fdc71bb0\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-2pwh2" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.800066 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/0bd030fd-cf38-4403-971f-4170fdc71bb0-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-867fb59d66-2pwh2\" (UID: \"0bd030fd-cf38-4403-971f-4170fdc71bb0\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-2pwh2" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.802608 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/0bd030fd-cf38-4403-971f-4170fdc71bb0-tenants\") pod \"logging-loki-gateway-867fb59d66-2pwh2\" (UID: \"0bd030fd-cf38-4403-971f-4170fdc71bb0\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-2pwh2" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.816269 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbd4q\" (UniqueName: \"kubernetes.io/projected/0bd030fd-cf38-4403-971f-4170fdc71bb0-kube-api-access-dbd4q\") pod \"logging-loki-gateway-867fb59d66-2pwh2\" (UID: \"0bd030fd-cf38-4403-971f-4170fdc71bb0\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-2pwh2" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.898016 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dcae42d-f05a-41f1-9d6a-11ccb28eb379-logging-loki-ca-bundle\") pod \"logging-loki-gateway-867fb59d66-pxpr6\" (UID: \"3dcae42d-f05a-41f1-9d6a-11ccb28eb379\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-pxpr6" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.898061 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7x7m\" (UniqueName: \"kubernetes.io/projected/3dcae42d-f05a-41f1-9d6a-11ccb28eb379-kube-api-access-d7x7m\") pod \"logging-loki-gateway-867fb59d66-pxpr6\" (UID: \"3dcae42d-f05a-41f1-9d6a-11ccb28eb379\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-pxpr6" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.898081 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/3dcae42d-f05a-41f1-9d6a-11ccb28eb379-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-867fb59d66-pxpr6\" (UID: \"3dcae42d-f05a-41f1-9d6a-11ccb28eb379\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-pxpr6" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.898117 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/3dcae42d-f05a-41f1-9d6a-11ccb28eb379-lokistack-gateway\") pod \"logging-loki-gateway-867fb59d66-pxpr6\" (UID: \"3dcae42d-f05a-41f1-9d6a-11ccb28eb379\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-pxpr6" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.898138 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/3dcae42d-f05a-41f1-9d6a-11ccb28eb379-tenants\") pod \"logging-loki-gateway-867fb59d66-pxpr6\" (UID: \"3dcae42d-f05a-41f1-9d6a-11ccb28eb379\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-pxpr6" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.898157 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/3dcae42d-f05a-41f1-9d6a-11ccb28eb379-rbac\") pod \"logging-loki-gateway-867fb59d66-pxpr6\" (UID: \"3dcae42d-f05a-41f1-9d6a-11ccb28eb379\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-pxpr6" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.898202 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/3dcae42d-f05a-41f1-9d6a-11ccb28eb379-tls-secret\") pod \"logging-loki-gateway-867fb59d66-pxpr6\" (UID: \"3dcae42d-f05a-41f1-9d6a-11ccb28eb379\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-pxpr6" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.898221 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dcae42d-f05a-41f1-9d6a-11ccb28eb379-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-867fb59d66-pxpr6\" (UID: \"3dcae42d-f05a-41f1-9d6a-11ccb28eb379\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-pxpr6" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.899168 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dcae42d-f05a-41f1-9d6a-11ccb28eb379-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-867fb59d66-pxpr6\" (UID: \"3dcae42d-f05a-41f1-9d6a-11ccb28eb379\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-pxpr6" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.899181 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dcae42d-f05a-41f1-9d6a-11ccb28eb379-logging-loki-ca-bundle\") pod \"logging-loki-gateway-867fb59d66-pxpr6\" (UID: \"3dcae42d-f05a-41f1-9d6a-11ccb28eb379\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-pxpr6" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.899354 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/3dcae42d-f05a-41f1-9d6a-11ccb28eb379-rbac\") pod \"logging-loki-gateway-867fb59d66-pxpr6\" (UID: \"3dcae42d-f05a-41f1-9d6a-11ccb28eb379\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-pxpr6" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.899366 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/3dcae42d-f05a-41f1-9d6a-11ccb28eb379-lokistack-gateway\") pod \"logging-loki-gateway-867fb59d66-pxpr6\" (UID: \"3dcae42d-f05a-41f1-9d6a-11ccb28eb379\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-pxpr6" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.901071 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/3dcae42d-f05a-41f1-9d6a-11ccb28eb379-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-867fb59d66-pxpr6\" (UID: \"3dcae42d-f05a-41f1-9d6a-11ccb28eb379\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-pxpr6" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.901837 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/3dcae42d-f05a-41f1-9d6a-11ccb28eb379-tenants\") pod \"logging-loki-gateway-867fb59d66-pxpr6\" (UID: \"3dcae42d-f05a-41f1-9d6a-11ccb28eb379\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-pxpr6" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.902192 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/3dcae42d-f05a-41f1-9d6a-11ccb28eb379-tls-secret\") pod \"logging-loki-gateway-867fb59d66-pxpr6\" (UID: \"3dcae42d-f05a-41f1-9d6a-11ccb28eb379\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-pxpr6" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.913658 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7x7m\" (UniqueName: \"kubernetes.io/projected/3dcae42d-f05a-41f1-9d6a-11ccb28eb379-kube-api-access-d7x7m\") pod \"logging-loki-gateway-867fb59d66-pxpr6\" (UID: \"3dcae42d-f05a-41f1-9d6a-11ccb28eb379\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-pxpr6" Mar 09 13:11:59 crc kubenswrapper[4723]: I0309 13:11:59.989229 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-867fb59d66-pxpr6" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.035344 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-5d5548c9f5-zg9mx"] Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.124533 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551032-l8bn7"] Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.125700 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551032-l8bn7" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.128176 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jrp4x" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.128459 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.128592 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.150923 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551032-l8bn7"] Mar 09 13:12:00 crc kubenswrapper[4723]: W0309 13:12:00.155141 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b9cdd14_6347_4701_9825_1ced6362cd8c.slice/crio-b8ef6427b2887c599d639ef6231285cfbdcc89f2cdddbc7d9ac79f3b98d77f0d WatchSource:0}: Error finding container b8ef6427b2887c599d639ef6231285cfbdcc89f2cdddbc7d9ac79f3b98d77f0d: Status 404 returned error can't find the container with id b8ef6427b2887c599d639ef6231285cfbdcc89f2cdddbc7d9ac79f3b98d77f0d Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.155369 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-76bf7b6d45-2d54b"] Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.262422 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-6d6859c548-4mdnv"] Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.304566 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fptr9\" (UniqueName: \"kubernetes.io/projected/bfc2f049-83e8-4cbe-a04e-d02db1274094-kube-api-access-fptr9\") pod \"auto-csr-approver-29551032-l8bn7\" (UID: \"bfc2f049-83e8-4cbe-a04e-d02db1274094\") " pod="openshift-infra/auto-csr-approver-29551032-l8bn7" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.304637 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/0bd030fd-cf38-4403-971f-4170fdc71bb0-tls-secret\") pod \"logging-loki-gateway-867fb59d66-2pwh2\" (UID: \"0bd030fd-cf38-4403-971f-4170fdc71bb0\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-2pwh2" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.307720 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/0bd030fd-cf38-4403-971f-4170fdc71bb0-tls-secret\") pod \"logging-loki-gateway-867fb59d66-2pwh2\" (UID: \"0bd030fd-cf38-4403-971f-4170fdc71bb0\") " pod="openshift-logging/logging-loki-gateway-867fb59d66-2pwh2" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.344968 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.345747 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.347527 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-http" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.348551 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-grpc" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.353159 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.406072 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fptr9\" (UniqueName: \"kubernetes.io/projected/bfc2f049-83e8-4cbe-a04e-d02db1274094-kube-api-access-fptr9\") pod \"auto-csr-approver-29551032-l8bn7\" (UID: \"bfc2f049-83e8-4cbe-a04e-d02db1274094\") " pod="openshift-infra/auto-csr-approver-29551032-l8bn7" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.421508 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.422678 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.425133 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-http" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.426609 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-grpc" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.427374 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fptr9\" (UniqueName: \"kubernetes.io/projected/bfc2f049-83e8-4cbe-a04e-d02db1274094-kube-api-access-fptr9\") pod \"auto-csr-approver-29551032-l8bn7\" (UID: \"bfc2f049-83e8-4cbe-a04e-d02db1274094\") " pod="openshift-infra/auto-csr-approver-29551032-l8bn7" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.430068 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.447606 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551032-l8bn7" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.486585 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-867fb59d66-pxpr6"] Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.511255 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.512216 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48261310-f664-41dd-9fbd-dd5a7bfc11e9-config\") pod \"logging-loki-ingester-0\" (UID: \"48261310-f664-41dd-9fbd-dd5a7bfc11e9\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.512271 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/a3a44cb8-3d3a-4462-b7fb-ae571ad70a70-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"a3a44cb8-3d3a-4462-b7fb-ae571ad70a70\") " pod="openshift-logging/logging-loki-compactor-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.512293 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfc98\" (UniqueName: \"kubernetes.io/projected/a3a44cb8-3d3a-4462-b7fb-ae571ad70a70-kube-api-access-mfc98\") pod \"logging-loki-compactor-0\" (UID: \"a3a44cb8-3d3a-4462-b7fb-ae571ad70a70\") " pod="openshift-logging/logging-loki-compactor-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.512307 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.512325 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3a44cb8-3d3a-4462-b7fb-ae571ad70a70-config\") pod \"logging-loki-compactor-0\" (UID: \"a3a44cb8-3d3a-4462-b7fb-ae571ad70a70\") " pod="openshift-logging/logging-loki-compactor-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.512342 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v62j5\" (UniqueName: \"kubernetes.io/projected/48261310-f664-41dd-9fbd-dd5a7bfc11e9-kube-api-access-v62j5\") pod \"logging-loki-ingester-0\" (UID: \"48261310-f664-41dd-9fbd-dd5a7bfc11e9\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.512360 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/a3a44cb8-3d3a-4462-b7fb-ae571ad70a70-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"a3a44cb8-3d3a-4462-b7fb-ae571ad70a70\") " pod="openshift-logging/logging-loki-compactor-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.512376 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c8a40342-3064-4673-96c3-a922870a35fb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c8a40342-3064-4673-96c3-a922870a35fb\") pod \"logging-loki-ingester-0\" (UID: \"48261310-f664-41dd-9fbd-dd5a7bfc11e9\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.512393 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.512445 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5d5cac50-7101-445b-a017-2985848d6f65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d5cac50-7101-445b-a017-2985848d6f65\") pod \"logging-loki-ingester-0\" (UID: \"48261310-f664-41dd-9fbd-dd5a7bfc11e9\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.512463 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48261310-f664-41dd-9fbd-dd5a7bfc11e9-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"48261310-f664-41dd-9fbd-dd5a7bfc11e9\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.512485 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0813d08b-e069-424a-b9be-21dc42c1a3ad\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0813d08b-e069-424a-b9be-21dc42c1a3ad\") pod \"logging-loki-compactor-0\" (UID: \"a3a44cb8-3d3a-4462-b7fb-ae571ad70a70\") " pod="openshift-logging/logging-loki-compactor-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.512499 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/48261310-f664-41dd-9fbd-dd5a7bfc11e9-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"48261310-f664-41dd-9fbd-dd5a7bfc11e9\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.512516 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/48261310-f664-41dd-9fbd-dd5a7bfc11e9-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"48261310-f664-41dd-9fbd-dd5a7bfc11e9\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.512533 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/a3a44cb8-3d3a-4462-b7fb-ae571ad70a70-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"a3a44cb8-3d3a-4462-b7fb-ae571ad70a70\") " pod="openshift-logging/logging-loki-compactor-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.512555 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3a44cb8-3d3a-4462-b7fb-ae571ad70a70-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"a3a44cb8-3d3a-4462-b7fb-ae571ad70a70\") " pod="openshift-logging/logging-loki-compactor-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.512570 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/48261310-f664-41dd-9fbd-dd5a7bfc11e9-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"48261310-f664-41dd-9fbd-dd5a7bfc11e9\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.516830 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-http" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.517011 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-grpc" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.558849 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-867fb59d66-2pwh2" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.614576 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0813d08b-e069-424a-b9be-21dc42c1a3ad\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0813d08b-e069-424a-b9be-21dc42c1a3ad\") pod \"logging-loki-compactor-0\" (UID: \"a3a44cb8-3d3a-4462-b7fb-ae571ad70a70\") " pod="openshift-logging/logging-loki-compactor-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.614661 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/48261310-f664-41dd-9fbd-dd5a7bfc11e9-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"48261310-f664-41dd-9fbd-dd5a7bfc11e9\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.614703 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/a3a44cb8-3d3a-4462-b7fb-ae571ad70a70-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"a3a44cb8-3d3a-4462-b7fb-ae571ad70a70\") " pod="openshift-logging/logging-loki-compactor-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.615024 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48261310-f664-41dd-9fbd-dd5a7bfc11e9-config\") pod \"logging-loki-ingester-0\" (UID: \"48261310-f664-41dd-9fbd-dd5a7bfc11e9\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.615064 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/a3a44cb8-3d3a-4462-b7fb-ae571ad70a70-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"a3a44cb8-3d3a-4462-b7fb-ae571ad70a70\") " pod="openshift-logging/logging-loki-compactor-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.615224 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfc98\" (UniqueName: \"kubernetes.io/projected/a3a44cb8-3d3a-4462-b7fb-ae571ad70a70-kube-api-access-mfc98\") pod \"logging-loki-compactor-0\" (UID: \"a3a44cb8-3d3a-4462-b7fb-ae571ad70a70\") " pod="openshift-logging/logging-loki-compactor-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.615386 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3a44cb8-3d3a-4462-b7fb-ae571ad70a70-config\") pod \"logging-loki-compactor-0\" (UID: \"a3a44cb8-3d3a-4462-b7fb-ae571ad70a70\") " pod="openshift-logging/logging-loki-compactor-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.615414 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v62j5\" (UniqueName: \"kubernetes.io/projected/48261310-f664-41dd-9fbd-dd5a7bfc11e9-kube-api-access-v62j5\") pod \"logging-loki-ingester-0\" (UID: \"48261310-f664-41dd-9fbd-dd5a7bfc11e9\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.615551 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/e2b63e12-eaaf-47df-93c5-cbd7effb4124-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"e2b63e12-eaaf-47df-93c5-cbd7effb4124\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.615585 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2b63e12-eaaf-47df-93c5-cbd7effb4124-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"e2b63e12-eaaf-47df-93c5-cbd7effb4124\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.617223 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5d5cac50-7101-445b-a017-2985848d6f65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d5cac50-7101-445b-a017-2985848d6f65\") pod \"logging-loki-ingester-0\" (UID: \"48261310-f664-41dd-9fbd-dd5a7bfc11e9\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.617382 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48261310-f664-41dd-9fbd-dd5a7bfc11e9-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"48261310-f664-41dd-9fbd-dd5a7bfc11e9\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.617512 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3a44cb8-3d3a-4462-b7fb-ae571ad70a70-config\") pod \"logging-loki-compactor-0\" (UID: \"a3a44cb8-3d3a-4462-b7fb-ae571ad70a70\") " pod="openshift-logging/logging-loki-compactor-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.617531 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48261310-f664-41dd-9fbd-dd5a7bfc11e9-config\") pod \"logging-loki-ingester-0\" (UID: \"48261310-f664-41dd-9fbd-dd5a7bfc11e9\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.617526 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/48261310-f664-41dd-9fbd-dd5a7bfc11e9-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"48261310-f664-41dd-9fbd-dd5a7bfc11e9\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.617607 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ktfp\" (UniqueName: \"kubernetes.io/projected/e2b63e12-eaaf-47df-93c5-cbd7effb4124-kube-api-access-7ktfp\") pod \"logging-loki-index-gateway-0\" (UID: \"e2b63e12-eaaf-47df-93c5-cbd7effb4124\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.617650 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3a44cb8-3d3a-4462-b7fb-ae571ad70a70-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"a3a44cb8-3d3a-4462-b7fb-ae571ad70a70\") " pod="openshift-logging/logging-loki-compactor-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.617676 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/48261310-f664-41dd-9fbd-dd5a7bfc11e9-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"48261310-f664-41dd-9fbd-dd5a7bfc11e9\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.617729 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/e2b63e12-eaaf-47df-93c5-cbd7effb4124-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"e2b63e12-eaaf-47df-93c5-cbd7effb4124\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.617799 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/a3a44cb8-3d3a-4462-b7fb-ae571ad70a70-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"a3a44cb8-3d3a-4462-b7fb-ae571ad70a70\") " pod="openshift-logging/logging-loki-compactor-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.617822 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c8a40342-3064-4673-96c3-a922870a35fb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c8a40342-3064-4673-96c3-a922870a35fb\") pod \"logging-loki-ingester-0\" (UID: \"48261310-f664-41dd-9fbd-dd5a7bfc11e9\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.617848 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5a324af2-9a7c-401a-841a-891d2bf98681\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5a324af2-9a7c-401a-841a-891d2bf98681\") pod \"logging-loki-index-gateway-0\" (UID: \"e2b63e12-eaaf-47df-93c5-cbd7effb4124\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.617892 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/e2b63e12-eaaf-47df-93c5-cbd7effb4124-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"e2b63e12-eaaf-47df-93c5-cbd7effb4124\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.617979 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2b63e12-eaaf-47df-93c5-cbd7effb4124-config\") pod \"logging-loki-index-gateway-0\" (UID: \"e2b63e12-eaaf-47df-93c5-cbd7effb4124\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.618951 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3a44cb8-3d3a-4462-b7fb-ae571ad70a70-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"a3a44cb8-3d3a-4462-b7fb-ae571ad70a70\") " pod="openshift-logging/logging-loki-compactor-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.619077 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48261310-f664-41dd-9fbd-dd5a7bfc11e9-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"48261310-f664-41dd-9fbd-dd5a7bfc11e9\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.622528 4723 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.622925 4723 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c8a40342-3064-4673-96c3-a922870a35fb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c8a40342-3064-4673-96c3-a922870a35fb\") pod \"logging-loki-ingester-0\" (UID: \"48261310-f664-41dd-9fbd-dd5a7bfc11e9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3aff517427f83168633ee0ade43794c62e3b16b6b5b359681fde5bd04a1ea541/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.622665 4723 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.623058 4723 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5d5cac50-7101-445b-a017-2985848d6f65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d5cac50-7101-445b-a017-2985848d6f65\") pod \"logging-loki-ingester-0\" (UID: \"48261310-f664-41dd-9fbd-dd5a7bfc11e9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/31b886a96d8768355206c100cb3ff14a92ceadf149a635a006537c5543cc5b4f/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.627325 4723 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.627396 4723 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0813d08b-e069-424a-b9be-21dc42c1a3ad\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0813d08b-e069-424a-b9be-21dc42c1a3ad\") pod \"logging-loki-compactor-0\" (UID: \"a3a44cb8-3d3a-4462-b7fb-ae571ad70a70\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d6783dcc96bcd37660c118daef25e9352a5d45a13b727f8c1822375abc690b35/globalmount\"" pod="openshift-logging/logging-loki-compactor-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.627532 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/a3a44cb8-3d3a-4462-b7fb-ae571ad70a70-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"a3a44cb8-3d3a-4462-b7fb-ae571ad70a70\") " pod="openshift-logging/logging-loki-compactor-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.627702 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/48261310-f664-41dd-9fbd-dd5a7bfc11e9-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"48261310-f664-41dd-9fbd-dd5a7bfc11e9\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.629541 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/48261310-f664-41dd-9fbd-dd5a7bfc11e9-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"48261310-f664-41dd-9fbd-dd5a7bfc11e9\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.629766 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/a3a44cb8-3d3a-4462-b7fb-ae571ad70a70-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"a3a44cb8-3d3a-4462-b7fb-ae571ad70a70\") " pod="openshift-logging/logging-loki-compactor-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.630939 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/48261310-f664-41dd-9fbd-dd5a7bfc11e9-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"48261310-f664-41dd-9fbd-dd5a7bfc11e9\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.632640 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/a3a44cb8-3d3a-4462-b7fb-ae571ad70a70-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"a3a44cb8-3d3a-4462-b7fb-ae571ad70a70\") " pod="openshift-logging/logging-loki-compactor-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.638350 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v62j5\" (UniqueName: \"kubernetes.io/projected/48261310-f664-41dd-9fbd-dd5a7bfc11e9-kube-api-access-v62j5\") pod \"logging-loki-ingester-0\" (UID: \"48261310-f664-41dd-9fbd-dd5a7bfc11e9\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.641879 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfc98\" (UniqueName: \"kubernetes.io/projected/a3a44cb8-3d3a-4462-b7fb-ae571ad70a70-kube-api-access-mfc98\") pod \"logging-loki-compactor-0\" (UID: \"a3a44cb8-3d3a-4462-b7fb-ae571ad70a70\") " pod="openshift-logging/logging-loki-compactor-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.660157 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5d5cac50-7101-445b-a017-2985848d6f65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d5cac50-7101-445b-a017-2985848d6f65\") pod \"logging-loki-ingester-0\" (UID: \"48261310-f664-41dd-9fbd-dd5a7bfc11e9\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.673387 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c8a40342-3064-4673-96c3-a922870a35fb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c8a40342-3064-4673-96c3-a922870a35fb\") pod \"logging-loki-ingester-0\" (UID: \"48261310-f664-41dd-9fbd-dd5a7bfc11e9\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.706623 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0813d08b-e069-424a-b9be-21dc42c1a3ad\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0813d08b-e069-424a-b9be-21dc42c1a3ad\") pod \"logging-loki-compactor-0\" (UID: \"a3a44cb8-3d3a-4462-b7fb-ae571ad70a70\") " pod="openshift-logging/logging-loki-compactor-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.720027 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/e2b63e12-eaaf-47df-93c5-cbd7effb4124-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"e2b63e12-eaaf-47df-93c5-cbd7effb4124\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.720076 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2b63e12-eaaf-47df-93c5-cbd7effb4124-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"e2b63e12-eaaf-47df-93c5-cbd7effb4124\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.720147 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ktfp\" (UniqueName: \"kubernetes.io/projected/e2b63e12-eaaf-47df-93c5-cbd7effb4124-kube-api-access-7ktfp\") pod \"logging-loki-index-gateway-0\" (UID: \"e2b63e12-eaaf-47df-93c5-cbd7effb4124\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.720178 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/e2b63e12-eaaf-47df-93c5-cbd7effb4124-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"e2b63e12-eaaf-47df-93c5-cbd7effb4124\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.720219 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5a324af2-9a7c-401a-841a-891d2bf98681\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5a324af2-9a7c-401a-841a-891d2bf98681\") pod \"logging-loki-index-gateway-0\" (UID: \"e2b63e12-eaaf-47df-93c5-cbd7effb4124\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.720244 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/e2b63e12-eaaf-47df-93c5-cbd7effb4124-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"e2b63e12-eaaf-47df-93c5-cbd7effb4124\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.720274 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2b63e12-eaaf-47df-93c5-cbd7effb4124-config\") pod \"logging-loki-index-gateway-0\" (UID: \"e2b63e12-eaaf-47df-93c5-cbd7effb4124\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.721379 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2b63e12-eaaf-47df-93c5-cbd7effb4124-config\") pod \"logging-loki-index-gateway-0\" (UID: \"e2b63e12-eaaf-47df-93c5-cbd7effb4124\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.724773 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/e2b63e12-eaaf-47df-93c5-cbd7effb4124-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"e2b63e12-eaaf-47df-93c5-cbd7effb4124\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.725453 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2b63e12-eaaf-47df-93c5-cbd7effb4124-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"e2b63e12-eaaf-47df-93c5-cbd7effb4124\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.729129 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/e2b63e12-eaaf-47df-93c5-cbd7effb4124-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"e2b63e12-eaaf-47df-93c5-cbd7effb4124\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.732201 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/e2b63e12-eaaf-47df-93c5-cbd7effb4124-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"e2b63e12-eaaf-47df-93c5-cbd7effb4124\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.733073 4723 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.733153 4723 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5a324af2-9a7c-401a-841a-891d2bf98681\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5a324af2-9a7c-401a-841a-891d2bf98681\") pod \"logging-loki-index-gateway-0\" (UID: \"e2b63e12-eaaf-47df-93c5-cbd7effb4124\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/df23a8a76cfe0f237f35b6065232f301d932ddc5ed98aec7597c09d0435230ee/globalmount\"" pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.737286 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.743947 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ktfp\" (UniqueName: \"kubernetes.io/projected/e2b63e12-eaaf-47df-93c5-cbd7effb4124-kube-api-access-7ktfp\") pod \"logging-loki-index-gateway-0\" (UID: \"e2b63e12-eaaf-47df-93c5-cbd7effb4124\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.755324 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5a324af2-9a7c-401a-841a-891d2bf98681\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5a324af2-9a7c-401a-841a-891d2bf98681\") pod \"logging-loki-index-gateway-0\" (UID: \"e2b63e12-eaaf-47df-93c5-cbd7effb4124\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.938894 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551032-l8bn7"] Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.961380 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.991324 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-5d5548c9f5-zg9mx" event={"ID":"04edbd9e-fd1b-4346-97ce-adfb011720a4","Type":"ContainerStarted","Data":"c03989ca1c496f536c738acbc813bb442f31cc965e28645a031ac42eaeee6bfe"} Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.995126 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-6d6859c548-4mdnv" event={"ID":"9cd1997b-cced-41c1-8a27-77321ffc48ae","Type":"ContainerStarted","Data":"4211f48c0be4b4d93130012e1e08e09c1978da2bb4db3b219f86389078f87723"} Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.996105 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-76bf7b6d45-2d54b" event={"ID":"8b9cdd14-6347-4701-9825-1ced6362cd8c","Type":"ContainerStarted","Data":"b8ef6427b2887c599d639ef6231285cfbdcc89f2cdddbc7d9ac79f3b98d77f0d"} Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.996832 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551032-l8bn7" event={"ID":"bfc2f049-83e8-4cbe-a04e-d02db1274094","Type":"ContainerStarted","Data":"964ac489cdc67f26cae0d5e95dc4decbb181c751ec51e553f5f7313f642613dc"} Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.997178 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 13:12:00 crc kubenswrapper[4723]: I0309 13:12:00.998489 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-867fb59d66-pxpr6" event={"ID":"3dcae42d-f05a-41f1-9d6a-11ccb28eb379","Type":"ContainerStarted","Data":"b4eb019be7685d535d08571062ae8f02f372764d7d49a2d9ae9a5909ea5eb4c4"} Mar 09 13:12:01 crc kubenswrapper[4723]: I0309 13:12:01.015171 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-867fb59d66-2pwh2"] Mar 09 13:12:01 crc kubenswrapper[4723]: I0309 13:12:01.287222 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Mar 09 13:12:01 crc kubenswrapper[4723]: W0309 13:12:01.295168 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3a44cb8_3d3a_4462_b7fb_ae571ad70a70.slice/crio-2255364e5030ee7a078241b058934bb1d90aeb73759f56c17a845137976ca07d WatchSource:0}: Error finding container 2255364e5030ee7a078241b058934bb1d90aeb73759f56c17a845137976ca07d: Status 404 returned error can't find the container with id 2255364e5030ee7a078241b058934bb1d90aeb73759f56c17a845137976ca07d Mar 09 13:12:01 crc kubenswrapper[4723]: I0309 13:12:01.402099 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Mar 09 13:12:01 crc kubenswrapper[4723]: W0309 13:12:01.406637 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2b63e12_eaaf_47df_93c5_cbd7effb4124.slice/crio-3dbbac67f277a316fd84e64d97ba473a40bc29e1e6e0b38ecf91e84bbaf1fa6d WatchSource:0}: Error finding container 3dbbac67f277a316fd84e64d97ba473a40bc29e1e6e0b38ecf91e84bbaf1fa6d: Status 404 returned error can't find the container with id 3dbbac67f277a316fd84e64d97ba473a40bc29e1e6e0b38ecf91e84bbaf1fa6d Mar 09 13:12:01 crc kubenswrapper[4723]: I0309 13:12:01.453349 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Mar 09 13:12:01 crc kubenswrapper[4723]: W0309 13:12:01.457968 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48261310_f664_41dd_9fbd_dd5a7bfc11e9.slice/crio-ce7355405a72275b9fd59cbac7a4081fbcb07941f4529e5f750517391eb4a0bf WatchSource:0}: Error finding container ce7355405a72275b9fd59cbac7a4081fbcb07941f4529e5f750517391eb4a0bf: Status 404 returned error can't find the container with id ce7355405a72275b9fd59cbac7a4081fbcb07941f4529e5f750517391eb4a0bf Mar 09 13:12:02 crc kubenswrapper[4723]: I0309 13:12:02.007305 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-867fb59d66-2pwh2" event={"ID":"0bd030fd-cf38-4403-971f-4170fdc71bb0","Type":"ContainerStarted","Data":"1afb0c0a72acd967da80ab55a289297521ee457ac1ccaf8186d7d4fad87a6c8f"} Mar 09 13:12:02 crc kubenswrapper[4723]: I0309 13:12:02.008554 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"a3a44cb8-3d3a-4462-b7fb-ae571ad70a70","Type":"ContainerStarted","Data":"2255364e5030ee7a078241b058934bb1d90aeb73759f56c17a845137976ca07d"} Mar 09 13:12:02 crc kubenswrapper[4723]: I0309 13:12:02.009770 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"e2b63e12-eaaf-47df-93c5-cbd7effb4124","Type":"ContainerStarted","Data":"3dbbac67f277a316fd84e64d97ba473a40bc29e1e6e0b38ecf91e84bbaf1fa6d"} Mar 09 13:12:02 crc kubenswrapper[4723]: I0309 13:12:02.010753 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"48261310-f664-41dd-9fbd-dd5a7bfc11e9","Type":"ContainerStarted","Data":"ce7355405a72275b9fd59cbac7a4081fbcb07941f4529e5f750517391eb4a0bf"} Mar 09 13:12:03 crc kubenswrapper[4723]: I0309 13:12:03.054532 4723 generic.go:334] "Generic (PLEG): container finished" podID="bfc2f049-83e8-4cbe-a04e-d02db1274094" containerID="50e29fcc16f3b2ad7526b05e9045d0c5bf3140d8f910d9d261087cf836a20464" exitCode=0 Mar 09 13:12:03 crc kubenswrapper[4723]: I0309 13:12:03.054591 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551032-l8bn7" event={"ID":"bfc2f049-83e8-4cbe-a04e-d02db1274094","Type":"ContainerDied","Data":"50e29fcc16f3b2ad7526b05e9045d0c5bf3140d8f910d9d261087cf836a20464"} Mar 09 13:12:03 crc kubenswrapper[4723]: I0309 13:12:03.376393 4723 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 09 13:12:05 crc kubenswrapper[4723]: I0309 13:12:05.162369 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551032-l8bn7" Mar 09 13:12:05 crc kubenswrapper[4723]: I0309 13:12:05.215899 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fptr9\" (UniqueName: \"kubernetes.io/projected/bfc2f049-83e8-4cbe-a04e-d02db1274094-kube-api-access-fptr9\") pod \"bfc2f049-83e8-4cbe-a04e-d02db1274094\" (UID: \"bfc2f049-83e8-4cbe-a04e-d02db1274094\") " Mar 09 13:12:05 crc kubenswrapper[4723]: I0309 13:12:05.241099 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfc2f049-83e8-4cbe-a04e-d02db1274094-kube-api-access-fptr9" (OuterVolumeSpecName: "kube-api-access-fptr9") pod "bfc2f049-83e8-4cbe-a04e-d02db1274094" (UID: "bfc2f049-83e8-4cbe-a04e-d02db1274094"). InnerVolumeSpecName "kube-api-access-fptr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:12:05 crc kubenswrapper[4723]: I0309 13:12:05.317942 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fptr9\" (UniqueName: \"kubernetes.io/projected/bfc2f049-83e8-4cbe-a04e-d02db1274094-kube-api-access-fptr9\") on node \"crc\" DevicePath \"\"" Mar 09 13:12:06 crc kubenswrapper[4723]: I0309 13:12:06.092913 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-76bf7b6d45-2d54b" event={"ID":"8b9cdd14-6347-4701-9825-1ced6362cd8c","Type":"ContainerStarted","Data":"246ba3553e9416c1b5cca3a0246a280be55994933066d11b9e63141dbaee0655"} Mar 09 13:12:06 crc kubenswrapper[4723]: I0309 13:12:06.093251 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-querier-76bf7b6d45-2d54b" Mar 09 13:12:06 crc kubenswrapper[4723]: I0309 13:12:06.094949 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-867fb59d66-pxpr6" event={"ID":"3dcae42d-f05a-41f1-9d6a-11ccb28eb379","Type":"ContainerStarted","Data":"a3eb7aef32694d66213b6186b503ff0618cedd757cbb302cefe3e2555e6e64aa"} Mar 09 13:12:06 crc kubenswrapper[4723]: I0309 13:12:06.096541 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"a3a44cb8-3d3a-4462-b7fb-ae571ad70a70","Type":"ContainerStarted","Data":"0a24698b04ea85e0ee42f325466363e4bbf82174ddfa82dde7d535f8758b3b78"} Mar 09 13:12:06 crc kubenswrapper[4723]: I0309 13:12:06.098048 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-compactor-0" Mar 09 13:12:06 crc kubenswrapper[4723]: I0309 13:12:06.099649 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"e2b63e12-eaaf-47df-93c5-cbd7effb4124","Type":"ContainerStarted","Data":"52ef28987437af2c4ee31499b9ffdad8d993b2771a401d9bd1c0d4eb0883c6fb"} Mar 09 13:12:06 crc kubenswrapper[4723]: I0309 13:12:06.099827 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 13:12:06 crc kubenswrapper[4723]: I0309 13:12:06.101077 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-6d6859c548-4mdnv" event={"ID":"9cd1997b-cced-41c1-8a27-77321ffc48ae","Type":"ContainerStarted","Data":"2e5876b672a25df145ee5783bed646c0e41c3fe27ad7ebf293316c55e97bf100"} Mar 09 13:12:06 crc kubenswrapper[4723]: I0309 13:12:06.101201 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-query-frontend-6d6859c548-4mdnv" Mar 09 13:12:06 crc kubenswrapper[4723]: I0309 13:12:06.106355 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551032-l8bn7" event={"ID":"bfc2f049-83e8-4cbe-a04e-d02db1274094","Type":"ContainerDied","Data":"964ac489cdc67f26cae0d5e95dc4decbb181c751ec51e553f5f7313f642613dc"} Mar 09 13:12:06 crc kubenswrapper[4723]: I0309 13:12:06.106396 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551032-l8bn7" Mar 09 13:12:06 crc kubenswrapper[4723]: I0309 13:12:06.106406 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="964ac489cdc67f26cae0d5e95dc4decbb181c751ec51e553f5f7313f642613dc" Mar 09 13:12:06 crc kubenswrapper[4723]: I0309 13:12:06.108566 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-867fb59d66-2pwh2" event={"ID":"0bd030fd-cf38-4403-971f-4170fdc71bb0","Type":"ContainerStarted","Data":"b694b68df7616dcbf812812c352d5d9adfc584385583fcda2ab25e72b349d852"} Mar 09 13:12:06 crc kubenswrapper[4723]: I0309 13:12:06.110627 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-5d5548c9f5-zg9mx" event={"ID":"04edbd9e-fd1b-4346-97ce-adfb011720a4","Type":"ContainerStarted","Data":"8d827142ac2608ff80ef980620fc11cc7592e835068041de25e65446e3b60e46"} Mar 09 13:12:06 crc kubenswrapper[4723]: I0309 13:12:06.110981 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-distributor-5d5548c9f5-zg9mx" Mar 09 13:12:06 crc kubenswrapper[4723]: I0309 13:12:06.113292 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"48261310-f664-41dd-9fbd-dd5a7bfc11e9","Type":"ContainerStarted","Data":"1fa4ea5b922ffefa020e256ec75a4e6517ab9a08921595fb84ae36176b65301a"} Mar 09 13:12:06 crc kubenswrapper[4723]: I0309 13:12:06.114265 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-ingester-0" Mar 09 13:12:06 crc kubenswrapper[4723]: I0309 13:12:06.132701 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-querier-76bf7b6d45-2d54b" podStartSLOduration=1.971076868 podStartE2EDuration="7.132669892s" podCreationTimestamp="2026-03-09 13:11:59 +0000 UTC" firstStartedPulling="2026-03-09 13:12:00.157149518 +0000 UTC m=+794.171617058" lastFinishedPulling="2026-03-09 13:12:05.318742542 +0000 UTC m=+799.333210082" observedRunningTime="2026-03-09 13:12:06.12208603 +0000 UTC m=+800.136553580" watchObservedRunningTime="2026-03-09 13:12:06.132669892 +0000 UTC m=+800.147137522" Mar 09 13:12:06 crc kubenswrapper[4723]: I0309 13:12:06.157958 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-compactor-0" podStartSLOduration=3.140143117 podStartE2EDuration="7.157933427s" podCreationTimestamp="2026-03-09 13:11:59 +0000 UTC" firstStartedPulling="2026-03-09 13:12:01.297248963 +0000 UTC m=+795.311716503" lastFinishedPulling="2026-03-09 13:12:05.315039273 +0000 UTC m=+799.329506813" observedRunningTime="2026-03-09 13:12:06.157409463 +0000 UTC m=+800.171877043" watchObservedRunningTime="2026-03-09 13:12:06.157933427 +0000 UTC m=+800.172400997" Mar 09 13:12:06 crc kubenswrapper[4723]: I0309 13:12:06.189640 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-ingester-0" podStartSLOduration=3.302360966 podStartE2EDuration="7.189621862s" podCreationTimestamp="2026-03-09 13:11:59 +0000 UTC" firstStartedPulling="2026-03-09 13:12:01.460151291 +0000 UTC m=+795.474618831" lastFinishedPulling="2026-03-09 13:12:05.347412187 +0000 UTC m=+799.361879727" observedRunningTime="2026-03-09 13:12:06.187033073 +0000 UTC m=+800.201500613" watchObservedRunningTime="2026-03-09 13:12:06.189621862 +0000 UTC m=+800.204089402" Mar 09 13:12:06 crc kubenswrapper[4723]: I0309 13:12:06.226492 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-index-gateway-0" podStartSLOduration=3.367458804 podStartE2EDuration="7.226473516s" podCreationTimestamp="2026-03-09 13:11:59 +0000 UTC" firstStartedPulling="2026-03-09 13:12:01.408965805 +0000 UTC m=+795.423433335" lastFinishedPulling="2026-03-09 13:12:05.267980507 +0000 UTC m=+799.282448047" observedRunningTime="2026-03-09 13:12:06.212073571 +0000 UTC m=+800.226541121" watchObservedRunningTime="2026-03-09 13:12:06.226473516 +0000 UTC m=+800.240941066" Mar 09 13:12:06 crc kubenswrapper[4723]: I0309 13:12:06.299029 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551026-vm6hr"] Mar 09 13:12:06 crc kubenswrapper[4723]: I0309 13:12:06.319247 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551026-vm6hr"] Mar 09 13:12:06 crc kubenswrapper[4723]: I0309 13:12:06.350736 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-distributor-5d5548c9f5-zg9mx" podStartSLOduration=2.094566215 podStartE2EDuration="7.350711881s" podCreationTimestamp="2026-03-09 13:11:59 +0000 UTC" firstStartedPulling="2026-03-09 13:12:00.048761506 +0000 UTC m=+794.063229046" lastFinishedPulling="2026-03-09 13:12:05.304907172 +0000 UTC m=+799.319374712" observedRunningTime="2026-03-09 13:12:06.281003191 +0000 UTC m=+800.295470741" watchObservedRunningTime="2026-03-09 13:12:06.350711881 +0000 UTC m=+800.365179421" Mar 09 13:12:06 crc kubenswrapper[4723]: I0309 13:12:06.367273 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-query-frontend-6d6859c548-4mdnv" podStartSLOduration=2.330242234 podStartE2EDuration="7.367255623s" podCreationTimestamp="2026-03-09 13:11:59 +0000 UTC" firstStartedPulling="2026-03-09 13:12:00.26776557 +0000 UTC m=+794.282233110" lastFinishedPulling="2026-03-09 13:12:05.304778959 +0000 UTC m=+799.319246499" observedRunningTime="2026-03-09 13:12:06.301271292 +0000 UTC m=+800.315738842" watchObservedRunningTime="2026-03-09 13:12:06.367255623 +0000 UTC m=+800.381723163" Mar 09 13:12:06 crc kubenswrapper[4723]: I0309 13:12:06.896203 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77132bc2-3cff-42ca-b132-bca14aa41733" path="/var/lib/kubelet/pods/77132bc2-3cff-42ca-b132-bca14aa41733/volumes" Mar 09 13:12:09 crc kubenswrapper[4723]: I0309 13:12:09.144024 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-867fb59d66-2pwh2" event={"ID":"0bd030fd-cf38-4403-971f-4170fdc71bb0","Type":"ContainerStarted","Data":"f2b673dc9ce8be84975971ae8b770daa3fb5f356c9e229330e8b6e29dd490213"} Mar 09 13:12:09 crc kubenswrapper[4723]: I0309 13:12:09.144422 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-867fb59d66-2pwh2" Mar 09 13:12:09 crc kubenswrapper[4723]: I0309 13:12:09.144446 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-867fb59d66-2pwh2" Mar 09 13:12:09 crc kubenswrapper[4723]: I0309 13:12:09.146805 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-867fb59d66-pxpr6" event={"ID":"3dcae42d-f05a-41f1-9d6a-11ccb28eb379","Type":"ContainerStarted","Data":"74ad269f26298abb80545b5f97eea6efc1a8038541ba687a1ed4df57e743e978"} Mar 09 13:12:09 crc kubenswrapper[4723]: I0309 13:12:09.156053 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-867fb59d66-2pwh2" Mar 09 13:12:09 crc kubenswrapper[4723]: I0309 13:12:09.160214 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-867fb59d66-2pwh2" Mar 09 13:12:09 crc kubenswrapper[4723]: I0309 13:12:09.204359 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-867fb59d66-2pwh2" podStartSLOduration=3.226106442 podStartE2EDuration="10.204291842s" podCreationTimestamp="2026-03-09 13:11:59 +0000 UTC" firstStartedPulling="2026-03-09 13:12:01.029442617 +0000 UTC m=+795.043910157" lastFinishedPulling="2026-03-09 13:12:08.007628017 +0000 UTC m=+802.022095557" observedRunningTime="2026-03-09 13:12:09.170392497 +0000 UTC m=+803.184860057" watchObservedRunningTime="2026-03-09 13:12:09.204291842 +0000 UTC m=+803.218759412" Mar 09 13:12:09 crc kubenswrapper[4723]: I0309 13:12:09.288510 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-867fb59d66-pxpr6" podStartSLOduration=2.765862399 podStartE2EDuration="10.288494219s" podCreationTimestamp="2026-03-09 13:11:59 +0000 UTC" firstStartedPulling="2026-03-09 13:12:00.491160102 +0000 UTC m=+794.505627652" lastFinishedPulling="2026-03-09 13:12:08.013791932 +0000 UTC m=+802.028259472" observedRunningTime="2026-03-09 13:12:09.273267892 +0000 UTC m=+803.287735432" watchObservedRunningTime="2026-03-09 13:12:09.288494219 +0000 UTC m=+803.302961759" Mar 09 13:12:09 crc kubenswrapper[4723]: I0309 13:12:09.989899 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-867fb59d66-pxpr6" Mar 09 13:12:09 crc kubenswrapper[4723]: I0309 13:12:09.990242 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-867fb59d66-pxpr6" Mar 09 13:12:10 crc kubenswrapper[4723]: I0309 13:12:10.004901 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-867fb59d66-pxpr6" Mar 09 13:12:10 crc kubenswrapper[4723]: I0309 13:12:10.006289 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-867fb59d66-pxpr6" Mar 09 13:12:20 crc kubenswrapper[4723]: I0309 13:12:20.742343 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-compactor-0" Mar 09 13:12:20 crc kubenswrapper[4723]: I0309 13:12:20.967592 4723 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Mar 09 13:12:20 crc kubenswrapper[4723]: I0309 13:12:20.967644 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="48261310-f664-41dd-9fbd-dd5a7bfc11e9" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 09 13:12:21 crc kubenswrapper[4723]: I0309 13:12:21.004943 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 13:12:29 crc kubenswrapper[4723]: I0309 13:12:29.527578 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-distributor-5d5548c9f5-zg9mx" Mar 09 13:12:29 crc kubenswrapper[4723]: I0309 13:12:29.697760 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-querier-76bf7b6d45-2d54b" Mar 09 13:12:29 crc kubenswrapper[4723]: I0309 13:12:29.777573 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-query-frontend-6d6859c548-4mdnv" Mar 09 13:12:30 crc kubenswrapper[4723]: I0309 13:12:30.969053 4723 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Mar 09 13:12:30 crc kubenswrapper[4723]: I0309 13:12:30.969575 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="48261310-f664-41dd-9fbd-dd5a7bfc11e9" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 09 13:12:40 crc kubenswrapper[4723]: I0309 13:12:40.969930 4723 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Mar 09 13:12:40 crc kubenswrapper[4723]: I0309 13:12:40.970582 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="48261310-f664-41dd-9fbd-dd5a7bfc11e9" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 09 13:12:47 crc kubenswrapper[4723]: I0309 13:12:47.519113 4723 scope.go:117] "RemoveContainer" containerID="4e21a6980b85e71f90d18fd1058d71f241f1bb4828c79c7ce64e9be6f21a0add" Mar 09 13:12:50 crc kubenswrapper[4723]: I0309 13:12:50.968176 4723 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Mar 09 13:12:50 crc kubenswrapper[4723]: I0309 13:12:50.968764 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="48261310-f664-41dd-9fbd-dd5a7bfc11e9" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 09 13:13:00 crc kubenswrapper[4723]: I0309 13:13:00.972414 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-ingester-0" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.258450 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-vl7dr"] Mar 09 13:13:19 crc kubenswrapper[4723]: E0309 13:13:19.260433 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfc2f049-83e8-4cbe-a04e-d02db1274094" containerName="oc" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.260462 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfc2f049-83e8-4cbe-a04e-d02db1274094" containerName="oc" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.260637 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfc2f049-83e8-4cbe-a04e-d02db1274094" containerName="oc" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.261266 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-vl7dr" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.267658 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.268073 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-n7kgj" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.271389 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.271620 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.271754 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.273041 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.282904 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-vl7dr"] Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.316308 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-vl7dr"] Mar 09 13:13:19 crc kubenswrapper[4723]: E0309 13:13:19.330024 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[collector-syslog-receiver collector-token config config-openshift-service-cacrt datadir entrypoint kube-api-access-cb4pp metrics sa-token tmp trusted-ca], unattached volumes=[], failed to process volumes=[collector-syslog-receiver collector-token config config-openshift-service-cacrt datadir entrypoint kube-api-access-cb4pp metrics sa-token tmp trusted-ca]: context canceled" pod="openshift-logging/collector-vl7dr" podUID="9f8fde5e-03c8-4002-995a-9a990d5e1dbf" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.335927 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-sa-token\") pod \"collector-vl7dr\" (UID: \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\") " pod="openshift-logging/collector-vl7dr" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.335962 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-config-openshift-service-cacrt\") pod \"collector-vl7dr\" (UID: \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\") " pod="openshift-logging/collector-vl7dr" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.336023 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-collector-syslog-receiver\") pod \"collector-vl7dr\" (UID: \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\") " pod="openshift-logging/collector-vl7dr" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.336076 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-entrypoint\") pod \"collector-vl7dr\" (UID: \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\") " pod="openshift-logging/collector-vl7dr" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.336093 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-trusted-ca\") pod \"collector-vl7dr\" (UID: \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\") " pod="openshift-logging/collector-vl7dr" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.336127 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-datadir\") pod \"collector-vl7dr\" (UID: \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\") " pod="openshift-logging/collector-vl7dr" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.336278 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-metrics\") pod \"collector-vl7dr\" (UID: \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\") " pod="openshift-logging/collector-vl7dr" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.336358 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb4pp\" (UniqueName: \"kubernetes.io/projected/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-kube-api-access-cb4pp\") pod \"collector-vl7dr\" (UID: \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\") " pod="openshift-logging/collector-vl7dr" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.336388 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-tmp\") pod \"collector-vl7dr\" (UID: \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\") " pod="openshift-logging/collector-vl7dr" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.336462 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-collector-token\") pod \"collector-vl7dr\" (UID: \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\") " pod="openshift-logging/collector-vl7dr" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.336517 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-config\") pod \"collector-vl7dr\" (UID: \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\") " pod="openshift-logging/collector-vl7dr" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.437127 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb4pp\" (UniqueName: \"kubernetes.io/projected/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-kube-api-access-cb4pp\") pod \"collector-vl7dr\" (UID: \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\") " pod="openshift-logging/collector-vl7dr" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.437181 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-tmp\") pod \"collector-vl7dr\" (UID: \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\") " pod="openshift-logging/collector-vl7dr" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.437228 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-collector-token\") pod \"collector-vl7dr\" (UID: \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\") " pod="openshift-logging/collector-vl7dr" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.437261 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-config\") pod \"collector-vl7dr\" (UID: \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\") " pod="openshift-logging/collector-vl7dr" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.437303 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-sa-token\") pod \"collector-vl7dr\" (UID: \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\") " pod="openshift-logging/collector-vl7dr" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.437328 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-config-openshift-service-cacrt\") pod \"collector-vl7dr\" (UID: \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\") " pod="openshift-logging/collector-vl7dr" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.437370 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-collector-syslog-receiver\") pod \"collector-vl7dr\" (UID: \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\") " pod="openshift-logging/collector-vl7dr" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.437400 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-entrypoint\") pod \"collector-vl7dr\" (UID: \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\") " pod="openshift-logging/collector-vl7dr" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.437418 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-datadir\") pod \"collector-vl7dr\" (UID: \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\") " pod="openshift-logging/collector-vl7dr" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.437439 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-trusted-ca\") pod \"collector-vl7dr\" (UID: \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\") " pod="openshift-logging/collector-vl7dr" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.437483 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-metrics\") pod \"collector-vl7dr\" (UID: \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\") " pod="openshift-logging/collector-vl7dr" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.438473 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-config-openshift-service-cacrt\") pod \"collector-vl7dr\" (UID: \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\") " pod="openshift-logging/collector-vl7dr" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.438755 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-entrypoint\") pod \"collector-vl7dr\" (UID: \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\") " pod="openshift-logging/collector-vl7dr" Mar 09 13:13:19 crc kubenswrapper[4723]: E0309 13:13:19.438853 4723 secret.go:188] Couldn't get secret openshift-logging/collector-syslog-receiver: secret "collector-syslog-receiver" not found Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.438927 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-datadir\") pod \"collector-vl7dr\" (UID: \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\") " pod="openshift-logging/collector-vl7dr" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.439585 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-config\") pod \"collector-vl7dr\" (UID: \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\") " pod="openshift-logging/collector-vl7dr" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.440053 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-trusted-ca\") pod \"collector-vl7dr\" (UID: \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\") " pod="openshift-logging/collector-vl7dr" Mar 09 13:13:19 crc kubenswrapper[4723]: E0309 13:13:19.440159 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-collector-syslog-receiver podName:9f8fde5e-03c8-4002-995a-9a990d5e1dbf nodeName:}" failed. No retries permitted until 2026-03-09 13:13:19.940139374 +0000 UTC m=+873.954606914 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "collector-syslog-receiver" (UniqueName: "kubernetes.io/secret/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-collector-syslog-receiver") pod "collector-vl7dr" (UID: "9f8fde5e-03c8-4002-995a-9a990d5e1dbf") : secret "collector-syslog-receiver" not found Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.444960 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-metrics\") pod \"collector-vl7dr\" (UID: \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\") " pod="openshift-logging/collector-vl7dr" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.446456 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-collector-token\") pod \"collector-vl7dr\" (UID: \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\") " pod="openshift-logging/collector-vl7dr" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.447284 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-tmp\") pod \"collector-vl7dr\" (UID: \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\") " pod="openshift-logging/collector-vl7dr" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.457472 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-sa-token\") pod \"collector-vl7dr\" (UID: \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\") " pod="openshift-logging/collector-vl7dr" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.460767 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb4pp\" (UniqueName: \"kubernetes.io/projected/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-kube-api-access-cb4pp\") pod \"collector-vl7dr\" (UID: \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\") " pod="openshift-logging/collector-vl7dr" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.758119 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-vl7dr" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.774248 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-vl7dr" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.943707 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cb4pp\" (UniqueName: \"kubernetes.io/projected/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-kube-api-access-cb4pp\") pod \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\" (UID: \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\") " Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.944847 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-config\") pod \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\" (UID: \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\") " Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.945032 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-datadir\") pod \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\" (UID: \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\") " Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.945130 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-trusted-ca\") pod \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\" (UID: \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\") " Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.945261 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-sa-token\") pod \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\" (UID: \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\") " Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.945277 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-datadir" (OuterVolumeSpecName: "datadir") pod "9f8fde5e-03c8-4002-995a-9a990d5e1dbf" (UID: "9f8fde5e-03c8-4002-995a-9a990d5e1dbf"). InnerVolumeSpecName "datadir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.945369 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-config" (OuterVolumeSpecName: "config") pod "9f8fde5e-03c8-4002-995a-9a990d5e1dbf" (UID: "9f8fde5e-03c8-4002-995a-9a990d5e1dbf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.945388 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-metrics\") pod \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\" (UID: \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\") " Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.945497 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-collector-token\") pod \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\" (UID: \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\") " Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.945585 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-config-openshift-service-cacrt\") pod \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\" (UID: \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\") " Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.945639 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-tmp\") pod \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\" (UID: \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\") " Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.945674 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-entrypoint\") pod \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\" (UID: \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\") " Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.946205 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9f8fde5e-03c8-4002-995a-9a990d5e1dbf" (UID: "9f8fde5e-03c8-4002-995a-9a990d5e1dbf"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.946656 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-entrypoint" (OuterVolumeSpecName: "entrypoint") pod "9f8fde5e-03c8-4002-995a-9a990d5e1dbf" (UID: "9f8fde5e-03c8-4002-995a-9a990d5e1dbf"). InnerVolumeSpecName "entrypoint". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.946802 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-config-openshift-service-cacrt" (OuterVolumeSpecName: "config-openshift-service-cacrt") pod "9f8fde5e-03c8-4002-995a-9a990d5e1dbf" (UID: "9f8fde5e-03c8-4002-995a-9a990d5e1dbf"). InnerVolumeSpecName "config-openshift-service-cacrt". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.947284 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-collector-syslog-receiver\") pod \"collector-vl7dr\" (UID: \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\") " pod="openshift-logging/collector-vl7dr" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.947401 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.947420 4723 reconciler_common.go:293] "Volume detached for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-datadir\") on node \"crc\" DevicePath \"\"" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.947484 4723 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.947502 4723 reconciler_common.go:293] "Volume detached for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-config-openshift-service-cacrt\") on node \"crc\" DevicePath \"\"" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.947521 4723 reconciler_common.go:293] "Volume detached for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-entrypoint\") on node \"crc\" DevicePath \"\"" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.950833 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-kube-api-access-cb4pp" (OuterVolumeSpecName: "kube-api-access-cb4pp") pod "9f8fde5e-03c8-4002-995a-9a990d5e1dbf" (UID: "9f8fde5e-03c8-4002-995a-9a990d5e1dbf"). InnerVolumeSpecName "kube-api-access-cb4pp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.951086 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-tmp" (OuterVolumeSpecName: "tmp") pod "9f8fde5e-03c8-4002-995a-9a990d5e1dbf" (UID: "9f8fde5e-03c8-4002-995a-9a990d5e1dbf"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.951669 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-collector-syslog-receiver\") pod \"collector-vl7dr\" (UID: \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\") " pod="openshift-logging/collector-vl7dr" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.952190 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-sa-token" (OuterVolumeSpecName: "sa-token") pod "9f8fde5e-03c8-4002-995a-9a990d5e1dbf" (UID: "9f8fde5e-03c8-4002-995a-9a990d5e1dbf"). InnerVolumeSpecName "sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.954717 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-metrics" (OuterVolumeSpecName: "metrics") pod "9f8fde5e-03c8-4002-995a-9a990d5e1dbf" (UID: "9f8fde5e-03c8-4002-995a-9a990d5e1dbf"). InnerVolumeSpecName "metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:13:19 crc kubenswrapper[4723]: I0309 13:13:19.956118 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-collector-token" (OuterVolumeSpecName: "collector-token") pod "9f8fde5e-03c8-4002-995a-9a990d5e1dbf" (UID: "9f8fde5e-03c8-4002-995a-9a990d5e1dbf"). InnerVolumeSpecName "collector-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.048295 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-collector-syslog-receiver\") pod \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\" (UID: \"9f8fde5e-03c8-4002-995a-9a990d5e1dbf\") " Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.048776 4723 reconciler_common.go:293] "Volume detached for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-sa-token\") on node \"crc\" DevicePath \"\"" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.048805 4723 reconciler_common.go:293] "Volume detached for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-metrics\") on node \"crc\" DevicePath \"\"" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.048822 4723 reconciler_common.go:293] "Volume detached for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-collector-token\") on node \"crc\" DevicePath \"\"" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.048839 4723 reconciler_common.go:293] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-tmp\") on node \"crc\" DevicePath \"\"" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.048856 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cb4pp\" (UniqueName: \"kubernetes.io/projected/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-kube-api-access-cb4pp\") on node \"crc\" DevicePath \"\"" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.053332 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-collector-syslog-receiver" (OuterVolumeSpecName: "collector-syslog-receiver") pod "9f8fde5e-03c8-4002-995a-9a990d5e1dbf" (UID: "9f8fde5e-03c8-4002-995a-9a990d5e1dbf"). InnerVolumeSpecName "collector-syslog-receiver". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.151041 4723 reconciler_common.go:293] "Volume detached for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/9f8fde5e-03c8-4002-995a-9a990d5e1dbf-collector-syslog-receiver\") on node \"crc\" DevicePath \"\"" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.765352 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-vl7dr" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.834752 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-vl7dr"] Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.848331 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-logging/collector-vl7dr"] Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.858844 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-xkssk"] Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.859956 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-xkssk" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.861962 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/fa952b22-aa73-49cf-b851-59e7c93de305-entrypoint\") pod \"collector-xkssk\" (UID: \"fa952b22-aa73-49cf-b851-59e7c93de305\") " pod="openshift-logging/collector-xkssk" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.861996 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fa952b22-aa73-49cf-b851-59e7c93de305-trusted-ca\") pod \"collector-xkssk\" (UID: \"fa952b22-aa73-49cf-b851-59e7c93de305\") " pod="openshift-logging/collector-xkssk" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.862030 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/fa952b22-aa73-49cf-b851-59e7c93de305-sa-token\") pod \"collector-xkssk\" (UID: \"fa952b22-aa73-49cf-b851-59e7c93de305\") " pod="openshift-logging/collector-xkssk" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.862146 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/fa952b22-aa73-49cf-b851-59e7c93de305-config-openshift-service-cacrt\") pod \"collector-xkssk\" (UID: \"fa952b22-aa73-49cf-b851-59e7c93de305\") " pod="openshift-logging/collector-xkssk" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.862754 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa952b22-aa73-49cf-b851-59e7c93de305-config\") pod \"collector-xkssk\" (UID: \"fa952b22-aa73-49cf-b851-59e7c93de305\") " pod="openshift-logging/collector-xkssk" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.862808 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/fa952b22-aa73-49cf-b851-59e7c93de305-datadir\") pod \"collector-xkssk\" (UID: \"fa952b22-aa73-49cf-b851-59e7c93de305\") " pod="openshift-logging/collector-xkssk" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.862930 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/fa952b22-aa73-49cf-b851-59e7c93de305-collector-syslog-receiver\") pod \"collector-xkssk\" (UID: \"fa952b22-aa73-49cf-b851-59e7c93de305\") " pod="openshift-logging/collector-xkssk" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.862972 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/fa952b22-aa73-49cf-b851-59e7c93de305-collector-token\") pod \"collector-xkssk\" (UID: \"fa952b22-aa73-49cf-b851-59e7c93de305\") " pod="openshift-logging/collector-xkssk" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.863002 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j85l4\" (UniqueName: \"kubernetes.io/projected/fa952b22-aa73-49cf-b851-59e7c93de305-kube-api-access-j85l4\") pod \"collector-xkssk\" (UID: \"fa952b22-aa73-49cf-b851-59e7c93de305\") " pod="openshift-logging/collector-xkssk" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.863059 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/fa952b22-aa73-49cf-b851-59e7c93de305-metrics\") pod \"collector-xkssk\" (UID: \"fa952b22-aa73-49cf-b851-59e7c93de305\") " pod="openshift-logging/collector-xkssk" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.863078 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fa952b22-aa73-49cf-b851-59e7c93de305-tmp\") pod \"collector-xkssk\" (UID: \"fa952b22-aa73-49cf-b851-59e7c93de305\") " pod="openshift-logging/collector-xkssk" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.864112 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.864303 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-n7kgj" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.864430 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.864577 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.864973 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.876221 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-xkssk"] Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.879709 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.891216 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f8fde5e-03c8-4002-995a-9a990d5e1dbf" path="/var/lib/kubelet/pods/9f8fde5e-03c8-4002-995a-9a990d5e1dbf/volumes" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.963536 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/fa952b22-aa73-49cf-b851-59e7c93de305-metrics\") pod \"collector-xkssk\" (UID: \"fa952b22-aa73-49cf-b851-59e7c93de305\") " pod="openshift-logging/collector-xkssk" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.963584 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fa952b22-aa73-49cf-b851-59e7c93de305-tmp\") pod \"collector-xkssk\" (UID: \"fa952b22-aa73-49cf-b851-59e7c93de305\") " pod="openshift-logging/collector-xkssk" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.963633 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/fa952b22-aa73-49cf-b851-59e7c93de305-entrypoint\") pod \"collector-xkssk\" (UID: \"fa952b22-aa73-49cf-b851-59e7c93de305\") " pod="openshift-logging/collector-xkssk" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.963673 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fa952b22-aa73-49cf-b851-59e7c93de305-trusted-ca\") pod \"collector-xkssk\" (UID: \"fa952b22-aa73-49cf-b851-59e7c93de305\") " pod="openshift-logging/collector-xkssk" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.963711 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/fa952b22-aa73-49cf-b851-59e7c93de305-sa-token\") pod \"collector-xkssk\" (UID: \"fa952b22-aa73-49cf-b851-59e7c93de305\") " pod="openshift-logging/collector-xkssk" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.963739 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/fa952b22-aa73-49cf-b851-59e7c93de305-config-openshift-service-cacrt\") pod \"collector-xkssk\" (UID: \"fa952b22-aa73-49cf-b851-59e7c93de305\") " pod="openshift-logging/collector-xkssk" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.963774 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa952b22-aa73-49cf-b851-59e7c93de305-config\") pod \"collector-xkssk\" (UID: \"fa952b22-aa73-49cf-b851-59e7c93de305\") " pod="openshift-logging/collector-xkssk" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.963828 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/fa952b22-aa73-49cf-b851-59e7c93de305-datadir\") pod \"collector-xkssk\" (UID: \"fa952b22-aa73-49cf-b851-59e7c93de305\") " pod="openshift-logging/collector-xkssk" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.963901 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/fa952b22-aa73-49cf-b851-59e7c93de305-collector-syslog-receiver\") pod \"collector-xkssk\" (UID: \"fa952b22-aa73-49cf-b851-59e7c93de305\") " pod="openshift-logging/collector-xkssk" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.963926 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/fa952b22-aa73-49cf-b851-59e7c93de305-collector-token\") pod \"collector-xkssk\" (UID: \"fa952b22-aa73-49cf-b851-59e7c93de305\") " pod="openshift-logging/collector-xkssk" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.963952 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j85l4\" (UniqueName: \"kubernetes.io/projected/fa952b22-aa73-49cf-b851-59e7c93de305-kube-api-access-j85l4\") pod \"collector-xkssk\" (UID: \"fa952b22-aa73-49cf-b851-59e7c93de305\") " pod="openshift-logging/collector-xkssk" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.964897 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/fa952b22-aa73-49cf-b851-59e7c93de305-config-openshift-service-cacrt\") pod \"collector-xkssk\" (UID: \"fa952b22-aa73-49cf-b851-59e7c93de305\") " pod="openshift-logging/collector-xkssk" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.965158 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa952b22-aa73-49cf-b851-59e7c93de305-config\") pod \"collector-xkssk\" (UID: \"fa952b22-aa73-49cf-b851-59e7c93de305\") " pod="openshift-logging/collector-xkssk" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.965543 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/fa952b22-aa73-49cf-b851-59e7c93de305-datadir\") pod \"collector-xkssk\" (UID: \"fa952b22-aa73-49cf-b851-59e7c93de305\") " pod="openshift-logging/collector-xkssk" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.966209 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fa952b22-aa73-49cf-b851-59e7c93de305-trusted-ca\") pod \"collector-xkssk\" (UID: \"fa952b22-aa73-49cf-b851-59e7c93de305\") " pod="openshift-logging/collector-xkssk" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.967471 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/fa952b22-aa73-49cf-b851-59e7c93de305-entrypoint\") pod \"collector-xkssk\" (UID: \"fa952b22-aa73-49cf-b851-59e7c93de305\") " pod="openshift-logging/collector-xkssk" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.967809 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fa952b22-aa73-49cf-b851-59e7c93de305-tmp\") pod \"collector-xkssk\" (UID: \"fa952b22-aa73-49cf-b851-59e7c93de305\") " pod="openshift-logging/collector-xkssk" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.968355 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/fa952b22-aa73-49cf-b851-59e7c93de305-metrics\") pod \"collector-xkssk\" (UID: \"fa952b22-aa73-49cf-b851-59e7c93de305\") " pod="openshift-logging/collector-xkssk" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.970447 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/fa952b22-aa73-49cf-b851-59e7c93de305-collector-syslog-receiver\") pod \"collector-xkssk\" (UID: \"fa952b22-aa73-49cf-b851-59e7c93de305\") " pod="openshift-logging/collector-xkssk" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.981713 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/fa952b22-aa73-49cf-b851-59e7c93de305-collector-token\") pod \"collector-xkssk\" (UID: \"fa952b22-aa73-49cf-b851-59e7c93de305\") " pod="openshift-logging/collector-xkssk" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.984335 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j85l4\" (UniqueName: \"kubernetes.io/projected/fa952b22-aa73-49cf-b851-59e7c93de305-kube-api-access-j85l4\") pod \"collector-xkssk\" (UID: \"fa952b22-aa73-49cf-b851-59e7c93de305\") " pod="openshift-logging/collector-xkssk" Mar 09 13:13:20 crc kubenswrapper[4723]: I0309 13:13:20.986735 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/fa952b22-aa73-49cf-b851-59e7c93de305-sa-token\") pod \"collector-xkssk\" (UID: \"fa952b22-aa73-49cf-b851-59e7c93de305\") " pod="openshift-logging/collector-xkssk" Mar 09 13:13:21 crc kubenswrapper[4723]: I0309 13:13:21.174933 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-xkssk" Mar 09 13:13:21 crc kubenswrapper[4723]: I0309 13:13:21.643296 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-xkssk"] Mar 09 13:13:21 crc kubenswrapper[4723]: I0309 13:13:21.667750 4723 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 13:13:21 crc kubenswrapper[4723]: I0309 13:13:21.775524 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-xkssk" event={"ID":"fa952b22-aa73-49cf-b851-59e7c93de305","Type":"ContainerStarted","Data":"71d8d6d84429b53ee4cdfb70c84f4c56f0b0a1fe71e4931279798b25643d08a4"} Mar 09 13:13:28 crc kubenswrapper[4723]: I0309 13:13:28.845607 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-xkssk" event={"ID":"fa952b22-aa73-49cf-b851-59e7c93de305","Type":"ContainerStarted","Data":"6c30ab84f070cbd8b35924972d7484bf9b1fb16fff3d4221e5514f6a525806e8"} Mar 09 13:13:28 crc kubenswrapper[4723]: I0309 13:13:28.893122 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/collector-xkssk" podStartSLOduration=2.344666473 podStartE2EDuration="8.893094093s" podCreationTimestamp="2026-03-09 13:13:20 +0000 UTC" firstStartedPulling="2026-03-09 13:13:21.667381687 +0000 UTC m=+875.681849227" lastFinishedPulling="2026-03-09 13:13:28.215809297 +0000 UTC m=+882.230276847" observedRunningTime="2026-03-09 13:13:28.881081132 +0000 UTC m=+882.895548732" watchObservedRunningTime="2026-03-09 13:13:28.893094093 +0000 UTC m=+882.907561673" Mar 09 13:13:33 crc kubenswrapper[4723]: I0309 13:13:33.947184 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:13:33 crc kubenswrapper[4723]: I0309 13:13:33.948980 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:13:58 crc kubenswrapper[4723]: I0309 13:13:58.928611 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82pcl7p"] Mar 09 13:13:58 crc kubenswrapper[4723]: I0309 13:13:58.930550 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82pcl7p" Mar 09 13:13:58 crc kubenswrapper[4723]: I0309 13:13:58.933676 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 09 13:13:58 crc kubenswrapper[4723]: I0309 13:13:58.938575 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82pcl7p"] Mar 09 13:13:59 crc kubenswrapper[4723]: I0309 13:13:59.108007 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f5db2d82-fe45-4c4a-a3b2-8addddad74fe-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82pcl7p\" (UID: \"f5db2d82-fe45-4c4a-a3b2-8addddad74fe\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82pcl7p" Mar 09 13:13:59 crc kubenswrapper[4723]: I0309 13:13:59.108193 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f5db2d82-fe45-4c4a-a3b2-8addddad74fe-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82pcl7p\" (UID: \"f5db2d82-fe45-4c4a-a3b2-8addddad74fe\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82pcl7p" Mar 09 13:13:59 crc kubenswrapper[4723]: I0309 13:13:59.108226 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt227\" (UniqueName: \"kubernetes.io/projected/f5db2d82-fe45-4c4a-a3b2-8addddad74fe-kube-api-access-zt227\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82pcl7p\" (UID: \"f5db2d82-fe45-4c4a-a3b2-8addddad74fe\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82pcl7p" Mar 09 13:13:59 crc kubenswrapper[4723]: I0309 13:13:59.209562 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f5db2d82-fe45-4c4a-a3b2-8addddad74fe-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82pcl7p\" (UID: \"f5db2d82-fe45-4c4a-a3b2-8addddad74fe\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82pcl7p" Mar 09 13:13:59 crc kubenswrapper[4723]: I0309 13:13:59.209614 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt227\" (UniqueName: \"kubernetes.io/projected/f5db2d82-fe45-4c4a-a3b2-8addddad74fe-kube-api-access-zt227\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82pcl7p\" (UID: \"f5db2d82-fe45-4c4a-a3b2-8addddad74fe\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82pcl7p" Mar 09 13:13:59 crc kubenswrapper[4723]: I0309 13:13:59.209677 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f5db2d82-fe45-4c4a-a3b2-8addddad74fe-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82pcl7p\" (UID: \"f5db2d82-fe45-4c4a-a3b2-8addddad74fe\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82pcl7p" Mar 09 13:13:59 crc kubenswrapper[4723]: I0309 13:13:59.210380 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f5db2d82-fe45-4c4a-a3b2-8addddad74fe-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82pcl7p\" (UID: \"f5db2d82-fe45-4c4a-a3b2-8addddad74fe\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82pcl7p" Mar 09 13:13:59 crc kubenswrapper[4723]: I0309 13:13:59.210685 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f5db2d82-fe45-4c4a-a3b2-8addddad74fe-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82pcl7p\" (UID: \"f5db2d82-fe45-4c4a-a3b2-8addddad74fe\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82pcl7p" Mar 09 13:13:59 crc kubenswrapper[4723]: I0309 13:13:59.235339 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt227\" (UniqueName: \"kubernetes.io/projected/f5db2d82-fe45-4c4a-a3b2-8addddad74fe-kube-api-access-zt227\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82pcl7p\" (UID: \"f5db2d82-fe45-4c4a-a3b2-8addddad74fe\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82pcl7p" Mar 09 13:13:59 crc kubenswrapper[4723]: I0309 13:13:59.258090 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82pcl7p" Mar 09 13:13:59 crc kubenswrapper[4723]: I0309 13:13:59.821306 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82pcl7p"] Mar 09 13:14:00 crc kubenswrapper[4723]: I0309 13:14:00.120755 4723 generic.go:334] "Generic (PLEG): container finished" podID="f5db2d82-fe45-4c4a-a3b2-8addddad74fe" containerID="c0c9a0cf496027580184f44a68729837bb2132a71984934b66c60cc335b16317" exitCode=0 Mar 09 13:14:00 crc kubenswrapper[4723]: I0309 13:14:00.120823 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82pcl7p" event={"ID":"f5db2d82-fe45-4c4a-a3b2-8addddad74fe","Type":"ContainerDied","Data":"c0c9a0cf496027580184f44a68729837bb2132a71984934b66c60cc335b16317"} Mar 09 13:14:00 crc kubenswrapper[4723]: I0309 13:14:00.121246 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82pcl7p" event={"ID":"f5db2d82-fe45-4c4a-a3b2-8addddad74fe","Type":"ContainerStarted","Data":"c87053ad8af96b89724079d8e86f16b8f9a564f6573891d5c5223c71fd50c6a6"} Mar 09 13:14:00 crc kubenswrapper[4723]: I0309 13:14:00.140314 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551034-7c7r4"] Mar 09 13:14:00 crc kubenswrapper[4723]: I0309 13:14:00.141621 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551034-7c7r4" Mar 09 13:14:00 crc kubenswrapper[4723]: I0309 13:14:00.146920 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:14:00 crc kubenswrapper[4723]: I0309 13:14:00.146921 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jrp4x" Mar 09 13:14:00 crc kubenswrapper[4723]: I0309 13:14:00.147008 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:14:00 crc kubenswrapper[4723]: I0309 13:14:00.158247 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551034-7c7r4"] Mar 09 13:14:00 crc kubenswrapper[4723]: I0309 13:14:00.326434 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4wqj\" (UniqueName: \"kubernetes.io/projected/6e328a06-9569-40c1-aef2-48e3659f74bf-kube-api-access-m4wqj\") pod \"auto-csr-approver-29551034-7c7r4\" (UID: \"6e328a06-9569-40c1-aef2-48e3659f74bf\") " pod="openshift-infra/auto-csr-approver-29551034-7c7r4" Mar 09 13:14:00 crc kubenswrapper[4723]: I0309 13:14:00.428482 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4wqj\" (UniqueName: \"kubernetes.io/projected/6e328a06-9569-40c1-aef2-48e3659f74bf-kube-api-access-m4wqj\") pod \"auto-csr-approver-29551034-7c7r4\" (UID: \"6e328a06-9569-40c1-aef2-48e3659f74bf\") " pod="openshift-infra/auto-csr-approver-29551034-7c7r4" Mar 09 13:14:00 crc kubenswrapper[4723]: I0309 13:14:00.455564 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4wqj\" (UniqueName: \"kubernetes.io/projected/6e328a06-9569-40c1-aef2-48e3659f74bf-kube-api-access-m4wqj\") pod \"auto-csr-approver-29551034-7c7r4\" (UID: \"6e328a06-9569-40c1-aef2-48e3659f74bf\") " pod="openshift-infra/auto-csr-approver-29551034-7c7r4" Mar 09 13:14:00 crc kubenswrapper[4723]: I0309 13:14:00.457744 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551034-7c7r4" Mar 09 13:14:00 crc kubenswrapper[4723]: I0309 13:14:00.924940 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551034-7c7r4"] Mar 09 13:14:01 crc kubenswrapper[4723]: I0309 13:14:01.127845 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551034-7c7r4" event={"ID":"6e328a06-9569-40c1-aef2-48e3659f74bf","Type":"ContainerStarted","Data":"513fef067f39d4c07dfb0e17469c48674a26eac58d4049d62e1ee8a3d1f26d4f"} Mar 09 13:14:01 crc kubenswrapper[4723]: I0309 13:14:01.283111 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8rqx6"] Mar 09 13:14:01 crc kubenswrapper[4723]: I0309 13:14:01.284939 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8rqx6" Mar 09 13:14:01 crc kubenswrapper[4723]: I0309 13:14:01.297312 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8rqx6"] Mar 09 13:14:01 crc kubenswrapper[4723]: I0309 13:14:01.446107 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13796d01-813c-401b-8b33-16cde7937e92-utilities\") pod \"redhat-operators-8rqx6\" (UID: \"13796d01-813c-401b-8b33-16cde7937e92\") " pod="openshift-marketplace/redhat-operators-8rqx6" Mar 09 13:14:01 crc kubenswrapper[4723]: I0309 13:14:01.446358 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bxzh\" (UniqueName: \"kubernetes.io/projected/13796d01-813c-401b-8b33-16cde7937e92-kube-api-access-5bxzh\") pod \"redhat-operators-8rqx6\" (UID: \"13796d01-813c-401b-8b33-16cde7937e92\") " pod="openshift-marketplace/redhat-operators-8rqx6" Mar 09 13:14:01 crc kubenswrapper[4723]: I0309 13:14:01.446447 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13796d01-813c-401b-8b33-16cde7937e92-catalog-content\") pod \"redhat-operators-8rqx6\" (UID: \"13796d01-813c-401b-8b33-16cde7937e92\") " pod="openshift-marketplace/redhat-operators-8rqx6" Mar 09 13:14:01 crc kubenswrapper[4723]: I0309 13:14:01.547630 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13796d01-813c-401b-8b33-16cde7937e92-catalog-content\") pod \"redhat-operators-8rqx6\" (UID: \"13796d01-813c-401b-8b33-16cde7937e92\") " pod="openshift-marketplace/redhat-operators-8rqx6" Mar 09 13:14:01 crc kubenswrapper[4723]: I0309 13:14:01.548003 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13796d01-813c-401b-8b33-16cde7937e92-utilities\") pod \"redhat-operators-8rqx6\" (UID: \"13796d01-813c-401b-8b33-16cde7937e92\") " pod="openshift-marketplace/redhat-operators-8rqx6" Mar 09 13:14:01 crc kubenswrapper[4723]: I0309 13:14:01.548123 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bxzh\" (UniqueName: \"kubernetes.io/projected/13796d01-813c-401b-8b33-16cde7937e92-kube-api-access-5bxzh\") pod \"redhat-operators-8rqx6\" (UID: \"13796d01-813c-401b-8b33-16cde7937e92\") " pod="openshift-marketplace/redhat-operators-8rqx6" Mar 09 13:14:01 crc kubenswrapper[4723]: I0309 13:14:01.548302 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13796d01-813c-401b-8b33-16cde7937e92-catalog-content\") pod \"redhat-operators-8rqx6\" (UID: \"13796d01-813c-401b-8b33-16cde7937e92\") " pod="openshift-marketplace/redhat-operators-8rqx6" Mar 09 13:14:01 crc kubenswrapper[4723]: I0309 13:14:01.548318 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13796d01-813c-401b-8b33-16cde7937e92-utilities\") pod \"redhat-operators-8rqx6\" (UID: \"13796d01-813c-401b-8b33-16cde7937e92\") " pod="openshift-marketplace/redhat-operators-8rqx6" Mar 09 13:14:01 crc kubenswrapper[4723]: I0309 13:14:01.565581 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bxzh\" (UniqueName: \"kubernetes.io/projected/13796d01-813c-401b-8b33-16cde7937e92-kube-api-access-5bxzh\") pod \"redhat-operators-8rqx6\" (UID: \"13796d01-813c-401b-8b33-16cde7937e92\") " pod="openshift-marketplace/redhat-operators-8rqx6" Mar 09 13:14:01 crc kubenswrapper[4723]: I0309 13:14:01.602271 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8rqx6" Mar 09 13:14:02 crc kubenswrapper[4723]: I0309 13:14:02.017793 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8rqx6"] Mar 09 13:14:02 crc kubenswrapper[4723]: I0309 13:14:02.137189 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8rqx6" event={"ID":"13796d01-813c-401b-8b33-16cde7937e92","Type":"ContainerStarted","Data":"7016a4296f02614ba1f382d3d4f52194730759c4b0fe6be6932f3334db4bdb81"} Mar 09 13:14:03 crc kubenswrapper[4723]: I0309 13:14:03.145198 4723 generic.go:334] "Generic (PLEG): container finished" podID="f5db2d82-fe45-4c4a-a3b2-8addddad74fe" containerID="731e75030cf6b053da563d7bb3a90977fe10d4c07bd908a15ee8e2268c5bb070" exitCode=0 Mar 09 13:14:03 crc kubenswrapper[4723]: I0309 13:14:03.145279 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82pcl7p" event={"ID":"f5db2d82-fe45-4c4a-a3b2-8addddad74fe","Type":"ContainerDied","Data":"731e75030cf6b053da563d7bb3a90977fe10d4c07bd908a15ee8e2268c5bb070"} Mar 09 13:14:03 crc kubenswrapper[4723]: I0309 13:14:03.147829 4723 generic.go:334] "Generic (PLEG): container finished" podID="6e328a06-9569-40c1-aef2-48e3659f74bf" containerID="9f0f6210f994342f289cc606c7a07d1c1fceb3bad25c49f5b874238e0e576a0b" exitCode=0 Mar 09 13:14:03 crc kubenswrapper[4723]: I0309 13:14:03.147904 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551034-7c7r4" event={"ID":"6e328a06-9569-40c1-aef2-48e3659f74bf","Type":"ContainerDied","Data":"9f0f6210f994342f289cc606c7a07d1c1fceb3bad25c49f5b874238e0e576a0b"} Mar 09 13:14:03 crc kubenswrapper[4723]: I0309 13:14:03.149300 4723 generic.go:334] "Generic (PLEG): container finished" podID="13796d01-813c-401b-8b33-16cde7937e92" containerID="d4e8f8d5cd12d561df39e6731e7e2a32bc17763384f1c67592aafbf34745cd7f" exitCode=0 Mar 09 13:14:03 crc kubenswrapper[4723]: I0309 13:14:03.149340 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8rqx6" event={"ID":"13796d01-813c-401b-8b33-16cde7937e92","Type":"ContainerDied","Data":"d4e8f8d5cd12d561df39e6731e7e2a32bc17763384f1c67592aafbf34745cd7f"} Mar 09 13:14:03 crc kubenswrapper[4723]: I0309 13:14:03.947304 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:14:03 crc kubenswrapper[4723]: I0309 13:14:03.947374 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:14:04 crc kubenswrapper[4723]: I0309 13:14:04.157756 4723 generic.go:334] "Generic (PLEG): container finished" podID="f5db2d82-fe45-4c4a-a3b2-8addddad74fe" containerID="4c8b492ae742b8229dbad3d36c042d38f582da2ab67c3e502d7463b6e2d7216f" exitCode=0 Mar 09 13:14:04 crc kubenswrapper[4723]: I0309 13:14:04.157892 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82pcl7p" event={"ID":"f5db2d82-fe45-4c4a-a3b2-8addddad74fe","Type":"ContainerDied","Data":"4c8b492ae742b8229dbad3d36c042d38f582da2ab67c3e502d7463b6e2d7216f"} Mar 09 13:14:04 crc kubenswrapper[4723]: I0309 13:14:04.482430 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551034-7c7r4" Mar 09 13:14:04 crc kubenswrapper[4723]: I0309 13:14:04.595536 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4wqj\" (UniqueName: \"kubernetes.io/projected/6e328a06-9569-40c1-aef2-48e3659f74bf-kube-api-access-m4wqj\") pod \"6e328a06-9569-40c1-aef2-48e3659f74bf\" (UID: \"6e328a06-9569-40c1-aef2-48e3659f74bf\") " Mar 09 13:14:04 crc kubenswrapper[4723]: I0309 13:14:04.601054 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e328a06-9569-40c1-aef2-48e3659f74bf-kube-api-access-m4wqj" (OuterVolumeSpecName: "kube-api-access-m4wqj") pod "6e328a06-9569-40c1-aef2-48e3659f74bf" (UID: "6e328a06-9569-40c1-aef2-48e3659f74bf"). InnerVolumeSpecName "kube-api-access-m4wqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:14:04 crc kubenswrapper[4723]: I0309 13:14:04.697187 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4wqj\" (UniqueName: \"kubernetes.io/projected/6e328a06-9569-40c1-aef2-48e3659f74bf-kube-api-access-m4wqj\") on node \"crc\" DevicePath \"\"" Mar 09 13:14:05 crc kubenswrapper[4723]: I0309 13:14:05.174422 4723 generic.go:334] "Generic (PLEG): container finished" podID="13796d01-813c-401b-8b33-16cde7937e92" containerID="f4a529e2d4b7d90579f66c9240d05cc56369330f66637577c120572b02eec504" exitCode=0 Mar 09 13:14:05 crc kubenswrapper[4723]: I0309 13:14:05.174479 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8rqx6" event={"ID":"13796d01-813c-401b-8b33-16cde7937e92","Type":"ContainerDied","Data":"f4a529e2d4b7d90579f66c9240d05cc56369330f66637577c120572b02eec504"} Mar 09 13:14:05 crc kubenswrapper[4723]: I0309 13:14:05.179124 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551034-7c7r4" Mar 09 13:14:05 crc kubenswrapper[4723]: I0309 13:14:05.179372 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551034-7c7r4" event={"ID":"6e328a06-9569-40c1-aef2-48e3659f74bf","Type":"ContainerDied","Data":"513fef067f39d4c07dfb0e17469c48674a26eac58d4049d62e1ee8a3d1f26d4f"} Mar 09 13:14:05 crc kubenswrapper[4723]: I0309 13:14:05.179408 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="513fef067f39d4c07dfb0e17469c48674a26eac58d4049d62e1ee8a3d1f26d4f" Mar 09 13:14:05 crc kubenswrapper[4723]: I0309 13:14:05.426059 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82pcl7p" Mar 09 13:14:05 crc kubenswrapper[4723]: I0309 13:14:05.508165 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zt227\" (UniqueName: \"kubernetes.io/projected/f5db2d82-fe45-4c4a-a3b2-8addddad74fe-kube-api-access-zt227\") pod \"f5db2d82-fe45-4c4a-a3b2-8addddad74fe\" (UID: \"f5db2d82-fe45-4c4a-a3b2-8addddad74fe\") " Mar 09 13:14:05 crc kubenswrapper[4723]: I0309 13:14:05.508247 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f5db2d82-fe45-4c4a-a3b2-8addddad74fe-util\") pod \"f5db2d82-fe45-4c4a-a3b2-8addddad74fe\" (UID: \"f5db2d82-fe45-4c4a-a3b2-8addddad74fe\") " Mar 09 13:14:05 crc kubenswrapper[4723]: I0309 13:14:05.508275 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f5db2d82-fe45-4c4a-a3b2-8addddad74fe-bundle\") pod \"f5db2d82-fe45-4c4a-a3b2-8addddad74fe\" (UID: \"f5db2d82-fe45-4c4a-a3b2-8addddad74fe\") " Mar 09 13:14:05 crc kubenswrapper[4723]: I0309 13:14:05.511319 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5db2d82-fe45-4c4a-a3b2-8addddad74fe-bundle" (OuterVolumeSpecName: "bundle") pod "f5db2d82-fe45-4c4a-a3b2-8addddad74fe" (UID: "f5db2d82-fe45-4c4a-a3b2-8addddad74fe"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:14:05 crc kubenswrapper[4723]: I0309 13:14:05.524230 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5db2d82-fe45-4c4a-a3b2-8addddad74fe-kube-api-access-zt227" (OuterVolumeSpecName: "kube-api-access-zt227") pod "f5db2d82-fe45-4c4a-a3b2-8addddad74fe" (UID: "f5db2d82-fe45-4c4a-a3b2-8addddad74fe"). InnerVolumeSpecName "kube-api-access-zt227". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:14:05 crc kubenswrapper[4723]: I0309 13:14:05.553077 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551028-dg5sf"] Mar 09 13:14:05 crc kubenswrapper[4723]: I0309 13:14:05.559247 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551028-dg5sf"] Mar 09 13:14:05 crc kubenswrapper[4723]: I0309 13:14:05.610626 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zt227\" (UniqueName: \"kubernetes.io/projected/f5db2d82-fe45-4c4a-a3b2-8addddad74fe-kube-api-access-zt227\") on node \"crc\" DevicePath \"\"" Mar 09 13:14:05 crc kubenswrapper[4723]: I0309 13:14:05.610660 4723 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f5db2d82-fe45-4c4a-a3b2-8addddad74fe-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:14:05 crc kubenswrapper[4723]: I0309 13:14:05.686596 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5db2d82-fe45-4c4a-a3b2-8addddad74fe-util" (OuterVolumeSpecName: "util") pod "f5db2d82-fe45-4c4a-a3b2-8addddad74fe" (UID: "f5db2d82-fe45-4c4a-a3b2-8addddad74fe"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:14:05 crc kubenswrapper[4723]: I0309 13:14:05.712612 4723 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f5db2d82-fe45-4c4a-a3b2-8addddad74fe-util\") on node \"crc\" DevicePath \"\"" Mar 09 13:14:06 crc kubenswrapper[4723]: I0309 13:14:06.192298 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82pcl7p" Mar 09 13:14:06 crc kubenswrapper[4723]: I0309 13:14:06.192307 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82pcl7p" event={"ID":"f5db2d82-fe45-4c4a-a3b2-8addddad74fe","Type":"ContainerDied","Data":"c87053ad8af96b89724079d8e86f16b8f9a564f6573891d5c5223c71fd50c6a6"} Mar 09 13:14:06 crc kubenswrapper[4723]: I0309 13:14:06.192727 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c87053ad8af96b89724079d8e86f16b8f9a564f6573891d5c5223c71fd50c6a6" Mar 09 13:14:06 crc kubenswrapper[4723]: I0309 13:14:06.196321 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8rqx6" event={"ID":"13796d01-813c-401b-8b33-16cde7937e92","Type":"ContainerStarted","Data":"b8c54ac450dd71be4247e6ac1a8342ab5d901522814b7dd90e63fe9fa921ce3c"} Mar 09 13:14:06 crc kubenswrapper[4723]: I0309 13:14:06.223850 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8rqx6" podStartSLOduration=2.666387444 podStartE2EDuration="5.223832794s" podCreationTimestamp="2026-03-09 13:14:01 +0000 UTC" firstStartedPulling="2026-03-09 13:14:03.158246598 +0000 UTC m=+917.172714138" lastFinishedPulling="2026-03-09 13:14:05.715691948 +0000 UTC m=+919.730159488" observedRunningTime="2026-03-09 13:14:06.22291621 +0000 UTC m=+920.237383760" watchObservedRunningTime="2026-03-09 13:14:06.223832794 +0000 UTC m=+920.238300354" Mar 09 13:14:06 crc kubenswrapper[4723]: I0309 13:14:06.891064 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e526ef20-e343-48ac-8600-e647ac6996a4" path="/var/lib/kubelet/pods/e526ef20-e343-48ac-8600-e647ac6996a4/volumes" Mar 09 13:14:08 crc kubenswrapper[4723]: I0309 13:14:08.404494 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-plh7h"] Mar 09 13:14:08 crc kubenswrapper[4723]: E0309 13:14:08.404833 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e328a06-9569-40c1-aef2-48e3659f74bf" containerName="oc" Mar 09 13:14:08 crc kubenswrapper[4723]: I0309 13:14:08.404849 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e328a06-9569-40c1-aef2-48e3659f74bf" containerName="oc" Mar 09 13:14:08 crc kubenswrapper[4723]: E0309 13:14:08.404884 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5db2d82-fe45-4c4a-a3b2-8addddad74fe" containerName="extract" Mar 09 13:14:08 crc kubenswrapper[4723]: I0309 13:14:08.404892 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5db2d82-fe45-4c4a-a3b2-8addddad74fe" containerName="extract" Mar 09 13:14:08 crc kubenswrapper[4723]: E0309 13:14:08.404903 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5db2d82-fe45-4c4a-a3b2-8addddad74fe" containerName="util" Mar 09 13:14:08 crc kubenswrapper[4723]: I0309 13:14:08.404910 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5db2d82-fe45-4c4a-a3b2-8addddad74fe" containerName="util" Mar 09 13:14:08 crc kubenswrapper[4723]: E0309 13:14:08.404924 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5db2d82-fe45-4c4a-a3b2-8addddad74fe" containerName="pull" Mar 09 13:14:08 crc kubenswrapper[4723]: I0309 13:14:08.404932 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5db2d82-fe45-4c4a-a3b2-8addddad74fe" containerName="pull" Mar 09 13:14:08 crc kubenswrapper[4723]: I0309 13:14:08.405104 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e328a06-9569-40c1-aef2-48e3659f74bf" containerName="oc" Mar 09 13:14:08 crc kubenswrapper[4723]: I0309 13:14:08.405129 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5db2d82-fe45-4c4a-a3b2-8addddad74fe" containerName="extract" Mar 09 13:14:08 crc kubenswrapper[4723]: I0309 13:14:08.405696 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-plh7h" Mar 09 13:14:08 crc kubenswrapper[4723]: I0309 13:14:08.407778 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 09 13:14:08 crc kubenswrapper[4723]: I0309 13:14:08.408217 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 09 13:14:08 crc kubenswrapper[4723]: I0309 13:14:08.408616 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-d9xpw" Mar 09 13:14:08 crc kubenswrapper[4723]: I0309 13:14:08.421227 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-plh7h"] Mar 09 13:14:08 crc kubenswrapper[4723]: I0309 13:14:08.574131 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w25tl\" (UniqueName: \"kubernetes.io/projected/3c8f1518-efa2-4f99-ad51-e3c754e2b244-kube-api-access-w25tl\") pod \"nmstate-operator-75c5dccd6c-plh7h\" (UID: \"3c8f1518-efa2-4f99-ad51-e3c754e2b244\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-plh7h" Mar 09 13:14:08 crc kubenswrapper[4723]: I0309 13:14:08.675947 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w25tl\" (UniqueName: \"kubernetes.io/projected/3c8f1518-efa2-4f99-ad51-e3c754e2b244-kube-api-access-w25tl\") pod \"nmstate-operator-75c5dccd6c-plh7h\" (UID: \"3c8f1518-efa2-4f99-ad51-e3c754e2b244\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-plh7h" Mar 09 13:14:08 crc kubenswrapper[4723]: I0309 13:14:08.750237 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w25tl\" (UniqueName: \"kubernetes.io/projected/3c8f1518-efa2-4f99-ad51-e3c754e2b244-kube-api-access-w25tl\") pod \"nmstate-operator-75c5dccd6c-plh7h\" (UID: \"3c8f1518-efa2-4f99-ad51-e3c754e2b244\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-plh7h" Mar 09 13:14:09 crc kubenswrapper[4723]: I0309 13:14:09.024698 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-plh7h" Mar 09 13:14:09 crc kubenswrapper[4723]: I0309 13:14:09.492621 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-plh7h"] Mar 09 13:14:10 crc kubenswrapper[4723]: I0309 13:14:10.236494 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-plh7h" event={"ID":"3c8f1518-efa2-4f99-ad51-e3c754e2b244","Type":"ContainerStarted","Data":"3979b4cc89dd6a28796ea99936b06b2831daf4c09112f6888cec00cde9b7b395"} Mar 09 13:14:11 crc kubenswrapper[4723]: I0309 13:14:11.604069 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8rqx6" Mar 09 13:14:11 crc kubenswrapper[4723]: I0309 13:14:11.604120 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8rqx6" Mar 09 13:14:12 crc kubenswrapper[4723]: I0309 13:14:12.665086 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8rqx6" podUID="13796d01-813c-401b-8b33-16cde7937e92" containerName="registry-server" probeResult="failure" output=< Mar 09 13:14:12 crc kubenswrapper[4723]: timeout: failed to connect service ":50051" within 1s Mar 09 13:14:12 crc kubenswrapper[4723]: > Mar 09 13:14:13 crc kubenswrapper[4723]: I0309 13:14:13.277655 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-plh7h" event={"ID":"3c8f1518-efa2-4f99-ad51-e3c754e2b244","Type":"ContainerStarted","Data":"94f280bff2cedb5c0fb8637c3b0fbf110be491f987b8b796a667061fd7d066af"} Mar 09 13:14:13 crc kubenswrapper[4723]: I0309 13:14:13.303573 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-plh7h" podStartSLOduration=2.266471692 podStartE2EDuration="5.303554695s" podCreationTimestamp="2026-03-09 13:14:08 +0000 UTC" firstStartedPulling="2026-03-09 13:14:09.499049668 +0000 UTC m=+923.513517208" lastFinishedPulling="2026-03-09 13:14:12.536132661 +0000 UTC m=+926.550600211" observedRunningTime="2026-03-09 13:14:13.299037094 +0000 UTC m=+927.313504634" watchObservedRunningTime="2026-03-09 13:14:13.303554695 +0000 UTC m=+927.318022235" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.266713 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-cr6kv"] Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.268194 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-cr6kv" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.270939 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-9jpf6" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.279507 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-cr6kv"] Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.295354 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-msbbv"] Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.296804 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-msbbv" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.303389 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.311409 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-msbbv"] Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.328729 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-cfc82"] Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.329621 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-cfc82" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.367738 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lkz7\" (UniqueName: \"kubernetes.io/projected/d467b1e5-db4c-4066-8686-9626d2fd19af-kube-api-access-4lkz7\") pod \"nmstate-metrics-69594cc75-cr6kv\" (UID: \"d467b1e5-db4c-4066-8686-9626d2fd19af\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-cr6kv" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.435177 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-8nj2b"] Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.436363 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-8nj2b" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.437985 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-twwtl" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.438385 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.439727 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.453797 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-8nj2b"] Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.468834 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/f40e96ae-e190-4d90-bb91-ce0a50b528a0-ovs-socket\") pod \"nmstate-handler-cfc82\" (UID: \"f40e96ae-e190-4d90-bb91-ce0a50b528a0\") " pod="openshift-nmstate/nmstate-handler-cfc82" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.468919 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/f40e96ae-e190-4d90-bb91-ce0a50b528a0-dbus-socket\") pod \"nmstate-handler-cfc82\" (UID: \"f40e96ae-e190-4d90-bb91-ce0a50b528a0\") " pod="openshift-nmstate/nmstate-handler-cfc82" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.468979 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/3b47483e-69de-403b-ab71-5c6665c0a36d-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-msbbv\" (UID: \"3b47483e-69de-403b-ab71-5c6665c0a36d\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-msbbv" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.469040 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/f40e96ae-e190-4d90-bb91-ce0a50b528a0-nmstate-lock\") pod \"nmstate-handler-cfc82\" (UID: \"f40e96ae-e190-4d90-bb91-ce0a50b528a0\") " pod="openshift-nmstate/nmstate-handler-cfc82" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.469100 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbrxf\" (UniqueName: \"kubernetes.io/projected/f40e96ae-e190-4d90-bb91-ce0a50b528a0-kube-api-access-dbrxf\") pod \"nmstate-handler-cfc82\" (UID: \"f40e96ae-e190-4d90-bb91-ce0a50b528a0\") " pod="openshift-nmstate/nmstate-handler-cfc82" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.469167 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lkz7\" (UniqueName: \"kubernetes.io/projected/d467b1e5-db4c-4066-8686-9626d2fd19af-kube-api-access-4lkz7\") pod \"nmstate-metrics-69594cc75-cr6kv\" (UID: \"d467b1e5-db4c-4066-8686-9626d2fd19af\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-cr6kv" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.469208 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-954bs\" (UniqueName: \"kubernetes.io/projected/3b47483e-69de-403b-ab71-5c6665c0a36d-kube-api-access-954bs\") pod \"nmstate-webhook-786f45cff4-msbbv\" (UID: \"3b47483e-69de-403b-ab71-5c6665c0a36d\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-msbbv" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.502884 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lkz7\" (UniqueName: \"kubernetes.io/projected/d467b1e5-db4c-4066-8686-9626d2fd19af-kube-api-access-4lkz7\") pod \"nmstate-metrics-69594cc75-cr6kv\" (UID: \"d467b1e5-db4c-4066-8686-9626d2fd19af\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-cr6kv" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.570192 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/3b47483e-69de-403b-ab71-5c6665c0a36d-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-msbbv\" (UID: \"3b47483e-69de-403b-ab71-5c6665c0a36d\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-msbbv" Mar 09 13:14:14 crc kubenswrapper[4723]: E0309 13:14:14.570375 4723 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 09 13:14:14 crc kubenswrapper[4723]: E0309 13:14:14.570702 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b47483e-69de-403b-ab71-5c6665c0a36d-tls-key-pair podName:3b47483e-69de-403b-ab71-5c6665c0a36d nodeName:}" failed. No retries permitted until 2026-03-09 13:14:15.07068282 +0000 UTC m=+929.085150360 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/3b47483e-69de-403b-ab71-5c6665c0a36d-tls-key-pair") pod "nmstate-webhook-786f45cff4-msbbv" (UID: "3b47483e-69de-403b-ab71-5c6665c0a36d") : secret "openshift-nmstate-webhook" not found Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.570616 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jt7m\" (UniqueName: \"kubernetes.io/projected/dbd14494-7dd0-4807-9980-024d38f263f8-kube-api-access-2jt7m\") pod \"nmstate-console-plugin-5dcbbd79cf-8nj2b\" (UID: \"dbd14494-7dd0-4807-9980-024d38f263f8\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-8nj2b" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.570892 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/f40e96ae-e190-4d90-bb91-ce0a50b528a0-nmstate-lock\") pod \"nmstate-handler-cfc82\" (UID: \"f40e96ae-e190-4d90-bb91-ce0a50b528a0\") " pod="openshift-nmstate/nmstate-handler-cfc82" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.570985 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/dbd14494-7dd0-4807-9980-024d38f263f8-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-8nj2b\" (UID: \"dbd14494-7dd0-4807-9980-024d38f263f8\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-8nj2b" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.571018 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbrxf\" (UniqueName: \"kubernetes.io/projected/f40e96ae-e190-4d90-bb91-ce0a50b528a0-kube-api-access-dbrxf\") pod \"nmstate-handler-cfc82\" (UID: \"f40e96ae-e190-4d90-bb91-ce0a50b528a0\") " pod="openshift-nmstate/nmstate-handler-cfc82" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.571034 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/f40e96ae-e190-4d90-bb91-ce0a50b528a0-nmstate-lock\") pod \"nmstate-handler-cfc82\" (UID: \"f40e96ae-e190-4d90-bb91-ce0a50b528a0\") " pod="openshift-nmstate/nmstate-handler-cfc82" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.571110 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/dbd14494-7dd0-4807-9980-024d38f263f8-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-8nj2b\" (UID: \"dbd14494-7dd0-4807-9980-024d38f263f8\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-8nj2b" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.571134 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-954bs\" (UniqueName: \"kubernetes.io/projected/3b47483e-69de-403b-ab71-5c6665c0a36d-kube-api-access-954bs\") pod \"nmstate-webhook-786f45cff4-msbbv\" (UID: \"3b47483e-69de-403b-ab71-5c6665c0a36d\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-msbbv" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.571223 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/f40e96ae-e190-4d90-bb91-ce0a50b528a0-ovs-socket\") pod \"nmstate-handler-cfc82\" (UID: \"f40e96ae-e190-4d90-bb91-ce0a50b528a0\") " pod="openshift-nmstate/nmstate-handler-cfc82" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.571275 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/f40e96ae-e190-4d90-bb91-ce0a50b528a0-dbus-socket\") pod \"nmstate-handler-cfc82\" (UID: \"f40e96ae-e190-4d90-bb91-ce0a50b528a0\") " pod="openshift-nmstate/nmstate-handler-cfc82" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.571301 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/f40e96ae-e190-4d90-bb91-ce0a50b528a0-ovs-socket\") pod \"nmstate-handler-cfc82\" (UID: \"f40e96ae-e190-4d90-bb91-ce0a50b528a0\") " pod="openshift-nmstate/nmstate-handler-cfc82" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.571679 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/f40e96ae-e190-4d90-bb91-ce0a50b528a0-dbus-socket\") pod \"nmstate-handler-cfc82\" (UID: \"f40e96ae-e190-4d90-bb91-ce0a50b528a0\") " pod="openshift-nmstate/nmstate-handler-cfc82" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.587697 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-cr6kv" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.588235 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbrxf\" (UniqueName: \"kubernetes.io/projected/f40e96ae-e190-4d90-bb91-ce0a50b528a0-kube-api-access-dbrxf\") pod \"nmstate-handler-cfc82\" (UID: \"f40e96ae-e190-4d90-bb91-ce0a50b528a0\") " pod="openshift-nmstate/nmstate-handler-cfc82" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.592886 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-954bs\" (UniqueName: \"kubernetes.io/projected/3b47483e-69de-403b-ab71-5c6665c0a36d-kube-api-access-954bs\") pod \"nmstate-webhook-786f45cff4-msbbv\" (UID: \"3b47483e-69de-403b-ab71-5c6665c0a36d\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-msbbv" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.647206 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-cfc82" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.672720 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jt7m\" (UniqueName: \"kubernetes.io/projected/dbd14494-7dd0-4807-9980-024d38f263f8-kube-api-access-2jt7m\") pod \"nmstate-console-plugin-5dcbbd79cf-8nj2b\" (UID: \"dbd14494-7dd0-4807-9980-024d38f263f8\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-8nj2b" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.672831 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/dbd14494-7dd0-4807-9980-024d38f263f8-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-8nj2b\" (UID: \"dbd14494-7dd0-4807-9980-024d38f263f8\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-8nj2b" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.672898 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/dbd14494-7dd0-4807-9980-024d38f263f8-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-8nj2b\" (UID: \"dbd14494-7dd0-4807-9980-024d38f263f8\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-8nj2b" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.674208 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/dbd14494-7dd0-4807-9980-024d38f263f8-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-8nj2b\" (UID: \"dbd14494-7dd0-4807-9980-024d38f263f8\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-8nj2b" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.681840 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/dbd14494-7dd0-4807-9980-024d38f263f8-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-8nj2b\" (UID: \"dbd14494-7dd0-4807-9980-024d38f263f8\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-8nj2b" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.717642 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jt7m\" (UniqueName: \"kubernetes.io/projected/dbd14494-7dd0-4807-9980-024d38f263f8-kube-api-access-2jt7m\") pod \"nmstate-console-plugin-5dcbbd79cf-8nj2b\" (UID: \"dbd14494-7dd0-4807-9980-024d38f263f8\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-8nj2b" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.761582 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-8nj2b" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.838446 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6d748498-cmr7m"] Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.842187 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d748498-cmr7m" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.855495 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6d748498-cmr7m"] Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.982110 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2994750c-7f55-4708-856b-c9547e3f054c-console-serving-cert\") pod \"console-6d748498-cmr7m\" (UID: \"2994750c-7f55-4708-856b-c9547e3f054c\") " pod="openshift-console/console-6d748498-cmr7m" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.982172 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2994750c-7f55-4708-856b-c9547e3f054c-oauth-serving-cert\") pod \"console-6d748498-cmr7m\" (UID: \"2994750c-7f55-4708-856b-c9547e3f054c\") " pod="openshift-console/console-6d748498-cmr7m" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.982200 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2994750c-7f55-4708-856b-c9547e3f054c-console-config\") pod \"console-6d748498-cmr7m\" (UID: \"2994750c-7f55-4708-856b-c9547e3f054c\") " pod="openshift-console/console-6d748498-cmr7m" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.982399 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2994750c-7f55-4708-856b-c9547e3f054c-trusted-ca-bundle\") pod \"console-6d748498-cmr7m\" (UID: \"2994750c-7f55-4708-856b-c9547e3f054c\") " pod="openshift-console/console-6d748498-cmr7m" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.982477 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2994750c-7f55-4708-856b-c9547e3f054c-console-oauth-config\") pod \"console-6d748498-cmr7m\" (UID: \"2994750c-7f55-4708-856b-c9547e3f054c\") " pod="openshift-console/console-6d748498-cmr7m" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.982509 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2994750c-7f55-4708-856b-c9547e3f054c-service-ca\") pod \"console-6d748498-cmr7m\" (UID: \"2994750c-7f55-4708-856b-c9547e3f054c\") " pod="openshift-console/console-6d748498-cmr7m" Mar 09 13:14:14 crc kubenswrapper[4723]: I0309 13:14:14.982540 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdpnb\" (UniqueName: \"kubernetes.io/projected/2994750c-7f55-4708-856b-c9547e3f054c-kube-api-access-vdpnb\") pod \"console-6d748498-cmr7m\" (UID: \"2994750c-7f55-4708-856b-c9547e3f054c\") " pod="openshift-console/console-6d748498-cmr7m" Mar 09 13:14:15 crc kubenswrapper[4723]: I0309 13:14:15.083703 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2994750c-7f55-4708-856b-c9547e3f054c-oauth-serving-cert\") pod \"console-6d748498-cmr7m\" (UID: \"2994750c-7f55-4708-856b-c9547e3f054c\") " pod="openshift-console/console-6d748498-cmr7m" Mar 09 13:14:15 crc kubenswrapper[4723]: I0309 13:14:15.083745 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2994750c-7f55-4708-856b-c9547e3f054c-console-config\") pod \"console-6d748498-cmr7m\" (UID: \"2994750c-7f55-4708-856b-c9547e3f054c\") " pod="openshift-console/console-6d748498-cmr7m" Mar 09 13:14:15 crc kubenswrapper[4723]: I0309 13:14:15.083804 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2994750c-7f55-4708-856b-c9547e3f054c-trusted-ca-bundle\") pod \"console-6d748498-cmr7m\" (UID: \"2994750c-7f55-4708-856b-c9547e3f054c\") " pod="openshift-console/console-6d748498-cmr7m" Mar 09 13:14:15 crc kubenswrapper[4723]: I0309 13:14:15.083835 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/3b47483e-69de-403b-ab71-5c6665c0a36d-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-msbbv\" (UID: \"3b47483e-69de-403b-ab71-5c6665c0a36d\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-msbbv" Mar 09 13:14:15 crc kubenswrapper[4723]: I0309 13:14:15.083851 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2994750c-7f55-4708-856b-c9547e3f054c-console-oauth-config\") pod \"console-6d748498-cmr7m\" (UID: \"2994750c-7f55-4708-856b-c9547e3f054c\") " pod="openshift-console/console-6d748498-cmr7m" Mar 09 13:14:15 crc kubenswrapper[4723]: I0309 13:14:15.083884 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2994750c-7f55-4708-856b-c9547e3f054c-service-ca\") pod \"console-6d748498-cmr7m\" (UID: \"2994750c-7f55-4708-856b-c9547e3f054c\") " pod="openshift-console/console-6d748498-cmr7m" Mar 09 13:14:15 crc kubenswrapper[4723]: I0309 13:14:15.083905 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdpnb\" (UniqueName: \"kubernetes.io/projected/2994750c-7f55-4708-856b-c9547e3f054c-kube-api-access-vdpnb\") pod \"console-6d748498-cmr7m\" (UID: \"2994750c-7f55-4708-856b-c9547e3f054c\") " pod="openshift-console/console-6d748498-cmr7m" Mar 09 13:14:15 crc kubenswrapper[4723]: I0309 13:14:15.083965 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2994750c-7f55-4708-856b-c9547e3f054c-console-serving-cert\") pod \"console-6d748498-cmr7m\" (UID: \"2994750c-7f55-4708-856b-c9547e3f054c\") " pod="openshift-console/console-6d748498-cmr7m" Mar 09 13:14:15 crc kubenswrapper[4723]: I0309 13:14:15.084728 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2994750c-7f55-4708-856b-c9547e3f054c-oauth-serving-cert\") pod \"console-6d748498-cmr7m\" (UID: \"2994750c-7f55-4708-856b-c9547e3f054c\") " pod="openshift-console/console-6d748498-cmr7m" Mar 09 13:14:15 crc kubenswrapper[4723]: I0309 13:14:15.084767 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2994750c-7f55-4708-856b-c9547e3f054c-console-config\") pod \"console-6d748498-cmr7m\" (UID: \"2994750c-7f55-4708-856b-c9547e3f054c\") " pod="openshift-console/console-6d748498-cmr7m" Mar 09 13:14:15 crc kubenswrapper[4723]: I0309 13:14:15.085659 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2994750c-7f55-4708-856b-c9547e3f054c-trusted-ca-bundle\") pod \"console-6d748498-cmr7m\" (UID: \"2994750c-7f55-4708-856b-c9547e3f054c\") " pod="openshift-console/console-6d748498-cmr7m" Mar 09 13:14:15 crc kubenswrapper[4723]: I0309 13:14:15.086254 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2994750c-7f55-4708-856b-c9547e3f054c-service-ca\") pod \"console-6d748498-cmr7m\" (UID: \"2994750c-7f55-4708-856b-c9547e3f054c\") " pod="openshift-console/console-6d748498-cmr7m" Mar 09 13:14:15 crc kubenswrapper[4723]: I0309 13:14:15.088654 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/3b47483e-69de-403b-ab71-5c6665c0a36d-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-msbbv\" (UID: \"3b47483e-69de-403b-ab71-5c6665c0a36d\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-msbbv" Mar 09 13:14:15 crc kubenswrapper[4723]: I0309 13:14:15.088798 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2994750c-7f55-4708-856b-c9547e3f054c-console-serving-cert\") pod \"console-6d748498-cmr7m\" (UID: \"2994750c-7f55-4708-856b-c9547e3f054c\") " pod="openshift-console/console-6d748498-cmr7m" Mar 09 13:14:15 crc kubenswrapper[4723]: I0309 13:14:15.091927 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2994750c-7f55-4708-856b-c9547e3f054c-console-oauth-config\") pod \"console-6d748498-cmr7m\" (UID: \"2994750c-7f55-4708-856b-c9547e3f054c\") " pod="openshift-console/console-6d748498-cmr7m" Mar 09 13:14:15 crc kubenswrapper[4723]: I0309 13:14:15.100947 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdpnb\" (UniqueName: \"kubernetes.io/projected/2994750c-7f55-4708-856b-c9547e3f054c-kube-api-access-vdpnb\") pod \"console-6d748498-cmr7m\" (UID: \"2994750c-7f55-4708-856b-c9547e3f054c\") " pod="openshift-console/console-6d748498-cmr7m" Mar 09 13:14:15 crc kubenswrapper[4723]: W0309 13:14:15.161009 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd467b1e5_db4c_4066_8686_9626d2fd19af.slice/crio-e394606bc56b0cec61cb06b2a0c614aa5adfa459ffcb7054b3befd22802c4202 WatchSource:0}: Error finding container e394606bc56b0cec61cb06b2a0c614aa5adfa459ffcb7054b3befd22802c4202: Status 404 returned error can't find the container with id e394606bc56b0cec61cb06b2a0c614aa5adfa459ffcb7054b3befd22802c4202 Mar 09 13:14:15 crc kubenswrapper[4723]: I0309 13:14:15.161914 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-cr6kv"] Mar 09 13:14:15 crc kubenswrapper[4723]: I0309 13:14:15.169078 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d748498-cmr7m" Mar 09 13:14:15 crc kubenswrapper[4723]: I0309 13:14:15.221227 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-msbbv" Mar 09 13:14:15 crc kubenswrapper[4723]: I0309 13:14:15.271632 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-8nj2b"] Mar 09 13:14:15 crc kubenswrapper[4723]: I0309 13:14:15.314517 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-cr6kv" event={"ID":"d467b1e5-db4c-4066-8686-9626d2fd19af","Type":"ContainerStarted","Data":"e394606bc56b0cec61cb06b2a0c614aa5adfa459ffcb7054b3befd22802c4202"} Mar 09 13:14:15 crc kubenswrapper[4723]: I0309 13:14:15.320670 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-cfc82" event={"ID":"f40e96ae-e190-4d90-bb91-ce0a50b528a0","Type":"ContainerStarted","Data":"d0248c811da509f4ea80a1c509205c9ca55d1c4c86ee316521098935b112b119"} Mar 09 13:14:15 crc kubenswrapper[4723]: W0309 13:14:15.579717 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2994750c_7f55_4708_856b_c9547e3f054c.slice/crio-640e041ceb96605808c9acad14b9abfb33243803ee73577767597291fa592caa WatchSource:0}: Error finding container 640e041ceb96605808c9acad14b9abfb33243803ee73577767597291fa592caa: Status 404 returned error can't find the container with id 640e041ceb96605808c9acad14b9abfb33243803ee73577767597291fa592caa Mar 09 13:14:15 crc kubenswrapper[4723]: I0309 13:14:15.586774 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6d748498-cmr7m"] Mar 09 13:14:15 crc kubenswrapper[4723]: I0309 13:14:15.689125 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-msbbv"] Mar 09 13:14:16 crc kubenswrapper[4723]: I0309 13:14:16.328297 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-8nj2b" event={"ID":"dbd14494-7dd0-4807-9980-024d38f263f8","Type":"ContainerStarted","Data":"9fb749e104b0b696d733d7993b6e8f8da959cffa437cc8d510b480b3cec1753d"} Mar 09 13:14:16 crc kubenswrapper[4723]: I0309 13:14:16.329891 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d748498-cmr7m" event={"ID":"2994750c-7f55-4708-856b-c9547e3f054c","Type":"ContainerStarted","Data":"f56b0d79b1e915f7a130679fa2da30ddd4a330859fc19d4a51c5569a1d76ecec"} Mar 09 13:14:16 crc kubenswrapper[4723]: I0309 13:14:16.329930 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d748498-cmr7m" event={"ID":"2994750c-7f55-4708-856b-c9547e3f054c","Type":"ContainerStarted","Data":"640e041ceb96605808c9acad14b9abfb33243803ee73577767597291fa592caa"} Mar 09 13:14:16 crc kubenswrapper[4723]: I0309 13:14:16.331402 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-msbbv" event={"ID":"3b47483e-69de-403b-ab71-5c6665c0a36d","Type":"ContainerStarted","Data":"aa54027f437ddda6f0d91b2a5df4305c6f6ba34fc44da3c3e9773dc3528b30ac"} Mar 09 13:14:16 crc kubenswrapper[4723]: I0309 13:14:16.354243 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6d748498-cmr7m" podStartSLOduration=2.354212522 podStartE2EDuration="2.354212522s" podCreationTimestamp="2026-03-09 13:14:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:14:16.348146679 +0000 UTC m=+930.362614249" watchObservedRunningTime="2026-03-09 13:14:16.354212522 +0000 UTC m=+930.368680062" Mar 09 13:14:19 crc kubenswrapper[4723]: I0309 13:14:19.354256 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-msbbv" event={"ID":"3b47483e-69de-403b-ab71-5c6665c0a36d","Type":"ContainerStarted","Data":"0f3765987a6552d88b29a61642fc5e67872d7f994885f85f4f088fd0089e7347"} Mar 09 13:14:19 crc kubenswrapper[4723]: I0309 13:14:19.354591 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-786f45cff4-msbbv" Mar 09 13:14:19 crc kubenswrapper[4723]: I0309 13:14:19.355761 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-cfc82" event={"ID":"f40e96ae-e190-4d90-bb91-ce0a50b528a0","Type":"ContainerStarted","Data":"9c37a459aa4afeb40dde05131265e02775d4169efcaa4943bbd4ec76eb5cbd9b"} Mar 09 13:14:19 crc kubenswrapper[4723]: I0309 13:14:19.355811 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-cfc82" Mar 09 13:14:19 crc kubenswrapper[4723]: I0309 13:14:19.356878 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-8nj2b" event={"ID":"dbd14494-7dd0-4807-9980-024d38f263f8","Type":"ContainerStarted","Data":"72010aad4844871cd0ab0ff1f7a1d65021a285c9c6efdf29ce2984a8bc44a1f0"} Mar 09 13:14:19 crc kubenswrapper[4723]: I0309 13:14:19.360559 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-cr6kv" event={"ID":"d467b1e5-db4c-4066-8686-9626d2fd19af","Type":"ContainerStarted","Data":"d768d37151c345c3e5b4303e5d46225baff1c24ef3d686fc6c50cef318d4af23"} Mar 09 13:14:19 crc kubenswrapper[4723]: I0309 13:14:19.385857 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-786f45cff4-msbbv" podStartSLOduration=2.54081877 podStartE2EDuration="5.385814817s" podCreationTimestamp="2026-03-09 13:14:14 +0000 UTC" firstStartedPulling="2026-03-09 13:14:15.688741919 +0000 UTC m=+929.703209459" lastFinishedPulling="2026-03-09 13:14:18.533737946 +0000 UTC m=+932.548205506" observedRunningTime="2026-03-09 13:14:19.37547731 +0000 UTC m=+933.389944850" watchObservedRunningTime="2026-03-09 13:14:19.385814817 +0000 UTC m=+933.400282357" Mar 09 13:14:19 crc kubenswrapper[4723]: I0309 13:14:19.399358 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-8nj2b" podStartSLOduration=2.207379186 podStartE2EDuration="5.399321259s" podCreationTimestamp="2026-03-09 13:14:14 +0000 UTC" firstStartedPulling="2026-03-09 13:14:15.305300645 +0000 UTC m=+929.319768185" lastFinishedPulling="2026-03-09 13:14:18.497242718 +0000 UTC m=+932.511710258" observedRunningTime="2026-03-09 13:14:19.399176705 +0000 UTC m=+933.413644245" watchObservedRunningTime="2026-03-09 13:14:19.399321259 +0000 UTC m=+933.413788799" Mar 09 13:14:19 crc kubenswrapper[4723]: I0309 13:14:19.423660 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-cfc82" podStartSLOduration=1.662711722 podStartE2EDuration="5.423640421s" podCreationTimestamp="2026-03-09 13:14:14 +0000 UTC" firstStartedPulling="2026-03-09 13:14:14.747222301 +0000 UTC m=+928.761689841" lastFinishedPulling="2026-03-09 13:14:18.50815098 +0000 UTC m=+932.522618540" observedRunningTime="2026-03-09 13:14:19.419214772 +0000 UTC m=+933.433682322" watchObservedRunningTime="2026-03-09 13:14:19.423640421 +0000 UTC m=+933.438107961" Mar 09 13:14:21 crc kubenswrapper[4723]: I0309 13:14:21.643916 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8rqx6" Mar 09 13:14:21 crc kubenswrapper[4723]: I0309 13:14:21.697591 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8rqx6" Mar 09 13:14:21 crc kubenswrapper[4723]: I0309 13:14:21.873057 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8rqx6"] Mar 09 13:14:22 crc kubenswrapper[4723]: I0309 13:14:22.388407 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-cr6kv" event={"ID":"d467b1e5-db4c-4066-8686-9626d2fd19af","Type":"ContainerStarted","Data":"481529b6f8e2dcbb8e23ba6d74a055780c50fadfcad6922a82e70b4b84f7bf23"} Mar 09 13:14:22 crc kubenswrapper[4723]: I0309 13:14:22.409702 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-69594cc75-cr6kv" podStartSLOduration=1.783795107 podStartE2EDuration="8.409687807s" podCreationTimestamp="2026-03-09 13:14:14 +0000 UTC" firstStartedPulling="2026-03-09 13:14:15.164571144 +0000 UTC m=+929.179038684" lastFinishedPulling="2026-03-09 13:14:21.790463844 +0000 UTC m=+935.804931384" observedRunningTime="2026-03-09 13:14:22.407703044 +0000 UTC m=+936.422170594" watchObservedRunningTime="2026-03-09 13:14:22.409687807 +0000 UTC m=+936.424155337" Mar 09 13:14:23 crc kubenswrapper[4723]: I0309 13:14:23.395503 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8rqx6" podUID="13796d01-813c-401b-8b33-16cde7937e92" containerName="registry-server" containerID="cri-o://b8c54ac450dd71be4247e6ac1a8342ab5d901522814b7dd90e63fe9fa921ce3c" gracePeriod=2 Mar 09 13:14:24 crc kubenswrapper[4723]: I0309 13:14:24.297929 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8rqx6" Mar 09 13:14:24 crc kubenswrapper[4723]: I0309 13:14:24.403771 4723 generic.go:334] "Generic (PLEG): container finished" podID="13796d01-813c-401b-8b33-16cde7937e92" containerID="b8c54ac450dd71be4247e6ac1a8342ab5d901522814b7dd90e63fe9fa921ce3c" exitCode=0 Mar 09 13:14:24 crc kubenswrapper[4723]: I0309 13:14:24.403816 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8rqx6" event={"ID":"13796d01-813c-401b-8b33-16cde7937e92","Type":"ContainerDied","Data":"b8c54ac450dd71be4247e6ac1a8342ab5d901522814b7dd90e63fe9fa921ce3c"} Mar 09 13:14:24 crc kubenswrapper[4723]: I0309 13:14:24.403844 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8rqx6" event={"ID":"13796d01-813c-401b-8b33-16cde7937e92","Type":"ContainerDied","Data":"7016a4296f02614ba1f382d3d4f52194730759c4b0fe6be6932f3334db4bdb81"} Mar 09 13:14:24 crc kubenswrapper[4723]: I0309 13:14:24.403891 4723 scope.go:117] "RemoveContainer" containerID="b8c54ac450dd71be4247e6ac1a8342ab5d901522814b7dd90e63fe9fa921ce3c" Mar 09 13:14:24 crc kubenswrapper[4723]: I0309 13:14:24.403892 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8rqx6" Mar 09 13:14:24 crc kubenswrapper[4723]: I0309 13:14:24.425296 4723 scope.go:117] "RemoveContainer" containerID="f4a529e2d4b7d90579f66c9240d05cc56369330f66637577c120572b02eec504" Mar 09 13:14:24 crc kubenswrapper[4723]: I0309 13:14:24.443406 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bxzh\" (UniqueName: \"kubernetes.io/projected/13796d01-813c-401b-8b33-16cde7937e92-kube-api-access-5bxzh\") pod \"13796d01-813c-401b-8b33-16cde7937e92\" (UID: \"13796d01-813c-401b-8b33-16cde7937e92\") " Mar 09 13:14:24 crc kubenswrapper[4723]: I0309 13:14:24.443572 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13796d01-813c-401b-8b33-16cde7937e92-utilities\") pod \"13796d01-813c-401b-8b33-16cde7937e92\" (UID: \"13796d01-813c-401b-8b33-16cde7937e92\") " Mar 09 13:14:24 crc kubenswrapper[4723]: I0309 13:14:24.443658 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13796d01-813c-401b-8b33-16cde7937e92-catalog-content\") pod \"13796d01-813c-401b-8b33-16cde7937e92\" (UID: \"13796d01-813c-401b-8b33-16cde7937e92\") " Mar 09 13:14:24 crc kubenswrapper[4723]: I0309 13:14:24.444702 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13796d01-813c-401b-8b33-16cde7937e92-utilities" (OuterVolumeSpecName: "utilities") pod "13796d01-813c-401b-8b33-16cde7937e92" (UID: "13796d01-813c-401b-8b33-16cde7937e92"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:14:24 crc kubenswrapper[4723]: I0309 13:14:24.451414 4723 scope.go:117] "RemoveContainer" containerID="d4e8f8d5cd12d561df39e6731e7e2a32bc17763384f1c67592aafbf34745cd7f" Mar 09 13:14:24 crc kubenswrapper[4723]: I0309 13:14:24.451650 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13796d01-813c-401b-8b33-16cde7937e92-kube-api-access-5bxzh" (OuterVolumeSpecName: "kube-api-access-5bxzh") pod "13796d01-813c-401b-8b33-16cde7937e92" (UID: "13796d01-813c-401b-8b33-16cde7937e92"). InnerVolumeSpecName "kube-api-access-5bxzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:14:24 crc kubenswrapper[4723]: I0309 13:14:24.493423 4723 scope.go:117] "RemoveContainer" containerID="b8c54ac450dd71be4247e6ac1a8342ab5d901522814b7dd90e63fe9fa921ce3c" Mar 09 13:14:24 crc kubenswrapper[4723]: E0309 13:14:24.493835 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8c54ac450dd71be4247e6ac1a8342ab5d901522814b7dd90e63fe9fa921ce3c\": container with ID starting with b8c54ac450dd71be4247e6ac1a8342ab5d901522814b7dd90e63fe9fa921ce3c not found: ID does not exist" containerID="b8c54ac450dd71be4247e6ac1a8342ab5d901522814b7dd90e63fe9fa921ce3c" Mar 09 13:14:24 crc kubenswrapper[4723]: I0309 13:14:24.493972 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8c54ac450dd71be4247e6ac1a8342ab5d901522814b7dd90e63fe9fa921ce3c"} err="failed to get container status \"b8c54ac450dd71be4247e6ac1a8342ab5d901522814b7dd90e63fe9fa921ce3c\": rpc error: code = NotFound desc = could not find container \"b8c54ac450dd71be4247e6ac1a8342ab5d901522814b7dd90e63fe9fa921ce3c\": container with ID starting with b8c54ac450dd71be4247e6ac1a8342ab5d901522814b7dd90e63fe9fa921ce3c not found: ID does not exist" Mar 09 13:14:24 crc kubenswrapper[4723]: I0309 13:14:24.494054 4723 scope.go:117] "RemoveContainer" containerID="f4a529e2d4b7d90579f66c9240d05cc56369330f66637577c120572b02eec504" Mar 09 13:14:24 crc kubenswrapper[4723]: E0309 13:14:24.494367 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4a529e2d4b7d90579f66c9240d05cc56369330f66637577c120572b02eec504\": container with ID starting with f4a529e2d4b7d90579f66c9240d05cc56369330f66637577c120572b02eec504 not found: ID does not exist" containerID="f4a529e2d4b7d90579f66c9240d05cc56369330f66637577c120572b02eec504" Mar 09 13:14:24 crc kubenswrapper[4723]: I0309 13:14:24.494445 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4a529e2d4b7d90579f66c9240d05cc56369330f66637577c120572b02eec504"} err="failed to get container status \"f4a529e2d4b7d90579f66c9240d05cc56369330f66637577c120572b02eec504\": rpc error: code = NotFound desc = could not find container \"f4a529e2d4b7d90579f66c9240d05cc56369330f66637577c120572b02eec504\": container with ID starting with f4a529e2d4b7d90579f66c9240d05cc56369330f66637577c120572b02eec504 not found: ID does not exist" Mar 09 13:14:24 crc kubenswrapper[4723]: I0309 13:14:24.494519 4723 scope.go:117] "RemoveContainer" containerID="d4e8f8d5cd12d561df39e6731e7e2a32bc17763384f1c67592aafbf34745cd7f" Mar 09 13:14:24 crc kubenswrapper[4723]: E0309 13:14:24.494750 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4e8f8d5cd12d561df39e6731e7e2a32bc17763384f1c67592aafbf34745cd7f\": container with ID starting with d4e8f8d5cd12d561df39e6731e7e2a32bc17763384f1c67592aafbf34745cd7f not found: ID does not exist" containerID="d4e8f8d5cd12d561df39e6731e7e2a32bc17763384f1c67592aafbf34745cd7f" Mar 09 13:14:24 crc kubenswrapper[4723]: I0309 13:14:24.494844 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4e8f8d5cd12d561df39e6731e7e2a32bc17763384f1c67592aafbf34745cd7f"} err="failed to get container status \"d4e8f8d5cd12d561df39e6731e7e2a32bc17763384f1c67592aafbf34745cd7f\": rpc error: code = NotFound desc = could not find container \"d4e8f8d5cd12d561df39e6731e7e2a32bc17763384f1c67592aafbf34745cd7f\": container with ID starting with d4e8f8d5cd12d561df39e6731e7e2a32bc17763384f1c67592aafbf34745cd7f not found: ID does not exist" Mar 09 13:14:24 crc kubenswrapper[4723]: I0309 13:14:24.545291 4723 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13796d01-813c-401b-8b33-16cde7937e92-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:14:24 crc kubenswrapper[4723]: I0309 13:14:24.545519 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bxzh\" (UniqueName: \"kubernetes.io/projected/13796d01-813c-401b-8b33-16cde7937e92-kube-api-access-5bxzh\") on node \"crc\" DevicePath \"\"" Mar 09 13:14:24 crc kubenswrapper[4723]: I0309 13:14:24.570647 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13796d01-813c-401b-8b33-16cde7937e92-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "13796d01-813c-401b-8b33-16cde7937e92" (UID: "13796d01-813c-401b-8b33-16cde7937e92"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:14:24 crc kubenswrapper[4723]: I0309 13:14:24.646699 4723 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13796d01-813c-401b-8b33-16cde7937e92-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:14:24 crc kubenswrapper[4723]: I0309 13:14:24.673509 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-cfc82" Mar 09 13:14:24 crc kubenswrapper[4723]: I0309 13:14:24.738527 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8rqx6"] Mar 09 13:14:24 crc kubenswrapper[4723]: I0309 13:14:24.743541 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8rqx6"] Mar 09 13:14:24 crc kubenswrapper[4723]: I0309 13:14:24.892627 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13796d01-813c-401b-8b33-16cde7937e92" path="/var/lib/kubelet/pods/13796d01-813c-401b-8b33-16cde7937e92/volumes" Mar 09 13:14:25 crc kubenswrapper[4723]: I0309 13:14:25.169842 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6d748498-cmr7m" Mar 09 13:14:25 crc kubenswrapper[4723]: I0309 13:14:25.169941 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6d748498-cmr7m" Mar 09 13:14:25 crc kubenswrapper[4723]: I0309 13:14:25.174936 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6d748498-cmr7m" Mar 09 13:14:25 crc kubenswrapper[4723]: I0309 13:14:25.417757 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6d748498-cmr7m" Mar 09 13:14:25 crc kubenswrapper[4723]: I0309 13:14:25.484758 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6dc6c4d949-sfpmd"] Mar 09 13:14:33 crc kubenswrapper[4723]: I0309 13:14:33.947458 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:14:33 crc kubenswrapper[4723]: I0309 13:14:33.948035 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:14:33 crc kubenswrapper[4723]: I0309 13:14:33.948081 4723 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" Mar 09 13:14:33 crc kubenswrapper[4723]: I0309 13:14:33.948693 4723 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4d25501b3a9b23fada0109d3a471f491cc22bbb00f111c3efbddd551e1408485"} pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 13:14:33 crc kubenswrapper[4723]: I0309 13:14:33.948742 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" containerID="cri-o://4d25501b3a9b23fada0109d3a471f491cc22bbb00f111c3efbddd551e1408485" gracePeriod=600 Mar 09 13:14:34 crc kubenswrapper[4723]: I0309 13:14:34.476561 4723 generic.go:334] "Generic (PLEG): container finished" podID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerID="4d25501b3a9b23fada0109d3a471f491cc22bbb00f111c3efbddd551e1408485" exitCode=0 Mar 09 13:14:34 crc kubenswrapper[4723]: I0309 13:14:34.476638 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" event={"ID":"983d5ed4-cfc7-402a-b226-29dc071c6e4e","Type":"ContainerDied","Data":"4d25501b3a9b23fada0109d3a471f491cc22bbb00f111c3efbddd551e1408485"} Mar 09 13:14:34 crc kubenswrapper[4723]: I0309 13:14:34.477165 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" event={"ID":"983d5ed4-cfc7-402a-b226-29dc071c6e4e","Type":"ContainerStarted","Data":"a07909afd95b9a1ee1329ff07b5736e303acd66573a389c66b14a13e53a70f9f"} Mar 09 13:14:34 crc kubenswrapper[4723]: I0309 13:14:34.477194 4723 scope.go:117] "RemoveContainer" containerID="0fa72ca2b7e100c53424b0c6c728520cb30db8e9432e97e83e4d09f170a81438" Mar 09 13:14:35 crc kubenswrapper[4723]: I0309 13:14:35.234015 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-786f45cff4-msbbv" Mar 09 13:14:35 crc kubenswrapper[4723]: I0309 13:14:35.247543 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nn5nz"] Mar 09 13:14:35 crc kubenswrapper[4723]: E0309 13:14:35.247991 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13796d01-813c-401b-8b33-16cde7937e92" containerName="extract-content" Mar 09 13:14:35 crc kubenswrapper[4723]: I0309 13:14:35.248017 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="13796d01-813c-401b-8b33-16cde7937e92" containerName="extract-content" Mar 09 13:14:35 crc kubenswrapper[4723]: E0309 13:14:35.248049 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13796d01-813c-401b-8b33-16cde7937e92" containerName="extract-utilities" Mar 09 13:14:35 crc kubenswrapper[4723]: I0309 13:14:35.248062 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="13796d01-813c-401b-8b33-16cde7937e92" containerName="extract-utilities" Mar 09 13:14:35 crc kubenswrapper[4723]: E0309 13:14:35.248088 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13796d01-813c-401b-8b33-16cde7937e92" containerName="registry-server" Mar 09 13:14:35 crc kubenswrapper[4723]: I0309 13:14:35.248100 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="13796d01-813c-401b-8b33-16cde7937e92" containerName="registry-server" Mar 09 13:14:35 crc kubenswrapper[4723]: I0309 13:14:35.248285 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="13796d01-813c-401b-8b33-16cde7937e92" containerName="registry-server" Mar 09 13:14:35 crc kubenswrapper[4723]: I0309 13:14:35.249761 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nn5nz" Mar 09 13:14:35 crc kubenswrapper[4723]: I0309 13:14:35.266452 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nn5nz"] Mar 09 13:14:35 crc kubenswrapper[4723]: I0309 13:14:35.326343 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3c53e12-83cb-4c57-a875-1ebac75ee7fb-catalog-content\") pod \"redhat-marketplace-nn5nz\" (UID: \"e3c53e12-83cb-4c57-a875-1ebac75ee7fb\") " pod="openshift-marketplace/redhat-marketplace-nn5nz" Mar 09 13:14:35 crc kubenswrapper[4723]: I0309 13:14:35.326421 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3c53e12-83cb-4c57-a875-1ebac75ee7fb-utilities\") pod \"redhat-marketplace-nn5nz\" (UID: \"e3c53e12-83cb-4c57-a875-1ebac75ee7fb\") " pod="openshift-marketplace/redhat-marketplace-nn5nz" Mar 09 13:14:35 crc kubenswrapper[4723]: I0309 13:14:35.326880 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zt7j\" (UniqueName: \"kubernetes.io/projected/e3c53e12-83cb-4c57-a875-1ebac75ee7fb-kube-api-access-2zt7j\") pod \"redhat-marketplace-nn5nz\" (UID: \"e3c53e12-83cb-4c57-a875-1ebac75ee7fb\") " pod="openshift-marketplace/redhat-marketplace-nn5nz" Mar 09 13:14:35 crc kubenswrapper[4723]: I0309 13:14:35.429104 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zt7j\" (UniqueName: \"kubernetes.io/projected/e3c53e12-83cb-4c57-a875-1ebac75ee7fb-kube-api-access-2zt7j\") pod \"redhat-marketplace-nn5nz\" (UID: \"e3c53e12-83cb-4c57-a875-1ebac75ee7fb\") " pod="openshift-marketplace/redhat-marketplace-nn5nz" Mar 09 13:14:35 crc kubenswrapper[4723]: I0309 13:14:35.429209 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3c53e12-83cb-4c57-a875-1ebac75ee7fb-catalog-content\") pod \"redhat-marketplace-nn5nz\" (UID: \"e3c53e12-83cb-4c57-a875-1ebac75ee7fb\") " pod="openshift-marketplace/redhat-marketplace-nn5nz" Mar 09 13:14:35 crc kubenswrapper[4723]: I0309 13:14:35.429275 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3c53e12-83cb-4c57-a875-1ebac75ee7fb-utilities\") pod \"redhat-marketplace-nn5nz\" (UID: \"e3c53e12-83cb-4c57-a875-1ebac75ee7fb\") " pod="openshift-marketplace/redhat-marketplace-nn5nz" Mar 09 13:14:35 crc kubenswrapper[4723]: I0309 13:14:35.429951 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3c53e12-83cb-4c57-a875-1ebac75ee7fb-catalog-content\") pod \"redhat-marketplace-nn5nz\" (UID: \"e3c53e12-83cb-4c57-a875-1ebac75ee7fb\") " pod="openshift-marketplace/redhat-marketplace-nn5nz" Mar 09 13:14:35 crc kubenswrapper[4723]: I0309 13:14:35.430011 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3c53e12-83cb-4c57-a875-1ebac75ee7fb-utilities\") pod \"redhat-marketplace-nn5nz\" (UID: \"e3c53e12-83cb-4c57-a875-1ebac75ee7fb\") " pod="openshift-marketplace/redhat-marketplace-nn5nz" Mar 09 13:14:35 crc kubenswrapper[4723]: I0309 13:14:35.452919 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zt7j\" (UniqueName: \"kubernetes.io/projected/e3c53e12-83cb-4c57-a875-1ebac75ee7fb-kube-api-access-2zt7j\") pod \"redhat-marketplace-nn5nz\" (UID: \"e3c53e12-83cb-4c57-a875-1ebac75ee7fb\") " pod="openshift-marketplace/redhat-marketplace-nn5nz" Mar 09 13:14:35 crc kubenswrapper[4723]: I0309 13:14:35.575281 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nn5nz" Mar 09 13:14:36 crc kubenswrapper[4723]: I0309 13:14:36.054606 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nn5nz"] Mar 09 13:14:36 crc kubenswrapper[4723]: I0309 13:14:36.496456 4723 generic.go:334] "Generic (PLEG): container finished" podID="e3c53e12-83cb-4c57-a875-1ebac75ee7fb" containerID="0aff7a5e53a17b666b71af021b9dc7c349cd78b94c47349b14984e77191d3085" exitCode=0 Mar 09 13:14:36 crc kubenswrapper[4723]: I0309 13:14:36.496518 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nn5nz" event={"ID":"e3c53e12-83cb-4c57-a875-1ebac75ee7fb","Type":"ContainerDied","Data":"0aff7a5e53a17b666b71af021b9dc7c349cd78b94c47349b14984e77191d3085"} Mar 09 13:14:36 crc kubenswrapper[4723]: I0309 13:14:36.497292 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nn5nz" event={"ID":"e3c53e12-83cb-4c57-a875-1ebac75ee7fb","Type":"ContainerStarted","Data":"7dbc311fdc65fd614b1e9ee561f56a0e144478768916c11aaae6c5f075928d96"} Mar 09 13:14:37 crc kubenswrapper[4723]: I0309 13:14:37.507976 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nn5nz" event={"ID":"e3c53e12-83cb-4c57-a875-1ebac75ee7fb","Type":"ContainerStarted","Data":"3eb13044e81af524fe202f29c91aed62c57621316d8467ebcd92df3516dc5280"} Mar 09 13:14:38 crc kubenswrapper[4723]: I0309 13:14:38.518002 4723 generic.go:334] "Generic (PLEG): container finished" podID="e3c53e12-83cb-4c57-a875-1ebac75ee7fb" containerID="3eb13044e81af524fe202f29c91aed62c57621316d8467ebcd92df3516dc5280" exitCode=0 Mar 09 13:14:38 crc kubenswrapper[4723]: I0309 13:14:38.518269 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nn5nz" event={"ID":"e3c53e12-83cb-4c57-a875-1ebac75ee7fb","Type":"ContainerDied","Data":"3eb13044e81af524fe202f29c91aed62c57621316d8467ebcd92df3516dc5280"} Mar 09 13:14:39 crc kubenswrapper[4723]: I0309 13:14:39.534875 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nn5nz" event={"ID":"e3c53e12-83cb-4c57-a875-1ebac75ee7fb","Type":"ContainerStarted","Data":"8a71f54cc0c906eca386042d3fdcbc8a12021e20fcdd3790c5b736db35f357dc"} Mar 09 13:14:39 crc kubenswrapper[4723]: I0309 13:14:39.556917 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nn5nz" podStartSLOduration=2.036108402 podStartE2EDuration="4.55689964s" podCreationTimestamp="2026-03-09 13:14:35 +0000 UTC" firstStartedPulling="2026-03-09 13:14:36.497958841 +0000 UTC m=+950.512426381" lastFinishedPulling="2026-03-09 13:14:39.018750059 +0000 UTC m=+953.033217619" observedRunningTime="2026-03-09 13:14:39.553948621 +0000 UTC m=+953.568416181" watchObservedRunningTime="2026-03-09 13:14:39.55689964 +0000 UTC m=+953.571367200" Mar 09 13:14:45 crc kubenswrapper[4723]: I0309 13:14:45.576385 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nn5nz" Mar 09 13:14:45 crc kubenswrapper[4723]: I0309 13:14:45.577122 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nn5nz" Mar 09 13:14:45 crc kubenswrapper[4723]: I0309 13:14:45.632510 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nn5nz" Mar 09 13:14:46 crc kubenswrapper[4723]: I0309 13:14:46.627434 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nn5nz" Mar 09 13:14:46 crc kubenswrapper[4723]: I0309 13:14:46.676027 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nn5nz"] Mar 09 13:14:47 crc kubenswrapper[4723]: I0309 13:14:47.603451 4723 scope.go:117] "RemoveContainer" containerID="66e466e6f155f03120dfc4c8c99001e064213bfd19d0d5dbd92949dd103e501c" Mar 09 13:14:48 crc kubenswrapper[4723]: I0309 13:14:48.602136 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nn5nz" podUID="e3c53e12-83cb-4c57-a875-1ebac75ee7fb" containerName="registry-server" containerID="cri-o://8a71f54cc0c906eca386042d3fdcbc8a12021e20fcdd3790c5b736db35f357dc" gracePeriod=2 Mar 09 13:14:48 crc kubenswrapper[4723]: I0309 13:14:48.859590 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vbhxs"] Mar 09 13:14:48 crc kubenswrapper[4723]: I0309 13:14:48.861686 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vbhxs" Mar 09 13:14:48 crc kubenswrapper[4723]: I0309 13:14:48.869654 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vbhxs"] Mar 09 13:14:49 crc kubenswrapper[4723]: I0309 13:14:49.062527 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3542ad02-5d52-4927-bd88-2b9968aa6f8a-utilities\") pod \"community-operators-vbhxs\" (UID: \"3542ad02-5d52-4927-bd88-2b9968aa6f8a\") " pod="openshift-marketplace/community-operators-vbhxs" Mar 09 13:14:49 crc kubenswrapper[4723]: I0309 13:14:49.062588 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k29rs\" (UniqueName: \"kubernetes.io/projected/3542ad02-5d52-4927-bd88-2b9968aa6f8a-kube-api-access-k29rs\") pod \"community-operators-vbhxs\" (UID: \"3542ad02-5d52-4927-bd88-2b9968aa6f8a\") " pod="openshift-marketplace/community-operators-vbhxs" Mar 09 13:14:49 crc kubenswrapper[4723]: I0309 13:14:49.062624 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3542ad02-5d52-4927-bd88-2b9968aa6f8a-catalog-content\") pod \"community-operators-vbhxs\" (UID: \"3542ad02-5d52-4927-bd88-2b9968aa6f8a\") " pod="openshift-marketplace/community-operators-vbhxs" Mar 09 13:14:49 crc kubenswrapper[4723]: I0309 13:14:49.104681 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nn5nz" Mar 09 13:14:49 crc kubenswrapper[4723]: I0309 13:14:49.167737 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zt7j\" (UniqueName: \"kubernetes.io/projected/e3c53e12-83cb-4c57-a875-1ebac75ee7fb-kube-api-access-2zt7j\") pod \"e3c53e12-83cb-4c57-a875-1ebac75ee7fb\" (UID: \"e3c53e12-83cb-4c57-a875-1ebac75ee7fb\") " Mar 09 13:14:49 crc kubenswrapper[4723]: I0309 13:14:49.167912 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3c53e12-83cb-4c57-a875-1ebac75ee7fb-catalog-content\") pod \"e3c53e12-83cb-4c57-a875-1ebac75ee7fb\" (UID: \"e3c53e12-83cb-4c57-a875-1ebac75ee7fb\") " Mar 09 13:14:49 crc kubenswrapper[4723]: I0309 13:14:49.167941 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3c53e12-83cb-4c57-a875-1ebac75ee7fb-utilities\") pod \"e3c53e12-83cb-4c57-a875-1ebac75ee7fb\" (UID: \"e3c53e12-83cb-4c57-a875-1ebac75ee7fb\") " Mar 09 13:14:49 crc kubenswrapper[4723]: I0309 13:14:49.168146 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3542ad02-5d52-4927-bd88-2b9968aa6f8a-utilities\") pod \"community-operators-vbhxs\" (UID: \"3542ad02-5d52-4927-bd88-2b9968aa6f8a\") " pod="openshift-marketplace/community-operators-vbhxs" Mar 09 13:14:49 crc kubenswrapper[4723]: I0309 13:14:49.168185 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k29rs\" (UniqueName: \"kubernetes.io/projected/3542ad02-5d52-4927-bd88-2b9968aa6f8a-kube-api-access-k29rs\") pod \"community-operators-vbhxs\" (UID: \"3542ad02-5d52-4927-bd88-2b9968aa6f8a\") " pod="openshift-marketplace/community-operators-vbhxs" Mar 09 13:14:49 crc kubenswrapper[4723]: I0309 13:14:49.168208 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3542ad02-5d52-4927-bd88-2b9968aa6f8a-catalog-content\") pod \"community-operators-vbhxs\" (UID: \"3542ad02-5d52-4927-bd88-2b9968aa6f8a\") " pod="openshift-marketplace/community-operators-vbhxs" Mar 09 13:14:49 crc kubenswrapper[4723]: I0309 13:14:49.168686 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3542ad02-5d52-4927-bd88-2b9968aa6f8a-catalog-content\") pod \"community-operators-vbhxs\" (UID: \"3542ad02-5d52-4927-bd88-2b9968aa6f8a\") " pod="openshift-marketplace/community-operators-vbhxs" Mar 09 13:14:49 crc kubenswrapper[4723]: I0309 13:14:49.168910 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3542ad02-5d52-4927-bd88-2b9968aa6f8a-utilities\") pod \"community-operators-vbhxs\" (UID: \"3542ad02-5d52-4927-bd88-2b9968aa6f8a\") " pod="openshift-marketplace/community-operators-vbhxs" Mar 09 13:14:49 crc kubenswrapper[4723]: I0309 13:14:49.169005 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3c53e12-83cb-4c57-a875-1ebac75ee7fb-utilities" (OuterVolumeSpecName: "utilities") pod "e3c53e12-83cb-4c57-a875-1ebac75ee7fb" (UID: "e3c53e12-83cb-4c57-a875-1ebac75ee7fb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:14:49 crc kubenswrapper[4723]: I0309 13:14:49.194143 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3c53e12-83cb-4c57-a875-1ebac75ee7fb-kube-api-access-2zt7j" (OuterVolumeSpecName: "kube-api-access-2zt7j") pod "e3c53e12-83cb-4c57-a875-1ebac75ee7fb" (UID: "e3c53e12-83cb-4c57-a875-1ebac75ee7fb"). InnerVolumeSpecName "kube-api-access-2zt7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:14:49 crc kubenswrapper[4723]: I0309 13:14:49.197066 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3c53e12-83cb-4c57-a875-1ebac75ee7fb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e3c53e12-83cb-4c57-a875-1ebac75ee7fb" (UID: "e3c53e12-83cb-4c57-a875-1ebac75ee7fb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:14:49 crc kubenswrapper[4723]: I0309 13:14:49.197944 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k29rs\" (UniqueName: \"kubernetes.io/projected/3542ad02-5d52-4927-bd88-2b9968aa6f8a-kube-api-access-k29rs\") pod \"community-operators-vbhxs\" (UID: \"3542ad02-5d52-4927-bd88-2b9968aa6f8a\") " pod="openshift-marketplace/community-operators-vbhxs" Mar 09 13:14:49 crc kubenswrapper[4723]: I0309 13:14:49.198372 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vbhxs" Mar 09 13:14:49 crc kubenswrapper[4723]: I0309 13:14:49.269024 4723 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3c53e12-83cb-4c57-a875-1ebac75ee7fb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:14:49 crc kubenswrapper[4723]: I0309 13:14:49.269061 4723 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3c53e12-83cb-4c57-a875-1ebac75ee7fb-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:14:49 crc kubenswrapper[4723]: I0309 13:14:49.269074 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zt7j\" (UniqueName: \"kubernetes.io/projected/e3c53e12-83cb-4c57-a875-1ebac75ee7fb-kube-api-access-2zt7j\") on node \"crc\" DevicePath \"\"" Mar 09 13:14:49 crc kubenswrapper[4723]: I0309 13:14:49.474705 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vbhxs"] Mar 09 13:14:49 crc kubenswrapper[4723]: I0309 13:14:49.619637 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbhxs" event={"ID":"3542ad02-5d52-4927-bd88-2b9968aa6f8a","Type":"ContainerStarted","Data":"633b0c607ebb543e3d19bbf92b29cf507888981965f088202bfd03fea6f06714"} Mar 09 13:14:49 crc kubenswrapper[4723]: I0309 13:14:49.623023 4723 generic.go:334] "Generic (PLEG): container finished" podID="e3c53e12-83cb-4c57-a875-1ebac75ee7fb" containerID="8a71f54cc0c906eca386042d3fdcbc8a12021e20fcdd3790c5b736db35f357dc" exitCode=0 Mar 09 13:14:49 crc kubenswrapper[4723]: I0309 13:14:49.623053 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nn5nz" event={"ID":"e3c53e12-83cb-4c57-a875-1ebac75ee7fb","Type":"ContainerDied","Data":"8a71f54cc0c906eca386042d3fdcbc8a12021e20fcdd3790c5b736db35f357dc"} Mar 09 13:14:49 crc kubenswrapper[4723]: I0309 13:14:49.623071 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nn5nz" event={"ID":"e3c53e12-83cb-4c57-a875-1ebac75ee7fb","Type":"ContainerDied","Data":"7dbc311fdc65fd614b1e9ee561f56a0e144478768916c11aaae6c5f075928d96"} Mar 09 13:14:49 crc kubenswrapper[4723]: I0309 13:14:49.623088 4723 scope.go:117] "RemoveContainer" containerID="8a71f54cc0c906eca386042d3fdcbc8a12021e20fcdd3790c5b736db35f357dc" Mar 09 13:14:49 crc kubenswrapper[4723]: I0309 13:14:49.623088 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nn5nz" Mar 09 13:14:49 crc kubenswrapper[4723]: I0309 13:14:49.641839 4723 scope.go:117] "RemoveContainer" containerID="3eb13044e81af524fe202f29c91aed62c57621316d8467ebcd92df3516dc5280" Mar 09 13:14:49 crc kubenswrapper[4723]: I0309 13:14:49.659932 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nn5nz"] Mar 09 13:14:49 crc kubenswrapper[4723]: I0309 13:14:49.666697 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nn5nz"] Mar 09 13:14:49 crc kubenswrapper[4723]: I0309 13:14:49.674501 4723 scope.go:117] "RemoveContainer" containerID="0aff7a5e53a17b666b71af021b9dc7c349cd78b94c47349b14984e77191d3085" Mar 09 13:14:49 crc kubenswrapper[4723]: I0309 13:14:49.772293 4723 scope.go:117] "RemoveContainer" containerID="8a71f54cc0c906eca386042d3fdcbc8a12021e20fcdd3790c5b736db35f357dc" Mar 09 13:14:49 crc kubenswrapper[4723]: E0309 13:14:49.772951 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a71f54cc0c906eca386042d3fdcbc8a12021e20fcdd3790c5b736db35f357dc\": container with ID starting with 8a71f54cc0c906eca386042d3fdcbc8a12021e20fcdd3790c5b736db35f357dc not found: ID does not exist" containerID="8a71f54cc0c906eca386042d3fdcbc8a12021e20fcdd3790c5b736db35f357dc" Mar 09 13:14:49 crc kubenswrapper[4723]: I0309 13:14:49.772999 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a71f54cc0c906eca386042d3fdcbc8a12021e20fcdd3790c5b736db35f357dc"} err="failed to get container status \"8a71f54cc0c906eca386042d3fdcbc8a12021e20fcdd3790c5b736db35f357dc\": rpc error: code = NotFound desc = could not find container \"8a71f54cc0c906eca386042d3fdcbc8a12021e20fcdd3790c5b736db35f357dc\": container with ID starting with 8a71f54cc0c906eca386042d3fdcbc8a12021e20fcdd3790c5b736db35f357dc not found: ID does not exist" Mar 09 13:14:49 crc kubenswrapper[4723]: I0309 13:14:49.773025 4723 scope.go:117] "RemoveContainer" containerID="3eb13044e81af524fe202f29c91aed62c57621316d8467ebcd92df3516dc5280" Mar 09 13:14:49 crc kubenswrapper[4723]: E0309 13:14:49.773490 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3eb13044e81af524fe202f29c91aed62c57621316d8467ebcd92df3516dc5280\": container with ID starting with 3eb13044e81af524fe202f29c91aed62c57621316d8467ebcd92df3516dc5280 not found: ID does not exist" containerID="3eb13044e81af524fe202f29c91aed62c57621316d8467ebcd92df3516dc5280" Mar 09 13:14:49 crc kubenswrapper[4723]: I0309 13:14:49.773519 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3eb13044e81af524fe202f29c91aed62c57621316d8467ebcd92df3516dc5280"} err="failed to get container status \"3eb13044e81af524fe202f29c91aed62c57621316d8467ebcd92df3516dc5280\": rpc error: code = NotFound desc = could not find container \"3eb13044e81af524fe202f29c91aed62c57621316d8467ebcd92df3516dc5280\": container with ID starting with 3eb13044e81af524fe202f29c91aed62c57621316d8467ebcd92df3516dc5280 not found: ID does not exist" Mar 09 13:14:49 crc kubenswrapper[4723]: I0309 13:14:49.773536 4723 scope.go:117] "RemoveContainer" containerID="0aff7a5e53a17b666b71af021b9dc7c349cd78b94c47349b14984e77191d3085" Mar 09 13:14:49 crc kubenswrapper[4723]: E0309 13:14:49.773798 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0aff7a5e53a17b666b71af021b9dc7c349cd78b94c47349b14984e77191d3085\": container with ID starting with 0aff7a5e53a17b666b71af021b9dc7c349cd78b94c47349b14984e77191d3085 not found: ID does not exist" containerID="0aff7a5e53a17b666b71af021b9dc7c349cd78b94c47349b14984e77191d3085" Mar 09 13:14:49 crc kubenswrapper[4723]: I0309 13:14:49.773820 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aff7a5e53a17b666b71af021b9dc7c349cd78b94c47349b14984e77191d3085"} err="failed to get container status \"0aff7a5e53a17b666b71af021b9dc7c349cd78b94c47349b14984e77191d3085\": rpc error: code = NotFound desc = could not find container \"0aff7a5e53a17b666b71af021b9dc7c349cd78b94c47349b14984e77191d3085\": container with ID starting with 0aff7a5e53a17b666b71af021b9dc7c349cd78b94c47349b14984e77191d3085 not found: ID does not exist" Mar 09 13:14:50 crc kubenswrapper[4723]: I0309 13:14:50.570065 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-6dc6c4d949-sfpmd" podUID="44c2f727-e5d0-4d0c-967c-6411c885db43" containerName="console" containerID="cri-o://a687edafcebadcfac28d729d6e28ce380e2f6ca4c95e86fed961696bb95fb67f" gracePeriod=15 Mar 09 13:14:50 crc kubenswrapper[4723]: I0309 13:14:50.649315 4723 generic.go:334] "Generic (PLEG): container finished" podID="3542ad02-5d52-4927-bd88-2b9968aa6f8a" containerID="9a50fb0c05994794fe88e3b3d9db2e339cd41aedc65381746fee0b0b16eabc6b" exitCode=0 Mar 09 13:14:50 crc kubenswrapper[4723]: I0309 13:14:50.649376 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbhxs" event={"ID":"3542ad02-5d52-4927-bd88-2b9968aa6f8a","Type":"ContainerDied","Data":"9a50fb0c05994794fe88e3b3d9db2e339cd41aedc65381746fee0b0b16eabc6b"} Mar 09 13:14:50 crc kubenswrapper[4723]: I0309 13:14:50.895645 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3c53e12-83cb-4c57-a875-1ebac75ee7fb" path="/var/lib/kubelet/pods/e3c53e12-83cb-4c57-a875-1ebac75ee7fb/volumes" Mar 09 13:14:50 crc kubenswrapper[4723]: I0309 13:14:50.975947 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6dc6c4d949-sfpmd_44c2f727-e5d0-4d0c-967c-6411c885db43/console/0.log" Mar 09 13:14:50 crc kubenswrapper[4723]: I0309 13:14:50.976010 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6dc6c4d949-sfpmd" Mar 09 13:14:51 crc kubenswrapper[4723]: I0309 13:14:51.100070 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wv76\" (UniqueName: \"kubernetes.io/projected/44c2f727-e5d0-4d0c-967c-6411c885db43-kube-api-access-9wv76\") pod \"44c2f727-e5d0-4d0c-967c-6411c885db43\" (UID: \"44c2f727-e5d0-4d0c-967c-6411c885db43\") " Mar 09 13:14:51 crc kubenswrapper[4723]: I0309 13:14:51.100542 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44c2f727-e5d0-4d0c-967c-6411c885db43-trusted-ca-bundle\") pod \"44c2f727-e5d0-4d0c-967c-6411c885db43\" (UID: \"44c2f727-e5d0-4d0c-967c-6411c885db43\") " Mar 09 13:14:51 crc kubenswrapper[4723]: I0309 13:14:51.100569 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/44c2f727-e5d0-4d0c-967c-6411c885db43-console-serving-cert\") pod \"44c2f727-e5d0-4d0c-967c-6411c885db43\" (UID: \"44c2f727-e5d0-4d0c-967c-6411c885db43\") " Mar 09 13:14:51 crc kubenswrapper[4723]: I0309 13:14:51.101633 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44c2f727-e5d0-4d0c-967c-6411c885db43-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "44c2f727-e5d0-4d0c-967c-6411c885db43" (UID: "44c2f727-e5d0-4d0c-967c-6411c885db43"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:14:51 crc kubenswrapper[4723]: I0309 13:14:51.101715 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/44c2f727-e5d0-4d0c-967c-6411c885db43-service-ca\") pod \"44c2f727-e5d0-4d0c-967c-6411c885db43\" (UID: \"44c2f727-e5d0-4d0c-967c-6411c885db43\") " Mar 09 13:14:51 crc kubenswrapper[4723]: I0309 13:14:51.101745 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/44c2f727-e5d0-4d0c-967c-6411c885db43-console-oauth-config\") pod \"44c2f727-e5d0-4d0c-967c-6411c885db43\" (UID: \"44c2f727-e5d0-4d0c-967c-6411c885db43\") " Mar 09 13:14:51 crc kubenswrapper[4723]: I0309 13:14:51.102852 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44c2f727-e5d0-4d0c-967c-6411c885db43-service-ca" (OuterVolumeSpecName: "service-ca") pod "44c2f727-e5d0-4d0c-967c-6411c885db43" (UID: "44c2f727-e5d0-4d0c-967c-6411c885db43"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:14:51 crc kubenswrapper[4723]: I0309 13:14:51.102962 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/44c2f727-e5d0-4d0c-967c-6411c885db43-oauth-serving-cert\") pod \"44c2f727-e5d0-4d0c-967c-6411c885db43\" (UID: \"44c2f727-e5d0-4d0c-967c-6411c885db43\") " Mar 09 13:14:51 crc kubenswrapper[4723]: I0309 13:14:51.102994 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/44c2f727-e5d0-4d0c-967c-6411c885db43-console-config\") pod \"44c2f727-e5d0-4d0c-967c-6411c885db43\" (UID: \"44c2f727-e5d0-4d0c-967c-6411c885db43\") " Mar 09 13:14:51 crc kubenswrapper[4723]: I0309 13:14:51.103590 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44c2f727-e5d0-4d0c-967c-6411c885db43-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "44c2f727-e5d0-4d0c-967c-6411c885db43" (UID: "44c2f727-e5d0-4d0c-967c-6411c885db43"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:14:51 crc kubenswrapper[4723]: I0309 13:14:51.103718 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44c2f727-e5d0-4d0c-967c-6411c885db43-console-config" (OuterVolumeSpecName: "console-config") pod "44c2f727-e5d0-4d0c-967c-6411c885db43" (UID: "44c2f727-e5d0-4d0c-967c-6411c885db43"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:14:51 crc kubenswrapper[4723]: I0309 13:14:51.104144 4723 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44c2f727-e5d0-4d0c-967c-6411c885db43-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:14:51 crc kubenswrapper[4723]: I0309 13:14:51.104168 4723 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/44c2f727-e5d0-4d0c-967c-6411c885db43-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:14:51 crc kubenswrapper[4723]: I0309 13:14:51.104178 4723 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/44c2f727-e5d0-4d0c-967c-6411c885db43-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:14:51 crc kubenswrapper[4723]: I0309 13:14:51.104190 4723 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/44c2f727-e5d0-4d0c-967c-6411c885db43-console-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:14:51 crc kubenswrapper[4723]: I0309 13:14:51.106053 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44c2f727-e5d0-4d0c-967c-6411c885db43-kube-api-access-9wv76" (OuterVolumeSpecName: "kube-api-access-9wv76") pod "44c2f727-e5d0-4d0c-967c-6411c885db43" (UID: "44c2f727-e5d0-4d0c-967c-6411c885db43"). InnerVolumeSpecName "kube-api-access-9wv76". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:14:51 crc kubenswrapper[4723]: I0309 13:14:51.116076 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44c2f727-e5d0-4d0c-967c-6411c885db43-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "44c2f727-e5d0-4d0c-967c-6411c885db43" (UID: "44c2f727-e5d0-4d0c-967c-6411c885db43"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:14:51 crc kubenswrapper[4723]: I0309 13:14:51.118049 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44c2f727-e5d0-4d0c-967c-6411c885db43-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "44c2f727-e5d0-4d0c-967c-6411c885db43" (UID: "44c2f727-e5d0-4d0c-967c-6411c885db43"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:14:51 crc kubenswrapper[4723]: I0309 13:14:51.205734 4723 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/44c2f727-e5d0-4d0c-967c-6411c885db43-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:14:51 crc kubenswrapper[4723]: I0309 13:14:51.205766 4723 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/44c2f727-e5d0-4d0c-967c-6411c885db43-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:14:51 crc kubenswrapper[4723]: I0309 13:14:51.205776 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wv76\" (UniqueName: \"kubernetes.io/projected/44c2f727-e5d0-4d0c-967c-6411c885db43-kube-api-access-9wv76\") on node \"crc\" DevicePath \"\"" Mar 09 13:14:51 crc kubenswrapper[4723]: I0309 13:14:51.659876 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6dc6c4d949-sfpmd_44c2f727-e5d0-4d0c-967c-6411c885db43/console/0.log" Mar 09 13:14:51 crc kubenswrapper[4723]: I0309 13:14:51.660125 4723 generic.go:334] "Generic (PLEG): container finished" podID="44c2f727-e5d0-4d0c-967c-6411c885db43" containerID="a687edafcebadcfac28d729d6e28ce380e2f6ca4c95e86fed961696bb95fb67f" exitCode=2 Mar 09 13:14:51 crc kubenswrapper[4723]: I0309 13:14:51.660187 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6dc6c4d949-sfpmd" Mar 09 13:14:51 crc kubenswrapper[4723]: I0309 13:14:51.660185 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6dc6c4d949-sfpmd" event={"ID":"44c2f727-e5d0-4d0c-967c-6411c885db43","Type":"ContainerDied","Data":"a687edafcebadcfac28d729d6e28ce380e2f6ca4c95e86fed961696bb95fb67f"} Mar 09 13:14:51 crc kubenswrapper[4723]: I0309 13:14:51.660293 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6dc6c4d949-sfpmd" event={"ID":"44c2f727-e5d0-4d0c-967c-6411c885db43","Type":"ContainerDied","Data":"e52fc4bed4c03c48c4b4f23c841546827d815994ef080fceb59077ab535ca0b6"} Mar 09 13:14:51 crc kubenswrapper[4723]: I0309 13:14:51.660310 4723 scope.go:117] "RemoveContainer" containerID="a687edafcebadcfac28d729d6e28ce380e2f6ca4c95e86fed961696bb95fb67f" Mar 09 13:14:51 crc kubenswrapper[4723]: I0309 13:14:51.664709 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbhxs" event={"ID":"3542ad02-5d52-4927-bd88-2b9968aa6f8a","Type":"ContainerStarted","Data":"3492ccc5677e68d30f1bd74a0263e8769d8c25a8572aa52296b326f15326b66c"} Mar 09 13:14:51 crc kubenswrapper[4723]: I0309 13:14:51.681274 4723 scope.go:117] "RemoveContainer" containerID="a687edafcebadcfac28d729d6e28ce380e2f6ca4c95e86fed961696bb95fb67f" Mar 09 13:14:51 crc kubenswrapper[4723]: E0309 13:14:51.682121 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a687edafcebadcfac28d729d6e28ce380e2f6ca4c95e86fed961696bb95fb67f\": container with ID starting with a687edafcebadcfac28d729d6e28ce380e2f6ca4c95e86fed961696bb95fb67f not found: ID does not exist" containerID="a687edafcebadcfac28d729d6e28ce380e2f6ca4c95e86fed961696bb95fb67f" Mar 09 13:14:51 crc kubenswrapper[4723]: I0309 13:14:51.682163 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a687edafcebadcfac28d729d6e28ce380e2f6ca4c95e86fed961696bb95fb67f"} err="failed to get container status \"a687edafcebadcfac28d729d6e28ce380e2f6ca4c95e86fed961696bb95fb67f\": rpc error: code = NotFound desc = could not find container \"a687edafcebadcfac28d729d6e28ce380e2f6ca4c95e86fed961696bb95fb67f\": container with ID starting with a687edafcebadcfac28d729d6e28ce380e2f6ca4c95e86fed961696bb95fb67f not found: ID does not exist" Mar 09 13:14:51 crc kubenswrapper[4723]: I0309 13:14:51.742640 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6dc6c4d949-sfpmd"] Mar 09 13:14:51 crc kubenswrapper[4723]: I0309 13:14:51.750214 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6dc6c4d949-sfpmd"] Mar 09 13:14:52 crc kubenswrapper[4723]: I0309 13:14:52.674995 4723 generic.go:334] "Generic (PLEG): container finished" podID="3542ad02-5d52-4927-bd88-2b9968aa6f8a" containerID="3492ccc5677e68d30f1bd74a0263e8769d8c25a8572aa52296b326f15326b66c" exitCode=0 Mar 09 13:14:52 crc kubenswrapper[4723]: I0309 13:14:52.675205 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbhxs" event={"ID":"3542ad02-5d52-4927-bd88-2b9968aa6f8a","Type":"ContainerDied","Data":"3492ccc5677e68d30f1bd74a0263e8769d8c25a8572aa52296b326f15326b66c"} Mar 09 13:14:52 crc kubenswrapper[4723]: I0309 13:14:52.887524 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44c2f727-e5d0-4d0c-967c-6411c885db43" path="/var/lib/kubelet/pods/44c2f727-e5d0-4d0c-967c-6411c885db43/volumes" Mar 09 13:14:53 crc kubenswrapper[4723]: I0309 13:14:53.683938 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbhxs" event={"ID":"3542ad02-5d52-4927-bd88-2b9968aa6f8a","Type":"ContainerStarted","Data":"4c78f77d79d4207376e22dcfae8edb10828a2bb00e18d81ea3b924401f6147f3"} Mar 09 13:14:53 crc kubenswrapper[4723]: I0309 13:14:53.705693 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vbhxs" podStartSLOduration=3.316925686 podStartE2EDuration="5.705651875s" podCreationTimestamp="2026-03-09 13:14:48 +0000 UTC" firstStartedPulling="2026-03-09 13:14:50.651119465 +0000 UTC m=+964.665587005" lastFinishedPulling="2026-03-09 13:14:53.039845654 +0000 UTC m=+967.054313194" observedRunningTime="2026-03-09 13:14:53.70099819 +0000 UTC m=+967.715465730" watchObservedRunningTime="2026-03-09 13:14:53.705651875 +0000 UTC m=+967.720119435" Mar 09 13:14:55 crc kubenswrapper[4723]: I0309 13:14:55.920828 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4xz7tk"] Mar 09 13:14:55 crc kubenswrapper[4723]: E0309 13:14:55.921174 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3c53e12-83cb-4c57-a875-1ebac75ee7fb" containerName="extract-content" Mar 09 13:14:55 crc kubenswrapper[4723]: I0309 13:14:55.921189 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3c53e12-83cb-4c57-a875-1ebac75ee7fb" containerName="extract-content" Mar 09 13:14:55 crc kubenswrapper[4723]: E0309 13:14:55.921211 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44c2f727-e5d0-4d0c-967c-6411c885db43" containerName="console" Mar 09 13:14:55 crc kubenswrapper[4723]: I0309 13:14:55.921218 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="44c2f727-e5d0-4d0c-967c-6411c885db43" containerName="console" Mar 09 13:14:55 crc kubenswrapper[4723]: E0309 13:14:55.921232 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3c53e12-83cb-4c57-a875-1ebac75ee7fb" containerName="registry-server" Mar 09 13:14:55 crc kubenswrapper[4723]: I0309 13:14:55.921240 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3c53e12-83cb-4c57-a875-1ebac75ee7fb" containerName="registry-server" Mar 09 13:14:55 crc kubenswrapper[4723]: E0309 13:14:55.921255 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3c53e12-83cb-4c57-a875-1ebac75ee7fb" containerName="extract-utilities" Mar 09 13:14:55 crc kubenswrapper[4723]: I0309 13:14:55.921263 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3c53e12-83cb-4c57-a875-1ebac75ee7fb" containerName="extract-utilities" Mar 09 13:14:55 crc kubenswrapper[4723]: I0309 13:14:55.921431 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3c53e12-83cb-4c57-a875-1ebac75ee7fb" containerName="registry-server" Mar 09 13:14:55 crc kubenswrapper[4723]: I0309 13:14:55.921448 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="44c2f727-e5d0-4d0c-967c-6411c885db43" containerName="console" Mar 09 13:14:55 crc kubenswrapper[4723]: I0309 13:14:55.922692 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4xz7tk" Mar 09 13:14:55 crc kubenswrapper[4723]: I0309 13:14:55.925558 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 09 13:14:55 crc kubenswrapper[4723]: I0309 13:14:55.936495 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4xz7tk"] Mar 09 13:14:55 crc kubenswrapper[4723]: I0309 13:14:55.975891 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9mr2\" (UniqueName: \"kubernetes.io/projected/dec42794-de2a-4e6d-9e9d-fe400f4052f3-kube-api-access-d9mr2\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4xz7tk\" (UID: \"dec42794-de2a-4e6d-9e9d-fe400f4052f3\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4xz7tk" Mar 09 13:14:55 crc kubenswrapper[4723]: I0309 13:14:55.975999 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dec42794-de2a-4e6d-9e9d-fe400f4052f3-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4xz7tk\" (UID: \"dec42794-de2a-4e6d-9e9d-fe400f4052f3\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4xz7tk" Mar 09 13:14:55 crc kubenswrapper[4723]: I0309 13:14:55.976267 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dec42794-de2a-4e6d-9e9d-fe400f4052f3-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4xz7tk\" (UID: \"dec42794-de2a-4e6d-9e9d-fe400f4052f3\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4xz7tk" Mar 09 13:14:56 crc kubenswrapper[4723]: I0309 13:14:56.077488 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dec42794-de2a-4e6d-9e9d-fe400f4052f3-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4xz7tk\" (UID: \"dec42794-de2a-4e6d-9e9d-fe400f4052f3\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4xz7tk" Mar 09 13:14:56 crc kubenswrapper[4723]: I0309 13:14:56.077623 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dec42794-de2a-4e6d-9e9d-fe400f4052f3-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4xz7tk\" (UID: \"dec42794-de2a-4e6d-9e9d-fe400f4052f3\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4xz7tk" Mar 09 13:14:56 crc kubenswrapper[4723]: I0309 13:14:56.077700 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9mr2\" (UniqueName: \"kubernetes.io/projected/dec42794-de2a-4e6d-9e9d-fe400f4052f3-kube-api-access-d9mr2\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4xz7tk\" (UID: \"dec42794-de2a-4e6d-9e9d-fe400f4052f3\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4xz7tk" Mar 09 13:14:56 crc kubenswrapper[4723]: I0309 13:14:56.078060 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dec42794-de2a-4e6d-9e9d-fe400f4052f3-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4xz7tk\" (UID: \"dec42794-de2a-4e6d-9e9d-fe400f4052f3\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4xz7tk" Mar 09 13:14:56 crc kubenswrapper[4723]: I0309 13:14:56.078453 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dec42794-de2a-4e6d-9e9d-fe400f4052f3-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4xz7tk\" (UID: \"dec42794-de2a-4e6d-9e9d-fe400f4052f3\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4xz7tk" Mar 09 13:14:56 crc kubenswrapper[4723]: I0309 13:14:56.095465 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9mr2\" (UniqueName: \"kubernetes.io/projected/dec42794-de2a-4e6d-9e9d-fe400f4052f3-kube-api-access-d9mr2\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4xz7tk\" (UID: \"dec42794-de2a-4e6d-9e9d-fe400f4052f3\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4xz7tk" Mar 09 13:14:56 crc kubenswrapper[4723]: I0309 13:14:56.248456 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4xz7tk" Mar 09 13:14:56 crc kubenswrapper[4723]: I0309 13:14:56.674298 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4xz7tk"] Mar 09 13:14:56 crc kubenswrapper[4723]: I0309 13:14:56.705709 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4xz7tk" event={"ID":"dec42794-de2a-4e6d-9e9d-fe400f4052f3","Type":"ContainerStarted","Data":"55fdae9bb9226ddcfbcd84231b80d6af346c630fea7c5719a35b0b29210bb165"} Mar 09 13:14:57 crc kubenswrapper[4723]: I0309 13:14:57.713147 4723 generic.go:334] "Generic (PLEG): container finished" podID="dec42794-de2a-4e6d-9e9d-fe400f4052f3" containerID="09d13b403d0131da3768bb623691197d5cec5948b5a74d8b99e56fc761a912dc" exitCode=0 Mar 09 13:14:57 crc kubenswrapper[4723]: I0309 13:14:57.713409 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4xz7tk" event={"ID":"dec42794-de2a-4e6d-9e9d-fe400f4052f3","Type":"ContainerDied","Data":"09d13b403d0131da3768bb623691197d5cec5948b5a74d8b99e56fc761a912dc"} Mar 09 13:14:59 crc kubenswrapper[4723]: I0309 13:14:59.199680 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vbhxs" Mar 09 13:14:59 crc kubenswrapper[4723]: I0309 13:14:59.200008 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vbhxs" Mar 09 13:14:59 crc kubenswrapper[4723]: I0309 13:14:59.274140 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vbhxs" Mar 09 13:14:59 crc kubenswrapper[4723]: I0309 13:14:59.734751 4723 generic.go:334] "Generic (PLEG): container finished" podID="dec42794-de2a-4e6d-9e9d-fe400f4052f3" containerID="112ac69eab400e122512638fa3bd42c667af64d2e2db20e81c565ea4ead4ee7b" exitCode=0 Mar 09 13:14:59 crc kubenswrapper[4723]: I0309 13:14:59.734824 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4xz7tk" event={"ID":"dec42794-de2a-4e6d-9e9d-fe400f4052f3","Type":"ContainerDied","Data":"112ac69eab400e122512638fa3bd42c667af64d2e2db20e81c565ea4ead4ee7b"} Mar 09 13:14:59 crc kubenswrapper[4723]: I0309 13:14:59.786262 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vbhxs" Mar 09 13:15:00 crc kubenswrapper[4723]: I0309 13:15:00.147908 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551035-7fwvj"] Mar 09 13:15:00 crc kubenswrapper[4723]: I0309 13:15:00.148914 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-7fwvj" Mar 09 13:15:00 crc kubenswrapper[4723]: I0309 13:15:00.151976 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 13:15:00 crc kubenswrapper[4723]: I0309 13:15:00.152638 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 13:15:00 crc kubenswrapper[4723]: I0309 13:15:00.161491 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551035-7fwvj"] Mar 09 13:15:00 crc kubenswrapper[4723]: I0309 13:15:00.244351 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1f5f9e3b-034c-47c8-810b-2bd21bd1c54d-secret-volume\") pod \"collect-profiles-29551035-7fwvj\" (UID: \"1f5f9e3b-034c-47c8-810b-2bd21bd1c54d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-7fwvj" Mar 09 13:15:00 crc kubenswrapper[4723]: I0309 13:15:00.244416 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1f5f9e3b-034c-47c8-810b-2bd21bd1c54d-config-volume\") pod \"collect-profiles-29551035-7fwvj\" (UID: \"1f5f9e3b-034c-47c8-810b-2bd21bd1c54d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-7fwvj" Mar 09 13:15:00 crc kubenswrapper[4723]: I0309 13:15:00.244475 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x7wp\" (UniqueName: \"kubernetes.io/projected/1f5f9e3b-034c-47c8-810b-2bd21bd1c54d-kube-api-access-6x7wp\") pod \"collect-profiles-29551035-7fwvj\" (UID: \"1f5f9e3b-034c-47c8-810b-2bd21bd1c54d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-7fwvj" Mar 09 13:15:00 crc kubenswrapper[4723]: I0309 13:15:00.345667 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1f5f9e3b-034c-47c8-810b-2bd21bd1c54d-secret-volume\") pod \"collect-profiles-29551035-7fwvj\" (UID: \"1f5f9e3b-034c-47c8-810b-2bd21bd1c54d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-7fwvj" Mar 09 13:15:00 crc kubenswrapper[4723]: I0309 13:15:00.345790 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1f5f9e3b-034c-47c8-810b-2bd21bd1c54d-config-volume\") pod \"collect-profiles-29551035-7fwvj\" (UID: \"1f5f9e3b-034c-47c8-810b-2bd21bd1c54d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-7fwvj" Mar 09 13:15:00 crc kubenswrapper[4723]: I0309 13:15:00.345934 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x7wp\" (UniqueName: \"kubernetes.io/projected/1f5f9e3b-034c-47c8-810b-2bd21bd1c54d-kube-api-access-6x7wp\") pod \"collect-profiles-29551035-7fwvj\" (UID: \"1f5f9e3b-034c-47c8-810b-2bd21bd1c54d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-7fwvj" Mar 09 13:15:00 crc kubenswrapper[4723]: I0309 13:15:00.347710 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1f5f9e3b-034c-47c8-810b-2bd21bd1c54d-config-volume\") pod \"collect-profiles-29551035-7fwvj\" (UID: \"1f5f9e3b-034c-47c8-810b-2bd21bd1c54d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-7fwvj" Mar 09 13:15:00 crc kubenswrapper[4723]: I0309 13:15:00.359637 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1f5f9e3b-034c-47c8-810b-2bd21bd1c54d-secret-volume\") pod \"collect-profiles-29551035-7fwvj\" (UID: \"1f5f9e3b-034c-47c8-810b-2bd21bd1c54d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-7fwvj" Mar 09 13:15:00 crc kubenswrapper[4723]: I0309 13:15:00.363335 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x7wp\" (UniqueName: \"kubernetes.io/projected/1f5f9e3b-034c-47c8-810b-2bd21bd1c54d-kube-api-access-6x7wp\") pod \"collect-profiles-29551035-7fwvj\" (UID: \"1f5f9e3b-034c-47c8-810b-2bd21bd1c54d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-7fwvj" Mar 09 13:15:00 crc kubenswrapper[4723]: I0309 13:15:00.466542 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-7fwvj" Mar 09 13:15:00 crc kubenswrapper[4723]: I0309 13:15:00.725617 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551035-7fwvj"] Mar 09 13:15:00 crc kubenswrapper[4723]: W0309 13:15:00.732301 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f5f9e3b_034c_47c8_810b_2bd21bd1c54d.slice/crio-17b5a5d179229331639d7bd60f2ca99b05fe66e34f02189fa29995a43a440691 WatchSource:0}: Error finding container 17b5a5d179229331639d7bd60f2ca99b05fe66e34f02189fa29995a43a440691: Status 404 returned error can't find the container with id 17b5a5d179229331639d7bd60f2ca99b05fe66e34f02189fa29995a43a440691 Mar 09 13:15:00 crc kubenswrapper[4723]: I0309 13:15:00.745089 4723 generic.go:334] "Generic (PLEG): container finished" podID="dec42794-de2a-4e6d-9e9d-fe400f4052f3" containerID="750adbbd7bd58f4ac145fbf139b6dfbf7c1e263e6e43e1af1dd9df83498107ff" exitCode=0 Mar 09 13:15:00 crc kubenswrapper[4723]: I0309 13:15:00.745158 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4xz7tk" event={"ID":"dec42794-de2a-4e6d-9e9d-fe400f4052f3","Type":"ContainerDied","Data":"750adbbd7bd58f4ac145fbf139b6dfbf7c1e263e6e43e1af1dd9df83498107ff"} Mar 09 13:15:00 crc kubenswrapper[4723]: I0309 13:15:00.747136 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-7fwvj" event={"ID":"1f5f9e3b-034c-47c8-810b-2bd21bd1c54d","Type":"ContainerStarted","Data":"17b5a5d179229331639d7bd60f2ca99b05fe66e34f02189fa29995a43a440691"} Mar 09 13:15:01 crc kubenswrapper[4723]: I0309 13:15:01.756028 4723 generic.go:334] "Generic (PLEG): container finished" podID="1f5f9e3b-034c-47c8-810b-2bd21bd1c54d" containerID="d3ea6bd376bc3cacc69cb96a929ff5e610086f4f24a0a2264dc3ebdc2f2624ed" exitCode=0 Mar 09 13:15:01 crc kubenswrapper[4723]: I0309 13:15:01.756219 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-7fwvj" event={"ID":"1f5f9e3b-034c-47c8-810b-2bd21bd1c54d","Type":"ContainerDied","Data":"d3ea6bd376bc3cacc69cb96a929ff5e610086f4f24a0a2264dc3ebdc2f2624ed"} Mar 09 13:15:02 crc kubenswrapper[4723]: I0309 13:15:02.065739 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4xz7tk" Mar 09 13:15:02 crc kubenswrapper[4723]: I0309 13:15:02.176148 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9mr2\" (UniqueName: \"kubernetes.io/projected/dec42794-de2a-4e6d-9e9d-fe400f4052f3-kube-api-access-d9mr2\") pod \"dec42794-de2a-4e6d-9e9d-fe400f4052f3\" (UID: \"dec42794-de2a-4e6d-9e9d-fe400f4052f3\") " Mar 09 13:15:02 crc kubenswrapper[4723]: I0309 13:15:02.176213 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dec42794-de2a-4e6d-9e9d-fe400f4052f3-util\") pod \"dec42794-de2a-4e6d-9e9d-fe400f4052f3\" (UID: \"dec42794-de2a-4e6d-9e9d-fe400f4052f3\") " Mar 09 13:15:02 crc kubenswrapper[4723]: I0309 13:15:02.176247 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dec42794-de2a-4e6d-9e9d-fe400f4052f3-bundle\") pod \"dec42794-de2a-4e6d-9e9d-fe400f4052f3\" (UID: \"dec42794-de2a-4e6d-9e9d-fe400f4052f3\") " Mar 09 13:15:02 crc kubenswrapper[4723]: I0309 13:15:02.177357 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dec42794-de2a-4e6d-9e9d-fe400f4052f3-bundle" (OuterVolumeSpecName: "bundle") pod "dec42794-de2a-4e6d-9e9d-fe400f4052f3" (UID: "dec42794-de2a-4e6d-9e9d-fe400f4052f3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:15:02 crc kubenswrapper[4723]: I0309 13:15:02.184045 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dec42794-de2a-4e6d-9e9d-fe400f4052f3-kube-api-access-d9mr2" (OuterVolumeSpecName: "kube-api-access-d9mr2") pod "dec42794-de2a-4e6d-9e9d-fe400f4052f3" (UID: "dec42794-de2a-4e6d-9e9d-fe400f4052f3"). InnerVolumeSpecName "kube-api-access-d9mr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:15:02 crc kubenswrapper[4723]: I0309 13:15:02.209301 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dec42794-de2a-4e6d-9e9d-fe400f4052f3-util" (OuterVolumeSpecName: "util") pod "dec42794-de2a-4e6d-9e9d-fe400f4052f3" (UID: "dec42794-de2a-4e6d-9e9d-fe400f4052f3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:15:02 crc kubenswrapper[4723]: I0309 13:15:02.278096 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9mr2\" (UniqueName: \"kubernetes.io/projected/dec42794-de2a-4e6d-9e9d-fe400f4052f3-kube-api-access-d9mr2\") on node \"crc\" DevicePath \"\"" Mar 09 13:15:02 crc kubenswrapper[4723]: I0309 13:15:02.278142 4723 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dec42794-de2a-4e6d-9e9d-fe400f4052f3-util\") on node \"crc\" DevicePath \"\"" Mar 09 13:15:02 crc kubenswrapper[4723]: I0309 13:15:02.278152 4723 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dec42794-de2a-4e6d-9e9d-fe400f4052f3-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:15:02 crc kubenswrapper[4723]: I0309 13:15:02.471547 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vbhxs"] Mar 09 13:15:02 crc kubenswrapper[4723]: I0309 13:15:02.472005 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vbhxs" podUID="3542ad02-5d52-4927-bd88-2b9968aa6f8a" containerName="registry-server" containerID="cri-o://4c78f77d79d4207376e22dcfae8edb10828a2bb00e18d81ea3b924401f6147f3" gracePeriod=2 Mar 09 13:15:02 crc kubenswrapper[4723]: I0309 13:15:02.786098 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4xz7tk" event={"ID":"dec42794-de2a-4e6d-9e9d-fe400f4052f3","Type":"ContainerDied","Data":"55fdae9bb9226ddcfbcd84231b80d6af346c630fea7c5719a35b0b29210bb165"} Mar 09 13:15:02 crc kubenswrapper[4723]: I0309 13:15:02.786391 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55fdae9bb9226ddcfbcd84231b80d6af346c630fea7c5719a35b0b29210bb165" Mar 09 13:15:02 crc kubenswrapper[4723]: I0309 13:15:02.786572 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4xz7tk" Mar 09 13:15:02 crc kubenswrapper[4723]: I0309 13:15:02.798003 4723 generic.go:334] "Generic (PLEG): container finished" podID="3542ad02-5d52-4927-bd88-2b9968aa6f8a" containerID="4c78f77d79d4207376e22dcfae8edb10828a2bb00e18d81ea3b924401f6147f3" exitCode=0 Mar 09 13:15:02 crc kubenswrapper[4723]: I0309 13:15:02.798149 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbhxs" event={"ID":"3542ad02-5d52-4927-bd88-2b9968aa6f8a","Type":"ContainerDied","Data":"4c78f77d79d4207376e22dcfae8edb10828a2bb00e18d81ea3b924401f6147f3"} Mar 09 13:15:02 crc kubenswrapper[4723]: I0309 13:15:02.884299 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vbhxs" Mar 09 13:15:02 crc kubenswrapper[4723]: I0309 13:15:02.999097 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k29rs\" (UniqueName: \"kubernetes.io/projected/3542ad02-5d52-4927-bd88-2b9968aa6f8a-kube-api-access-k29rs\") pod \"3542ad02-5d52-4927-bd88-2b9968aa6f8a\" (UID: \"3542ad02-5d52-4927-bd88-2b9968aa6f8a\") " Mar 09 13:15:02 crc kubenswrapper[4723]: I0309 13:15:02.999322 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3542ad02-5d52-4927-bd88-2b9968aa6f8a-catalog-content\") pod \"3542ad02-5d52-4927-bd88-2b9968aa6f8a\" (UID: \"3542ad02-5d52-4927-bd88-2b9968aa6f8a\") " Mar 09 13:15:02 crc kubenswrapper[4723]: I0309 13:15:02.999410 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3542ad02-5d52-4927-bd88-2b9968aa6f8a-utilities\") pod \"3542ad02-5d52-4927-bd88-2b9968aa6f8a\" (UID: \"3542ad02-5d52-4927-bd88-2b9968aa6f8a\") " Mar 09 13:15:03 crc kubenswrapper[4723]: I0309 13:15:03.000757 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3542ad02-5d52-4927-bd88-2b9968aa6f8a-utilities" (OuterVolumeSpecName: "utilities") pod "3542ad02-5d52-4927-bd88-2b9968aa6f8a" (UID: "3542ad02-5d52-4927-bd88-2b9968aa6f8a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:15:03 crc kubenswrapper[4723]: I0309 13:15:03.002954 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3542ad02-5d52-4927-bd88-2b9968aa6f8a-kube-api-access-k29rs" (OuterVolumeSpecName: "kube-api-access-k29rs") pod "3542ad02-5d52-4927-bd88-2b9968aa6f8a" (UID: "3542ad02-5d52-4927-bd88-2b9968aa6f8a"). InnerVolumeSpecName "kube-api-access-k29rs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:15:03 crc kubenswrapper[4723]: I0309 13:15:03.072023 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3542ad02-5d52-4927-bd88-2b9968aa6f8a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3542ad02-5d52-4927-bd88-2b9968aa6f8a" (UID: "3542ad02-5d52-4927-bd88-2b9968aa6f8a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:15:03 crc kubenswrapper[4723]: I0309 13:15:03.072195 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-7fwvj" Mar 09 13:15:03 crc kubenswrapper[4723]: I0309 13:15:03.101637 4723 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3542ad02-5d52-4927-bd88-2b9968aa6f8a-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:15:03 crc kubenswrapper[4723]: I0309 13:15:03.101683 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k29rs\" (UniqueName: \"kubernetes.io/projected/3542ad02-5d52-4927-bd88-2b9968aa6f8a-kube-api-access-k29rs\") on node \"crc\" DevicePath \"\"" Mar 09 13:15:03 crc kubenswrapper[4723]: I0309 13:15:03.101708 4723 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3542ad02-5d52-4927-bd88-2b9968aa6f8a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:15:03 crc kubenswrapper[4723]: I0309 13:15:03.203328 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6x7wp\" (UniqueName: \"kubernetes.io/projected/1f5f9e3b-034c-47c8-810b-2bd21bd1c54d-kube-api-access-6x7wp\") pod \"1f5f9e3b-034c-47c8-810b-2bd21bd1c54d\" (UID: \"1f5f9e3b-034c-47c8-810b-2bd21bd1c54d\") " Mar 09 13:15:03 crc kubenswrapper[4723]: I0309 13:15:03.203379 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1f5f9e3b-034c-47c8-810b-2bd21bd1c54d-secret-volume\") pod \"1f5f9e3b-034c-47c8-810b-2bd21bd1c54d\" (UID: \"1f5f9e3b-034c-47c8-810b-2bd21bd1c54d\") " Mar 09 13:15:03 crc kubenswrapper[4723]: I0309 13:15:03.203430 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1f5f9e3b-034c-47c8-810b-2bd21bd1c54d-config-volume\") pod \"1f5f9e3b-034c-47c8-810b-2bd21bd1c54d\" (UID: \"1f5f9e3b-034c-47c8-810b-2bd21bd1c54d\") " Mar 09 13:15:03 crc kubenswrapper[4723]: I0309 13:15:03.204335 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f5f9e3b-034c-47c8-810b-2bd21bd1c54d-config-volume" (OuterVolumeSpecName: "config-volume") pod "1f5f9e3b-034c-47c8-810b-2bd21bd1c54d" (UID: "1f5f9e3b-034c-47c8-810b-2bd21bd1c54d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:15:03 crc kubenswrapper[4723]: I0309 13:15:03.212927 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f5f9e3b-034c-47c8-810b-2bd21bd1c54d-kube-api-access-6x7wp" (OuterVolumeSpecName: "kube-api-access-6x7wp") pod "1f5f9e3b-034c-47c8-810b-2bd21bd1c54d" (UID: "1f5f9e3b-034c-47c8-810b-2bd21bd1c54d"). InnerVolumeSpecName "kube-api-access-6x7wp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:15:03 crc kubenswrapper[4723]: I0309 13:15:03.216955 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f5f9e3b-034c-47c8-810b-2bd21bd1c54d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1f5f9e3b-034c-47c8-810b-2bd21bd1c54d" (UID: "1f5f9e3b-034c-47c8-810b-2bd21bd1c54d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:15:03 crc kubenswrapper[4723]: I0309 13:15:03.305597 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6x7wp\" (UniqueName: \"kubernetes.io/projected/1f5f9e3b-034c-47c8-810b-2bd21bd1c54d-kube-api-access-6x7wp\") on node \"crc\" DevicePath \"\"" Mar 09 13:15:03 crc kubenswrapper[4723]: I0309 13:15:03.305626 4723 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1f5f9e3b-034c-47c8-810b-2bd21bd1c54d-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 13:15:03 crc kubenswrapper[4723]: I0309 13:15:03.305635 4723 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1f5f9e3b-034c-47c8-810b-2bd21bd1c54d-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 13:15:03 crc kubenswrapper[4723]: I0309 13:15:03.806999 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-7fwvj" event={"ID":"1f5f9e3b-034c-47c8-810b-2bd21bd1c54d","Type":"ContainerDied","Data":"17b5a5d179229331639d7bd60f2ca99b05fe66e34f02189fa29995a43a440691"} Mar 09 13:15:03 crc kubenswrapper[4723]: I0309 13:15:03.807033 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17b5a5d179229331639d7bd60f2ca99b05fe66e34f02189fa29995a43a440691" Mar 09 13:15:03 crc kubenswrapper[4723]: I0309 13:15:03.807053 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-7fwvj" Mar 09 13:15:03 crc kubenswrapper[4723]: I0309 13:15:03.809405 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbhxs" event={"ID":"3542ad02-5d52-4927-bd88-2b9968aa6f8a","Type":"ContainerDied","Data":"633b0c607ebb543e3d19bbf92b29cf507888981965f088202bfd03fea6f06714"} Mar 09 13:15:03 crc kubenswrapper[4723]: I0309 13:15:03.809443 4723 scope.go:117] "RemoveContainer" containerID="4c78f77d79d4207376e22dcfae8edb10828a2bb00e18d81ea3b924401f6147f3" Mar 09 13:15:03 crc kubenswrapper[4723]: I0309 13:15:03.809445 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vbhxs" Mar 09 13:15:03 crc kubenswrapper[4723]: I0309 13:15:03.827851 4723 scope.go:117] "RemoveContainer" containerID="3492ccc5677e68d30f1bd74a0263e8769d8c25a8572aa52296b326f15326b66c" Mar 09 13:15:03 crc kubenswrapper[4723]: I0309 13:15:03.856373 4723 scope.go:117] "RemoveContainer" containerID="9a50fb0c05994794fe88e3b3d9db2e339cd41aedc65381746fee0b0b16eabc6b" Mar 09 13:15:03 crc kubenswrapper[4723]: I0309 13:15:03.857813 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vbhxs"] Mar 09 13:15:03 crc kubenswrapper[4723]: I0309 13:15:03.867788 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vbhxs"] Mar 09 13:15:04 crc kubenswrapper[4723]: I0309 13:15:04.914222 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3542ad02-5d52-4927-bd88-2b9968aa6f8a" path="/var/lib/kubelet/pods/3542ad02-5d52-4927-bd88-2b9968aa6f8a/volumes" Mar 09 13:15:13 crc kubenswrapper[4723]: I0309 13:15:13.206511 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6d8b459b8b-tm8sv"] Mar 09 13:15:13 crc kubenswrapper[4723]: E0309 13:15:13.207233 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3542ad02-5d52-4927-bd88-2b9968aa6f8a" containerName="extract-utilities" Mar 09 13:15:13 crc kubenswrapper[4723]: I0309 13:15:13.207246 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="3542ad02-5d52-4927-bd88-2b9968aa6f8a" containerName="extract-utilities" Mar 09 13:15:13 crc kubenswrapper[4723]: E0309 13:15:13.207257 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3542ad02-5d52-4927-bd88-2b9968aa6f8a" containerName="extract-content" Mar 09 13:15:13 crc kubenswrapper[4723]: I0309 13:15:13.207263 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="3542ad02-5d52-4927-bd88-2b9968aa6f8a" containerName="extract-content" Mar 09 13:15:13 crc kubenswrapper[4723]: E0309 13:15:13.207279 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f5f9e3b-034c-47c8-810b-2bd21bd1c54d" containerName="collect-profiles" Mar 09 13:15:13 crc kubenswrapper[4723]: I0309 13:15:13.207285 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f5f9e3b-034c-47c8-810b-2bd21bd1c54d" containerName="collect-profiles" Mar 09 13:15:13 crc kubenswrapper[4723]: E0309 13:15:13.207293 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dec42794-de2a-4e6d-9e9d-fe400f4052f3" containerName="util" Mar 09 13:15:13 crc kubenswrapper[4723]: I0309 13:15:13.207299 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="dec42794-de2a-4e6d-9e9d-fe400f4052f3" containerName="util" Mar 09 13:15:13 crc kubenswrapper[4723]: E0309 13:15:13.207311 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dec42794-de2a-4e6d-9e9d-fe400f4052f3" containerName="extract" Mar 09 13:15:13 crc kubenswrapper[4723]: I0309 13:15:13.207317 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="dec42794-de2a-4e6d-9e9d-fe400f4052f3" containerName="extract" Mar 09 13:15:13 crc kubenswrapper[4723]: E0309 13:15:13.207328 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dec42794-de2a-4e6d-9e9d-fe400f4052f3" containerName="pull" Mar 09 13:15:13 crc kubenswrapper[4723]: I0309 13:15:13.207333 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="dec42794-de2a-4e6d-9e9d-fe400f4052f3" containerName="pull" Mar 09 13:15:13 crc kubenswrapper[4723]: E0309 13:15:13.207342 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3542ad02-5d52-4927-bd88-2b9968aa6f8a" containerName="registry-server" Mar 09 13:15:13 crc kubenswrapper[4723]: I0309 13:15:13.207349 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="3542ad02-5d52-4927-bd88-2b9968aa6f8a" containerName="registry-server" Mar 09 13:15:13 crc kubenswrapper[4723]: I0309 13:15:13.207481 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f5f9e3b-034c-47c8-810b-2bd21bd1c54d" containerName="collect-profiles" Mar 09 13:15:13 crc kubenswrapper[4723]: I0309 13:15:13.207491 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="3542ad02-5d52-4927-bd88-2b9968aa6f8a" containerName="registry-server" Mar 09 13:15:13 crc kubenswrapper[4723]: I0309 13:15:13.207497 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="dec42794-de2a-4e6d-9e9d-fe400f4052f3" containerName="extract" Mar 09 13:15:13 crc kubenswrapper[4723]: I0309 13:15:13.208004 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6d8b459b8b-tm8sv" Mar 09 13:15:13 crc kubenswrapper[4723]: I0309 13:15:13.212286 4723 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 09 13:15:13 crc kubenswrapper[4723]: I0309 13:15:13.212490 4723 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 09 13:15:13 crc kubenswrapper[4723]: I0309 13:15:13.212521 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 09 13:15:13 crc kubenswrapper[4723]: I0309 13:15:13.212750 4723 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-rkw24" Mar 09 13:15:13 crc kubenswrapper[4723]: I0309 13:15:13.212937 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 09 13:15:13 crc kubenswrapper[4723]: I0309 13:15:13.224218 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6d8b459b8b-tm8sv"] Mar 09 13:15:13 crc kubenswrapper[4723]: I0309 13:15:13.368354 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4fc91e18-da85-44c6-96c7-2c15123b9980-apiservice-cert\") pod \"metallb-operator-controller-manager-6d8b459b8b-tm8sv\" (UID: \"4fc91e18-da85-44c6-96c7-2c15123b9980\") " pod="metallb-system/metallb-operator-controller-manager-6d8b459b8b-tm8sv" Mar 09 13:15:13 crc kubenswrapper[4723]: I0309 13:15:13.368399 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4fc91e18-da85-44c6-96c7-2c15123b9980-webhook-cert\") pod \"metallb-operator-controller-manager-6d8b459b8b-tm8sv\" (UID: \"4fc91e18-da85-44c6-96c7-2c15123b9980\") " pod="metallb-system/metallb-operator-controller-manager-6d8b459b8b-tm8sv" Mar 09 13:15:13 crc kubenswrapper[4723]: I0309 13:15:13.368460 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bcnv\" (UniqueName: \"kubernetes.io/projected/4fc91e18-da85-44c6-96c7-2c15123b9980-kube-api-access-8bcnv\") pod \"metallb-operator-controller-manager-6d8b459b8b-tm8sv\" (UID: \"4fc91e18-da85-44c6-96c7-2c15123b9980\") " pod="metallb-system/metallb-operator-controller-manager-6d8b459b8b-tm8sv" Mar 09 13:15:13 crc kubenswrapper[4723]: I0309 13:15:13.470216 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4fc91e18-da85-44c6-96c7-2c15123b9980-apiservice-cert\") pod \"metallb-operator-controller-manager-6d8b459b8b-tm8sv\" (UID: \"4fc91e18-da85-44c6-96c7-2c15123b9980\") " pod="metallb-system/metallb-operator-controller-manager-6d8b459b8b-tm8sv" Mar 09 13:15:13 crc kubenswrapper[4723]: I0309 13:15:13.470260 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4fc91e18-da85-44c6-96c7-2c15123b9980-webhook-cert\") pod \"metallb-operator-controller-manager-6d8b459b8b-tm8sv\" (UID: \"4fc91e18-da85-44c6-96c7-2c15123b9980\") " pod="metallb-system/metallb-operator-controller-manager-6d8b459b8b-tm8sv" Mar 09 13:15:13 crc kubenswrapper[4723]: I0309 13:15:13.470311 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bcnv\" (UniqueName: \"kubernetes.io/projected/4fc91e18-da85-44c6-96c7-2c15123b9980-kube-api-access-8bcnv\") pod \"metallb-operator-controller-manager-6d8b459b8b-tm8sv\" (UID: \"4fc91e18-da85-44c6-96c7-2c15123b9980\") " pod="metallb-system/metallb-operator-controller-manager-6d8b459b8b-tm8sv" Mar 09 13:15:13 crc kubenswrapper[4723]: I0309 13:15:13.479795 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4fc91e18-da85-44c6-96c7-2c15123b9980-apiservice-cert\") pod \"metallb-operator-controller-manager-6d8b459b8b-tm8sv\" (UID: \"4fc91e18-da85-44c6-96c7-2c15123b9980\") " pod="metallb-system/metallb-operator-controller-manager-6d8b459b8b-tm8sv" Mar 09 13:15:13 crc kubenswrapper[4723]: I0309 13:15:13.479842 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4fc91e18-da85-44c6-96c7-2c15123b9980-webhook-cert\") pod \"metallb-operator-controller-manager-6d8b459b8b-tm8sv\" (UID: \"4fc91e18-da85-44c6-96c7-2c15123b9980\") " pod="metallb-system/metallb-operator-controller-manager-6d8b459b8b-tm8sv" Mar 09 13:15:13 crc kubenswrapper[4723]: I0309 13:15:13.493770 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bcnv\" (UniqueName: \"kubernetes.io/projected/4fc91e18-da85-44c6-96c7-2c15123b9980-kube-api-access-8bcnv\") pod \"metallb-operator-controller-manager-6d8b459b8b-tm8sv\" (UID: \"4fc91e18-da85-44c6-96c7-2c15123b9980\") " pod="metallb-system/metallb-operator-controller-manager-6d8b459b8b-tm8sv" Mar 09 13:15:13 crc kubenswrapper[4723]: I0309 13:15:13.567767 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6d8b459b8b-tm8sv" Mar 09 13:15:13 crc kubenswrapper[4723]: I0309 13:15:13.703411 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-997cfb689-c8857"] Mar 09 13:15:13 crc kubenswrapper[4723]: I0309 13:15:13.704455 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-997cfb689-c8857" Mar 09 13:15:13 crc kubenswrapper[4723]: I0309 13:15:13.707773 4723 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-wf4r7" Mar 09 13:15:13 crc kubenswrapper[4723]: I0309 13:15:13.708031 4723 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 09 13:15:13 crc kubenswrapper[4723]: I0309 13:15:13.714158 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-997cfb689-c8857"] Mar 09 13:15:13 crc kubenswrapper[4723]: I0309 13:15:13.716649 4723 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 09 13:15:13 crc kubenswrapper[4723]: I0309 13:15:13.879133 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxhrw\" (UniqueName: \"kubernetes.io/projected/881230db-85c7-4159-b1dd-f537ed6baece-kube-api-access-fxhrw\") pod \"metallb-operator-webhook-server-997cfb689-c8857\" (UID: \"881230db-85c7-4159-b1dd-f537ed6baece\") " pod="metallb-system/metallb-operator-webhook-server-997cfb689-c8857" Mar 09 13:15:13 crc kubenswrapper[4723]: I0309 13:15:13.879198 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/881230db-85c7-4159-b1dd-f537ed6baece-apiservice-cert\") pod \"metallb-operator-webhook-server-997cfb689-c8857\" (UID: \"881230db-85c7-4159-b1dd-f537ed6baece\") " pod="metallb-system/metallb-operator-webhook-server-997cfb689-c8857" Mar 09 13:15:13 crc kubenswrapper[4723]: I0309 13:15:13.879510 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/881230db-85c7-4159-b1dd-f537ed6baece-webhook-cert\") pod \"metallb-operator-webhook-server-997cfb689-c8857\" (UID: \"881230db-85c7-4159-b1dd-f537ed6baece\") " pod="metallb-system/metallb-operator-webhook-server-997cfb689-c8857" Mar 09 13:15:13 crc kubenswrapper[4723]: I0309 13:15:13.981576 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxhrw\" (UniqueName: \"kubernetes.io/projected/881230db-85c7-4159-b1dd-f537ed6baece-kube-api-access-fxhrw\") pod \"metallb-operator-webhook-server-997cfb689-c8857\" (UID: \"881230db-85c7-4159-b1dd-f537ed6baece\") " pod="metallb-system/metallb-operator-webhook-server-997cfb689-c8857" Mar 09 13:15:13 crc kubenswrapper[4723]: I0309 13:15:13.982074 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/881230db-85c7-4159-b1dd-f537ed6baece-apiservice-cert\") pod \"metallb-operator-webhook-server-997cfb689-c8857\" (UID: \"881230db-85c7-4159-b1dd-f537ed6baece\") " pod="metallb-system/metallb-operator-webhook-server-997cfb689-c8857" Mar 09 13:15:13 crc kubenswrapper[4723]: I0309 13:15:13.982223 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/881230db-85c7-4159-b1dd-f537ed6baece-webhook-cert\") pod \"metallb-operator-webhook-server-997cfb689-c8857\" (UID: \"881230db-85c7-4159-b1dd-f537ed6baece\") " pod="metallb-system/metallb-operator-webhook-server-997cfb689-c8857" Mar 09 13:15:13 crc kubenswrapper[4723]: I0309 13:15:13.989610 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/881230db-85c7-4159-b1dd-f537ed6baece-apiservice-cert\") pod \"metallb-operator-webhook-server-997cfb689-c8857\" (UID: \"881230db-85c7-4159-b1dd-f537ed6baece\") " pod="metallb-system/metallb-operator-webhook-server-997cfb689-c8857" Mar 09 13:15:13 crc kubenswrapper[4723]: I0309 13:15:13.989676 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/881230db-85c7-4159-b1dd-f537ed6baece-webhook-cert\") pod \"metallb-operator-webhook-server-997cfb689-c8857\" (UID: \"881230db-85c7-4159-b1dd-f537ed6baece\") " pod="metallb-system/metallb-operator-webhook-server-997cfb689-c8857" Mar 09 13:15:14 crc kubenswrapper[4723]: I0309 13:15:14.005181 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxhrw\" (UniqueName: \"kubernetes.io/projected/881230db-85c7-4159-b1dd-f537ed6baece-kube-api-access-fxhrw\") pod \"metallb-operator-webhook-server-997cfb689-c8857\" (UID: \"881230db-85c7-4159-b1dd-f537ed6baece\") " pod="metallb-system/metallb-operator-webhook-server-997cfb689-c8857" Mar 09 13:15:14 crc kubenswrapper[4723]: I0309 13:15:14.031475 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-997cfb689-c8857" Mar 09 13:15:14 crc kubenswrapper[4723]: I0309 13:15:14.065351 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6d8b459b8b-tm8sv"] Mar 09 13:15:14 crc kubenswrapper[4723]: I0309 13:15:14.454379 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-997cfb689-c8857"] Mar 09 13:15:14 crc kubenswrapper[4723]: W0309 13:15:14.457988 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod881230db_85c7_4159_b1dd_f537ed6baece.slice/crio-c374d2dfbfd520fb949dd0d7a1aed134d72fb99dcb5f6cb174396b19582f82c4 WatchSource:0}: Error finding container c374d2dfbfd520fb949dd0d7a1aed134d72fb99dcb5f6cb174396b19582f82c4: Status 404 returned error can't find the container with id c374d2dfbfd520fb949dd0d7a1aed134d72fb99dcb5f6cb174396b19582f82c4 Mar 09 13:15:14 crc kubenswrapper[4723]: I0309 13:15:14.912895 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-997cfb689-c8857" event={"ID":"881230db-85c7-4159-b1dd-f537ed6baece","Type":"ContainerStarted","Data":"c374d2dfbfd520fb949dd0d7a1aed134d72fb99dcb5f6cb174396b19582f82c4"} Mar 09 13:15:14 crc kubenswrapper[4723]: I0309 13:15:14.912955 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6d8b459b8b-tm8sv" event={"ID":"4fc91e18-da85-44c6-96c7-2c15123b9980","Type":"ContainerStarted","Data":"32586a8beebc3446f61673a7d91ece1fd02c1429e3a88c7eb56535cd150f86a0"} Mar 09 13:15:16 crc kubenswrapper[4723]: I0309 13:15:16.283705 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tdl8s"] Mar 09 13:15:16 crc kubenswrapper[4723]: I0309 13:15:16.285396 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tdl8s" Mar 09 13:15:16 crc kubenswrapper[4723]: I0309 13:15:16.297789 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tdl8s"] Mar 09 13:15:16 crc kubenswrapper[4723]: I0309 13:15:16.321100 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9933fffa-cfbd-48de-85ca-0864644a9f0f-utilities\") pod \"certified-operators-tdl8s\" (UID: \"9933fffa-cfbd-48de-85ca-0864644a9f0f\") " pod="openshift-marketplace/certified-operators-tdl8s" Mar 09 13:15:16 crc kubenswrapper[4723]: I0309 13:15:16.321647 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9933fffa-cfbd-48de-85ca-0864644a9f0f-catalog-content\") pod \"certified-operators-tdl8s\" (UID: \"9933fffa-cfbd-48de-85ca-0864644a9f0f\") " pod="openshift-marketplace/certified-operators-tdl8s" Mar 09 13:15:16 crc kubenswrapper[4723]: I0309 13:15:16.321724 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5d54\" (UniqueName: \"kubernetes.io/projected/9933fffa-cfbd-48de-85ca-0864644a9f0f-kube-api-access-h5d54\") pod \"certified-operators-tdl8s\" (UID: \"9933fffa-cfbd-48de-85ca-0864644a9f0f\") " pod="openshift-marketplace/certified-operators-tdl8s" Mar 09 13:15:16 crc kubenswrapper[4723]: I0309 13:15:16.425390 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5d54\" (UniqueName: \"kubernetes.io/projected/9933fffa-cfbd-48de-85ca-0864644a9f0f-kube-api-access-h5d54\") pod \"certified-operators-tdl8s\" (UID: \"9933fffa-cfbd-48de-85ca-0864644a9f0f\") " pod="openshift-marketplace/certified-operators-tdl8s" Mar 09 13:15:16 crc kubenswrapper[4723]: I0309 13:15:16.425511 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9933fffa-cfbd-48de-85ca-0864644a9f0f-utilities\") pod \"certified-operators-tdl8s\" (UID: \"9933fffa-cfbd-48de-85ca-0864644a9f0f\") " pod="openshift-marketplace/certified-operators-tdl8s" Mar 09 13:15:16 crc kubenswrapper[4723]: I0309 13:15:16.425632 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9933fffa-cfbd-48de-85ca-0864644a9f0f-catalog-content\") pod \"certified-operators-tdl8s\" (UID: \"9933fffa-cfbd-48de-85ca-0864644a9f0f\") " pod="openshift-marketplace/certified-operators-tdl8s" Mar 09 13:15:16 crc kubenswrapper[4723]: I0309 13:15:16.426470 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9933fffa-cfbd-48de-85ca-0864644a9f0f-catalog-content\") pod \"certified-operators-tdl8s\" (UID: \"9933fffa-cfbd-48de-85ca-0864644a9f0f\") " pod="openshift-marketplace/certified-operators-tdl8s" Mar 09 13:15:16 crc kubenswrapper[4723]: I0309 13:15:16.426524 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9933fffa-cfbd-48de-85ca-0864644a9f0f-utilities\") pod \"certified-operators-tdl8s\" (UID: \"9933fffa-cfbd-48de-85ca-0864644a9f0f\") " pod="openshift-marketplace/certified-operators-tdl8s" Mar 09 13:15:16 crc kubenswrapper[4723]: I0309 13:15:16.447735 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5d54\" (UniqueName: \"kubernetes.io/projected/9933fffa-cfbd-48de-85ca-0864644a9f0f-kube-api-access-h5d54\") pod \"certified-operators-tdl8s\" (UID: \"9933fffa-cfbd-48de-85ca-0864644a9f0f\") " pod="openshift-marketplace/certified-operators-tdl8s" Mar 09 13:15:16 crc kubenswrapper[4723]: I0309 13:15:16.628400 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tdl8s" Mar 09 13:15:17 crc kubenswrapper[4723]: I0309 13:15:17.081175 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tdl8s"] Mar 09 13:15:17 crc kubenswrapper[4723]: I0309 13:15:17.971369 4723 generic.go:334] "Generic (PLEG): container finished" podID="9933fffa-cfbd-48de-85ca-0864644a9f0f" containerID="d2872d0d2b7c386b1c868bd15466d79498e1d53e3d9cd8d76db3761afc7e7fb8" exitCode=0 Mar 09 13:15:17 crc kubenswrapper[4723]: I0309 13:15:17.971713 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tdl8s" event={"ID":"9933fffa-cfbd-48de-85ca-0864644a9f0f","Type":"ContainerDied","Data":"d2872d0d2b7c386b1c868bd15466d79498e1d53e3d9cd8d76db3761afc7e7fb8"} Mar 09 13:15:17 crc kubenswrapper[4723]: I0309 13:15:17.971750 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tdl8s" event={"ID":"9933fffa-cfbd-48de-85ca-0864644a9f0f","Type":"ContainerStarted","Data":"d1c3c043ce6dde22b0b65349b6a440c93d53ddcd9ed100c6436b089a900c9f12"} Mar 09 13:15:21 crc kubenswrapper[4723]: I0309 13:15:21.001479 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tdl8s" event={"ID":"9933fffa-cfbd-48de-85ca-0864644a9f0f","Type":"ContainerStarted","Data":"2b8808a88a3406654d3f6e3b98c4e63a7e574274a3cafbc3cc6869d81e535459"} Mar 09 13:15:21 crc kubenswrapper[4723]: I0309 13:15:21.003699 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6d8b459b8b-tm8sv" event={"ID":"4fc91e18-da85-44c6-96c7-2c15123b9980","Type":"ContainerStarted","Data":"bc80182eca8002f5fd41d1508c70437685348de06417543bc6879d6b5b03dae3"} Mar 09 13:15:21 crc kubenswrapper[4723]: I0309 13:15:21.003798 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6d8b459b8b-tm8sv" Mar 09 13:15:21 crc kubenswrapper[4723]: I0309 13:15:21.005496 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-997cfb689-c8857" event={"ID":"881230db-85c7-4159-b1dd-f537ed6baece","Type":"ContainerStarted","Data":"01c3fe4dfe2fc5ad71c06c36fc4c7ee15f4c12f72f5750618c35b20c33f89878"} Mar 09 13:15:21 crc kubenswrapper[4723]: I0309 13:15:21.005624 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-997cfb689-c8857" Mar 09 13:15:21 crc kubenswrapper[4723]: I0309 13:15:21.035000 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-997cfb689-c8857" podStartSLOduration=1.9118293629999998 podStartE2EDuration="8.034983231s" podCreationTimestamp="2026-03-09 13:15:13 +0000 UTC" firstStartedPulling="2026-03-09 13:15:14.461945848 +0000 UTC m=+988.476413388" lastFinishedPulling="2026-03-09 13:15:20.585099716 +0000 UTC m=+994.599567256" observedRunningTime="2026-03-09 13:15:21.032246578 +0000 UTC m=+995.046714128" watchObservedRunningTime="2026-03-09 13:15:21.034983231 +0000 UTC m=+995.049450761" Mar 09 13:15:21 crc kubenswrapper[4723]: I0309 13:15:21.060850 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6d8b459b8b-tm8sv" podStartSLOduration=1.579998741 podStartE2EDuration="8.060833014s" podCreationTimestamp="2026-03-09 13:15:13 +0000 UTC" firstStartedPulling="2026-03-09 13:15:14.08287937 +0000 UTC m=+988.097346910" lastFinishedPulling="2026-03-09 13:15:20.563713643 +0000 UTC m=+994.578181183" observedRunningTime="2026-03-09 13:15:21.056427036 +0000 UTC m=+995.070894606" watchObservedRunningTime="2026-03-09 13:15:21.060833014 +0000 UTC m=+995.075300554" Mar 09 13:15:22 crc kubenswrapper[4723]: I0309 13:15:22.019980 4723 generic.go:334] "Generic (PLEG): container finished" podID="9933fffa-cfbd-48de-85ca-0864644a9f0f" containerID="2b8808a88a3406654d3f6e3b98c4e63a7e574274a3cafbc3cc6869d81e535459" exitCode=0 Mar 09 13:15:22 crc kubenswrapper[4723]: I0309 13:15:22.020051 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tdl8s" event={"ID":"9933fffa-cfbd-48de-85ca-0864644a9f0f","Type":"ContainerDied","Data":"2b8808a88a3406654d3f6e3b98c4e63a7e574274a3cafbc3cc6869d81e535459"} Mar 09 13:15:23 crc kubenswrapper[4723]: I0309 13:15:23.030292 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tdl8s" event={"ID":"9933fffa-cfbd-48de-85ca-0864644a9f0f","Type":"ContainerStarted","Data":"8aedd1c9fe7b6e9b86dcbdd9e5d27de026392790c2f68359ed8b8c9f7e8e417b"} Mar 09 13:15:26 crc kubenswrapper[4723]: I0309 13:15:26.629169 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tdl8s" Mar 09 13:15:26 crc kubenswrapper[4723]: I0309 13:15:26.629510 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tdl8s" Mar 09 13:15:26 crc kubenswrapper[4723]: I0309 13:15:26.683468 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tdl8s" Mar 09 13:15:26 crc kubenswrapper[4723]: I0309 13:15:26.700295 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tdl8s" podStartSLOduration=6.924544655 podStartE2EDuration="10.700279781s" podCreationTimestamp="2026-03-09 13:15:16 +0000 UTC" firstStartedPulling="2026-03-09 13:15:18.913521914 +0000 UTC m=+992.927989444" lastFinishedPulling="2026-03-09 13:15:22.68925702 +0000 UTC m=+996.703724570" observedRunningTime="2026-03-09 13:15:23.051533428 +0000 UTC m=+997.066000968" watchObservedRunningTime="2026-03-09 13:15:26.700279781 +0000 UTC m=+1000.714747321" Mar 09 13:15:27 crc kubenswrapper[4723]: I0309 13:15:27.112626 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tdl8s" Mar 09 13:15:29 crc kubenswrapper[4723]: I0309 13:15:29.073987 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tdl8s"] Mar 09 13:15:29 crc kubenswrapper[4723]: I0309 13:15:29.089558 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tdl8s" podUID="9933fffa-cfbd-48de-85ca-0864644a9f0f" containerName="registry-server" containerID="cri-o://8aedd1c9fe7b6e9b86dcbdd9e5d27de026392790c2f68359ed8b8c9f7e8e417b" gracePeriod=2 Mar 09 13:15:29 crc kubenswrapper[4723]: I0309 13:15:29.619968 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tdl8s" Mar 09 13:15:29 crc kubenswrapper[4723]: I0309 13:15:29.760091 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9933fffa-cfbd-48de-85ca-0864644a9f0f-utilities\") pod \"9933fffa-cfbd-48de-85ca-0864644a9f0f\" (UID: \"9933fffa-cfbd-48de-85ca-0864644a9f0f\") " Mar 09 13:15:29 crc kubenswrapper[4723]: I0309 13:15:29.760240 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9933fffa-cfbd-48de-85ca-0864644a9f0f-catalog-content\") pod \"9933fffa-cfbd-48de-85ca-0864644a9f0f\" (UID: \"9933fffa-cfbd-48de-85ca-0864644a9f0f\") " Mar 09 13:15:29 crc kubenswrapper[4723]: I0309 13:15:29.760355 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5d54\" (UniqueName: \"kubernetes.io/projected/9933fffa-cfbd-48de-85ca-0864644a9f0f-kube-api-access-h5d54\") pod \"9933fffa-cfbd-48de-85ca-0864644a9f0f\" (UID: \"9933fffa-cfbd-48de-85ca-0864644a9f0f\") " Mar 09 13:15:29 crc kubenswrapper[4723]: I0309 13:15:29.761008 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9933fffa-cfbd-48de-85ca-0864644a9f0f-utilities" (OuterVolumeSpecName: "utilities") pod "9933fffa-cfbd-48de-85ca-0864644a9f0f" (UID: "9933fffa-cfbd-48de-85ca-0864644a9f0f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:15:29 crc kubenswrapper[4723]: I0309 13:15:29.778400 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9933fffa-cfbd-48de-85ca-0864644a9f0f-kube-api-access-h5d54" (OuterVolumeSpecName: "kube-api-access-h5d54") pod "9933fffa-cfbd-48de-85ca-0864644a9f0f" (UID: "9933fffa-cfbd-48de-85ca-0864644a9f0f"). InnerVolumeSpecName "kube-api-access-h5d54". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:15:29 crc kubenswrapper[4723]: I0309 13:15:29.850131 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9933fffa-cfbd-48de-85ca-0864644a9f0f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9933fffa-cfbd-48de-85ca-0864644a9f0f" (UID: "9933fffa-cfbd-48de-85ca-0864644a9f0f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:15:29 crc kubenswrapper[4723]: I0309 13:15:29.867975 4723 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9933fffa-cfbd-48de-85ca-0864644a9f0f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:15:29 crc kubenswrapper[4723]: I0309 13:15:29.868019 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5d54\" (UniqueName: \"kubernetes.io/projected/9933fffa-cfbd-48de-85ca-0864644a9f0f-kube-api-access-h5d54\") on node \"crc\" DevicePath \"\"" Mar 09 13:15:29 crc kubenswrapper[4723]: I0309 13:15:29.868036 4723 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9933fffa-cfbd-48de-85ca-0864644a9f0f-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:15:30 crc kubenswrapper[4723]: I0309 13:15:30.098162 4723 generic.go:334] "Generic (PLEG): container finished" podID="9933fffa-cfbd-48de-85ca-0864644a9f0f" containerID="8aedd1c9fe7b6e9b86dcbdd9e5d27de026392790c2f68359ed8b8c9f7e8e417b" exitCode=0 Mar 09 13:15:30 crc kubenswrapper[4723]: I0309 13:15:30.098208 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tdl8s" event={"ID":"9933fffa-cfbd-48de-85ca-0864644a9f0f","Type":"ContainerDied","Data":"8aedd1c9fe7b6e9b86dcbdd9e5d27de026392790c2f68359ed8b8c9f7e8e417b"} Mar 09 13:15:30 crc kubenswrapper[4723]: I0309 13:15:30.098234 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tdl8s" event={"ID":"9933fffa-cfbd-48de-85ca-0864644a9f0f","Type":"ContainerDied","Data":"d1c3c043ce6dde22b0b65349b6a440c93d53ddcd9ed100c6436b089a900c9f12"} Mar 09 13:15:30 crc kubenswrapper[4723]: I0309 13:15:30.098251 4723 scope.go:117] "RemoveContainer" containerID="8aedd1c9fe7b6e9b86dcbdd9e5d27de026392790c2f68359ed8b8c9f7e8e417b" Mar 09 13:15:30 crc kubenswrapper[4723]: I0309 13:15:30.098309 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tdl8s" Mar 09 13:15:30 crc kubenswrapper[4723]: I0309 13:15:30.118601 4723 scope.go:117] "RemoveContainer" containerID="2b8808a88a3406654d3f6e3b98c4e63a7e574274a3cafbc3cc6869d81e535459" Mar 09 13:15:30 crc kubenswrapper[4723]: I0309 13:15:30.170699 4723 scope.go:117] "RemoveContainer" containerID="d2872d0d2b7c386b1c868bd15466d79498e1d53e3d9cd8d76db3761afc7e7fb8" Mar 09 13:15:30 crc kubenswrapper[4723]: I0309 13:15:30.176299 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tdl8s"] Mar 09 13:15:30 crc kubenswrapper[4723]: I0309 13:15:30.184997 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tdl8s"] Mar 09 13:15:30 crc kubenswrapper[4723]: I0309 13:15:30.191522 4723 scope.go:117] "RemoveContainer" containerID="8aedd1c9fe7b6e9b86dcbdd9e5d27de026392790c2f68359ed8b8c9f7e8e417b" Mar 09 13:15:30 crc kubenswrapper[4723]: E0309 13:15:30.192099 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8aedd1c9fe7b6e9b86dcbdd9e5d27de026392790c2f68359ed8b8c9f7e8e417b\": container with ID starting with 8aedd1c9fe7b6e9b86dcbdd9e5d27de026392790c2f68359ed8b8c9f7e8e417b not found: ID does not exist" containerID="8aedd1c9fe7b6e9b86dcbdd9e5d27de026392790c2f68359ed8b8c9f7e8e417b" Mar 09 13:15:30 crc kubenswrapper[4723]: I0309 13:15:30.192141 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8aedd1c9fe7b6e9b86dcbdd9e5d27de026392790c2f68359ed8b8c9f7e8e417b"} err="failed to get container status \"8aedd1c9fe7b6e9b86dcbdd9e5d27de026392790c2f68359ed8b8c9f7e8e417b\": rpc error: code = NotFound desc = could not find container \"8aedd1c9fe7b6e9b86dcbdd9e5d27de026392790c2f68359ed8b8c9f7e8e417b\": container with ID starting with 8aedd1c9fe7b6e9b86dcbdd9e5d27de026392790c2f68359ed8b8c9f7e8e417b not found: ID does not exist" Mar 09 13:15:30 crc kubenswrapper[4723]: I0309 13:15:30.192167 4723 scope.go:117] "RemoveContainer" containerID="2b8808a88a3406654d3f6e3b98c4e63a7e574274a3cafbc3cc6869d81e535459" Mar 09 13:15:30 crc kubenswrapper[4723]: E0309 13:15:30.192515 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b8808a88a3406654d3f6e3b98c4e63a7e574274a3cafbc3cc6869d81e535459\": container with ID starting with 2b8808a88a3406654d3f6e3b98c4e63a7e574274a3cafbc3cc6869d81e535459 not found: ID does not exist" containerID="2b8808a88a3406654d3f6e3b98c4e63a7e574274a3cafbc3cc6869d81e535459" Mar 09 13:15:30 crc kubenswrapper[4723]: I0309 13:15:30.192552 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b8808a88a3406654d3f6e3b98c4e63a7e574274a3cafbc3cc6869d81e535459"} err="failed to get container status \"2b8808a88a3406654d3f6e3b98c4e63a7e574274a3cafbc3cc6869d81e535459\": rpc error: code = NotFound desc = could not find container \"2b8808a88a3406654d3f6e3b98c4e63a7e574274a3cafbc3cc6869d81e535459\": container with ID starting with 2b8808a88a3406654d3f6e3b98c4e63a7e574274a3cafbc3cc6869d81e535459 not found: ID does not exist" Mar 09 13:15:30 crc kubenswrapper[4723]: I0309 13:15:30.192571 4723 scope.go:117] "RemoveContainer" containerID="d2872d0d2b7c386b1c868bd15466d79498e1d53e3d9cd8d76db3761afc7e7fb8" Mar 09 13:15:30 crc kubenswrapper[4723]: E0309 13:15:30.192885 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2872d0d2b7c386b1c868bd15466d79498e1d53e3d9cd8d76db3761afc7e7fb8\": container with ID starting with d2872d0d2b7c386b1c868bd15466d79498e1d53e3d9cd8d76db3761afc7e7fb8 not found: ID does not exist" containerID="d2872d0d2b7c386b1c868bd15466d79498e1d53e3d9cd8d76db3761afc7e7fb8" Mar 09 13:15:30 crc kubenswrapper[4723]: I0309 13:15:30.192926 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2872d0d2b7c386b1c868bd15466d79498e1d53e3d9cd8d76db3761afc7e7fb8"} err="failed to get container status \"d2872d0d2b7c386b1c868bd15466d79498e1d53e3d9cd8d76db3761afc7e7fb8\": rpc error: code = NotFound desc = could not find container \"d2872d0d2b7c386b1c868bd15466d79498e1d53e3d9cd8d76db3761afc7e7fb8\": container with ID starting with d2872d0d2b7c386b1c868bd15466d79498e1d53e3d9cd8d76db3761afc7e7fb8 not found: ID does not exist" Mar 09 13:15:30 crc kubenswrapper[4723]: I0309 13:15:30.895001 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9933fffa-cfbd-48de-85ca-0864644a9f0f" path="/var/lib/kubelet/pods/9933fffa-cfbd-48de-85ca-0864644a9f0f/volumes" Mar 09 13:15:34 crc kubenswrapper[4723]: I0309 13:15:34.037400 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-997cfb689-c8857" Mar 09 13:15:53 crc kubenswrapper[4723]: I0309 13:15:53.570798 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6d8b459b8b-tm8sv" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.338568 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-t8zw4"] Mar 09 13:15:54 crc kubenswrapper[4723]: E0309 13:15:54.342982 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9933fffa-cfbd-48de-85ca-0864644a9f0f" containerName="extract-utilities" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.343013 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="9933fffa-cfbd-48de-85ca-0864644a9f0f" containerName="extract-utilities" Mar 09 13:15:54 crc kubenswrapper[4723]: E0309 13:15:54.343056 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9933fffa-cfbd-48de-85ca-0864644a9f0f" containerName="registry-server" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.343067 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="9933fffa-cfbd-48de-85ca-0864644a9f0f" containerName="registry-server" Mar 09 13:15:54 crc kubenswrapper[4723]: E0309 13:15:54.343086 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9933fffa-cfbd-48de-85ca-0864644a9f0f" containerName="extract-content" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.343094 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="9933fffa-cfbd-48de-85ca-0864644a9f0f" containerName="extract-content" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.343378 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="9933fffa-cfbd-48de-85ca-0864644a9f0f" containerName="registry-server" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.344282 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-t8zw4" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.347926 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-t8zw4"] Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.348632 4723 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-2nr7b" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.348900 4723 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.371628 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-jqk9s"] Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.377931 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-jqk9s" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.383449 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.383599 4723 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.441343 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-zpfhd"] Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.442711 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-zpfhd" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.444049 4723 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-pl4f7" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.444211 4723 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.445050 4723 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.445111 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.452944 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-86ddb6bd46-7hbxv"] Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.454120 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-7hbxv" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.456364 4723 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.469991 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4-metrics-certs\") pod \"frr-k8s-jqk9s\" (UID: \"54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4\") " pod="metallb-system/frr-k8s-jqk9s" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.470051 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4-frr-startup\") pod \"frr-k8s-jqk9s\" (UID: \"54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4\") " pod="metallb-system/frr-k8s-jqk9s" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.470070 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3fab2f82-df4b-417b-8188-0c4f455df30c-cert\") pod \"frr-k8s-webhook-server-7f989f654f-t8zw4\" (UID: \"3fab2f82-df4b-417b-8188-0c4f455df30c\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-t8zw4" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.470087 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ps6s\" (UniqueName: \"kubernetes.io/projected/3fab2f82-df4b-417b-8188-0c4f455df30c-kube-api-access-8ps6s\") pod \"frr-k8s-webhook-server-7f989f654f-t8zw4\" (UID: \"3fab2f82-df4b-417b-8188-0c4f455df30c\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-t8zw4" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.470118 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4-reloader\") pod \"frr-k8s-jqk9s\" (UID: \"54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4\") " pod="metallb-system/frr-k8s-jqk9s" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.470134 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/351b987c-4b9a-4bf6-8832-a0504c9c16ed-metrics-certs\") pod \"controller-86ddb6bd46-7hbxv\" (UID: \"351b987c-4b9a-4bf6-8832-a0504c9c16ed\") " pod="metallb-system/controller-86ddb6bd46-7hbxv" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.470150 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xchxt\" (UniqueName: \"kubernetes.io/projected/54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4-kube-api-access-xchxt\") pod \"frr-k8s-jqk9s\" (UID: \"54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4\") " pod="metallb-system/frr-k8s-jqk9s" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.470168 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b8287b9e-89fc-417d-b98b-564e6acdbb25-metrics-certs\") pod \"speaker-zpfhd\" (UID: \"b8287b9e-89fc-417d-b98b-564e6acdbb25\") " pod="metallb-system/speaker-zpfhd" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.470190 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48cw8\" (UniqueName: \"kubernetes.io/projected/351b987c-4b9a-4bf6-8832-a0504c9c16ed-kube-api-access-48cw8\") pod \"controller-86ddb6bd46-7hbxv\" (UID: \"351b987c-4b9a-4bf6-8832-a0504c9c16ed\") " pod="metallb-system/controller-86ddb6bd46-7hbxv" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.470221 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfkhf\" (UniqueName: \"kubernetes.io/projected/b8287b9e-89fc-417d-b98b-564e6acdbb25-kube-api-access-zfkhf\") pod \"speaker-zpfhd\" (UID: \"b8287b9e-89fc-417d-b98b-564e6acdbb25\") " pod="metallb-system/speaker-zpfhd" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.470236 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4-metrics\") pod \"frr-k8s-jqk9s\" (UID: \"54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4\") " pod="metallb-system/frr-k8s-jqk9s" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.470286 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/351b987c-4b9a-4bf6-8832-a0504c9c16ed-cert\") pod \"controller-86ddb6bd46-7hbxv\" (UID: \"351b987c-4b9a-4bf6-8832-a0504c9c16ed\") " pod="metallb-system/controller-86ddb6bd46-7hbxv" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.470309 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b8287b9e-89fc-417d-b98b-564e6acdbb25-memberlist\") pod \"speaker-zpfhd\" (UID: \"b8287b9e-89fc-417d-b98b-564e6acdbb25\") " pod="metallb-system/speaker-zpfhd" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.470342 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/b8287b9e-89fc-417d-b98b-564e6acdbb25-metallb-excludel2\") pod \"speaker-zpfhd\" (UID: \"b8287b9e-89fc-417d-b98b-564e6acdbb25\") " pod="metallb-system/speaker-zpfhd" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.470359 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4-frr-conf\") pod \"frr-k8s-jqk9s\" (UID: \"54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4\") " pod="metallb-system/frr-k8s-jqk9s" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.470385 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4-frr-sockets\") pod \"frr-k8s-jqk9s\" (UID: \"54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4\") " pod="metallb-system/frr-k8s-jqk9s" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.480435 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-7hbxv"] Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.571450 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4-frr-sockets\") pod \"frr-k8s-jqk9s\" (UID: \"54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4\") " pod="metallb-system/frr-k8s-jqk9s" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.571524 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4-metrics-certs\") pod \"frr-k8s-jqk9s\" (UID: \"54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4\") " pod="metallb-system/frr-k8s-jqk9s" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.571558 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4-frr-startup\") pod \"frr-k8s-jqk9s\" (UID: \"54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4\") " pod="metallb-system/frr-k8s-jqk9s" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.571579 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3fab2f82-df4b-417b-8188-0c4f455df30c-cert\") pod \"frr-k8s-webhook-server-7f989f654f-t8zw4\" (UID: \"3fab2f82-df4b-417b-8188-0c4f455df30c\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-t8zw4" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.571600 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ps6s\" (UniqueName: \"kubernetes.io/projected/3fab2f82-df4b-417b-8188-0c4f455df30c-kube-api-access-8ps6s\") pod \"frr-k8s-webhook-server-7f989f654f-t8zw4\" (UID: \"3fab2f82-df4b-417b-8188-0c4f455df30c\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-t8zw4" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.571630 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4-reloader\") pod \"frr-k8s-jqk9s\" (UID: \"54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4\") " pod="metallb-system/frr-k8s-jqk9s" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.571647 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/351b987c-4b9a-4bf6-8832-a0504c9c16ed-metrics-certs\") pod \"controller-86ddb6bd46-7hbxv\" (UID: \"351b987c-4b9a-4bf6-8832-a0504c9c16ed\") " pod="metallb-system/controller-86ddb6bd46-7hbxv" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.571666 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xchxt\" (UniqueName: \"kubernetes.io/projected/54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4-kube-api-access-xchxt\") pod \"frr-k8s-jqk9s\" (UID: \"54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4\") " pod="metallb-system/frr-k8s-jqk9s" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.571686 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b8287b9e-89fc-417d-b98b-564e6acdbb25-metrics-certs\") pod \"speaker-zpfhd\" (UID: \"b8287b9e-89fc-417d-b98b-564e6acdbb25\") " pod="metallb-system/speaker-zpfhd" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.571709 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48cw8\" (UniqueName: \"kubernetes.io/projected/351b987c-4b9a-4bf6-8832-a0504c9c16ed-kube-api-access-48cw8\") pod \"controller-86ddb6bd46-7hbxv\" (UID: \"351b987c-4b9a-4bf6-8832-a0504c9c16ed\") " pod="metallb-system/controller-86ddb6bd46-7hbxv" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.571740 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfkhf\" (UniqueName: \"kubernetes.io/projected/b8287b9e-89fc-417d-b98b-564e6acdbb25-kube-api-access-zfkhf\") pod \"speaker-zpfhd\" (UID: \"b8287b9e-89fc-417d-b98b-564e6acdbb25\") " pod="metallb-system/speaker-zpfhd" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.571759 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4-metrics\") pod \"frr-k8s-jqk9s\" (UID: \"54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4\") " pod="metallb-system/frr-k8s-jqk9s" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.571783 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/351b987c-4b9a-4bf6-8832-a0504c9c16ed-cert\") pod \"controller-86ddb6bd46-7hbxv\" (UID: \"351b987c-4b9a-4bf6-8832-a0504c9c16ed\") " pod="metallb-system/controller-86ddb6bd46-7hbxv" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.571806 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b8287b9e-89fc-417d-b98b-564e6acdbb25-memberlist\") pod \"speaker-zpfhd\" (UID: \"b8287b9e-89fc-417d-b98b-564e6acdbb25\") " pod="metallb-system/speaker-zpfhd" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.571829 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/b8287b9e-89fc-417d-b98b-564e6acdbb25-metallb-excludel2\") pod \"speaker-zpfhd\" (UID: \"b8287b9e-89fc-417d-b98b-564e6acdbb25\") " pod="metallb-system/speaker-zpfhd" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.571847 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4-frr-conf\") pod \"frr-k8s-jqk9s\" (UID: \"54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4\") " pod="metallb-system/frr-k8s-jqk9s" Mar 09 13:15:54 crc kubenswrapper[4723]: E0309 13:15:54.571871 4723 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Mar 09 13:15:54 crc kubenswrapper[4723]: E0309 13:15:54.571941 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/351b987c-4b9a-4bf6-8832-a0504c9c16ed-metrics-certs podName:351b987c-4b9a-4bf6-8832-a0504c9c16ed nodeName:}" failed. No retries permitted until 2026-03-09 13:15:55.071920628 +0000 UTC m=+1029.086388268 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/351b987c-4b9a-4bf6-8832-a0504c9c16ed-metrics-certs") pod "controller-86ddb6bd46-7hbxv" (UID: "351b987c-4b9a-4bf6-8832-a0504c9c16ed") : secret "controller-certs-secret" not found Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.571999 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4-frr-sockets\") pod \"frr-k8s-jqk9s\" (UID: \"54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4\") " pod="metallb-system/frr-k8s-jqk9s" Mar 09 13:15:54 crc kubenswrapper[4723]: E0309 13:15:54.572035 4723 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Mar 09 13:15:54 crc kubenswrapper[4723]: E0309 13:15:54.572090 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8287b9e-89fc-417d-b98b-564e6acdbb25-metrics-certs podName:b8287b9e-89fc-417d-b98b-564e6acdbb25 nodeName:}" failed. No retries permitted until 2026-03-09 13:15:55.072074122 +0000 UTC m=+1029.086541662 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b8287b9e-89fc-417d-b98b-564e6acdbb25-metrics-certs") pod "speaker-zpfhd" (UID: "b8287b9e-89fc-417d-b98b-564e6acdbb25") : secret "speaker-certs-secret" not found Mar 09 13:15:54 crc kubenswrapper[4723]: E0309 13:15:54.572099 4723 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 09 13:15:54 crc kubenswrapper[4723]: E0309 13:15:54.572131 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8287b9e-89fc-417d-b98b-564e6acdbb25-memberlist podName:b8287b9e-89fc-417d-b98b-564e6acdbb25 nodeName:}" failed. No retries permitted until 2026-03-09 13:15:55.072121873 +0000 UTC m=+1029.086589533 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/b8287b9e-89fc-417d-b98b-564e6acdbb25-memberlist") pod "speaker-zpfhd" (UID: "b8287b9e-89fc-417d-b98b-564e6acdbb25") : secret "metallb-memberlist" not found Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.572497 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4-frr-startup\") pod \"frr-k8s-jqk9s\" (UID: \"54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4\") " pod="metallb-system/frr-k8s-jqk9s" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.572723 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/b8287b9e-89fc-417d-b98b-564e6acdbb25-metallb-excludel2\") pod \"speaker-zpfhd\" (UID: \"b8287b9e-89fc-417d-b98b-564e6acdbb25\") " pod="metallb-system/speaker-zpfhd" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.573528 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4-frr-conf\") pod \"frr-k8s-jqk9s\" (UID: \"54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4\") " pod="metallb-system/frr-k8s-jqk9s" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.574101 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4-metrics\") pod \"frr-k8s-jqk9s\" (UID: \"54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4\") " pod="metallb-system/frr-k8s-jqk9s" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.575756 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4-reloader\") pod \"frr-k8s-jqk9s\" (UID: \"54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4\") " pod="metallb-system/frr-k8s-jqk9s" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.580295 4723 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.580565 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4-metrics-certs\") pod \"frr-k8s-jqk9s\" (UID: \"54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4\") " pod="metallb-system/frr-k8s-jqk9s" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.580733 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3fab2f82-df4b-417b-8188-0c4f455df30c-cert\") pod \"frr-k8s-webhook-server-7f989f654f-t8zw4\" (UID: \"3fab2f82-df4b-417b-8188-0c4f455df30c\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-t8zw4" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.588367 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/351b987c-4b9a-4bf6-8832-a0504c9c16ed-cert\") pod \"controller-86ddb6bd46-7hbxv\" (UID: \"351b987c-4b9a-4bf6-8832-a0504c9c16ed\") " pod="metallb-system/controller-86ddb6bd46-7hbxv" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.589789 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48cw8\" (UniqueName: \"kubernetes.io/projected/351b987c-4b9a-4bf6-8832-a0504c9c16ed-kube-api-access-48cw8\") pod \"controller-86ddb6bd46-7hbxv\" (UID: \"351b987c-4b9a-4bf6-8832-a0504c9c16ed\") " pod="metallb-system/controller-86ddb6bd46-7hbxv" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.593633 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ps6s\" (UniqueName: \"kubernetes.io/projected/3fab2f82-df4b-417b-8188-0c4f455df30c-kube-api-access-8ps6s\") pod \"frr-k8s-webhook-server-7f989f654f-t8zw4\" (UID: \"3fab2f82-df4b-417b-8188-0c4f455df30c\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-t8zw4" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.609296 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfkhf\" (UniqueName: \"kubernetes.io/projected/b8287b9e-89fc-417d-b98b-564e6acdbb25-kube-api-access-zfkhf\") pod \"speaker-zpfhd\" (UID: \"b8287b9e-89fc-417d-b98b-564e6acdbb25\") " pod="metallb-system/speaker-zpfhd" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.616479 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xchxt\" (UniqueName: \"kubernetes.io/projected/54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4-kube-api-access-xchxt\") pod \"frr-k8s-jqk9s\" (UID: \"54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4\") " pod="metallb-system/frr-k8s-jqk9s" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.675059 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-t8zw4" Mar 09 13:15:54 crc kubenswrapper[4723]: I0309 13:15:54.694379 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-jqk9s" Mar 09 13:15:55 crc kubenswrapper[4723]: E0309 13:15:55.079966 4723 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 09 13:15:55 crc kubenswrapper[4723]: I0309 13:15:55.080600 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b8287b9e-89fc-417d-b98b-564e6acdbb25-memberlist\") pod \"speaker-zpfhd\" (UID: \"b8287b9e-89fc-417d-b98b-564e6acdbb25\") " pod="metallb-system/speaker-zpfhd" Mar 09 13:15:55 crc kubenswrapper[4723]: E0309 13:15:55.080662 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8287b9e-89fc-417d-b98b-564e6acdbb25-memberlist podName:b8287b9e-89fc-417d-b98b-564e6acdbb25 nodeName:}" failed. No retries permitted until 2026-03-09 13:15:56.08064563 +0000 UTC m=+1030.095113170 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/b8287b9e-89fc-417d-b98b-564e6acdbb25-memberlist") pod "speaker-zpfhd" (UID: "b8287b9e-89fc-417d-b98b-564e6acdbb25") : secret "metallb-memberlist" not found Mar 09 13:15:55 crc kubenswrapper[4723]: I0309 13:15:55.082231 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/351b987c-4b9a-4bf6-8832-a0504c9c16ed-metrics-certs\") pod \"controller-86ddb6bd46-7hbxv\" (UID: \"351b987c-4b9a-4bf6-8832-a0504c9c16ed\") " pod="metallb-system/controller-86ddb6bd46-7hbxv" Mar 09 13:15:55 crc kubenswrapper[4723]: I0309 13:15:55.082998 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b8287b9e-89fc-417d-b98b-564e6acdbb25-metrics-certs\") pod \"speaker-zpfhd\" (UID: \"b8287b9e-89fc-417d-b98b-564e6acdbb25\") " pod="metallb-system/speaker-zpfhd" Mar 09 13:15:55 crc kubenswrapper[4723]: I0309 13:15:55.091764 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/351b987c-4b9a-4bf6-8832-a0504c9c16ed-metrics-certs\") pod \"controller-86ddb6bd46-7hbxv\" (UID: \"351b987c-4b9a-4bf6-8832-a0504c9c16ed\") " pod="metallb-system/controller-86ddb6bd46-7hbxv" Mar 09 13:15:55 crc kubenswrapper[4723]: I0309 13:15:55.096379 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b8287b9e-89fc-417d-b98b-564e6acdbb25-metrics-certs\") pod \"speaker-zpfhd\" (UID: \"b8287b9e-89fc-417d-b98b-564e6acdbb25\") " pod="metallb-system/speaker-zpfhd" Mar 09 13:15:55 crc kubenswrapper[4723]: I0309 13:15:55.141658 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-t8zw4"] Mar 09 13:15:55 crc kubenswrapper[4723]: I0309 13:15:55.306753 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jqk9s" event={"ID":"54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4","Type":"ContainerStarted","Data":"3075a0e847251254567e947f6353c8831d070c3aca00a7c68b96f7a4a04344a1"} Mar 09 13:15:55 crc kubenswrapper[4723]: I0309 13:15:55.308065 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-t8zw4" event={"ID":"3fab2f82-df4b-417b-8188-0c4f455df30c","Type":"ContainerStarted","Data":"343fe5ec1340c4c55434581db287a474638f94cff45da69207dd9cda89d0038e"} Mar 09 13:15:55 crc kubenswrapper[4723]: I0309 13:15:55.371368 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-7hbxv" Mar 09 13:15:55 crc kubenswrapper[4723]: W0309 13:15:55.788011 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod351b987c_4b9a_4bf6_8832_a0504c9c16ed.slice/crio-308d9ee78579cf2506db7793c25d42635d24db19c23a753e937b44ae595bf13a WatchSource:0}: Error finding container 308d9ee78579cf2506db7793c25d42635d24db19c23a753e937b44ae595bf13a: Status 404 returned error can't find the container with id 308d9ee78579cf2506db7793c25d42635d24db19c23a753e937b44ae595bf13a Mar 09 13:15:55 crc kubenswrapper[4723]: I0309 13:15:55.788724 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-7hbxv"] Mar 09 13:15:56 crc kubenswrapper[4723]: I0309 13:15:56.101098 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b8287b9e-89fc-417d-b98b-564e6acdbb25-memberlist\") pod \"speaker-zpfhd\" (UID: \"b8287b9e-89fc-417d-b98b-564e6acdbb25\") " pod="metallb-system/speaker-zpfhd" Mar 09 13:15:56 crc kubenswrapper[4723]: I0309 13:15:56.107001 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b8287b9e-89fc-417d-b98b-564e6acdbb25-memberlist\") pod \"speaker-zpfhd\" (UID: \"b8287b9e-89fc-417d-b98b-564e6acdbb25\") " pod="metallb-system/speaker-zpfhd" Mar 09 13:15:56 crc kubenswrapper[4723]: I0309 13:15:56.261408 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-zpfhd" Mar 09 13:15:56 crc kubenswrapper[4723]: W0309 13:15:56.286420 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8287b9e_89fc_417d_b98b_564e6acdbb25.slice/crio-5c5dca20ebd72c69ba602ec4a5b6762e4b38ca29c76372a2a83d32998e303c6a WatchSource:0}: Error finding container 5c5dca20ebd72c69ba602ec4a5b6762e4b38ca29c76372a2a83d32998e303c6a: Status 404 returned error can't find the container with id 5c5dca20ebd72c69ba602ec4a5b6762e4b38ca29c76372a2a83d32998e303c6a Mar 09 13:15:56 crc kubenswrapper[4723]: I0309 13:15:56.318374 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zpfhd" event={"ID":"b8287b9e-89fc-417d-b98b-564e6acdbb25","Type":"ContainerStarted","Data":"5c5dca20ebd72c69ba602ec4a5b6762e4b38ca29c76372a2a83d32998e303c6a"} Mar 09 13:15:56 crc kubenswrapper[4723]: I0309 13:15:56.320634 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-7hbxv" event={"ID":"351b987c-4b9a-4bf6-8832-a0504c9c16ed","Type":"ContainerStarted","Data":"0a1c30dd15e67d7a571ddd5832bb61e39e6915e50ed82ab52c3d55ea6bd0b970"} Mar 09 13:15:56 crc kubenswrapper[4723]: I0309 13:15:56.320681 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-7hbxv" event={"ID":"351b987c-4b9a-4bf6-8832-a0504c9c16ed","Type":"ContainerStarted","Data":"8022f99124f6820a426b2b3e15682012c09e05cfed12893ee15bb1e7bad3cfbb"} Mar 09 13:15:56 crc kubenswrapper[4723]: I0309 13:15:56.320695 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-7hbxv" event={"ID":"351b987c-4b9a-4bf6-8832-a0504c9c16ed","Type":"ContainerStarted","Data":"308d9ee78579cf2506db7793c25d42635d24db19c23a753e937b44ae595bf13a"} Mar 09 13:15:56 crc kubenswrapper[4723]: I0309 13:15:56.321800 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-86ddb6bd46-7hbxv" Mar 09 13:15:56 crc kubenswrapper[4723]: I0309 13:15:56.905095 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-86ddb6bd46-7hbxv" podStartSLOduration=2.905078718 podStartE2EDuration="2.905078718s" podCreationTimestamp="2026-03-09 13:15:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:15:56.347229519 +0000 UTC m=+1030.361697059" watchObservedRunningTime="2026-03-09 13:15:56.905078718 +0000 UTC m=+1030.919546258" Mar 09 13:15:57 crc kubenswrapper[4723]: I0309 13:15:57.339493 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zpfhd" event={"ID":"b8287b9e-89fc-417d-b98b-564e6acdbb25","Type":"ContainerStarted","Data":"3bf27eb2e9158612c8b416345554cbe82f0766184c73384bf2621994db567f17"} Mar 09 13:15:57 crc kubenswrapper[4723]: I0309 13:15:57.339824 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zpfhd" event={"ID":"b8287b9e-89fc-417d-b98b-564e6acdbb25","Type":"ContainerStarted","Data":"81bc867c5003ff9ae5d5e01240bc5fcbd0cb3b542af72a386792c4714a592f3e"} Mar 09 13:15:57 crc kubenswrapper[4723]: I0309 13:15:57.339840 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-zpfhd" Mar 09 13:15:57 crc kubenswrapper[4723]: I0309 13:15:57.370329 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-zpfhd" podStartSLOduration=3.370312745 podStartE2EDuration="3.370312745s" podCreationTimestamp="2026-03-09 13:15:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:15:57.363248215 +0000 UTC m=+1031.377715755" watchObservedRunningTime="2026-03-09 13:15:57.370312745 +0000 UTC m=+1031.384780285" Mar 09 13:16:00 crc kubenswrapper[4723]: I0309 13:16:00.121630 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551036-5g57t"] Mar 09 13:16:00 crc kubenswrapper[4723]: I0309 13:16:00.123355 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551036-5g57t" Mar 09 13:16:00 crc kubenswrapper[4723]: I0309 13:16:00.129698 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:16:00 crc kubenswrapper[4723]: I0309 13:16:00.129992 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:16:00 crc kubenswrapper[4723]: I0309 13:16:00.130054 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jrp4x" Mar 09 13:16:00 crc kubenswrapper[4723]: I0309 13:16:00.132600 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551036-5g57t"] Mar 09 13:16:00 crc kubenswrapper[4723]: I0309 13:16:00.267997 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n5q6\" (UniqueName: \"kubernetes.io/projected/5ad9c8b7-3845-41fa-b73a-88cb02635900-kube-api-access-6n5q6\") pod \"auto-csr-approver-29551036-5g57t\" (UID: \"5ad9c8b7-3845-41fa-b73a-88cb02635900\") " pod="openshift-infra/auto-csr-approver-29551036-5g57t" Mar 09 13:16:00 crc kubenswrapper[4723]: I0309 13:16:00.370116 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n5q6\" (UniqueName: \"kubernetes.io/projected/5ad9c8b7-3845-41fa-b73a-88cb02635900-kube-api-access-6n5q6\") pod \"auto-csr-approver-29551036-5g57t\" (UID: \"5ad9c8b7-3845-41fa-b73a-88cb02635900\") " pod="openshift-infra/auto-csr-approver-29551036-5g57t" Mar 09 13:16:00 crc kubenswrapper[4723]: I0309 13:16:00.395519 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n5q6\" (UniqueName: \"kubernetes.io/projected/5ad9c8b7-3845-41fa-b73a-88cb02635900-kube-api-access-6n5q6\") pod \"auto-csr-approver-29551036-5g57t\" (UID: \"5ad9c8b7-3845-41fa-b73a-88cb02635900\") " pod="openshift-infra/auto-csr-approver-29551036-5g57t" Mar 09 13:16:00 crc kubenswrapper[4723]: I0309 13:16:00.579714 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551036-5g57t" Mar 09 13:16:02 crc kubenswrapper[4723]: I0309 13:16:02.908361 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551036-5g57t"] Mar 09 13:16:03 crc kubenswrapper[4723]: I0309 13:16:03.391400 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551036-5g57t" event={"ID":"5ad9c8b7-3845-41fa-b73a-88cb02635900","Type":"ContainerStarted","Data":"0dd45729f0da3fedd691b22be386a9c6918de81d6e83c2ea33a75320714f5ddc"} Mar 09 13:16:03 crc kubenswrapper[4723]: I0309 13:16:03.393086 4723 generic.go:334] "Generic (PLEG): container finished" podID="54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4" containerID="893f2b6038719d3eb81b3bf525b4e0dc2effc759844297699f6c8b5e8ab27444" exitCode=0 Mar 09 13:16:03 crc kubenswrapper[4723]: I0309 13:16:03.393209 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jqk9s" event={"ID":"54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4","Type":"ContainerDied","Data":"893f2b6038719d3eb81b3bf525b4e0dc2effc759844297699f6c8b5e8ab27444"} Mar 09 13:16:03 crc kubenswrapper[4723]: I0309 13:16:03.397476 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-t8zw4" event={"ID":"3fab2f82-df4b-417b-8188-0c4f455df30c","Type":"ContainerStarted","Data":"cc65e84b691dc6553b6b78c6ec1952368e56f52dba6cf69da82f1a007251c121"} Mar 09 13:16:03 crc kubenswrapper[4723]: I0309 13:16:03.398126 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-t8zw4" Mar 09 13:16:03 crc kubenswrapper[4723]: I0309 13:16:03.441290 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-t8zw4" podStartSLOduration=1.980281371 podStartE2EDuration="9.441268191s" podCreationTimestamp="2026-03-09 13:15:54 +0000 UTC" firstStartedPulling="2026-03-09 13:15:55.150139752 +0000 UTC m=+1029.164607292" lastFinishedPulling="2026-03-09 13:16:02.611126572 +0000 UTC m=+1036.625594112" observedRunningTime="2026-03-09 13:16:03.430116242 +0000 UTC m=+1037.444583792" watchObservedRunningTime="2026-03-09 13:16:03.441268191 +0000 UTC m=+1037.455735731" Mar 09 13:16:04 crc kubenswrapper[4723]: I0309 13:16:04.406084 4723 generic.go:334] "Generic (PLEG): container finished" podID="54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4" containerID="4e492a89eff5cc8337591756c5aee595b6789aa89a92de27d99e0080dc4d1d1e" exitCode=0 Mar 09 13:16:04 crc kubenswrapper[4723]: I0309 13:16:04.406190 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jqk9s" event={"ID":"54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4","Type":"ContainerDied","Data":"4e492a89eff5cc8337591756c5aee595b6789aa89a92de27d99e0080dc4d1d1e"} Mar 09 13:16:05 crc kubenswrapper[4723]: I0309 13:16:05.376953 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-86ddb6bd46-7hbxv" Mar 09 13:16:05 crc kubenswrapper[4723]: I0309 13:16:05.418824 4723 generic.go:334] "Generic (PLEG): container finished" podID="5ad9c8b7-3845-41fa-b73a-88cb02635900" containerID="b70a1ab9ed74a98e75c02efbf4800c8b4082bf380fe425243c91da6c6eee89da" exitCode=0 Mar 09 13:16:05 crc kubenswrapper[4723]: I0309 13:16:05.418939 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551036-5g57t" event={"ID":"5ad9c8b7-3845-41fa-b73a-88cb02635900","Type":"ContainerDied","Data":"b70a1ab9ed74a98e75c02efbf4800c8b4082bf380fe425243c91da6c6eee89da"} Mar 09 13:16:05 crc kubenswrapper[4723]: I0309 13:16:05.422254 4723 generic.go:334] "Generic (PLEG): container finished" podID="54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4" containerID="bbe74166f441ea7f70d318f3332f052a859efacdca04f8fb8b640d0001d6cf8a" exitCode=0 Mar 09 13:16:05 crc kubenswrapper[4723]: I0309 13:16:05.422302 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jqk9s" event={"ID":"54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4","Type":"ContainerDied","Data":"bbe74166f441ea7f70d318f3332f052a859efacdca04f8fb8b640d0001d6cf8a"} Mar 09 13:16:06 crc kubenswrapper[4723]: I0309 13:16:06.274583 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-zpfhd" Mar 09 13:16:06 crc kubenswrapper[4723]: I0309 13:16:06.449140 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jqk9s" event={"ID":"54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4","Type":"ContainerStarted","Data":"8b064a5eec5bf9199d5dd13178351a988cc99ca7abe502a9f11fc7b78cb8ff28"} Mar 09 13:16:06 crc kubenswrapper[4723]: I0309 13:16:06.449182 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jqk9s" event={"ID":"54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4","Type":"ContainerStarted","Data":"1a8a59cef9626a5f1c57bc07ad7d1a76a66b3e7d7136213818a81e7dabbb8f7f"} Mar 09 13:16:06 crc kubenswrapper[4723]: I0309 13:16:06.449192 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jqk9s" event={"ID":"54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4","Type":"ContainerStarted","Data":"5374ff2a69f435dced40628abe0ea8609bed315443e95f6d339c2ca83967e710"} Mar 09 13:16:06 crc kubenswrapper[4723]: I0309 13:16:06.449200 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jqk9s" event={"ID":"54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4","Type":"ContainerStarted","Data":"29d7f0fa3f618fb5a43eb2a5b58d000b3dc342b14bb68f039188a5ea0fa32361"} Mar 09 13:16:06 crc kubenswrapper[4723]: I0309 13:16:06.449208 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jqk9s" event={"ID":"54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4","Type":"ContainerStarted","Data":"18f8e1e97096657035bd9655990fc06601cdaa36300710eaba04c463639b7777"} Mar 09 13:16:06 crc kubenswrapper[4723]: I0309 13:16:06.782430 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551036-5g57t" Mar 09 13:16:06 crc kubenswrapper[4723]: I0309 13:16:06.886589 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6n5q6\" (UniqueName: \"kubernetes.io/projected/5ad9c8b7-3845-41fa-b73a-88cb02635900-kube-api-access-6n5q6\") pod \"5ad9c8b7-3845-41fa-b73a-88cb02635900\" (UID: \"5ad9c8b7-3845-41fa-b73a-88cb02635900\") " Mar 09 13:16:06 crc kubenswrapper[4723]: I0309 13:16:06.894021 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ad9c8b7-3845-41fa-b73a-88cb02635900-kube-api-access-6n5q6" (OuterVolumeSpecName: "kube-api-access-6n5q6") pod "5ad9c8b7-3845-41fa-b73a-88cb02635900" (UID: "5ad9c8b7-3845-41fa-b73a-88cb02635900"). InnerVolumeSpecName "kube-api-access-6n5q6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:16:06 crc kubenswrapper[4723]: I0309 13:16:06.988465 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6n5q6\" (UniqueName: \"kubernetes.io/projected/5ad9c8b7-3845-41fa-b73a-88cb02635900-kube-api-access-6n5q6\") on node \"crc\" DevicePath \"\"" Mar 09 13:16:07 crc kubenswrapper[4723]: I0309 13:16:07.459996 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551036-5g57t" Mar 09 13:16:07 crc kubenswrapper[4723]: I0309 13:16:07.460159 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551036-5g57t" event={"ID":"5ad9c8b7-3845-41fa-b73a-88cb02635900","Type":"ContainerDied","Data":"0dd45729f0da3fedd691b22be386a9c6918de81d6e83c2ea33a75320714f5ddc"} Mar 09 13:16:07 crc kubenswrapper[4723]: I0309 13:16:07.460613 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0dd45729f0da3fedd691b22be386a9c6918de81d6e83c2ea33a75320714f5ddc" Mar 09 13:16:07 crc kubenswrapper[4723]: I0309 13:16:07.463333 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jqk9s" event={"ID":"54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4","Type":"ContainerStarted","Data":"4ba5a8632905a3b6596d1601556cb1b83f831c01baa27d720374ed2f6764b8d3"} Mar 09 13:16:07 crc kubenswrapper[4723]: I0309 13:16:07.464091 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-jqk9s" Mar 09 13:16:07 crc kubenswrapper[4723]: I0309 13:16:07.501982 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-jqk9s" podStartSLOduration=5.753887928 podStartE2EDuration="13.501963231s" podCreationTimestamp="2026-03-09 13:15:54 +0000 UTC" firstStartedPulling="2026-03-09 13:15:54.84517082 +0000 UTC m=+1028.859638380" lastFinishedPulling="2026-03-09 13:16:02.593246143 +0000 UTC m=+1036.607713683" observedRunningTime="2026-03-09 13:16:07.49520629 +0000 UTC m=+1041.509673830" watchObservedRunningTime="2026-03-09 13:16:07.501963231 +0000 UTC m=+1041.516430761" Mar 09 13:16:07 crc kubenswrapper[4723]: I0309 13:16:07.840703 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551030-lchm5"] Mar 09 13:16:07 crc kubenswrapper[4723]: I0309 13:16:07.846038 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551030-lchm5"] Mar 09 13:16:08 crc kubenswrapper[4723]: I0309 13:16:08.892331 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="619d3345-763a-4a5b-b038-1f4443d8b0c8" path="/var/lib/kubelet/pods/619d3345-763a-4a5b-b038-1f4443d8b0c8/volumes" Mar 09 13:16:09 crc kubenswrapper[4723]: I0309 13:16:09.694758 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-jqk9s" Mar 09 13:16:09 crc kubenswrapper[4723]: I0309 13:16:09.751396 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-jqk9s" Mar 09 13:16:12 crc kubenswrapper[4723]: I0309 13:16:12.874211 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-r8zdr"] Mar 09 13:16:12 crc kubenswrapper[4723]: E0309 13:16:12.874555 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ad9c8b7-3845-41fa-b73a-88cb02635900" containerName="oc" Mar 09 13:16:12 crc kubenswrapper[4723]: I0309 13:16:12.874568 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ad9c8b7-3845-41fa-b73a-88cb02635900" containerName="oc" Mar 09 13:16:12 crc kubenswrapper[4723]: I0309 13:16:12.874790 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ad9c8b7-3845-41fa-b73a-88cb02635900" containerName="oc" Mar 09 13:16:12 crc kubenswrapper[4723]: I0309 13:16:12.875773 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-r8zdr" Mar 09 13:16:12 crc kubenswrapper[4723]: I0309 13:16:12.878717 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 09 13:16:12 crc kubenswrapper[4723]: I0309 13:16:12.878723 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-tpj2h" Mar 09 13:16:12 crc kubenswrapper[4723]: I0309 13:16:12.880084 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 09 13:16:12 crc kubenswrapper[4723]: I0309 13:16:12.889099 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-r8zdr"] Mar 09 13:16:13 crc kubenswrapper[4723]: I0309 13:16:13.010981 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c956k\" (UniqueName: \"kubernetes.io/projected/28e5bb9f-e8f1-49eb-9adc-c36859e5b03f-kube-api-access-c956k\") pod \"openstack-operator-index-r8zdr\" (UID: \"28e5bb9f-e8f1-49eb-9adc-c36859e5b03f\") " pod="openstack-operators/openstack-operator-index-r8zdr" Mar 09 13:16:13 crc kubenswrapper[4723]: I0309 13:16:13.113495 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c956k\" (UniqueName: \"kubernetes.io/projected/28e5bb9f-e8f1-49eb-9adc-c36859e5b03f-kube-api-access-c956k\") pod \"openstack-operator-index-r8zdr\" (UID: \"28e5bb9f-e8f1-49eb-9adc-c36859e5b03f\") " pod="openstack-operators/openstack-operator-index-r8zdr" Mar 09 13:16:13 crc kubenswrapper[4723]: I0309 13:16:13.140498 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c956k\" (UniqueName: \"kubernetes.io/projected/28e5bb9f-e8f1-49eb-9adc-c36859e5b03f-kube-api-access-c956k\") pod \"openstack-operator-index-r8zdr\" (UID: \"28e5bb9f-e8f1-49eb-9adc-c36859e5b03f\") " pod="openstack-operators/openstack-operator-index-r8zdr" Mar 09 13:16:13 crc kubenswrapper[4723]: I0309 13:16:13.200851 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-r8zdr" Mar 09 13:16:13 crc kubenswrapper[4723]: W0309 13:16:13.702694 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28e5bb9f_e8f1_49eb_9adc_c36859e5b03f.slice/crio-3bab23f6cf77ceb1ad4fc9b071031147ef504122a9448f2584104da8a2a6ed09 WatchSource:0}: Error finding container 3bab23f6cf77ceb1ad4fc9b071031147ef504122a9448f2584104da8a2a6ed09: Status 404 returned error can't find the container with id 3bab23f6cf77ceb1ad4fc9b071031147ef504122a9448f2584104da8a2a6ed09 Mar 09 13:16:13 crc kubenswrapper[4723]: I0309 13:16:13.705874 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-r8zdr"] Mar 09 13:16:14 crc kubenswrapper[4723]: I0309 13:16:14.524626 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-r8zdr" event={"ID":"28e5bb9f-e8f1-49eb-9adc-c36859e5b03f","Type":"ContainerStarted","Data":"3bab23f6cf77ceb1ad4fc9b071031147ef504122a9448f2584104da8a2a6ed09"} Mar 09 13:16:14 crc kubenswrapper[4723]: I0309 13:16:14.681782 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-t8zw4" Mar 09 13:16:17 crc kubenswrapper[4723]: I0309 13:16:17.548544 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-r8zdr" event={"ID":"28e5bb9f-e8f1-49eb-9adc-c36859e5b03f","Type":"ContainerStarted","Data":"0a98d6896f37901e433e4c9b9e7d981cc5996e451799b17d4d5d9e0827fbc686"} Mar 09 13:16:17 crc kubenswrapper[4723]: I0309 13:16:17.569813 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-r8zdr" podStartSLOduration=2.8574561899999997 podStartE2EDuration="5.569791754s" podCreationTimestamp="2026-03-09 13:16:12 +0000 UTC" firstStartedPulling="2026-03-09 13:16:13.705229076 +0000 UTC m=+1047.719696616" lastFinishedPulling="2026-03-09 13:16:16.41756464 +0000 UTC m=+1050.432032180" observedRunningTime="2026-03-09 13:16:17.563938678 +0000 UTC m=+1051.578406208" watchObservedRunningTime="2026-03-09 13:16:17.569791754 +0000 UTC m=+1051.584259294" Mar 09 13:16:18 crc kubenswrapper[4723]: I0309 13:16:18.064041 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-r8zdr"] Mar 09 13:16:18 crc kubenswrapper[4723]: I0309 13:16:18.677053 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-9kl8t"] Mar 09 13:16:18 crc kubenswrapper[4723]: I0309 13:16:18.678640 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-9kl8t" Mar 09 13:16:18 crc kubenswrapper[4723]: I0309 13:16:18.703812 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-9kl8t"] Mar 09 13:16:18 crc kubenswrapper[4723]: I0309 13:16:18.849358 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jflwx\" (UniqueName: \"kubernetes.io/projected/194a48e1-f053-4aa1-bdfe-07aa2a8a208e-kube-api-access-jflwx\") pod \"openstack-operator-index-9kl8t\" (UID: \"194a48e1-f053-4aa1-bdfe-07aa2a8a208e\") " pod="openstack-operators/openstack-operator-index-9kl8t" Mar 09 13:16:18 crc kubenswrapper[4723]: I0309 13:16:18.951209 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jflwx\" (UniqueName: \"kubernetes.io/projected/194a48e1-f053-4aa1-bdfe-07aa2a8a208e-kube-api-access-jflwx\") pod \"openstack-operator-index-9kl8t\" (UID: \"194a48e1-f053-4aa1-bdfe-07aa2a8a208e\") " pod="openstack-operators/openstack-operator-index-9kl8t" Mar 09 13:16:18 crc kubenswrapper[4723]: I0309 13:16:18.978951 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jflwx\" (UniqueName: \"kubernetes.io/projected/194a48e1-f053-4aa1-bdfe-07aa2a8a208e-kube-api-access-jflwx\") pod \"openstack-operator-index-9kl8t\" (UID: \"194a48e1-f053-4aa1-bdfe-07aa2a8a208e\") " pod="openstack-operators/openstack-operator-index-9kl8t" Mar 09 13:16:19 crc kubenswrapper[4723]: I0309 13:16:19.011072 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-9kl8t" Mar 09 13:16:19 crc kubenswrapper[4723]: I0309 13:16:19.423976 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-9kl8t"] Mar 09 13:16:19 crc kubenswrapper[4723]: W0309 13:16:19.428803 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod194a48e1_f053_4aa1_bdfe_07aa2a8a208e.slice/crio-4b3d56726d8991d454b1e467a058c4e42cade791a8d8d35e8372e34e821501a5 WatchSource:0}: Error finding container 4b3d56726d8991d454b1e467a058c4e42cade791a8d8d35e8372e34e821501a5: Status 404 returned error can't find the container with id 4b3d56726d8991d454b1e467a058c4e42cade791a8d8d35e8372e34e821501a5 Mar 09 13:16:19 crc kubenswrapper[4723]: I0309 13:16:19.563049 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-9kl8t" event={"ID":"194a48e1-f053-4aa1-bdfe-07aa2a8a208e","Type":"ContainerStarted","Data":"4b3d56726d8991d454b1e467a058c4e42cade791a8d8d35e8372e34e821501a5"} Mar 09 13:16:19 crc kubenswrapper[4723]: I0309 13:16:19.563460 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-r8zdr" podUID="28e5bb9f-e8f1-49eb-9adc-c36859e5b03f" containerName="registry-server" containerID="cri-o://0a98d6896f37901e433e4c9b9e7d981cc5996e451799b17d4d5d9e0827fbc686" gracePeriod=2 Mar 09 13:16:19 crc kubenswrapper[4723]: I0309 13:16:19.951539 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-r8zdr" Mar 09 13:16:20 crc kubenswrapper[4723]: I0309 13:16:20.072231 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c956k\" (UniqueName: \"kubernetes.io/projected/28e5bb9f-e8f1-49eb-9adc-c36859e5b03f-kube-api-access-c956k\") pod \"28e5bb9f-e8f1-49eb-9adc-c36859e5b03f\" (UID: \"28e5bb9f-e8f1-49eb-9adc-c36859e5b03f\") " Mar 09 13:16:20 crc kubenswrapper[4723]: I0309 13:16:20.077507 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28e5bb9f-e8f1-49eb-9adc-c36859e5b03f-kube-api-access-c956k" (OuterVolumeSpecName: "kube-api-access-c956k") pod "28e5bb9f-e8f1-49eb-9adc-c36859e5b03f" (UID: "28e5bb9f-e8f1-49eb-9adc-c36859e5b03f"). InnerVolumeSpecName "kube-api-access-c956k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:16:20 crc kubenswrapper[4723]: I0309 13:16:20.173994 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c956k\" (UniqueName: \"kubernetes.io/projected/28e5bb9f-e8f1-49eb-9adc-c36859e5b03f-kube-api-access-c956k\") on node \"crc\" DevicePath \"\"" Mar 09 13:16:20 crc kubenswrapper[4723]: I0309 13:16:20.573957 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-9kl8t" event={"ID":"194a48e1-f053-4aa1-bdfe-07aa2a8a208e","Type":"ContainerStarted","Data":"7559ddc8b7991b0c679cbfec6d32fd009c22d91705d78205ee2ac0574eb2131f"} Mar 09 13:16:20 crc kubenswrapper[4723]: I0309 13:16:20.575765 4723 generic.go:334] "Generic (PLEG): container finished" podID="28e5bb9f-e8f1-49eb-9adc-c36859e5b03f" containerID="0a98d6896f37901e433e4c9b9e7d981cc5996e451799b17d4d5d9e0827fbc686" exitCode=0 Mar 09 13:16:20 crc kubenswrapper[4723]: I0309 13:16:20.575803 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-r8zdr" Mar 09 13:16:20 crc kubenswrapper[4723]: I0309 13:16:20.575817 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-r8zdr" event={"ID":"28e5bb9f-e8f1-49eb-9adc-c36859e5b03f","Type":"ContainerDied","Data":"0a98d6896f37901e433e4c9b9e7d981cc5996e451799b17d4d5d9e0827fbc686"} Mar 09 13:16:20 crc kubenswrapper[4723]: I0309 13:16:20.575852 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-r8zdr" event={"ID":"28e5bb9f-e8f1-49eb-9adc-c36859e5b03f","Type":"ContainerDied","Data":"3bab23f6cf77ceb1ad4fc9b071031147ef504122a9448f2584104da8a2a6ed09"} Mar 09 13:16:20 crc kubenswrapper[4723]: I0309 13:16:20.575892 4723 scope.go:117] "RemoveContainer" containerID="0a98d6896f37901e433e4c9b9e7d981cc5996e451799b17d4d5d9e0827fbc686" Mar 09 13:16:20 crc kubenswrapper[4723]: I0309 13:16:20.603405 4723 scope.go:117] "RemoveContainer" containerID="0a98d6896f37901e433e4c9b9e7d981cc5996e451799b17d4d5d9e0827fbc686" Mar 09 13:16:20 crc kubenswrapper[4723]: I0309 13:16:20.604071 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-9kl8t" podStartSLOduration=2.557983726 podStartE2EDuration="2.604053059s" podCreationTimestamp="2026-03-09 13:16:18 +0000 UTC" firstStartedPulling="2026-03-09 13:16:19.434782327 +0000 UTC m=+1053.449249867" lastFinishedPulling="2026-03-09 13:16:19.48085166 +0000 UTC m=+1053.495319200" observedRunningTime="2026-03-09 13:16:20.59887444 +0000 UTC m=+1054.613341990" watchObservedRunningTime="2026-03-09 13:16:20.604053059 +0000 UTC m=+1054.618520599" Mar 09 13:16:20 crc kubenswrapper[4723]: E0309 13:16:20.604167 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a98d6896f37901e433e4c9b9e7d981cc5996e451799b17d4d5d9e0827fbc686\": container with ID starting with 0a98d6896f37901e433e4c9b9e7d981cc5996e451799b17d4d5d9e0827fbc686 not found: ID does not exist" containerID="0a98d6896f37901e433e4c9b9e7d981cc5996e451799b17d4d5d9e0827fbc686" Mar 09 13:16:20 crc kubenswrapper[4723]: I0309 13:16:20.604235 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a98d6896f37901e433e4c9b9e7d981cc5996e451799b17d4d5d9e0827fbc686"} err="failed to get container status \"0a98d6896f37901e433e4c9b9e7d981cc5996e451799b17d4d5d9e0827fbc686\": rpc error: code = NotFound desc = could not find container \"0a98d6896f37901e433e4c9b9e7d981cc5996e451799b17d4d5d9e0827fbc686\": container with ID starting with 0a98d6896f37901e433e4c9b9e7d981cc5996e451799b17d4d5d9e0827fbc686 not found: ID does not exist" Mar 09 13:16:20 crc kubenswrapper[4723]: I0309 13:16:20.615979 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-r8zdr"] Mar 09 13:16:20 crc kubenswrapper[4723]: I0309 13:16:20.621403 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-r8zdr"] Mar 09 13:16:20 crc kubenswrapper[4723]: I0309 13:16:20.889392 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28e5bb9f-e8f1-49eb-9adc-c36859e5b03f" path="/var/lib/kubelet/pods/28e5bb9f-e8f1-49eb-9adc-c36859e5b03f/volumes" Mar 09 13:16:24 crc kubenswrapper[4723]: I0309 13:16:24.698621 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-jqk9s" Mar 09 13:16:29 crc kubenswrapper[4723]: I0309 13:16:29.012482 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-9kl8t" Mar 09 13:16:29 crc kubenswrapper[4723]: I0309 13:16:29.012905 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-9kl8t" Mar 09 13:16:29 crc kubenswrapper[4723]: I0309 13:16:29.056494 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-9kl8t" Mar 09 13:16:29 crc kubenswrapper[4723]: I0309 13:16:29.703881 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-9kl8t" Mar 09 13:16:30 crc kubenswrapper[4723]: I0309 13:16:30.709218 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/21dcd4e21a427937548aca6e984807babb52619359c299a213cedf1d364pmf9"] Mar 09 13:16:30 crc kubenswrapper[4723]: E0309 13:16:30.709838 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28e5bb9f-e8f1-49eb-9adc-c36859e5b03f" containerName="registry-server" Mar 09 13:16:30 crc kubenswrapper[4723]: I0309 13:16:30.709853 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="28e5bb9f-e8f1-49eb-9adc-c36859e5b03f" containerName="registry-server" Mar 09 13:16:30 crc kubenswrapper[4723]: I0309 13:16:30.710058 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="28e5bb9f-e8f1-49eb-9adc-c36859e5b03f" containerName="registry-server" Mar 09 13:16:30 crc kubenswrapper[4723]: I0309 13:16:30.711528 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/21dcd4e21a427937548aca6e984807babb52619359c299a213cedf1d364pmf9" Mar 09 13:16:30 crc kubenswrapper[4723]: I0309 13:16:30.712979 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-d89zw" Mar 09 13:16:30 crc kubenswrapper[4723]: I0309 13:16:30.732471 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/21dcd4e21a427937548aca6e984807babb52619359c299a213cedf1d364pmf9"] Mar 09 13:16:30 crc kubenswrapper[4723]: I0309 13:16:30.764587 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8f2b5701-dca6-45fa-8962-1d72ae98fe97-bundle\") pod \"21dcd4e21a427937548aca6e984807babb52619359c299a213cedf1d364pmf9\" (UID: \"8f2b5701-dca6-45fa-8962-1d72ae98fe97\") " pod="openstack-operators/21dcd4e21a427937548aca6e984807babb52619359c299a213cedf1d364pmf9" Mar 09 13:16:30 crc kubenswrapper[4723]: I0309 13:16:30.764699 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2chbn\" (UniqueName: \"kubernetes.io/projected/8f2b5701-dca6-45fa-8962-1d72ae98fe97-kube-api-access-2chbn\") pod \"21dcd4e21a427937548aca6e984807babb52619359c299a213cedf1d364pmf9\" (UID: \"8f2b5701-dca6-45fa-8962-1d72ae98fe97\") " pod="openstack-operators/21dcd4e21a427937548aca6e984807babb52619359c299a213cedf1d364pmf9" Mar 09 13:16:30 crc kubenswrapper[4723]: I0309 13:16:30.764762 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8f2b5701-dca6-45fa-8962-1d72ae98fe97-util\") pod \"21dcd4e21a427937548aca6e984807babb52619359c299a213cedf1d364pmf9\" (UID: \"8f2b5701-dca6-45fa-8962-1d72ae98fe97\") " pod="openstack-operators/21dcd4e21a427937548aca6e984807babb52619359c299a213cedf1d364pmf9" Mar 09 13:16:30 crc kubenswrapper[4723]: I0309 13:16:30.866525 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2chbn\" (UniqueName: \"kubernetes.io/projected/8f2b5701-dca6-45fa-8962-1d72ae98fe97-kube-api-access-2chbn\") pod \"21dcd4e21a427937548aca6e984807babb52619359c299a213cedf1d364pmf9\" (UID: \"8f2b5701-dca6-45fa-8962-1d72ae98fe97\") " pod="openstack-operators/21dcd4e21a427937548aca6e984807babb52619359c299a213cedf1d364pmf9" Mar 09 13:16:30 crc kubenswrapper[4723]: I0309 13:16:30.866607 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8f2b5701-dca6-45fa-8962-1d72ae98fe97-util\") pod \"21dcd4e21a427937548aca6e984807babb52619359c299a213cedf1d364pmf9\" (UID: \"8f2b5701-dca6-45fa-8962-1d72ae98fe97\") " pod="openstack-operators/21dcd4e21a427937548aca6e984807babb52619359c299a213cedf1d364pmf9" Mar 09 13:16:30 crc kubenswrapper[4723]: I0309 13:16:30.866708 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8f2b5701-dca6-45fa-8962-1d72ae98fe97-bundle\") pod \"21dcd4e21a427937548aca6e984807babb52619359c299a213cedf1d364pmf9\" (UID: \"8f2b5701-dca6-45fa-8962-1d72ae98fe97\") " pod="openstack-operators/21dcd4e21a427937548aca6e984807babb52619359c299a213cedf1d364pmf9" Mar 09 13:16:30 crc kubenswrapper[4723]: I0309 13:16:30.867224 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8f2b5701-dca6-45fa-8962-1d72ae98fe97-bundle\") pod \"21dcd4e21a427937548aca6e984807babb52619359c299a213cedf1d364pmf9\" (UID: \"8f2b5701-dca6-45fa-8962-1d72ae98fe97\") " pod="openstack-operators/21dcd4e21a427937548aca6e984807babb52619359c299a213cedf1d364pmf9" Mar 09 13:16:30 crc kubenswrapper[4723]: I0309 13:16:30.867898 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8f2b5701-dca6-45fa-8962-1d72ae98fe97-util\") pod \"21dcd4e21a427937548aca6e984807babb52619359c299a213cedf1d364pmf9\" (UID: \"8f2b5701-dca6-45fa-8962-1d72ae98fe97\") " pod="openstack-operators/21dcd4e21a427937548aca6e984807babb52619359c299a213cedf1d364pmf9" Mar 09 13:16:30 crc kubenswrapper[4723]: I0309 13:16:30.899537 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2chbn\" (UniqueName: \"kubernetes.io/projected/8f2b5701-dca6-45fa-8962-1d72ae98fe97-kube-api-access-2chbn\") pod \"21dcd4e21a427937548aca6e984807babb52619359c299a213cedf1d364pmf9\" (UID: \"8f2b5701-dca6-45fa-8962-1d72ae98fe97\") " pod="openstack-operators/21dcd4e21a427937548aca6e984807babb52619359c299a213cedf1d364pmf9" Mar 09 13:16:31 crc kubenswrapper[4723]: I0309 13:16:31.033011 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/21dcd4e21a427937548aca6e984807babb52619359c299a213cedf1d364pmf9" Mar 09 13:16:31 crc kubenswrapper[4723]: I0309 13:16:31.482892 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/21dcd4e21a427937548aca6e984807babb52619359c299a213cedf1d364pmf9"] Mar 09 13:16:31 crc kubenswrapper[4723]: I0309 13:16:31.687452 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/21dcd4e21a427937548aca6e984807babb52619359c299a213cedf1d364pmf9" event={"ID":"8f2b5701-dca6-45fa-8962-1d72ae98fe97","Type":"ContainerStarted","Data":"97bc63d4ba31da9a77d28854053cb4ef41920c3846e0596aaf034c8b57eb6971"} Mar 09 13:16:31 crc kubenswrapper[4723]: I0309 13:16:31.687545 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/21dcd4e21a427937548aca6e984807babb52619359c299a213cedf1d364pmf9" event={"ID":"8f2b5701-dca6-45fa-8962-1d72ae98fe97","Type":"ContainerStarted","Data":"8fb4399dcd62bbf2825345c61a1be79655967ce96580d5dc4c45d40e98058cea"} Mar 09 13:16:32 crc kubenswrapper[4723]: I0309 13:16:32.700574 4723 generic.go:334] "Generic (PLEG): container finished" podID="8f2b5701-dca6-45fa-8962-1d72ae98fe97" containerID="97bc63d4ba31da9a77d28854053cb4ef41920c3846e0596aaf034c8b57eb6971" exitCode=0 Mar 09 13:16:32 crc kubenswrapper[4723]: I0309 13:16:32.700676 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/21dcd4e21a427937548aca6e984807babb52619359c299a213cedf1d364pmf9" event={"ID":"8f2b5701-dca6-45fa-8962-1d72ae98fe97","Type":"ContainerDied","Data":"97bc63d4ba31da9a77d28854053cb4ef41920c3846e0596aaf034c8b57eb6971"} Mar 09 13:16:33 crc kubenswrapper[4723]: I0309 13:16:33.715626 4723 generic.go:334] "Generic (PLEG): container finished" podID="8f2b5701-dca6-45fa-8962-1d72ae98fe97" containerID="6da96ca0e8cb75a5e70f76af94644e113e2eaaa83414048f9a8b66fac9e1b669" exitCode=0 Mar 09 13:16:33 crc kubenswrapper[4723]: I0309 13:16:33.715699 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/21dcd4e21a427937548aca6e984807babb52619359c299a213cedf1d364pmf9" event={"ID":"8f2b5701-dca6-45fa-8962-1d72ae98fe97","Type":"ContainerDied","Data":"6da96ca0e8cb75a5e70f76af94644e113e2eaaa83414048f9a8b66fac9e1b669"} Mar 09 13:16:34 crc kubenswrapper[4723]: I0309 13:16:34.729140 4723 generic.go:334] "Generic (PLEG): container finished" podID="8f2b5701-dca6-45fa-8962-1d72ae98fe97" containerID="ea8fef47fc5a5a97d0e2d7b3d00c7b6ba78ed6b5fc44c1d8a748a972c25bc1a8" exitCode=0 Mar 09 13:16:34 crc kubenswrapper[4723]: I0309 13:16:34.729367 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/21dcd4e21a427937548aca6e984807babb52619359c299a213cedf1d364pmf9" event={"ID":"8f2b5701-dca6-45fa-8962-1d72ae98fe97","Type":"ContainerDied","Data":"ea8fef47fc5a5a97d0e2d7b3d00c7b6ba78ed6b5fc44c1d8a748a972c25bc1a8"} Mar 09 13:16:36 crc kubenswrapper[4723]: I0309 13:16:36.112463 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/21dcd4e21a427937548aca6e984807babb52619359c299a213cedf1d364pmf9" Mar 09 13:16:36 crc kubenswrapper[4723]: I0309 13:16:36.167202 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2chbn\" (UniqueName: \"kubernetes.io/projected/8f2b5701-dca6-45fa-8962-1d72ae98fe97-kube-api-access-2chbn\") pod \"8f2b5701-dca6-45fa-8962-1d72ae98fe97\" (UID: \"8f2b5701-dca6-45fa-8962-1d72ae98fe97\") " Mar 09 13:16:36 crc kubenswrapper[4723]: I0309 13:16:36.167482 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8f2b5701-dca6-45fa-8962-1d72ae98fe97-bundle\") pod \"8f2b5701-dca6-45fa-8962-1d72ae98fe97\" (UID: \"8f2b5701-dca6-45fa-8962-1d72ae98fe97\") " Mar 09 13:16:36 crc kubenswrapper[4723]: I0309 13:16:36.167588 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8f2b5701-dca6-45fa-8962-1d72ae98fe97-util\") pod \"8f2b5701-dca6-45fa-8962-1d72ae98fe97\" (UID: \"8f2b5701-dca6-45fa-8962-1d72ae98fe97\") " Mar 09 13:16:36 crc kubenswrapper[4723]: I0309 13:16:36.173481 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f2b5701-dca6-45fa-8962-1d72ae98fe97-bundle" (OuterVolumeSpecName: "bundle") pod "8f2b5701-dca6-45fa-8962-1d72ae98fe97" (UID: "8f2b5701-dca6-45fa-8962-1d72ae98fe97"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:16:36 crc kubenswrapper[4723]: I0309 13:16:36.175563 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f2b5701-dca6-45fa-8962-1d72ae98fe97-kube-api-access-2chbn" (OuterVolumeSpecName: "kube-api-access-2chbn") pod "8f2b5701-dca6-45fa-8962-1d72ae98fe97" (UID: "8f2b5701-dca6-45fa-8962-1d72ae98fe97"). InnerVolumeSpecName "kube-api-access-2chbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:16:36 crc kubenswrapper[4723]: I0309 13:16:36.185587 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f2b5701-dca6-45fa-8962-1d72ae98fe97-util" (OuterVolumeSpecName: "util") pod "8f2b5701-dca6-45fa-8962-1d72ae98fe97" (UID: "8f2b5701-dca6-45fa-8962-1d72ae98fe97"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:16:36 crc kubenswrapper[4723]: I0309 13:16:36.270433 4723 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8f2b5701-dca6-45fa-8962-1d72ae98fe97-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:16:36 crc kubenswrapper[4723]: I0309 13:16:36.270491 4723 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8f2b5701-dca6-45fa-8962-1d72ae98fe97-util\") on node \"crc\" DevicePath \"\"" Mar 09 13:16:36 crc kubenswrapper[4723]: I0309 13:16:36.270504 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2chbn\" (UniqueName: \"kubernetes.io/projected/8f2b5701-dca6-45fa-8962-1d72ae98fe97-kube-api-access-2chbn\") on node \"crc\" DevicePath \"\"" Mar 09 13:16:36 crc kubenswrapper[4723]: I0309 13:16:36.756141 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/21dcd4e21a427937548aca6e984807babb52619359c299a213cedf1d364pmf9" event={"ID":"8f2b5701-dca6-45fa-8962-1d72ae98fe97","Type":"ContainerDied","Data":"8fb4399dcd62bbf2825345c61a1be79655967ce96580d5dc4c45d40e98058cea"} Mar 09 13:16:36 crc kubenswrapper[4723]: I0309 13:16:36.756185 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8fb4399dcd62bbf2825345c61a1be79655967ce96580d5dc4c45d40e98058cea" Mar 09 13:16:36 crc kubenswrapper[4723]: I0309 13:16:36.756259 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/21dcd4e21a427937548aca6e984807babb52619359c299a213cedf1d364pmf9" Mar 09 13:16:42 crc kubenswrapper[4723]: I0309 13:16:42.924301 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-7d644d7fb7-r7swc"] Mar 09 13:16:42 crc kubenswrapper[4723]: E0309 13:16:42.925162 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f2b5701-dca6-45fa-8962-1d72ae98fe97" containerName="util" Mar 09 13:16:42 crc kubenswrapper[4723]: I0309 13:16:42.925176 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f2b5701-dca6-45fa-8962-1d72ae98fe97" containerName="util" Mar 09 13:16:42 crc kubenswrapper[4723]: E0309 13:16:42.925201 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f2b5701-dca6-45fa-8962-1d72ae98fe97" containerName="extract" Mar 09 13:16:42 crc kubenswrapper[4723]: I0309 13:16:42.925210 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f2b5701-dca6-45fa-8962-1d72ae98fe97" containerName="extract" Mar 09 13:16:42 crc kubenswrapper[4723]: E0309 13:16:42.925228 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f2b5701-dca6-45fa-8962-1d72ae98fe97" containerName="pull" Mar 09 13:16:42 crc kubenswrapper[4723]: I0309 13:16:42.925236 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f2b5701-dca6-45fa-8962-1d72ae98fe97" containerName="pull" Mar 09 13:16:42 crc kubenswrapper[4723]: I0309 13:16:42.925424 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f2b5701-dca6-45fa-8962-1d72ae98fe97" containerName="extract" Mar 09 13:16:42 crc kubenswrapper[4723]: I0309 13:16:42.926088 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-7d644d7fb7-r7swc" Mar 09 13:16:42 crc kubenswrapper[4723]: I0309 13:16:42.927567 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-f9c59" Mar 09 13:16:43 crc kubenswrapper[4723]: I0309 13:16:43.012449 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-7d644d7fb7-r7swc"] Mar 09 13:16:43 crc kubenswrapper[4723]: I0309 13:16:43.078285 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpq4r\" (UniqueName: \"kubernetes.io/projected/b223943e-1394-48af-8f5c-78a9d370b602-kube-api-access-qpq4r\") pod \"openstack-operator-controller-init-7d644d7fb7-r7swc\" (UID: \"b223943e-1394-48af-8f5c-78a9d370b602\") " pod="openstack-operators/openstack-operator-controller-init-7d644d7fb7-r7swc" Mar 09 13:16:43 crc kubenswrapper[4723]: I0309 13:16:43.180609 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpq4r\" (UniqueName: \"kubernetes.io/projected/b223943e-1394-48af-8f5c-78a9d370b602-kube-api-access-qpq4r\") pod \"openstack-operator-controller-init-7d644d7fb7-r7swc\" (UID: \"b223943e-1394-48af-8f5c-78a9d370b602\") " pod="openstack-operators/openstack-operator-controller-init-7d644d7fb7-r7swc" Mar 09 13:16:43 crc kubenswrapper[4723]: I0309 13:16:43.200565 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpq4r\" (UniqueName: \"kubernetes.io/projected/b223943e-1394-48af-8f5c-78a9d370b602-kube-api-access-qpq4r\") pod \"openstack-operator-controller-init-7d644d7fb7-r7swc\" (UID: \"b223943e-1394-48af-8f5c-78a9d370b602\") " pod="openstack-operators/openstack-operator-controller-init-7d644d7fb7-r7swc" Mar 09 13:16:43 crc kubenswrapper[4723]: I0309 13:16:43.243192 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-7d644d7fb7-r7swc" Mar 09 13:16:43 crc kubenswrapper[4723]: I0309 13:16:43.733957 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-7d644d7fb7-r7swc"] Mar 09 13:16:43 crc kubenswrapper[4723]: I0309 13:16:43.844271 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-7d644d7fb7-r7swc" event={"ID":"b223943e-1394-48af-8f5c-78a9d370b602","Type":"ContainerStarted","Data":"ad73460cb9bdb4196b501c36e9e6d6b29329df1797dba9b76e209f0f0aff51d8"} Mar 09 13:16:47 crc kubenswrapper[4723]: I0309 13:16:47.728588 4723 scope.go:117] "RemoveContainer" containerID="a4f3f7b89c97320355e6b1d445b6fe38c1eea5fbdd02b89e050f53194759a45f" Mar 09 13:16:47 crc kubenswrapper[4723]: I0309 13:16:47.887063 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-7d644d7fb7-r7swc" event={"ID":"b223943e-1394-48af-8f5c-78a9d370b602","Type":"ContainerStarted","Data":"c3084a1970d35edc9a8299d144c191d18820e0d4f40f2d1c918dfff048851c7e"} Mar 09 13:16:47 crc kubenswrapper[4723]: I0309 13:16:47.887527 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-7d644d7fb7-r7swc" Mar 09 13:16:47 crc kubenswrapper[4723]: I0309 13:16:47.913821 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-7d644d7fb7-r7swc" podStartSLOduration=2.036545156 podStartE2EDuration="5.913806954s" podCreationTimestamp="2026-03-09 13:16:42 +0000 UTC" firstStartedPulling="2026-03-09 13:16:43.730726456 +0000 UTC m=+1077.745194046" lastFinishedPulling="2026-03-09 13:16:47.607988304 +0000 UTC m=+1081.622455844" observedRunningTime="2026-03-09 13:16:47.911103842 +0000 UTC m=+1081.925571382" watchObservedRunningTime="2026-03-09 13:16:47.913806954 +0000 UTC m=+1081.928274494" Mar 09 13:16:53 crc kubenswrapper[4723]: I0309 13:16:53.246546 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-7d644d7fb7-r7swc" Mar 09 13:17:03 crc kubenswrapper[4723]: I0309 13:17:03.947543 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:17:03 crc kubenswrapper[4723]: I0309 13:17:03.948186 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.194111 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-4zwkg"] Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.196070 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-4zwkg" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.199521 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-vnmpr" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.204618 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-mtmcb"] Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.205896 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-mtmcb" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.207944 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-hcfh6" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.213106 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-dwlzx"] Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.214202 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-dwlzx" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.216380 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-w7ktj" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.224066 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-4zwkg"] Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.233912 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-mtmcb"] Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.235239 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qmjc\" (UniqueName: \"kubernetes.io/projected/6bb6b3ee-7923-42ce-b36d-dabdaa42f829-kube-api-access-2qmjc\") pod \"designate-operator-controller-manager-5d87c9d997-dwlzx\" (UID: \"6bb6b3ee-7923-42ce-b36d-dabdaa42f829\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-dwlzx" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.235278 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttzpv\" (UniqueName: \"kubernetes.io/projected/01b1451d-b917-4176-abf6-fd84021ba30d-kube-api-access-ttzpv\") pod \"cinder-operator-controller-manager-55d77d7b5c-4zwkg\" (UID: \"01b1451d-b917-4176-abf6-fd84021ba30d\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-4zwkg" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.238033 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j4h4\" (UniqueName: \"kubernetes.io/projected/6e192922-8050-41f1-bf25-33a12ace409b-kube-api-access-9j4h4\") pod \"barbican-operator-controller-manager-6db6876945-mtmcb\" (UID: \"6e192922-8050-41f1-bf25-33a12ace409b\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-mtmcb" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.243966 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-dwlzx"] Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.264950 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-89rtm"] Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.274199 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-89rtm" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.305444 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-x6qnt" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.310467 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-9btqb"] Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.351824 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-9btqb" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.363358 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-89rtm"] Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.363662 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttzpv\" (UniqueName: \"kubernetes.io/projected/01b1451d-b917-4176-abf6-fd84021ba30d-kube-api-access-ttzpv\") pod \"cinder-operator-controller-manager-55d77d7b5c-4zwkg\" (UID: \"01b1451d-b917-4176-abf6-fd84021ba30d\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-4zwkg" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.363751 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9j4h4\" (UniqueName: \"kubernetes.io/projected/6e192922-8050-41f1-bf25-33a12ace409b-kube-api-access-9j4h4\") pod \"barbican-operator-controller-manager-6db6876945-mtmcb\" (UID: \"6e192922-8050-41f1-bf25-33a12ace409b\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-mtmcb" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.363886 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qmjc\" (UniqueName: \"kubernetes.io/projected/6bb6b3ee-7923-42ce-b36d-dabdaa42f829-kube-api-access-2qmjc\") pod \"designate-operator-controller-manager-5d87c9d997-dwlzx\" (UID: \"6bb6b3ee-7923-42ce-b36d-dabdaa42f829\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-dwlzx" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.375288 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-njftd" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.383524 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-9btqb"] Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.392100 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j4h4\" (UniqueName: \"kubernetes.io/projected/6e192922-8050-41f1-bf25-33a12ace409b-kube-api-access-9j4h4\") pod \"barbican-operator-controller-manager-6db6876945-mtmcb\" (UID: \"6e192922-8050-41f1-bf25-33a12ace409b\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-mtmcb" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.392730 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttzpv\" (UniqueName: \"kubernetes.io/projected/01b1451d-b917-4176-abf6-fd84021ba30d-kube-api-access-ttzpv\") pod \"cinder-operator-controller-manager-55d77d7b5c-4zwkg\" (UID: \"01b1451d-b917-4176-abf6-fd84021ba30d\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-4zwkg" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.410850 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qmjc\" (UniqueName: \"kubernetes.io/projected/6bb6b3ee-7923-42ce-b36d-dabdaa42f829-kube-api-access-2qmjc\") pod \"designate-operator-controller-manager-5d87c9d997-dwlzx\" (UID: \"6bb6b3ee-7923-42ce-b36d-dabdaa42f829\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-dwlzx" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.456638 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-5wrr9"] Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.457720 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-5wrr9" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.463246 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-pq9w6" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.464747 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldbj2\" (UniqueName: \"kubernetes.io/projected/6bf9afff-37d5-41e4-9270-8994fc65deda-kube-api-access-ldbj2\") pod \"heat-operator-controller-manager-cf99c678f-9btqb\" (UID: \"6bf9afff-37d5-41e4-9270-8994fc65deda\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-9btqb" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.464907 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55949\" (UniqueName: \"kubernetes.io/projected/9646c273-606f-4551-82dd-39e09007dc17-kube-api-access-55949\") pod \"glance-operator-controller-manager-64db6967f8-89rtm\" (UID: \"9646c273-606f-4551-82dd-39e09007dc17\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-89rtm" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.467801 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-5wrr9"] Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.495029 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-f7fcc58b9-v6s9h"] Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.498586 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-v6s9h" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.501461 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.501661 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-jnbzx" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.510234 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-f7fcc58b9-v6s9h"] Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.524365 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-rwzzl"] Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.525709 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-rwzzl" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.526631 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-4zwkg" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.529417 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-gg56z" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.531231 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-wrqqw"] Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.532349 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wrqqw" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.552826 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-tfllm" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.553266 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-mtmcb" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.557951 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-rwzzl"] Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.565530 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-dwlzx" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.567151 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55949\" (UniqueName: \"kubernetes.io/projected/9646c273-606f-4551-82dd-39e09007dc17-kube-api-access-55949\") pod \"glance-operator-controller-manager-64db6967f8-89rtm\" (UID: \"9646c273-606f-4551-82dd-39e09007dc17\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-89rtm" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.567242 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wx9l\" (UniqueName: \"kubernetes.io/projected/2fc3d688-53db-4d4e-9555-54c047570ae5-kube-api-access-8wx9l\") pod \"horizon-operator-controller-manager-78bc7f9bd9-5wrr9\" (UID: \"2fc3d688-53db-4d4e-9555-54c047570ae5\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-5wrr9" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.567275 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldbj2\" (UniqueName: \"kubernetes.io/projected/6bf9afff-37d5-41e4-9270-8994fc65deda-kube-api-access-ldbj2\") pod \"heat-operator-controller-manager-cf99c678f-9btqb\" (UID: \"6bf9afff-37d5-41e4-9270-8994fc65deda\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-9btqb" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.588949 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-96b5g"] Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.590275 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-96b5g" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.598366 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-ln4fc" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.607677 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55949\" (UniqueName: \"kubernetes.io/projected/9646c273-606f-4551-82dd-39e09007dc17-kube-api-access-55949\") pod \"glance-operator-controller-manager-64db6967f8-89rtm\" (UID: \"9646c273-606f-4551-82dd-39e09007dc17\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-89rtm" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.609727 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-wrqqw"] Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.628412 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-96b5g"] Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.633648 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldbj2\" (UniqueName: \"kubernetes.io/projected/6bf9afff-37d5-41e4-9270-8994fc65deda-kube-api-access-ldbj2\") pod \"heat-operator-controller-manager-cf99c678f-9btqb\" (UID: \"6bf9afff-37d5-41e4-9270-8994fc65deda\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-9btqb" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.651851 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-czjxc"] Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.653134 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54688575f-czjxc" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.661421 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-89rtm" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.667245 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-hn74r" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.668211 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wx9l\" (UniqueName: \"kubernetes.io/projected/2fc3d688-53db-4d4e-9555-54c047570ae5-kube-api-access-8wx9l\") pod \"horizon-operator-controller-manager-78bc7f9bd9-5wrr9\" (UID: \"2fc3d688-53db-4d4e-9555-54c047570ae5\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-5wrr9" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.668250 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hlrg\" (UniqueName: \"kubernetes.io/projected/5ea4f771-5b0c-410d-8a6c-a45b039edb6a-kube-api-access-5hlrg\") pod \"infra-operator-controller-manager-f7fcc58b9-v6s9h\" (UID: \"5ea4f771-5b0c-410d-8a6c-a45b039edb6a\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-v6s9h" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.668302 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxs8g\" (UniqueName: \"kubernetes.io/projected/02c2f97c-15b6-4c33-8be5-c61cc982e989-kube-api-access-rxs8g\") pod \"keystone-operator-controller-manager-7c789f89c6-wrqqw\" (UID: \"02c2f97c-15b6-4c33-8be5-c61cc982e989\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wrqqw" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.668328 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ea4f771-5b0c-410d-8a6c-a45b039edb6a-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-v6s9h\" (UID: \"5ea4f771-5b0c-410d-8a6c-a45b039edb6a\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-v6s9h" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.668376 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9psd\" (UniqueName: \"kubernetes.io/projected/76830983-65b6-495a-8283-c9e2df80562b-kube-api-access-w9psd\") pod \"ironic-operator-controller-manager-545456dc4-rwzzl\" (UID: \"76830983-65b6-495a-8283-c9e2df80562b\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-rwzzl" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.674663 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-b2fx2"] Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.676741 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-b2fx2" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.685414 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-fkfp2" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.698746 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-czjxc"] Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.704177 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wx9l\" (UniqueName: \"kubernetes.io/projected/2fc3d688-53db-4d4e-9555-54c047570ae5-kube-api-access-8wx9l\") pod \"horizon-operator-controller-manager-78bc7f9bd9-5wrr9\" (UID: \"2fc3d688-53db-4d4e-9555-54c047570ae5\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-5wrr9" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.734919 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-b2fx2"] Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.734978 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-6npjh"] Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.737811 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-6npjh" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.742519 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-wk2f2" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.743124 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-lbqm8"] Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.744582 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-lbqm8" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.756283 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-5blll" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.756492 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-6npjh"] Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.761316 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-9btqb" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.771950 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d6x9\" (UniqueName: \"kubernetes.io/projected/49f841ea-0808-406e-a0d0-671f5db13f93-kube-api-access-4d6x9\") pod \"mariadb-operator-controller-manager-7b6bfb6475-b2fx2\" (UID: \"49f841ea-0808-406e-a0d0-671f5db13f93\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-b2fx2" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.772001 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b998f\" (UniqueName: \"kubernetes.io/projected/afe8d0e8-415a-4f80-8b5a-c3eb45e585cd-kube-api-access-b998f\") pod \"manila-operator-controller-manager-67d996989d-96b5g\" (UID: \"afe8d0e8-415a-4f80-8b5a-c3eb45e585cd\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-96b5g" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.772033 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hlrg\" (UniqueName: \"kubernetes.io/projected/5ea4f771-5b0c-410d-8a6c-a45b039edb6a-kube-api-access-5hlrg\") pod \"infra-operator-controller-manager-f7fcc58b9-v6s9h\" (UID: \"5ea4f771-5b0c-410d-8a6c-a45b039edb6a\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-v6s9h" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.772104 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxs8g\" (UniqueName: \"kubernetes.io/projected/02c2f97c-15b6-4c33-8be5-c61cc982e989-kube-api-access-rxs8g\") pod \"keystone-operator-controller-manager-7c789f89c6-wrqqw\" (UID: \"02c2f97c-15b6-4c33-8be5-c61cc982e989\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wrqqw" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.772141 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ea4f771-5b0c-410d-8a6c-a45b039edb6a-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-v6s9h\" (UID: \"5ea4f771-5b0c-410d-8a6c-a45b039edb6a\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-v6s9h" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.772207 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sspp\" (UniqueName: \"kubernetes.io/projected/c4d1a44c-121a-4326-9920-af7e6f87a031-kube-api-access-4sspp\") pod \"neutron-operator-controller-manager-54688575f-czjxc\" (UID: \"c4d1a44c-121a-4326-9920-af7e6f87a031\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-czjxc" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.772234 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9psd\" (UniqueName: \"kubernetes.io/projected/76830983-65b6-495a-8283-c9e2df80562b-kube-api-access-w9psd\") pod \"ironic-operator-controller-manager-545456dc4-rwzzl\" (UID: \"76830983-65b6-495a-8283-c9e2df80562b\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-rwzzl" Mar 09 13:17:31 crc kubenswrapper[4723]: E0309 13:17:31.772654 4723 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 09 13:17:31 crc kubenswrapper[4723]: E0309 13:17:31.772706 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ea4f771-5b0c-410d-8a6c-a45b039edb6a-cert podName:5ea4f771-5b0c-410d-8a6c-a45b039edb6a nodeName:}" failed. No retries permitted until 2026-03-09 13:17:32.272685993 +0000 UTC m=+1126.287153533 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5ea4f771-5b0c-410d-8a6c-a45b039edb6a-cert") pod "infra-operator-controller-manager-f7fcc58b9-v6s9h" (UID: "5ea4f771-5b0c-410d-8a6c-a45b039edb6a") : secret "infra-operator-webhook-server-cert" not found Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.780566 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-5wrg7"] Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.781834 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-5wrg7" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.785699 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-6j65m" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.798431 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-lbqm8"] Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.798856 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-5wrr9" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.808958 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cz5w4f"] Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.810379 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cz5w4f" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.814092 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-mt7hb"] Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.815660 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-mt7hb" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.824974 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-5wrg7"] Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.842604 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-nqk2g" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.846916 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-mt7hb"] Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.858727 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cz5w4f"] Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.873114 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxhvf\" (UniqueName: \"kubernetes.io/projected/eb08b38d-0624-4bd5-a3ba-9447cdbc80fb-kube-api-access-jxhvf\") pod \"nova-operator-controller-manager-74b6b5dc96-6npjh\" (UID: \"eb08b38d-0624-4bd5-a3ba-9447cdbc80fb\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-6npjh" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.873185 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcqsk\" (UniqueName: \"kubernetes.io/projected/1e62b006-449e-440b-b425-d56fbb171cd5-kube-api-access-lcqsk\") pod \"ovn-operator-controller-manager-75684d597f-5wrg7\" (UID: \"1e62b006-449e-440b-b425-d56fbb171cd5\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-5wrg7" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.873250 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svzh6\" (UniqueName: \"kubernetes.io/projected/a8a23c57-bff5-4820-955c-441521c1e8f2-kube-api-access-svzh6\") pod \"octavia-operator-controller-manager-5d86c7ddb7-lbqm8\" (UID: \"a8a23c57-bff5-4820-955c-441521c1e8f2\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-lbqm8" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.873296 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sspp\" (UniqueName: \"kubernetes.io/projected/c4d1a44c-121a-4326-9920-af7e6f87a031-kube-api-access-4sspp\") pod \"neutron-operator-controller-manager-54688575f-czjxc\" (UID: \"c4d1a44c-121a-4326-9920-af7e6f87a031\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-czjxc" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.873358 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d6x9\" (UniqueName: \"kubernetes.io/projected/49f841ea-0808-406e-a0d0-671f5db13f93-kube-api-access-4d6x9\") pod \"mariadb-operator-controller-manager-7b6bfb6475-b2fx2\" (UID: \"49f841ea-0808-406e-a0d0-671f5db13f93\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-b2fx2" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.873377 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b998f\" (UniqueName: \"kubernetes.io/projected/afe8d0e8-415a-4f80-8b5a-c3eb45e585cd-kube-api-access-b998f\") pod \"manila-operator-controller-manager-67d996989d-96b5g\" (UID: \"afe8d0e8-415a-4f80-8b5a-c3eb45e585cd\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-96b5g" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.874378 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.874556 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-w4tc2" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.883719 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-8w2sj"] Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.883953 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxs8g\" (UniqueName: \"kubernetes.io/projected/02c2f97c-15b6-4c33-8be5-c61cc982e989-kube-api-access-rxs8g\") pod \"keystone-operator-controller-manager-7c789f89c6-wrqqw\" (UID: \"02c2f97c-15b6-4c33-8be5-c61cc982e989\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wrqqw" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.885030 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-8w2sj" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.895943 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-8w2sj"] Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.899381 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-rpftx" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.923736 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hlrg\" (UniqueName: \"kubernetes.io/projected/5ea4f771-5b0c-410d-8a6c-a45b039edb6a-kube-api-access-5hlrg\") pod \"infra-operator-controller-manager-f7fcc58b9-v6s9h\" (UID: \"5ea4f771-5b0c-410d-8a6c-a45b039edb6a\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-v6s9h" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.924136 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9psd\" (UniqueName: \"kubernetes.io/projected/76830983-65b6-495a-8283-c9e2df80562b-kube-api-access-w9psd\") pod \"ironic-operator-controller-manager-545456dc4-rwzzl\" (UID: \"76830983-65b6-495a-8283-c9e2df80562b\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-rwzzl" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.943851 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sspp\" (UniqueName: \"kubernetes.io/projected/c4d1a44c-121a-4326-9920-af7e6f87a031-kube-api-access-4sspp\") pod \"neutron-operator-controller-manager-54688575f-czjxc\" (UID: \"c4d1a44c-121a-4326-9920-af7e6f87a031\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-czjxc" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.968157 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b998f\" (UniqueName: \"kubernetes.io/projected/afe8d0e8-415a-4f80-8b5a-c3eb45e585cd-kube-api-access-b998f\") pod \"manila-operator-controller-manager-67d996989d-96b5g\" (UID: \"afe8d0e8-415a-4f80-8b5a-c3eb45e585cd\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-96b5g" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.969358 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d6x9\" (UniqueName: \"kubernetes.io/projected/49f841ea-0808-406e-a0d0-671f5db13f93-kube-api-access-4d6x9\") pod \"mariadb-operator-controller-manager-7b6bfb6475-b2fx2\" (UID: \"49f841ea-0808-406e-a0d0-671f5db13f93\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-b2fx2" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.976907 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxhvf\" (UniqueName: \"kubernetes.io/projected/eb08b38d-0624-4bd5-a3ba-9447cdbc80fb-kube-api-access-jxhvf\") pod \"nova-operator-controller-manager-74b6b5dc96-6npjh\" (UID: \"eb08b38d-0624-4bd5-a3ba-9447cdbc80fb\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-6npjh" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.977008 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l5gl\" (UniqueName: \"kubernetes.io/projected/c3f17509-7e0b-452d-b3ca-0a3210159f17-kube-api-access-6l5gl\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cz5w4f\" (UID: \"c3f17509-7e0b-452d-b3ca-0a3210159f17\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cz5w4f" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.977039 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcqsk\" (UniqueName: \"kubernetes.io/projected/1e62b006-449e-440b-b425-d56fbb171cd5-kube-api-access-lcqsk\") pod \"ovn-operator-controller-manager-75684d597f-5wrg7\" (UID: \"1e62b006-449e-440b-b425-d56fbb171cd5\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-5wrg7" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.977093 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svzh6\" (UniqueName: \"kubernetes.io/projected/a8a23c57-bff5-4820-955c-441521c1e8f2-kube-api-access-svzh6\") pod \"octavia-operator-controller-manager-5d86c7ddb7-lbqm8\" (UID: \"a8a23c57-bff5-4820-955c-441521c1e8f2\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-lbqm8" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.977125 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnn4w\" (UniqueName: \"kubernetes.io/projected/57b972b8-b38f-4bc5-8cb5-cb2d949ff3b8-kube-api-access-lnn4w\") pod \"swift-operator-controller-manager-9b9ff9f4d-8w2sj\" (UID: \"57b972b8-b38f-4bc5-8cb5-cb2d949ff3b8\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-8w2sj" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.977181 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pqxq\" (UniqueName: \"kubernetes.io/projected/f1620e57-58ba-4313-bba4-f5ece039f9f7-kube-api-access-4pqxq\") pod \"placement-operator-controller-manager-648564c9fc-mt7hb\" (UID: \"f1620e57-58ba-4313-bba4-f5ece039f9f7\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-mt7hb" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.977222 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3f17509-7e0b-452d-b3ca-0a3210159f17-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cz5w4f\" (UID: \"c3f17509-7e0b-452d-b3ca-0a3210159f17\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cz5w4f" Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.977434 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-ssps2"] Mar 09 13:17:31 crc kubenswrapper[4723]: I0309 13:17:31.980626 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-ssps2" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.004931 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-trsq8" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.011397 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wrqqw" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.033657 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-96b5g" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.047558 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcqsk\" (UniqueName: \"kubernetes.io/projected/1e62b006-449e-440b-b425-d56fbb171cd5-kube-api-access-lcqsk\") pod \"ovn-operator-controller-manager-75684d597f-5wrg7\" (UID: \"1e62b006-449e-440b-b425-d56fbb171cd5\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-5wrg7" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.059688 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svzh6\" (UniqueName: \"kubernetes.io/projected/a8a23c57-bff5-4820-955c-441521c1e8f2-kube-api-access-svzh6\") pod \"octavia-operator-controller-manager-5d86c7ddb7-lbqm8\" (UID: \"a8a23c57-bff5-4820-955c-441521c1e8f2\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-lbqm8" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.069124 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxhvf\" (UniqueName: \"kubernetes.io/projected/eb08b38d-0624-4bd5-a3ba-9447cdbc80fb-kube-api-access-jxhvf\") pod \"nova-operator-controller-manager-74b6b5dc96-6npjh\" (UID: \"eb08b38d-0624-4bd5-a3ba-9447cdbc80fb\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-6npjh" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.075690 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54688575f-czjxc" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.079060 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmrnp\" (UniqueName: \"kubernetes.io/projected/36e23b55-a129-4c5f-8938-26f58742541b-kube-api-access-wmrnp\") pod \"telemetry-operator-controller-manager-5b9fbd87f-ssps2\" (UID: \"36e23b55-a129-4c5f-8938-26f58742541b\") " pod="openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-ssps2" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.079236 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6l5gl\" (UniqueName: \"kubernetes.io/projected/c3f17509-7e0b-452d-b3ca-0a3210159f17-kube-api-access-6l5gl\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cz5w4f\" (UID: \"c3f17509-7e0b-452d-b3ca-0a3210159f17\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cz5w4f" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.079309 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnn4w\" (UniqueName: \"kubernetes.io/projected/57b972b8-b38f-4bc5-8cb5-cb2d949ff3b8-kube-api-access-lnn4w\") pod \"swift-operator-controller-manager-9b9ff9f4d-8w2sj\" (UID: \"57b972b8-b38f-4bc5-8cb5-cb2d949ff3b8\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-8w2sj" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.079359 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pqxq\" (UniqueName: \"kubernetes.io/projected/f1620e57-58ba-4313-bba4-f5ece039f9f7-kube-api-access-4pqxq\") pod \"placement-operator-controller-manager-648564c9fc-mt7hb\" (UID: \"f1620e57-58ba-4313-bba4-f5ece039f9f7\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-mt7hb" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.079391 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3f17509-7e0b-452d-b3ca-0a3210159f17-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cz5w4f\" (UID: \"c3f17509-7e0b-452d-b3ca-0a3210159f17\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cz5w4f" Mar 09 13:17:32 crc kubenswrapper[4723]: E0309 13:17:32.079549 4723 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 13:17:32 crc kubenswrapper[4723]: E0309 13:17:32.079599 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3f17509-7e0b-452d-b3ca-0a3210159f17-cert podName:c3f17509-7e0b-452d-b3ca-0a3210159f17 nodeName:}" failed. No retries permitted until 2026-03-09 13:17:32.579584722 +0000 UTC m=+1126.594052262 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c3f17509-7e0b-452d-b3ca-0a3210159f17-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cz5w4f" (UID: "c3f17509-7e0b-452d-b3ca-0a3210159f17") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.090973 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-ssps2"] Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.096487 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-b2fx2" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.125401 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pqxq\" (UniqueName: \"kubernetes.io/projected/f1620e57-58ba-4313-bba4-f5ece039f9f7-kube-api-access-4pqxq\") pod \"placement-operator-controller-manager-648564c9fc-mt7hb\" (UID: \"f1620e57-58ba-4313-bba4-f5ece039f9f7\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-mt7hb" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.126184 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnn4w\" (UniqueName: \"kubernetes.io/projected/57b972b8-b38f-4bc5-8cb5-cb2d949ff3b8-kube-api-access-lnn4w\") pod \"swift-operator-controller-manager-9b9ff9f4d-8w2sj\" (UID: \"57b972b8-b38f-4bc5-8cb5-cb2d949ff3b8\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-8w2sj" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.126282 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l5gl\" (UniqueName: \"kubernetes.io/projected/c3f17509-7e0b-452d-b3ca-0a3210159f17-kube-api-access-6l5gl\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cz5w4f\" (UID: \"c3f17509-7e0b-452d-b3ca-0a3210159f17\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cz5w4f" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.130873 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-6npjh" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.151279 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-rwzzl" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.173099 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-lbqm8" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.181070 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmrnp\" (UniqueName: \"kubernetes.io/projected/36e23b55-a129-4c5f-8938-26f58742541b-kube-api-access-wmrnp\") pod \"telemetry-operator-controller-manager-5b9fbd87f-ssps2\" (UID: \"36e23b55-a129-4c5f-8938-26f58742541b\") " pod="openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-ssps2" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.195091 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-6qstb"] Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.197348 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-6qstb" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.201038 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-6qstb"] Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.203813 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-md5rv" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.234462 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-5wrg7" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.235331 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmrnp\" (UniqueName: \"kubernetes.io/projected/36e23b55-a129-4c5f-8938-26f58742541b-kube-api-access-wmrnp\") pod \"telemetry-operator-controller-manager-5b9fbd87f-ssps2\" (UID: \"36e23b55-a129-4c5f-8938-26f58742541b\") " pod="openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-ssps2" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.263060 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-4czcc"] Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.264500 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-4czcc" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.266897 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-mt7hb" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.272436 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-v8r2m" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.283165 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ea4f771-5b0c-410d-8a6c-a45b039edb6a-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-v6s9h\" (UID: \"5ea4f771-5b0c-410d-8a6c-a45b039edb6a\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-v6s9h" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.283477 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm9jw\" (UniqueName: \"kubernetes.io/projected/8554b7c9-0bd7-4326-b906-fe07dcdce9da-kube-api-access-xm9jw\") pod \"test-operator-controller-manager-55b5ff4dbb-6qstb\" (UID: \"8554b7c9-0bd7-4326-b906-fe07dcdce9da\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-6qstb" Mar 09 13:17:32 crc kubenswrapper[4723]: E0309 13:17:32.283823 4723 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 09 13:17:32 crc kubenswrapper[4723]: E0309 13:17:32.285620 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ea4f771-5b0c-410d-8a6c-a45b039edb6a-cert podName:5ea4f771-5b0c-410d-8a6c-a45b039edb6a nodeName:}" failed. No retries permitted until 2026-03-09 13:17:33.285590468 +0000 UTC m=+1127.300058068 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5ea4f771-5b0c-410d-8a6c-a45b039edb6a-cert") pod "infra-operator-controller-manager-f7fcc58b9-v6s9h" (UID: "5ea4f771-5b0c-410d-8a6c-a45b039edb6a") : secret "infra-operator-webhook-server-cert" not found Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.298659 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-4czcc"] Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.308936 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-8w2sj" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.328052 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-ssps2" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.384048 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-76577b8ddd-8748r"] Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.385625 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-76577b8ddd-8748r" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.385788 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm9jw\" (UniqueName: \"kubernetes.io/projected/8554b7c9-0bd7-4326-b906-fe07dcdce9da-kube-api-access-xm9jw\") pod \"test-operator-controller-manager-55b5ff4dbb-6qstb\" (UID: \"8554b7c9-0bd7-4326-b906-fe07dcdce9da\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-6qstb" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.385967 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwsv5\" (UniqueName: \"kubernetes.io/projected/e93b778c-c10f-4da5-a3c2-91010b4b3aab-kube-api-access-vwsv5\") pod \"watcher-operator-controller-manager-bccc79885-4czcc\" (UID: \"e93b778c-c10f-4da5-a3c2-91010b4b3aab\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-4czcc" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.388687 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.389053 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.389390 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-nrmt8" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.409635 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-76577b8ddd-8748r"] Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.417922 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm9jw\" (UniqueName: \"kubernetes.io/projected/8554b7c9-0bd7-4326-b906-fe07dcdce9da-kube-api-access-xm9jw\") pod \"test-operator-controller-manager-55b5ff4dbb-6qstb\" (UID: \"8554b7c9-0bd7-4326-b906-fe07dcdce9da\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-6qstb" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.427064 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k56p4"] Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.428092 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k56p4" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.431230 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-gvfbm" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.441308 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k56p4"] Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.487361 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b9b75469-0c5d-47b4-b75c-28cdf8316167-webhook-certs\") pod \"openstack-operator-controller-manager-76577b8ddd-8748r\" (UID: \"b9b75469-0c5d-47b4-b75c-28cdf8316167\") " pod="openstack-operators/openstack-operator-controller-manager-76577b8ddd-8748r" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.487687 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwsv5\" (UniqueName: \"kubernetes.io/projected/e93b778c-c10f-4da5-a3c2-91010b4b3aab-kube-api-access-vwsv5\") pod \"watcher-operator-controller-manager-bccc79885-4czcc\" (UID: \"e93b778c-c10f-4da5-a3c2-91010b4b3aab\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-4czcc" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.487724 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z58dp\" (UniqueName: \"kubernetes.io/projected/b9b75469-0c5d-47b4-b75c-28cdf8316167-kube-api-access-z58dp\") pod \"openstack-operator-controller-manager-76577b8ddd-8748r\" (UID: \"b9b75469-0c5d-47b4-b75c-28cdf8316167\") " pod="openstack-operators/openstack-operator-controller-manager-76577b8ddd-8748r" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.487771 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b9b75469-0c5d-47b4-b75c-28cdf8316167-metrics-certs\") pod \"openstack-operator-controller-manager-76577b8ddd-8748r\" (UID: \"b9b75469-0c5d-47b4-b75c-28cdf8316167\") " pod="openstack-operators/openstack-operator-controller-manager-76577b8ddd-8748r" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.527402 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-mtmcb"] Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.547355 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwsv5\" (UniqueName: \"kubernetes.io/projected/e93b778c-c10f-4da5-a3c2-91010b4b3aab-kube-api-access-vwsv5\") pod \"watcher-operator-controller-manager-bccc79885-4czcc\" (UID: \"e93b778c-c10f-4da5-a3c2-91010b4b3aab\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-4czcc" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.589566 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hbsl\" (UniqueName: \"kubernetes.io/projected/d748e45b-6515-4e15-a776-61bbe83179c0-kube-api-access-4hbsl\") pod \"rabbitmq-cluster-operator-manager-668c99d594-k56p4\" (UID: \"d748e45b-6515-4e15-a776-61bbe83179c0\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k56p4" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.589629 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3f17509-7e0b-452d-b3ca-0a3210159f17-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cz5w4f\" (UID: \"c3f17509-7e0b-452d-b3ca-0a3210159f17\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cz5w4f" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.589668 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z58dp\" (UniqueName: \"kubernetes.io/projected/b9b75469-0c5d-47b4-b75c-28cdf8316167-kube-api-access-z58dp\") pod \"openstack-operator-controller-manager-76577b8ddd-8748r\" (UID: \"b9b75469-0c5d-47b4-b75c-28cdf8316167\") " pod="openstack-operators/openstack-operator-controller-manager-76577b8ddd-8748r" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.589699 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b9b75469-0c5d-47b4-b75c-28cdf8316167-metrics-certs\") pod \"openstack-operator-controller-manager-76577b8ddd-8748r\" (UID: \"b9b75469-0c5d-47b4-b75c-28cdf8316167\") " pod="openstack-operators/openstack-operator-controller-manager-76577b8ddd-8748r" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.589735 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b9b75469-0c5d-47b4-b75c-28cdf8316167-webhook-certs\") pod \"openstack-operator-controller-manager-76577b8ddd-8748r\" (UID: \"b9b75469-0c5d-47b4-b75c-28cdf8316167\") " pod="openstack-operators/openstack-operator-controller-manager-76577b8ddd-8748r" Mar 09 13:17:32 crc kubenswrapper[4723]: E0309 13:17:32.589897 4723 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 09 13:17:32 crc kubenswrapper[4723]: E0309 13:17:32.589943 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9b75469-0c5d-47b4-b75c-28cdf8316167-webhook-certs podName:b9b75469-0c5d-47b4-b75c-28cdf8316167 nodeName:}" failed. No retries permitted until 2026-03-09 13:17:33.089928097 +0000 UTC m=+1127.104395637 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b9b75469-0c5d-47b4-b75c-28cdf8316167-webhook-certs") pod "openstack-operator-controller-manager-76577b8ddd-8748r" (UID: "b9b75469-0c5d-47b4-b75c-28cdf8316167") : secret "webhook-server-cert" not found Mar 09 13:17:32 crc kubenswrapper[4723]: E0309 13:17:32.590191 4723 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 13:17:32 crc kubenswrapper[4723]: E0309 13:17:32.590213 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3f17509-7e0b-452d-b3ca-0a3210159f17-cert podName:c3f17509-7e0b-452d-b3ca-0a3210159f17 nodeName:}" failed. No retries permitted until 2026-03-09 13:17:33.590206775 +0000 UTC m=+1127.604674315 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c3f17509-7e0b-452d-b3ca-0a3210159f17-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cz5w4f" (UID: "c3f17509-7e0b-452d-b3ca-0a3210159f17") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 13:17:32 crc kubenswrapper[4723]: E0309 13:17:32.590370 4723 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 09 13:17:32 crc kubenswrapper[4723]: E0309 13:17:32.590390 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9b75469-0c5d-47b4-b75c-28cdf8316167-metrics-certs podName:b9b75469-0c5d-47b4-b75c-28cdf8316167 nodeName:}" failed. No retries permitted until 2026-03-09 13:17:33.090383889 +0000 UTC m=+1127.104851429 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b9b75469-0c5d-47b4-b75c-28cdf8316167-metrics-certs") pod "openstack-operator-controller-manager-76577b8ddd-8748r" (UID: "b9b75469-0c5d-47b4-b75c-28cdf8316167") : secret "metrics-server-cert" not found Mar 09 13:17:32 crc kubenswrapper[4723]: W0309 13:17:32.613265 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e192922_8050_41f1_bf25_33a12ace409b.slice/crio-272287b976f9c1b93b21dc736e25f75fd9a5f0c046493da877ce9bc9c5f4b352 WatchSource:0}: Error finding container 272287b976f9c1b93b21dc736e25f75fd9a5f0c046493da877ce9bc9c5f4b352: Status 404 returned error can't find the container with id 272287b976f9c1b93b21dc736e25f75fd9a5f0c046493da877ce9bc9c5f4b352 Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.617132 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z58dp\" (UniqueName: \"kubernetes.io/projected/b9b75469-0c5d-47b4-b75c-28cdf8316167-kube-api-access-z58dp\") pod \"openstack-operator-controller-manager-76577b8ddd-8748r\" (UID: \"b9b75469-0c5d-47b4-b75c-28cdf8316167\") " pod="openstack-operators/openstack-operator-controller-manager-76577b8ddd-8748r" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.664815 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-6qstb" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.690327 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-4czcc" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.692123 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hbsl\" (UniqueName: \"kubernetes.io/projected/d748e45b-6515-4e15-a776-61bbe83179c0-kube-api-access-4hbsl\") pod \"rabbitmq-cluster-operator-manager-668c99d594-k56p4\" (UID: \"d748e45b-6515-4e15-a776-61bbe83179c0\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k56p4" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.712640 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hbsl\" (UniqueName: \"kubernetes.io/projected/d748e45b-6515-4e15-a776-61bbe83179c0-kube-api-access-4hbsl\") pod \"rabbitmq-cluster-operator-manager-668c99d594-k56p4\" (UID: \"d748e45b-6515-4e15-a776-61bbe83179c0\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k56p4" Mar 09 13:17:32 crc kubenswrapper[4723]: I0309 13:17:32.779680 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k56p4" Mar 09 13:17:33 crc kubenswrapper[4723]: W0309 13:17:33.062788 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01b1451d_b917_4176_abf6_fd84021ba30d.slice/crio-6e43be8c2eb18958d154d923c1034bfa0d2a0b1854645bb876a53cb869041951 WatchSource:0}: Error finding container 6e43be8c2eb18958d154d923c1034bfa0d2a0b1854645bb876a53cb869041951: Status 404 returned error can't find the container with id 6e43be8c2eb18958d154d923c1034bfa0d2a0b1854645bb876a53cb869041951 Mar 09 13:17:33 crc kubenswrapper[4723]: I0309 13:17:33.062983 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-89rtm"] Mar 09 13:17:33 crc kubenswrapper[4723]: W0309 13:17:33.069211 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bb6b3ee_7923_42ce_b36d_dabdaa42f829.slice/crio-41ed9237bf37853c8973c08fd30167d5060f4d9a398bd4894d31fc84b94fe462 WatchSource:0}: Error finding container 41ed9237bf37853c8973c08fd30167d5060f4d9a398bd4894d31fc84b94fe462: Status 404 returned error can't find the container with id 41ed9237bf37853c8973c08fd30167d5060f4d9a398bd4894d31fc84b94fe462 Mar 09 13:17:33 crc kubenswrapper[4723]: W0309 13:17:33.069440 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9646c273_606f_4551_82dd_39e09007dc17.slice/crio-9958fa90c6416e0c2119b0e4905ef9bfb28d1ea38af15dcc791daa79adb0b348 WatchSource:0}: Error finding container 9958fa90c6416e0c2119b0e4905ef9bfb28d1ea38af15dcc791daa79adb0b348: Status 404 returned error can't find the container with id 9958fa90c6416e0c2119b0e4905ef9bfb28d1ea38af15dcc791daa79adb0b348 Mar 09 13:17:33 crc kubenswrapper[4723]: I0309 13:17:33.074102 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-4zwkg"] Mar 09 13:17:33 crc kubenswrapper[4723]: I0309 13:17:33.081752 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-dwlzx"] Mar 09 13:17:33 crc kubenswrapper[4723]: I0309 13:17:33.099401 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b9b75469-0c5d-47b4-b75c-28cdf8316167-metrics-certs\") pod \"openstack-operator-controller-manager-76577b8ddd-8748r\" (UID: \"b9b75469-0c5d-47b4-b75c-28cdf8316167\") " pod="openstack-operators/openstack-operator-controller-manager-76577b8ddd-8748r" Mar 09 13:17:33 crc kubenswrapper[4723]: I0309 13:17:33.099459 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b9b75469-0c5d-47b4-b75c-28cdf8316167-webhook-certs\") pod \"openstack-operator-controller-manager-76577b8ddd-8748r\" (UID: \"b9b75469-0c5d-47b4-b75c-28cdf8316167\") " pod="openstack-operators/openstack-operator-controller-manager-76577b8ddd-8748r" Mar 09 13:17:33 crc kubenswrapper[4723]: E0309 13:17:33.099534 4723 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 09 13:17:33 crc kubenswrapper[4723]: E0309 13:17:33.099588 4723 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 09 13:17:33 crc kubenswrapper[4723]: E0309 13:17:33.099632 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9b75469-0c5d-47b4-b75c-28cdf8316167-metrics-certs podName:b9b75469-0c5d-47b4-b75c-28cdf8316167 nodeName:}" failed. No retries permitted until 2026-03-09 13:17:34.099614546 +0000 UTC m=+1128.114082086 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b9b75469-0c5d-47b4-b75c-28cdf8316167-metrics-certs") pod "openstack-operator-controller-manager-76577b8ddd-8748r" (UID: "b9b75469-0c5d-47b4-b75c-28cdf8316167") : secret "metrics-server-cert" not found Mar 09 13:17:33 crc kubenswrapper[4723]: E0309 13:17:33.099649 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9b75469-0c5d-47b4-b75c-28cdf8316167-webhook-certs podName:b9b75469-0c5d-47b4-b75c-28cdf8316167 nodeName:}" failed. No retries permitted until 2026-03-09 13:17:34.099642627 +0000 UTC m=+1128.114110167 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b9b75469-0c5d-47b4-b75c-28cdf8316167-webhook-certs") pod "openstack-operator-controller-manager-76577b8ddd-8748r" (UID: "b9b75469-0c5d-47b4-b75c-28cdf8316167") : secret "webhook-server-cert" not found Mar 09 13:17:33 crc kubenswrapper[4723]: I0309 13:17:33.303475 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ea4f771-5b0c-410d-8a6c-a45b039edb6a-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-v6s9h\" (UID: \"5ea4f771-5b0c-410d-8a6c-a45b039edb6a\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-v6s9h" Mar 09 13:17:33 crc kubenswrapper[4723]: E0309 13:17:33.303734 4723 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 09 13:17:33 crc kubenswrapper[4723]: E0309 13:17:33.303955 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ea4f771-5b0c-410d-8a6c-a45b039edb6a-cert podName:5ea4f771-5b0c-410d-8a6c-a45b039edb6a nodeName:}" failed. No retries permitted until 2026-03-09 13:17:35.303933638 +0000 UTC m=+1129.318401178 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5ea4f771-5b0c-410d-8a6c-a45b039edb6a-cert") pod "infra-operator-controller-manager-f7fcc58b9-v6s9h" (UID: "5ea4f771-5b0c-410d-8a6c-a45b039edb6a") : secret "infra-operator-webhook-server-cert" not found Mar 09 13:17:33 crc kubenswrapper[4723]: I0309 13:17:33.450879 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-mtmcb" event={"ID":"6e192922-8050-41f1-bf25-33a12ace409b","Type":"ContainerStarted","Data":"272287b976f9c1b93b21dc736e25f75fd9a5f0c046493da877ce9bc9c5f4b352"} Mar 09 13:17:33 crc kubenswrapper[4723]: I0309 13:17:33.455372 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-4zwkg" event={"ID":"01b1451d-b917-4176-abf6-fd84021ba30d","Type":"ContainerStarted","Data":"6e43be8c2eb18958d154d923c1034bfa0d2a0b1854645bb876a53cb869041951"} Mar 09 13:17:33 crc kubenswrapper[4723]: I0309 13:17:33.457570 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-dwlzx" event={"ID":"6bb6b3ee-7923-42ce-b36d-dabdaa42f829","Type":"ContainerStarted","Data":"41ed9237bf37853c8973c08fd30167d5060f4d9a398bd4894d31fc84b94fe462"} Mar 09 13:17:33 crc kubenswrapper[4723]: I0309 13:17:33.458562 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-89rtm" event={"ID":"9646c273-606f-4551-82dd-39e09007dc17","Type":"ContainerStarted","Data":"9958fa90c6416e0c2119b0e4905ef9bfb28d1ea38af15dcc791daa79adb0b348"} Mar 09 13:17:33 crc kubenswrapper[4723]: I0309 13:17:33.620490 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3f17509-7e0b-452d-b3ca-0a3210159f17-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cz5w4f\" (UID: \"c3f17509-7e0b-452d-b3ca-0a3210159f17\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cz5w4f" Mar 09 13:17:33 crc kubenswrapper[4723]: E0309 13:17:33.621213 4723 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 13:17:33 crc kubenswrapper[4723]: E0309 13:17:33.621316 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3f17509-7e0b-452d-b3ca-0a3210159f17-cert podName:c3f17509-7e0b-452d-b3ca-0a3210159f17 nodeName:}" failed. No retries permitted until 2026-03-09 13:17:35.621292646 +0000 UTC m=+1129.635760196 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c3f17509-7e0b-452d-b3ca-0a3210159f17-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cz5w4f" (UID: "c3f17509-7e0b-452d-b3ca-0a3210159f17") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 13:17:33 crc kubenswrapper[4723]: I0309 13:17:33.707033 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-wrqqw"] Mar 09 13:17:33 crc kubenswrapper[4723]: I0309 13:17:33.714434 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-rwzzl"] Mar 09 13:17:33 crc kubenswrapper[4723]: W0309 13:17:33.720929 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76830983_65b6_495a_8283_c9e2df80562b.slice/crio-672a3c16564c17928c884ce03f26e0f18d9126a37ae6432176aa0cb1b0163edc WatchSource:0}: Error finding container 672a3c16564c17928c884ce03f26e0f18d9126a37ae6432176aa0cb1b0163edc: Status 404 returned error can't find the container with id 672a3c16564c17928c884ce03f26e0f18d9126a37ae6432176aa0cb1b0163edc Mar 09 13:17:33 crc kubenswrapper[4723]: W0309 13:17:33.722560 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafe8d0e8_415a_4f80_8b5a_c3eb45e585cd.slice/crio-11f31e535f086d4672997e1813f418e2ca7d9fa745bdc689f28be22cf8a25d12 WatchSource:0}: Error finding container 11f31e535f086d4672997e1813f418e2ca7d9fa745bdc689f28be22cf8a25d12: Status 404 returned error can't find the container with id 11f31e535f086d4672997e1813f418e2ca7d9fa745bdc689f28be22cf8a25d12 Mar 09 13:17:33 crc kubenswrapper[4723]: I0309 13:17:33.724448 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-96b5g"] Mar 09 13:17:33 crc kubenswrapper[4723]: I0309 13:17:33.751195 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-b2fx2"] Mar 09 13:17:33 crc kubenswrapper[4723]: I0309 13:17:33.779164 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-5wrr9"] Mar 09 13:17:33 crc kubenswrapper[4723]: I0309 13:17:33.786383 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-czjxc"] Mar 09 13:17:33 crc kubenswrapper[4723]: I0309 13:17:33.797662 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-9btqb"] Mar 09 13:17:33 crc kubenswrapper[4723]: I0309 13:17:33.946630 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:17:33 crc kubenswrapper[4723]: I0309 13:17:33.946680 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:17:34 crc kubenswrapper[4723]: I0309 13:17:34.131892 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b9b75469-0c5d-47b4-b75c-28cdf8316167-metrics-certs\") pod \"openstack-operator-controller-manager-76577b8ddd-8748r\" (UID: \"b9b75469-0c5d-47b4-b75c-28cdf8316167\") " pod="openstack-operators/openstack-operator-controller-manager-76577b8ddd-8748r" Mar 09 13:17:34 crc kubenswrapper[4723]: I0309 13:17:34.132034 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b9b75469-0c5d-47b4-b75c-28cdf8316167-webhook-certs\") pod \"openstack-operator-controller-manager-76577b8ddd-8748r\" (UID: \"b9b75469-0c5d-47b4-b75c-28cdf8316167\") " pod="openstack-operators/openstack-operator-controller-manager-76577b8ddd-8748r" Mar 09 13:17:34 crc kubenswrapper[4723]: E0309 13:17:34.134896 4723 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 09 13:17:34 crc kubenswrapper[4723]: E0309 13:17:34.135107 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9b75469-0c5d-47b4-b75c-28cdf8316167-webhook-certs podName:b9b75469-0c5d-47b4-b75c-28cdf8316167 nodeName:}" failed. No retries permitted until 2026-03-09 13:17:36.135082085 +0000 UTC m=+1130.149549625 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b9b75469-0c5d-47b4-b75c-28cdf8316167-webhook-certs") pod "openstack-operator-controller-manager-76577b8ddd-8748r" (UID: "b9b75469-0c5d-47b4-b75c-28cdf8316167") : secret "webhook-server-cert" not found Mar 09 13:17:34 crc kubenswrapper[4723]: E0309 13:17:34.141242 4723 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 09 13:17:34 crc kubenswrapper[4723]: E0309 13:17:34.141335 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9b75469-0c5d-47b4-b75c-28cdf8316167-metrics-certs podName:b9b75469-0c5d-47b4-b75c-28cdf8316167 nodeName:}" failed. No retries permitted until 2026-03-09 13:17:36.141312502 +0000 UTC m=+1130.155780042 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b9b75469-0c5d-47b4-b75c-28cdf8316167-metrics-certs") pod "openstack-operator-controller-manager-76577b8ddd-8748r" (UID: "b9b75469-0c5d-47b4-b75c-28cdf8316167") : secret "metrics-server-cert" not found Mar 09 13:17:34 crc kubenswrapper[4723]: I0309 13:17:34.193981 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-lbqm8"] Mar 09 13:17:34 crc kubenswrapper[4723]: I0309 13:17:34.220481 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-5wrg7"] Mar 09 13:17:34 crc kubenswrapper[4723]: W0309 13:17:34.227835 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8554b7c9_0bd7_4326_b906_fe07dcdce9da.slice/crio-8685491d998f9036f35ebe4e5ff5ffb4bbe1aa41cd9b6eed567c9f27781c788c WatchSource:0}: Error finding container 8685491d998f9036f35ebe4e5ff5ffb4bbe1aa41cd9b6eed567c9f27781c788c: Status 404 returned error can't find the container with id 8685491d998f9036f35ebe4e5ff5ffb4bbe1aa41cd9b6eed567c9f27781c788c Mar 09 13:17:34 crc kubenswrapper[4723]: I0309 13:17:34.235727 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-6qstb"] Mar 09 13:17:34 crc kubenswrapper[4723]: I0309 13:17:34.251580 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-ssps2"] Mar 09 13:17:34 crc kubenswrapper[4723]: I0309 13:17:34.275003 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-6npjh"] Mar 09 13:17:34 crc kubenswrapper[4723]: E0309 13:17:34.276069 4723 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vwsv5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-bccc79885-4czcc_openstack-operators(e93b778c-c10f-4da5-a3c2-91010b4b3aab): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 09 13:17:34 crc kubenswrapper[4723]: E0309 13:17:34.277354 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-4czcc" podUID="e93b778c-c10f-4da5-a3c2-91010b4b3aab" Mar 09 13:17:34 crc kubenswrapper[4723]: E0309 13:17:34.280555 4723 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:bb939885bd04593ad03af901adb77ee2a2d18529b328c23288c7cc7a2ba5282e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4pqxq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-648564c9fc-mt7hb_openstack-operators(f1620e57-58ba-4313-bba4-f5ece039f9f7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 09 13:17:34 crc kubenswrapper[4723]: E0309 13:17:34.282923 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-mt7hb" podUID="f1620e57-58ba-4313-bba4-f5ece039f9f7" Mar 09 13:17:34 crc kubenswrapper[4723]: I0309 13:17:34.287955 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-8w2sj"] Mar 09 13:17:34 crc kubenswrapper[4723]: I0309 13:17:34.296080 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-mt7hb"] Mar 09 13:17:34 crc kubenswrapper[4723]: E0309 13:17:34.297181 4723 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:172f24bd4603ac3498536a8a2c8fffb07cf9113dd52bc132778ea0aa275c6b84,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jxhvf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-74b6b5dc96-6npjh_openstack-operators(eb08b38d-0624-4bd5-a3ba-9447cdbc80fb): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 09 13:17:34 crc kubenswrapper[4723]: E0309 13:17:34.298361 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-6npjh" podUID="eb08b38d-0624-4bd5-a3ba-9447cdbc80fb" Mar 09 13:17:34 crc kubenswrapper[4723]: I0309 13:17:34.302454 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-4czcc"] Mar 09 13:17:34 crc kubenswrapper[4723]: I0309 13:17:34.465055 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k56p4"] Mar 09 13:17:34 crc kubenswrapper[4723]: I0309 13:17:34.474723 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-6qstb" event={"ID":"8554b7c9-0bd7-4326-b906-fe07dcdce9da","Type":"ContainerStarted","Data":"8685491d998f9036f35ebe4e5ff5ffb4bbe1aa41cd9b6eed567c9f27781c788c"} Mar 09 13:17:34 crc kubenswrapper[4723]: I0309 13:17:34.477101 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-4czcc" event={"ID":"e93b778c-c10f-4da5-a3c2-91010b4b3aab","Type":"ContainerStarted","Data":"3805d2b8d4e7351ba4650ce0d0ad5e47dc8f0d3de7226622279b9f931f3c8598"} Mar 09 13:17:34 crc kubenswrapper[4723]: E0309 13:17:34.480679 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-4czcc" podUID="e93b778c-c10f-4da5-a3c2-91010b4b3aab" Mar 09 13:17:34 crc kubenswrapper[4723]: I0309 13:17:34.487652 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-lbqm8" event={"ID":"a8a23c57-bff5-4820-955c-441521c1e8f2","Type":"ContainerStarted","Data":"527ceb7cd4be497e290d4621ee7da0e6aee788a0b4578437e2bd8b70d5615504"} Mar 09 13:17:34 crc kubenswrapper[4723]: I0309 13:17:34.495628 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-rwzzl" event={"ID":"76830983-65b6-495a-8283-c9e2df80562b","Type":"ContainerStarted","Data":"672a3c16564c17928c884ce03f26e0f18d9126a37ae6432176aa0cb1b0163edc"} Mar 09 13:17:34 crc kubenswrapper[4723]: I0309 13:17:34.498012 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-5wrg7" event={"ID":"1e62b006-449e-440b-b425-d56fbb171cd5","Type":"ContainerStarted","Data":"1f079bf37ab03b200eec218a4c2a061114928f018b3911e5cceb4898d21ff300"} Mar 09 13:17:34 crc kubenswrapper[4723]: I0309 13:17:34.499793 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-96b5g" event={"ID":"afe8d0e8-415a-4f80-8b5a-c3eb45e585cd","Type":"ContainerStarted","Data":"11f31e535f086d4672997e1813f418e2ca7d9fa745bdc689f28be22cf8a25d12"} Mar 09 13:17:34 crc kubenswrapper[4723]: I0309 13:17:34.501488 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-mt7hb" event={"ID":"f1620e57-58ba-4313-bba4-f5ece039f9f7","Type":"ContainerStarted","Data":"d85dad8d1af9d14559d78d125d6b6f95b773be4b27a1b445d63bf9c1b071926f"} Mar 09 13:17:34 crc kubenswrapper[4723]: E0309 13:17:34.518167 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:bb939885bd04593ad03af901adb77ee2a2d18529b328c23288c7cc7a2ba5282e\\\"\"" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-mt7hb" podUID="f1620e57-58ba-4313-bba4-f5ece039f9f7" Mar 09 13:17:34 crc kubenswrapper[4723]: I0309 13:17:34.524035 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wrqqw" event={"ID":"02c2f97c-15b6-4c33-8be5-c61cc982e989","Type":"ContainerStarted","Data":"17d859e35e2ac822e192a8b6d3e150cbfae59da9661931d962f120aa073bd04f"} Mar 09 13:17:34 crc kubenswrapper[4723]: I0309 13:17:34.528436 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-b2fx2" event={"ID":"49f841ea-0808-406e-a0d0-671f5db13f93","Type":"ContainerStarted","Data":"ef46773cced3bb55fa4c12a4ef4d4c9f5aff73d7fb10860627b4e2b4bdbc0fdf"} Mar 09 13:17:34 crc kubenswrapper[4723]: I0309 13:17:34.531264 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54688575f-czjxc" event={"ID":"c4d1a44c-121a-4326-9920-af7e6f87a031","Type":"ContainerStarted","Data":"9c88563bb46e4fe01b16fb667dbfb8214246ace4ca39aa26f6d0cf80b0596cb9"} Mar 09 13:17:34 crc kubenswrapper[4723]: I0309 13:17:34.532448 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-6npjh" event={"ID":"eb08b38d-0624-4bd5-a3ba-9447cdbc80fb","Type":"ContainerStarted","Data":"090cbdf96b37e001120123756e9e990a275d60537a4afefa3a736f858270ec5c"} Mar 09 13:17:34 crc kubenswrapper[4723]: E0309 13:17:34.538941 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:172f24bd4603ac3498536a8a2c8fffb07cf9113dd52bc132778ea0aa275c6b84\\\"\"" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-6npjh" podUID="eb08b38d-0624-4bd5-a3ba-9447cdbc80fb" Mar 09 13:17:34 crc kubenswrapper[4723]: I0309 13:17:34.542557 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-9btqb" event={"ID":"6bf9afff-37d5-41e4-9270-8994fc65deda","Type":"ContainerStarted","Data":"1ee49af10eba745c6be4ece011f0b747879ecb15e38d10e6c68c9b53f1dc27d0"} Mar 09 13:17:34 crc kubenswrapper[4723]: I0309 13:17:34.546670 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-8w2sj" event={"ID":"57b972b8-b38f-4bc5-8cb5-cb2d949ff3b8","Type":"ContainerStarted","Data":"2e7285189126491e90bd4f6bf3370ff58bf694e0db0e812c30f3acfb87ef0bb1"} Mar 09 13:17:34 crc kubenswrapper[4723]: I0309 13:17:34.558084 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-ssps2" event={"ID":"36e23b55-a129-4c5f-8938-26f58742541b","Type":"ContainerStarted","Data":"3e8974ce2cb45846bf5fbeb06990838338dfb617c87c68d37afbe066389a823f"} Mar 09 13:17:34 crc kubenswrapper[4723]: I0309 13:17:34.559602 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-5wrr9" event={"ID":"2fc3d688-53db-4d4e-9555-54c047570ae5","Type":"ContainerStarted","Data":"7e4d28857470a9f9a1351729234e4b64e7bba4ea095f1ab1f87f1f90ffbb3778"} Mar 09 13:17:35 crc kubenswrapper[4723]: I0309 13:17:35.359339 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ea4f771-5b0c-410d-8a6c-a45b039edb6a-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-v6s9h\" (UID: \"5ea4f771-5b0c-410d-8a6c-a45b039edb6a\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-v6s9h" Mar 09 13:17:35 crc kubenswrapper[4723]: E0309 13:17:35.359458 4723 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 09 13:17:35 crc kubenswrapper[4723]: E0309 13:17:35.359537 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ea4f771-5b0c-410d-8a6c-a45b039edb6a-cert podName:5ea4f771-5b0c-410d-8a6c-a45b039edb6a nodeName:}" failed. No retries permitted until 2026-03-09 13:17:39.359519304 +0000 UTC m=+1133.373986844 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5ea4f771-5b0c-410d-8a6c-a45b039edb6a-cert") pod "infra-operator-controller-manager-f7fcc58b9-v6s9h" (UID: "5ea4f771-5b0c-410d-8a6c-a45b039edb6a") : secret "infra-operator-webhook-server-cert" not found Mar 09 13:17:35 crc kubenswrapper[4723]: I0309 13:17:35.577022 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k56p4" event={"ID":"d748e45b-6515-4e15-a776-61bbe83179c0","Type":"ContainerStarted","Data":"b76e7d9d9e5dc72fd3ee72057d710d77864ff1259d6b029a9e164fed17b79245"} Mar 09 13:17:35 crc kubenswrapper[4723]: E0309 13:17:35.578748 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:bb939885bd04593ad03af901adb77ee2a2d18529b328c23288c7cc7a2ba5282e\\\"\"" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-mt7hb" podUID="f1620e57-58ba-4313-bba4-f5ece039f9f7" Mar 09 13:17:35 crc kubenswrapper[4723]: E0309 13:17:35.578841 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-4czcc" podUID="e93b778c-c10f-4da5-a3c2-91010b4b3aab" Mar 09 13:17:35 crc kubenswrapper[4723]: E0309 13:17:35.586128 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:172f24bd4603ac3498536a8a2c8fffb07cf9113dd52bc132778ea0aa275c6b84\\\"\"" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-6npjh" podUID="eb08b38d-0624-4bd5-a3ba-9447cdbc80fb" Mar 09 13:17:35 crc kubenswrapper[4723]: I0309 13:17:35.664047 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3f17509-7e0b-452d-b3ca-0a3210159f17-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cz5w4f\" (UID: \"c3f17509-7e0b-452d-b3ca-0a3210159f17\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cz5w4f" Mar 09 13:17:35 crc kubenswrapper[4723]: E0309 13:17:35.664197 4723 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 13:17:35 crc kubenswrapper[4723]: E0309 13:17:35.664257 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3f17509-7e0b-452d-b3ca-0a3210159f17-cert podName:c3f17509-7e0b-452d-b3ca-0a3210159f17 nodeName:}" failed. No retries permitted until 2026-03-09 13:17:39.664243044 +0000 UTC m=+1133.678710584 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c3f17509-7e0b-452d-b3ca-0a3210159f17-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cz5w4f" (UID: "c3f17509-7e0b-452d-b3ca-0a3210159f17") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 13:17:36 crc kubenswrapper[4723]: I0309 13:17:36.175750 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b9b75469-0c5d-47b4-b75c-28cdf8316167-webhook-certs\") pod \"openstack-operator-controller-manager-76577b8ddd-8748r\" (UID: \"b9b75469-0c5d-47b4-b75c-28cdf8316167\") " pod="openstack-operators/openstack-operator-controller-manager-76577b8ddd-8748r" Mar 09 13:17:36 crc kubenswrapper[4723]: E0309 13:17:36.175946 4723 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 09 13:17:36 crc kubenswrapper[4723]: E0309 13:17:36.176527 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9b75469-0c5d-47b4-b75c-28cdf8316167-webhook-certs podName:b9b75469-0c5d-47b4-b75c-28cdf8316167 nodeName:}" failed. No retries permitted until 2026-03-09 13:17:40.17650472 +0000 UTC m=+1134.190972260 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b9b75469-0c5d-47b4-b75c-28cdf8316167-webhook-certs") pod "openstack-operator-controller-manager-76577b8ddd-8748r" (UID: "b9b75469-0c5d-47b4-b75c-28cdf8316167") : secret "webhook-server-cert" not found Mar 09 13:17:36 crc kubenswrapper[4723]: I0309 13:17:36.176561 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b9b75469-0c5d-47b4-b75c-28cdf8316167-metrics-certs\") pod \"openstack-operator-controller-manager-76577b8ddd-8748r\" (UID: \"b9b75469-0c5d-47b4-b75c-28cdf8316167\") " pod="openstack-operators/openstack-operator-controller-manager-76577b8ddd-8748r" Mar 09 13:17:36 crc kubenswrapper[4723]: E0309 13:17:36.176802 4723 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 09 13:17:36 crc kubenswrapper[4723]: E0309 13:17:36.176912 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9b75469-0c5d-47b4-b75c-28cdf8316167-metrics-certs podName:b9b75469-0c5d-47b4-b75c-28cdf8316167 nodeName:}" failed. No retries permitted until 2026-03-09 13:17:40.176893701 +0000 UTC m=+1134.191361241 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b9b75469-0c5d-47b4-b75c-28cdf8316167-metrics-certs") pod "openstack-operator-controller-manager-76577b8ddd-8748r" (UID: "b9b75469-0c5d-47b4-b75c-28cdf8316167") : secret "metrics-server-cert" not found Mar 09 13:17:39 crc kubenswrapper[4723]: I0309 13:17:39.447014 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ea4f771-5b0c-410d-8a6c-a45b039edb6a-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-v6s9h\" (UID: \"5ea4f771-5b0c-410d-8a6c-a45b039edb6a\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-v6s9h" Mar 09 13:17:39 crc kubenswrapper[4723]: E0309 13:17:39.447254 4723 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 09 13:17:39 crc kubenswrapper[4723]: E0309 13:17:39.447578 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ea4f771-5b0c-410d-8a6c-a45b039edb6a-cert podName:5ea4f771-5b0c-410d-8a6c-a45b039edb6a nodeName:}" failed. No retries permitted until 2026-03-09 13:17:47.447553306 +0000 UTC m=+1141.462020856 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5ea4f771-5b0c-410d-8a6c-a45b039edb6a-cert") pod "infra-operator-controller-manager-f7fcc58b9-v6s9h" (UID: "5ea4f771-5b0c-410d-8a6c-a45b039edb6a") : secret "infra-operator-webhook-server-cert" not found Mar 09 13:17:39 crc kubenswrapper[4723]: I0309 13:17:39.753686 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3f17509-7e0b-452d-b3ca-0a3210159f17-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cz5w4f\" (UID: \"c3f17509-7e0b-452d-b3ca-0a3210159f17\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cz5w4f" Mar 09 13:17:39 crc kubenswrapper[4723]: E0309 13:17:39.753849 4723 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 13:17:39 crc kubenswrapper[4723]: E0309 13:17:39.754384 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3f17509-7e0b-452d-b3ca-0a3210159f17-cert podName:c3f17509-7e0b-452d-b3ca-0a3210159f17 nodeName:}" failed. No retries permitted until 2026-03-09 13:17:47.754365441 +0000 UTC m=+1141.768832981 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c3f17509-7e0b-452d-b3ca-0a3210159f17-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cz5w4f" (UID: "c3f17509-7e0b-452d-b3ca-0a3210159f17") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 13:17:40 crc kubenswrapper[4723]: I0309 13:17:40.265152 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b9b75469-0c5d-47b4-b75c-28cdf8316167-metrics-certs\") pod \"openstack-operator-controller-manager-76577b8ddd-8748r\" (UID: \"b9b75469-0c5d-47b4-b75c-28cdf8316167\") " pod="openstack-operators/openstack-operator-controller-manager-76577b8ddd-8748r" Mar 09 13:17:40 crc kubenswrapper[4723]: I0309 13:17:40.265824 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b9b75469-0c5d-47b4-b75c-28cdf8316167-webhook-certs\") pod \"openstack-operator-controller-manager-76577b8ddd-8748r\" (UID: \"b9b75469-0c5d-47b4-b75c-28cdf8316167\") " pod="openstack-operators/openstack-operator-controller-manager-76577b8ddd-8748r" Mar 09 13:17:40 crc kubenswrapper[4723]: E0309 13:17:40.266033 4723 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 09 13:17:40 crc kubenswrapper[4723]: E0309 13:17:40.266080 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9b75469-0c5d-47b4-b75c-28cdf8316167-webhook-certs podName:b9b75469-0c5d-47b4-b75c-28cdf8316167 nodeName:}" failed. No retries permitted until 2026-03-09 13:17:48.266065463 +0000 UTC m=+1142.280533003 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b9b75469-0c5d-47b4-b75c-28cdf8316167-webhook-certs") pod "openstack-operator-controller-manager-76577b8ddd-8748r" (UID: "b9b75469-0c5d-47b4-b75c-28cdf8316167") : secret "webhook-server-cert" not found Mar 09 13:17:40 crc kubenswrapper[4723]: E0309 13:17:40.266333 4723 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 09 13:17:40 crc kubenswrapper[4723]: E0309 13:17:40.266415 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9b75469-0c5d-47b4-b75c-28cdf8316167-metrics-certs podName:b9b75469-0c5d-47b4-b75c-28cdf8316167 nodeName:}" failed. No retries permitted until 2026-03-09 13:17:48.266398102 +0000 UTC m=+1142.280865642 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b9b75469-0c5d-47b4-b75c-28cdf8316167-metrics-certs") pod "openstack-operator-controller-manager-76577b8ddd-8748r" (UID: "b9b75469-0c5d-47b4-b75c-28cdf8316167") : secret "metrics-server-cert" not found Mar 09 13:17:45 crc kubenswrapper[4723]: E0309 13:17:45.613231 4723 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:114c0dee0bab1d453890e9dcc7727de749055bdbea049384d5696e7ac8d78fe3" Mar 09 13:17:45 crc kubenswrapper[4723]: E0309 13:17:45.614352 4723 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:114c0dee0bab1d453890e9dcc7727de749055bdbea049384d5696e7ac8d78fe3,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8wx9l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-78bc7f9bd9-5wrr9_openstack-operators(2fc3d688-53db-4d4e-9555-54c047570ae5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 13:17:45 crc kubenswrapper[4723]: E0309 13:17:45.615520 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-5wrr9" podUID="2fc3d688-53db-4d4e-9555-54c047570ae5" Mar 09 13:17:45 crc kubenswrapper[4723]: E0309 13:17:45.686632 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:114c0dee0bab1d453890e9dcc7727de749055bdbea049384d5696e7ac8d78fe3\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-5wrr9" podUID="2fc3d688-53db-4d4e-9555-54c047570ae5" Mar 09 13:17:46 crc kubenswrapper[4723]: E0309 13:17:46.662526 4723 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:5592ec4a6fbe2c832d1828b51af0b907e5d733d478b6f378a9b2f6d6cf0ac505" Mar 09 13:17:46 crc kubenswrapper[4723]: E0309 13:17:46.662740 4723 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:5592ec4a6fbe2c832d1828b51af0b907e5d733d478b6f378a9b2f6d6cf0ac505,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4d6x9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-7b6bfb6475-b2fx2_openstack-operators(49f841ea-0808-406e-a0d0-671f5db13f93): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 13:17:46 crc kubenswrapper[4723]: E0309 13:17:46.665055 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-b2fx2" podUID="49f841ea-0808-406e-a0d0-671f5db13f93" Mar 09 13:17:46 crc kubenswrapper[4723]: E0309 13:17:46.697091 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:5592ec4a6fbe2c832d1828b51af0b907e5d733d478b6f378a9b2f6d6cf0ac505\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-b2fx2" podUID="49f841ea-0808-406e-a0d0-671f5db13f93" Mar 09 13:17:47 crc kubenswrapper[4723]: E0309 13:17:47.301793 4723 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:81e43c058d9af1d3bc31704010c630bc2a574c2ee388aa0ffe8c7b9621a7d051" Mar 09 13:17:47 crc kubenswrapper[4723]: E0309 13:17:47.302359 4723 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:81e43c058d9af1d3bc31704010c630bc2a574c2ee388aa0ffe8c7b9621a7d051,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-55949,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-64db6967f8-89rtm_openstack-operators(9646c273-606f-4551-82dd-39e09007dc17): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 13:17:47 crc kubenswrapper[4723]: E0309 13:17:47.303562 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-89rtm" podUID="9646c273-606f-4551-82dd-39e09007dc17" Mar 09 13:17:47 crc kubenswrapper[4723]: I0309 13:17:47.508864 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ea4f771-5b0c-410d-8a6c-a45b039edb6a-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-v6s9h\" (UID: \"5ea4f771-5b0c-410d-8a6c-a45b039edb6a\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-v6s9h" Mar 09 13:17:47 crc kubenswrapper[4723]: E0309 13:17:47.509495 4723 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 09 13:17:47 crc kubenswrapper[4723]: E0309 13:17:47.510013 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ea4f771-5b0c-410d-8a6c-a45b039edb6a-cert podName:5ea4f771-5b0c-410d-8a6c-a45b039edb6a nodeName:}" failed. No retries permitted until 2026-03-09 13:18:03.509832212 +0000 UTC m=+1157.524299832 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5ea4f771-5b0c-410d-8a6c-a45b039edb6a-cert") pod "infra-operator-controller-manager-f7fcc58b9-v6s9h" (UID: "5ea4f771-5b0c-410d-8a6c-a45b039edb6a") : secret "infra-operator-webhook-server-cert" not found Mar 09 13:17:47 crc kubenswrapper[4723]: E0309 13:17:47.701152 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:81e43c058d9af1d3bc31704010c630bc2a574c2ee388aa0ffe8c7b9621a7d051\\\"\"" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-89rtm" podUID="9646c273-606f-4551-82dd-39e09007dc17" Mar 09 13:17:47 crc kubenswrapper[4723]: I0309 13:17:47.814318 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3f17509-7e0b-452d-b3ca-0a3210159f17-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cz5w4f\" (UID: \"c3f17509-7e0b-452d-b3ca-0a3210159f17\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cz5w4f" Mar 09 13:17:47 crc kubenswrapper[4723]: E0309 13:17:47.814531 4723 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 13:17:47 crc kubenswrapper[4723]: E0309 13:17:47.814615 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3f17509-7e0b-452d-b3ca-0a3210159f17-cert podName:c3f17509-7e0b-452d-b3ca-0a3210159f17 nodeName:}" failed. No retries permitted until 2026-03-09 13:18:03.814594143 +0000 UTC m=+1157.829061683 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c3f17509-7e0b-452d-b3ca-0a3210159f17-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cz5w4f" (UID: "c3f17509-7e0b-452d-b3ca-0a3210159f17") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 13:17:48 crc kubenswrapper[4723]: I0309 13:17:48.323400 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b9b75469-0c5d-47b4-b75c-28cdf8316167-metrics-certs\") pod \"openstack-operator-controller-manager-76577b8ddd-8748r\" (UID: \"b9b75469-0c5d-47b4-b75c-28cdf8316167\") " pod="openstack-operators/openstack-operator-controller-manager-76577b8ddd-8748r" Mar 09 13:17:48 crc kubenswrapper[4723]: I0309 13:17:48.323854 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b9b75469-0c5d-47b4-b75c-28cdf8316167-webhook-certs\") pod \"openstack-operator-controller-manager-76577b8ddd-8748r\" (UID: \"b9b75469-0c5d-47b4-b75c-28cdf8316167\") " pod="openstack-operators/openstack-operator-controller-manager-76577b8ddd-8748r" Mar 09 13:17:48 crc kubenswrapper[4723]: E0309 13:17:48.323610 4723 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 09 13:17:48 crc kubenswrapper[4723]: E0309 13:17:48.324029 4723 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 09 13:17:48 crc kubenswrapper[4723]: E0309 13:17:48.324075 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9b75469-0c5d-47b4-b75c-28cdf8316167-webhook-certs podName:b9b75469-0c5d-47b4-b75c-28cdf8316167 nodeName:}" failed. No retries permitted until 2026-03-09 13:18:04.324054576 +0000 UTC m=+1158.338522216 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b9b75469-0c5d-47b4-b75c-28cdf8316167-webhook-certs") pod "openstack-operator-controller-manager-76577b8ddd-8748r" (UID: "b9b75469-0c5d-47b4-b75c-28cdf8316167") : secret "webhook-server-cert" not found Mar 09 13:17:48 crc kubenswrapper[4723]: E0309 13:17:48.324477 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9b75469-0c5d-47b4-b75c-28cdf8316167-metrics-certs podName:b9b75469-0c5d-47b4-b75c-28cdf8316167 nodeName:}" failed. No retries permitted until 2026-03-09 13:18:04.324460487 +0000 UTC m=+1158.338928117 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b9b75469-0c5d-47b4-b75c-28cdf8316167-metrics-certs") pod "openstack-operator-controller-manager-76577b8ddd-8748r" (UID: "b9b75469-0c5d-47b4-b75c-28cdf8316167") : secret "metrics-server-cert" not found Mar 09 13:17:48 crc kubenswrapper[4723]: E0309 13:17:48.926826 4723 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:7961c67cfc87de69055f8330771af625f73d857426c4bb17ebb888ead843fff3" Mar 09 13:17:48 crc kubenswrapper[4723]: E0309 13:17:48.927100 4723 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:7961c67cfc87de69055f8330771af625f73d857426c4bb17ebb888ead843fff3,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ttzpv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-55d77d7b5c-4zwkg_openstack-operators(01b1451d-b917-4176-abf6-fd84021ba30d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 13:17:48 crc kubenswrapper[4723]: E0309 13:17:48.928329 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-4zwkg" podUID="01b1451d-b917-4176-abf6-fd84021ba30d" Mar 09 13:17:49 crc kubenswrapper[4723]: E0309 13:17:49.628184 4723 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:f1158ec4d879c4646eee4323bc501eba4d377beb2ad6fbe08ed30070c441ac26" Mar 09 13:17:49 crc kubenswrapper[4723]: E0309 13:17:49.628627 4723 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:f1158ec4d879c4646eee4323bc501eba4d377beb2ad6fbe08ed30070c441ac26,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b998f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-67d996989d-96b5g_openstack-operators(afe8d0e8-415a-4f80-8b5a-c3eb45e585cd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 13:17:49 crc kubenswrapper[4723]: E0309 13:17:49.629772 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-67d996989d-96b5g" podUID="afe8d0e8-415a-4f80-8b5a-c3eb45e585cd" Mar 09 13:17:49 crc kubenswrapper[4723]: E0309 13:17:49.719800 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:f1158ec4d879c4646eee4323bc501eba4d377beb2ad6fbe08ed30070c441ac26\\\"\"" pod="openstack-operators/manila-operator-controller-manager-67d996989d-96b5g" podUID="afe8d0e8-415a-4f80-8b5a-c3eb45e585cd" Mar 09 13:17:49 crc kubenswrapper[4723]: E0309 13:17:49.719812 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:7961c67cfc87de69055f8330771af625f73d857426c4bb17ebb888ead843fff3\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-4zwkg" podUID="01b1451d-b917-4176-abf6-fd84021ba30d" Mar 09 13:17:50 crc kubenswrapper[4723]: E0309 13:17:50.957025 4723 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:f309cdea8084a4b1e8cbcd732d6e250fd93c55cfd1b48ba9026907c8591faab7" Mar 09 13:17:50 crc kubenswrapper[4723]: E0309 13:17:50.957535 4723 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:f309cdea8084a4b1e8cbcd732d6e250fd93c55cfd1b48ba9026907c8591faab7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lnn4w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-9b9ff9f4d-8w2sj_openstack-operators(57b972b8-b38f-4bc5-8cb5-cb2d949ff3b8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 13:17:50 crc kubenswrapper[4723]: E0309 13:17:50.958829 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-8w2sj" podUID="57b972b8-b38f-4bc5-8cb5-cb2d949ff3b8" Mar 09 13:17:51 crc kubenswrapper[4723]: E0309 13:17:51.739029 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:f309cdea8084a4b1e8cbcd732d6e250fd93c55cfd1b48ba9026907c8591faab7\\\"\"" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-8w2sj" podUID="57b972b8-b38f-4bc5-8cb5-cb2d949ff3b8" Mar 09 13:17:52 crc kubenswrapper[4723]: E0309 13:17:52.298286 4723 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:9f73c84a9581b5739d8da333c7b64403d7b7ca284b22c624d0effe07f3d2819c" Mar 09 13:17:52 crc kubenswrapper[4723]: E0309 13:17:52.299177 4723 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:9f73c84a9581b5739d8da333c7b64403d7b7ca284b22c624d0effe07f3d2819c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lcqsk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-75684d597f-5wrg7_openstack-operators(1e62b006-449e-440b-b425-d56fbb171cd5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 13:17:52 crc kubenswrapper[4723]: E0309 13:17:52.300490 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-5wrg7" podUID="1e62b006-449e-440b-b425-d56fbb171cd5" Mar 09 13:17:52 crc kubenswrapper[4723]: E0309 13:17:52.746103 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:9f73c84a9581b5739d8da333c7b64403d7b7ca284b22c624d0effe07f3d2819c\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-5wrg7" podUID="1e62b006-449e-440b-b425-d56fbb171cd5" Mar 09 13:17:52 crc kubenswrapper[4723]: E0309 13:17:52.943280 4723 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:2d59045b8d8e6f9c5483c4fdda7c5057218d553200dc4bcf26789980ac1d9abd" Mar 09 13:17:52 crc kubenswrapper[4723]: E0309 13:17:52.943553 4723 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:2d59045b8d8e6f9c5483c4fdda7c5057218d553200dc4bcf26789980ac1d9abd,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-svzh6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-5d86c7ddb7-lbqm8_openstack-operators(a8a23c57-bff5-4820-955c-441521c1e8f2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 13:17:52 crc kubenswrapper[4723]: E0309 13:17:52.945709 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-lbqm8" podUID="a8a23c57-bff5-4820-955c-441521c1e8f2" Mar 09 13:17:53 crc kubenswrapper[4723]: E0309 13:17:53.557620 4723 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:9d723ab33964ee44704eed3223b64e828349d45dee04695434a6fcf4b6807d4c" Mar 09 13:17:53 crc kubenswrapper[4723]: E0309 13:17:53.558164 4723 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:9d723ab33964ee44704eed3223b64e828349d45dee04695434a6fcf4b6807d4c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rxs8g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7c789f89c6-wrqqw_openstack-operators(02c2f97c-15b6-4c33-8be5-c61cc982e989): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 13:17:53 crc kubenswrapper[4723]: E0309 13:17:53.559402 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wrqqw" podUID="02c2f97c-15b6-4c33-8be5-c61cc982e989" Mar 09 13:17:53 crc kubenswrapper[4723]: E0309 13:17:53.752806 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:9d723ab33964ee44704eed3223b64e828349d45dee04695434a6fcf4b6807d4c\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wrqqw" podUID="02c2f97c-15b6-4c33-8be5-c61cc982e989" Mar 09 13:17:53 crc kubenswrapper[4723]: E0309 13:17:53.753276 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:2d59045b8d8e6f9c5483c4fdda7c5057218d553200dc4bcf26789980ac1d9abd\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-lbqm8" podUID="a8a23c57-bff5-4820-955c-441521c1e8f2" Mar 09 13:17:55 crc kubenswrapper[4723]: E0309 13:17:55.183549 4723 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:9d03f03aa9a460f1fcac8875064808c03e4ecd0388873bbfb9c7dc58331f3968" Mar 09 13:17:55 crc kubenswrapper[4723]: E0309 13:17:55.183706 4723 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:9d03f03aa9a460f1fcac8875064808c03e4ecd0388873bbfb9c7dc58331f3968,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xm9jw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-55b5ff4dbb-6qstb_openstack-operators(8554b7c9-0bd7-4326-b906-fe07dcdce9da): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 13:17:55 crc kubenswrapper[4723]: E0309 13:17:55.185271 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-6qstb" podUID="8554b7c9-0bd7-4326-b906-fe07dcdce9da" Mar 09 13:17:55 crc kubenswrapper[4723]: E0309 13:17:55.270172 4723 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.36:5001/openstack-k8s-operators/telemetry-operator:8ccfcdb23140e93b912bb23903a7d6fafb754e30" Mar 09 13:17:55 crc kubenswrapper[4723]: E0309 13:17:55.270221 4723 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.36:5001/openstack-k8s-operators/telemetry-operator:8ccfcdb23140e93b912bb23903a7d6fafb754e30" Mar 09 13:17:55 crc kubenswrapper[4723]: E0309 13:17:55.270343 4723 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.36:5001/openstack-k8s-operators/telemetry-operator:8ccfcdb23140e93b912bb23903a7d6fafb754e30,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wmrnp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-5b9fbd87f-ssps2_openstack-operators(36e23b55-a129-4c5f-8938-26f58742541b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 13:17:55 crc kubenswrapper[4723]: E0309 13:17:55.278038 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-ssps2" podUID="36e23b55-a129-4c5f-8938-26f58742541b" Mar 09 13:17:55 crc kubenswrapper[4723]: E0309 13:17:55.768137 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:9d03f03aa9a460f1fcac8875064808c03e4ecd0388873bbfb9c7dc58331f3968\\\"\"" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-6qstb" podUID="8554b7c9-0bd7-4326-b906-fe07dcdce9da" Mar 09 13:17:55 crc kubenswrapper[4723]: E0309 13:17:55.768266 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.36:5001/openstack-k8s-operators/telemetry-operator:8ccfcdb23140e93b912bb23903a7d6fafb754e30\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-ssps2" podUID="36e23b55-a129-4c5f-8938-26f58742541b" Mar 09 13:17:56 crc kubenswrapper[4723]: E0309 13:17:56.984236 4723 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Mar 09 13:17:56 crc kubenswrapper[4723]: E0309 13:17:56.984837 4723 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4hbsl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-k56p4_openstack-operators(d748e45b-6515-4e15-a776-61bbe83179c0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 13:17:56 crc kubenswrapper[4723]: E0309 13:17:56.986029 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k56p4" podUID="d748e45b-6515-4e15-a776-61bbe83179c0" Mar 09 13:17:57 crc kubenswrapper[4723]: E0309 13:17:57.788835 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k56p4" podUID="d748e45b-6515-4e15-a776-61bbe83179c0" Mar 09 13:17:58 crc kubenswrapper[4723]: I0309 13:17:58.792021 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54688575f-czjxc" event={"ID":"c4d1a44c-121a-4326-9920-af7e6f87a031","Type":"ContainerStarted","Data":"5c35fd19105b460331a1e6a445879b8d1fe09d02e7916d500b022437204af60e"} Mar 09 13:17:58 crc kubenswrapper[4723]: I0309 13:17:58.793339 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-54688575f-czjxc" Mar 09 13:17:58 crc kubenswrapper[4723]: I0309 13:17:58.794727 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-dwlzx" event={"ID":"6bb6b3ee-7923-42ce-b36d-dabdaa42f829","Type":"ContainerStarted","Data":"63ff4492e21464c82527d71d1415ea71ec591c9d1dcae83bf94d7422453f17f6"} Mar 09 13:17:58 crc kubenswrapper[4723]: I0309 13:17:58.795277 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-dwlzx" Mar 09 13:17:58 crc kubenswrapper[4723]: I0309 13:17:58.796674 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-mt7hb" event={"ID":"f1620e57-58ba-4313-bba4-f5ece039f9f7","Type":"ContainerStarted","Data":"8ad4c959f8dbecd98ae7321f156084dd9d444a3585d403dc97a7594c5ff09d38"} Mar 09 13:17:58 crc kubenswrapper[4723]: I0309 13:17:58.797228 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-mt7hb" Mar 09 13:17:58 crc kubenswrapper[4723]: I0309 13:17:58.799054 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-6npjh" event={"ID":"eb08b38d-0624-4bd5-a3ba-9447cdbc80fb","Type":"ContainerStarted","Data":"958f6ffe05e868abab4ddbae630f85f9fc791c6831a28aef4066fa19d43db817"} Mar 09 13:17:58 crc kubenswrapper[4723]: I0309 13:17:58.799229 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-6npjh" Mar 09 13:17:58 crc kubenswrapper[4723]: I0309 13:17:58.800289 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-mtmcb" event={"ID":"6e192922-8050-41f1-bf25-33a12ace409b","Type":"ContainerStarted","Data":"6dda417b94c7abbadf617fff706260da7d5d5f333132b8bd7f4bb6b21e1d1fa5"} Mar 09 13:17:58 crc kubenswrapper[4723]: I0309 13:17:58.800790 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-mtmcb" Mar 09 13:17:58 crc kubenswrapper[4723]: I0309 13:17:58.802380 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-9btqb" event={"ID":"6bf9afff-37d5-41e4-9270-8994fc65deda","Type":"ContainerStarted","Data":"1071acd68a73100a44a94704c88bc3c3a5224c46c203d3d947900aa271884ad9"} Mar 09 13:17:58 crc kubenswrapper[4723]: I0309 13:17:58.802486 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-9btqb" Mar 09 13:17:58 crc kubenswrapper[4723]: I0309 13:17:58.803988 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-4czcc" event={"ID":"e93b778c-c10f-4da5-a3c2-91010b4b3aab","Type":"ContainerStarted","Data":"eeb00507cd9bc181481808834da541051e92c09a0e8e7b7147ba1ed0a2558a4f"} Mar 09 13:17:58 crc kubenswrapper[4723]: I0309 13:17:58.804605 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-4czcc" Mar 09 13:17:58 crc kubenswrapper[4723]: I0309 13:17:58.806278 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-rwzzl" event={"ID":"76830983-65b6-495a-8283-c9e2df80562b","Type":"ContainerStarted","Data":"1c14795ee2341bcf74fa37c10784f46ef9d51a8a58a94a19e2e1dcddd12c94c4"} Mar 09 13:17:58 crc kubenswrapper[4723]: I0309 13:17:58.806427 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-rwzzl" Mar 09 13:17:58 crc kubenswrapper[4723]: I0309 13:17:58.825624 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-54688575f-czjxc" podStartSLOduration=4.6218149969999995 podStartE2EDuration="27.825604164s" podCreationTimestamp="2026-03-09 13:17:31 +0000 UTC" firstStartedPulling="2026-03-09 13:17:33.758149931 +0000 UTC m=+1127.772617471" lastFinishedPulling="2026-03-09 13:17:56.961939098 +0000 UTC m=+1150.976406638" observedRunningTime="2026-03-09 13:17:58.819986113 +0000 UTC m=+1152.834453673" watchObservedRunningTime="2026-03-09 13:17:58.825604164 +0000 UTC m=+1152.840071724" Mar 09 13:17:58 crc kubenswrapper[4723]: I0309 13:17:58.850529 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-6npjh" podStartSLOduration=4.553459438 podStartE2EDuration="27.850507961s" podCreationTimestamp="2026-03-09 13:17:31 +0000 UTC" firstStartedPulling="2026-03-09 13:17:34.297055542 +0000 UTC m=+1128.311523082" lastFinishedPulling="2026-03-09 13:17:57.594104065 +0000 UTC m=+1151.608571605" observedRunningTime="2026-03-09 13:17:58.842130926 +0000 UTC m=+1152.856598486" watchObservedRunningTime="2026-03-09 13:17:58.850507961 +0000 UTC m=+1152.864975511" Mar 09 13:17:58 crc kubenswrapper[4723]: I0309 13:17:58.868206 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-dwlzx" podStartSLOduration=5.6937225609999995 podStartE2EDuration="27.868184034s" podCreationTimestamp="2026-03-09 13:17:31 +0000 UTC" firstStartedPulling="2026-03-09 13:17:33.073592749 +0000 UTC m=+1127.088060279" lastFinishedPulling="2026-03-09 13:17:55.248054222 +0000 UTC m=+1149.262521752" observedRunningTime="2026-03-09 13:17:58.862464911 +0000 UTC m=+1152.876932451" watchObservedRunningTime="2026-03-09 13:17:58.868184034 +0000 UTC m=+1152.882651584" Mar 09 13:17:58 crc kubenswrapper[4723]: I0309 13:17:58.879241 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-4czcc" podStartSLOduration=4.476107447 podStartE2EDuration="27.87922639s" podCreationTimestamp="2026-03-09 13:17:31 +0000 UTC" firstStartedPulling="2026-03-09 13:17:34.275931727 +0000 UTC m=+1128.290399267" lastFinishedPulling="2026-03-09 13:17:57.67905067 +0000 UTC m=+1151.693518210" observedRunningTime="2026-03-09 13:17:58.873521317 +0000 UTC m=+1152.887988857" watchObservedRunningTime="2026-03-09 13:17:58.87922639 +0000 UTC m=+1152.893693930" Mar 09 13:17:58 crc kubenswrapper[4723]: I0309 13:17:58.920364 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-rwzzl" podStartSLOduration=4.695384757 podStartE2EDuration="27.920341501s" podCreationTimestamp="2026-03-09 13:17:31 +0000 UTC" firstStartedPulling="2026-03-09 13:17:33.73720818 +0000 UTC m=+1127.751675720" lastFinishedPulling="2026-03-09 13:17:56.962164914 +0000 UTC m=+1150.976632464" observedRunningTime="2026-03-09 13:17:58.890209304 +0000 UTC m=+1152.904676844" watchObservedRunningTime="2026-03-09 13:17:58.920341501 +0000 UTC m=+1152.934809061" Mar 09 13:17:58 crc kubenswrapper[4723]: I0309 13:17:58.961250 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-mt7hb" podStartSLOduration=4.52853289 podStartE2EDuration="27.961229626s" podCreationTimestamp="2026-03-09 13:17:31 +0000 UTC" firstStartedPulling="2026-03-09 13:17:34.280434927 +0000 UTC m=+1128.294902467" lastFinishedPulling="2026-03-09 13:17:57.713131663 +0000 UTC m=+1151.727599203" observedRunningTime="2026-03-09 13:17:58.916636611 +0000 UTC m=+1152.931104141" watchObservedRunningTime="2026-03-09 13:17:58.961229626 +0000 UTC m=+1152.975697166" Mar 09 13:17:58 crc kubenswrapper[4723]: I0309 13:17:58.972451 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-9btqb" podStartSLOduration=6.556043603 podStartE2EDuration="27.972409965s" podCreationTimestamp="2026-03-09 13:17:31 +0000 UTC" firstStartedPulling="2026-03-09 13:17:33.759948309 +0000 UTC m=+1127.774415849" lastFinishedPulling="2026-03-09 13:17:55.176314661 +0000 UTC m=+1149.190782211" observedRunningTime="2026-03-09 13:17:58.941759244 +0000 UTC m=+1152.956226784" watchObservedRunningTime="2026-03-09 13:17:58.972409965 +0000 UTC m=+1152.986877505" Mar 09 13:17:58 crc kubenswrapper[4723]: I0309 13:17:58.989945 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-mtmcb" podStartSLOduration=5.435589109 podStartE2EDuration="27.989926584s" podCreationTimestamp="2026-03-09 13:17:31 +0000 UTC" firstStartedPulling="2026-03-09 13:17:32.621979116 +0000 UTC m=+1126.636446646" lastFinishedPulling="2026-03-09 13:17:55.176316581 +0000 UTC m=+1149.190784121" observedRunningTime="2026-03-09 13:17:58.957161327 +0000 UTC m=+1152.971628867" watchObservedRunningTime="2026-03-09 13:17:58.989926584 +0000 UTC m=+1153.004394124" Mar 09 13:17:59 crc kubenswrapper[4723]: I0309 13:17:59.816743 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-b2fx2" event={"ID":"49f841ea-0808-406e-a0d0-671f5db13f93","Type":"ContainerStarted","Data":"ed414b9e883c5a83dab862b8b6ce709e77421728163788c9a8952caf36a6486c"} Mar 09 13:17:59 crc kubenswrapper[4723]: I0309 13:17:59.817425 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-b2fx2" Mar 09 13:17:59 crc kubenswrapper[4723]: I0309 13:17:59.818946 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-5wrr9" event={"ID":"2fc3d688-53db-4d4e-9555-54c047570ae5","Type":"ContainerStarted","Data":"2aae66a42f39e3d943c924bcf2b049beb151c7241e47b38d3f7bbc8fb0209809"} Mar 09 13:17:59 crc kubenswrapper[4723]: I0309 13:17:59.839191 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-b2fx2" podStartSLOduration=3.120241718 podStartE2EDuration="28.839173656s" podCreationTimestamp="2026-03-09 13:17:31 +0000 UTC" firstStartedPulling="2026-03-09 13:17:33.729304449 +0000 UTC m=+1127.743771989" lastFinishedPulling="2026-03-09 13:17:59.448236387 +0000 UTC m=+1153.462703927" observedRunningTime="2026-03-09 13:17:59.835496347 +0000 UTC m=+1153.849963887" watchObservedRunningTime="2026-03-09 13:17:59.839173656 +0000 UTC m=+1153.853641196" Mar 09 13:17:59 crc kubenswrapper[4723]: I0309 13:17:59.853105 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-5wrr9" podStartSLOduration=3.166881086 podStartE2EDuration="28.853084648s" podCreationTimestamp="2026-03-09 13:17:31 +0000 UTC" firstStartedPulling="2026-03-09 13:17:33.76071623 +0000 UTC m=+1127.775183770" lastFinishedPulling="2026-03-09 13:17:59.446919782 +0000 UTC m=+1153.461387332" observedRunningTime="2026-03-09 13:17:59.850835388 +0000 UTC m=+1153.865302928" watchObservedRunningTime="2026-03-09 13:17:59.853084648 +0000 UTC m=+1153.867552178" Mar 09 13:18:00 crc kubenswrapper[4723]: I0309 13:18:00.141970 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551038-49mwf"] Mar 09 13:18:00 crc kubenswrapper[4723]: I0309 13:18:00.142966 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551038-49mwf" Mar 09 13:18:00 crc kubenswrapper[4723]: I0309 13:18:00.146259 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:18:00 crc kubenswrapper[4723]: I0309 13:18:00.146317 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:18:00 crc kubenswrapper[4723]: I0309 13:18:00.146476 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jrp4x" Mar 09 13:18:00 crc kubenswrapper[4723]: I0309 13:18:00.153209 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551038-49mwf"] Mar 09 13:18:00 crc kubenswrapper[4723]: I0309 13:18:00.266232 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfwn7\" (UniqueName: \"kubernetes.io/projected/75404ce0-2053-477b-8324-d967c2dff0e9-kube-api-access-cfwn7\") pod \"auto-csr-approver-29551038-49mwf\" (UID: \"75404ce0-2053-477b-8324-d967c2dff0e9\") " pod="openshift-infra/auto-csr-approver-29551038-49mwf" Mar 09 13:18:00 crc kubenswrapper[4723]: I0309 13:18:00.367125 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfwn7\" (UniqueName: \"kubernetes.io/projected/75404ce0-2053-477b-8324-d967c2dff0e9-kube-api-access-cfwn7\") pod \"auto-csr-approver-29551038-49mwf\" (UID: \"75404ce0-2053-477b-8324-d967c2dff0e9\") " pod="openshift-infra/auto-csr-approver-29551038-49mwf" Mar 09 13:18:00 crc kubenswrapper[4723]: I0309 13:18:00.387828 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfwn7\" (UniqueName: \"kubernetes.io/projected/75404ce0-2053-477b-8324-d967c2dff0e9-kube-api-access-cfwn7\") pod \"auto-csr-approver-29551038-49mwf\" (UID: \"75404ce0-2053-477b-8324-d967c2dff0e9\") " pod="openshift-infra/auto-csr-approver-29551038-49mwf" Mar 09 13:18:00 crc kubenswrapper[4723]: I0309 13:18:00.458735 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551038-49mwf" Mar 09 13:18:01 crc kubenswrapper[4723]: I0309 13:18:01.799065 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-5wrr9" Mar 09 13:18:02 crc kubenswrapper[4723]: I0309 13:18:02.122582 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551038-49mwf"] Mar 09 13:18:02 crc kubenswrapper[4723]: I0309 13:18:02.867762 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551038-49mwf" event={"ID":"75404ce0-2053-477b-8324-d967c2dff0e9","Type":"ContainerStarted","Data":"a175cc5b5883541a89ed9375ebb9dd7977e059161420482673e77a09754bcd7a"} Mar 09 13:18:03 crc kubenswrapper[4723]: I0309 13:18:03.537192 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ea4f771-5b0c-410d-8a6c-a45b039edb6a-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-v6s9h\" (UID: \"5ea4f771-5b0c-410d-8a6c-a45b039edb6a\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-v6s9h" Mar 09 13:18:03 crc kubenswrapper[4723]: I0309 13:18:03.543577 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ea4f771-5b0c-410d-8a6c-a45b039edb6a-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-v6s9h\" (UID: \"5ea4f771-5b0c-410d-8a6c-a45b039edb6a\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-v6s9h" Mar 09 13:18:03 crc kubenswrapper[4723]: I0309 13:18:03.627318 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-jnbzx" Mar 09 13:18:03 crc kubenswrapper[4723]: I0309 13:18:03.635237 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-v6s9h" Mar 09 13:18:04 crc kubenswrapper[4723]: I0309 13:18:03.851681 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3f17509-7e0b-452d-b3ca-0a3210159f17-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cz5w4f\" (UID: \"c3f17509-7e0b-452d-b3ca-0a3210159f17\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cz5w4f" Mar 09 13:18:04 crc kubenswrapper[4723]: I0309 13:18:03.859388 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3f17509-7e0b-452d-b3ca-0a3210159f17-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cz5w4f\" (UID: \"c3f17509-7e0b-452d-b3ca-0a3210159f17\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cz5w4f" Mar 09 13:18:04 crc kubenswrapper[4723]: I0309 13:18:03.877942 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-4zwkg" event={"ID":"01b1451d-b917-4176-abf6-fd84021ba30d","Type":"ContainerStarted","Data":"ea1cfaf1b42205e5232643357b1959141be5c218b95aab35a40c0fb5d83545d7"} Mar 09 13:18:04 crc kubenswrapper[4723]: I0309 13:18:03.878156 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-4zwkg" Mar 09 13:18:04 crc kubenswrapper[4723]: I0309 13:18:03.881593 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-89rtm" event={"ID":"9646c273-606f-4551-82dd-39e09007dc17","Type":"ContainerStarted","Data":"2a911db2e9c8a8ec8d405c7ee347dd067bb3e4c97c63cb99532cedb03e9219ea"} Mar 09 13:18:04 crc kubenswrapper[4723]: I0309 13:18:03.882498 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-89rtm" Mar 09 13:18:04 crc kubenswrapper[4723]: I0309 13:18:03.885636 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-96b5g" event={"ID":"afe8d0e8-415a-4f80-8b5a-c3eb45e585cd","Type":"ContainerStarted","Data":"b1087f142c97ec62a8d132c00054962e2c28af777864a855fcc7e2be4dc531d8"} Mar 09 13:18:04 crc kubenswrapper[4723]: I0309 13:18:03.885831 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-67d996989d-96b5g" Mar 09 13:18:04 crc kubenswrapper[4723]: I0309 13:18:03.944411 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-89rtm" podStartSLOduration=2.8833886140000002 podStartE2EDuration="32.944396768s" podCreationTimestamp="2026-03-09 13:17:31 +0000 UTC" firstStartedPulling="2026-03-09 13:17:33.071537254 +0000 UTC m=+1127.086004784" lastFinishedPulling="2026-03-09 13:18:03.132545398 +0000 UTC m=+1157.147012938" observedRunningTime="2026-03-09 13:18:03.940048402 +0000 UTC m=+1157.954515942" watchObservedRunningTime="2026-03-09 13:18:03.944396768 +0000 UTC m=+1157.958864308" Mar 09 13:18:04 crc kubenswrapper[4723]: I0309 13:18:03.946324 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:18:04 crc kubenswrapper[4723]: I0309 13:18:03.946352 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:18:04 crc kubenswrapper[4723]: I0309 13:18:03.946381 4723 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" Mar 09 13:18:04 crc kubenswrapper[4723]: I0309 13:18:03.947075 4723 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a07909afd95b9a1ee1329ff07b5736e303acd66573a389c66b14a13e53a70f9f"} pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 13:18:04 crc kubenswrapper[4723]: I0309 13:18:03.947122 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" containerID="cri-o://a07909afd95b9a1ee1329ff07b5736e303acd66573a389c66b14a13e53a70f9f" gracePeriod=600 Mar 09 13:18:04 crc kubenswrapper[4723]: I0309 13:18:03.947133 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-4zwkg" podStartSLOduration=3.044986002 podStartE2EDuration="32.947126011s" podCreationTimestamp="2026-03-09 13:17:31 +0000 UTC" firstStartedPulling="2026-03-09 13:17:33.065186244 +0000 UTC m=+1127.079653784" lastFinishedPulling="2026-03-09 13:18:02.967326253 +0000 UTC m=+1156.981793793" observedRunningTime="2026-03-09 13:18:03.900229145 +0000 UTC m=+1157.914696685" watchObservedRunningTime="2026-03-09 13:18:03.947126011 +0000 UTC m=+1157.961593551" Mar 09 13:18:04 crc kubenswrapper[4723]: I0309 13:18:03.959902 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-67d996989d-96b5g" podStartSLOduration=3.662782656 podStartE2EDuration="32.959887793s" podCreationTimestamp="2026-03-09 13:17:31 +0000 UTC" firstStartedPulling="2026-03-09 13:17:33.729050472 +0000 UTC m=+1127.743518002" lastFinishedPulling="2026-03-09 13:18:03.026155579 +0000 UTC m=+1157.040623139" observedRunningTime="2026-03-09 13:18:03.956964925 +0000 UTC m=+1157.971432475" watchObservedRunningTime="2026-03-09 13:18:03.959887793 +0000 UTC m=+1157.974355323" Mar 09 13:18:04 crc kubenswrapper[4723]: I0309 13:18:04.091817 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-nqk2g" Mar 09 13:18:04 crc kubenswrapper[4723]: I0309 13:18:04.100436 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cz5w4f" Mar 09 13:18:04 crc kubenswrapper[4723]: I0309 13:18:04.366549 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b9b75469-0c5d-47b4-b75c-28cdf8316167-metrics-certs\") pod \"openstack-operator-controller-manager-76577b8ddd-8748r\" (UID: \"b9b75469-0c5d-47b4-b75c-28cdf8316167\") " pod="openstack-operators/openstack-operator-controller-manager-76577b8ddd-8748r" Mar 09 13:18:04 crc kubenswrapper[4723]: I0309 13:18:04.367557 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b9b75469-0c5d-47b4-b75c-28cdf8316167-webhook-certs\") pod \"openstack-operator-controller-manager-76577b8ddd-8748r\" (UID: \"b9b75469-0c5d-47b4-b75c-28cdf8316167\") " pod="openstack-operators/openstack-operator-controller-manager-76577b8ddd-8748r" Mar 09 13:18:04 crc kubenswrapper[4723]: I0309 13:18:04.372253 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b9b75469-0c5d-47b4-b75c-28cdf8316167-metrics-certs\") pod \"openstack-operator-controller-manager-76577b8ddd-8748r\" (UID: \"b9b75469-0c5d-47b4-b75c-28cdf8316167\") " pod="openstack-operators/openstack-operator-controller-manager-76577b8ddd-8748r" Mar 09 13:18:04 crc kubenswrapper[4723]: I0309 13:18:04.372969 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b9b75469-0c5d-47b4-b75c-28cdf8316167-webhook-certs\") pod \"openstack-operator-controller-manager-76577b8ddd-8748r\" (UID: \"b9b75469-0c5d-47b4-b75c-28cdf8316167\") " pod="openstack-operators/openstack-operator-controller-manager-76577b8ddd-8748r" Mar 09 13:18:04 crc kubenswrapper[4723]: I0309 13:18:04.572233 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-nrmt8" Mar 09 13:18:04 crc kubenswrapper[4723]: I0309 13:18:04.576001 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-76577b8ddd-8748r" Mar 09 13:18:04 crc kubenswrapper[4723]: I0309 13:18:04.853342 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-f7fcc58b9-v6s9h"] Mar 09 13:18:04 crc kubenswrapper[4723]: I0309 13:18:04.910154 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cz5w4f"] Mar 09 13:18:04 crc kubenswrapper[4723]: I0309 13:18:04.914612 4723 generic.go:334] "Generic (PLEG): container finished" podID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerID="a07909afd95b9a1ee1329ff07b5736e303acd66573a389c66b14a13e53a70f9f" exitCode=0 Mar 09 13:18:04 crc kubenswrapper[4723]: I0309 13:18:04.914700 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" event={"ID":"983d5ed4-cfc7-402a-b226-29dc071c6e4e","Type":"ContainerDied","Data":"a07909afd95b9a1ee1329ff07b5736e303acd66573a389c66b14a13e53a70f9f"} Mar 09 13:18:04 crc kubenswrapper[4723]: I0309 13:18:04.914741 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" event={"ID":"983d5ed4-cfc7-402a-b226-29dc071c6e4e","Type":"ContainerStarted","Data":"37a6af4ad4a336694755a90bae29c7dad0bac535fc07da2bdf95f50123da1b17"} Mar 09 13:18:04 crc kubenswrapper[4723]: I0309 13:18:04.914767 4723 scope.go:117] "RemoveContainer" containerID="4d25501b3a9b23fada0109d3a471f491cc22bbb00f111c3efbddd551e1408485" Mar 09 13:18:04 crc kubenswrapper[4723]: I0309 13:18:04.918556 4723 generic.go:334] "Generic (PLEG): container finished" podID="75404ce0-2053-477b-8324-d967c2dff0e9" containerID="a6f50c1b5e999531bcc92924f162bdd30b9206002c69fcaff5e650142d092799" exitCode=0 Mar 09 13:18:04 crc kubenswrapper[4723]: I0309 13:18:04.918662 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551038-49mwf" event={"ID":"75404ce0-2053-477b-8324-d967c2dff0e9","Type":"ContainerDied","Data":"a6f50c1b5e999531bcc92924f162bdd30b9206002c69fcaff5e650142d092799"} Mar 09 13:18:04 crc kubenswrapper[4723]: I0309 13:18:04.920270 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-v6s9h" event={"ID":"5ea4f771-5b0c-410d-8a6c-a45b039edb6a","Type":"ContainerStarted","Data":"89c88dec36d85fd8ea4521a09485264b0c93eaff1e1434de1f8c57fe52542387"} Mar 09 13:18:04 crc kubenswrapper[4723]: I0309 13:18:04.922678 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-8w2sj" event={"ID":"57b972b8-b38f-4bc5-8cb5-cb2d949ff3b8","Type":"ContainerStarted","Data":"d44e53d912b1be7a1ab2052112e2a8416b8bc7051d623412a3999cfed99d14ad"} Mar 09 13:18:04 crc kubenswrapper[4723]: I0309 13:18:04.923234 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-8w2sj" Mar 09 13:18:04 crc kubenswrapper[4723]: I0309 13:18:04.965578 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-8w2sj" podStartSLOduration=4.870564969 podStartE2EDuration="33.965561043s" podCreationTimestamp="2026-03-09 13:17:31 +0000 UTC" firstStartedPulling="2026-03-09 13:17:34.271705604 +0000 UTC m=+1128.286173144" lastFinishedPulling="2026-03-09 13:18:03.366701678 +0000 UTC m=+1157.381169218" observedRunningTime="2026-03-09 13:18:04.961244347 +0000 UTC m=+1158.975711887" watchObservedRunningTime="2026-03-09 13:18:04.965561043 +0000 UTC m=+1158.980028583" Mar 09 13:18:05 crc kubenswrapper[4723]: I0309 13:18:05.070584 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-76577b8ddd-8748r"] Mar 09 13:18:05 crc kubenswrapper[4723]: I0309 13:18:05.934989 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-76577b8ddd-8748r" event={"ID":"b9b75469-0c5d-47b4-b75c-28cdf8316167","Type":"ContainerStarted","Data":"92a1c172f2e22c3df2504610f1472dd7f767995a8e370a1556dbd855477eabcf"} Mar 09 13:18:05 crc kubenswrapper[4723]: I0309 13:18:05.937409 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-76577b8ddd-8748r" event={"ID":"b9b75469-0c5d-47b4-b75c-28cdf8316167","Type":"ContainerStarted","Data":"f9679d143c395c380d78edc5e2852d702eead154c5bcf045cb0c293cb47b34ed"} Mar 09 13:18:05 crc kubenswrapper[4723]: I0309 13:18:05.937439 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-5wrg7" event={"ID":"1e62b006-449e-440b-b425-d56fbb171cd5","Type":"ContainerStarted","Data":"7d6d2b7dd0b30b5d76819b770b5806aaaa5364356829b126efdb17abebb86c61"} Mar 09 13:18:05 crc kubenswrapper[4723]: I0309 13:18:05.937473 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-76577b8ddd-8748r" Mar 09 13:18:05 crc kubenswrapper[4723]: I0309 13:18:05.937817 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-5wrg7" Mar 09 13:18:05 crc kubenswrapper[4723]: I0309 13:18:05.939500 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cz5w4f" event={"ID":"c3f17509-7e0b-452d-b3ca-0a3210159f17","Type":"ContainerStarted","Data":"18bfd8f141a8403085ba0916b34e8ffda1a90838f83125d6851e17822f398f5a"} Mar 09 13:18:05 crc kubenswrapper[4723]: I0309 13:18:05.969839 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-76577b8ddd-8748r" podStartSLOduration=33.969816886 podStartE2EDuration="33.969816886s" podCreationTimestamp="2026-03-09 13:17:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:18:05.96326293 +0000 UTC m=+1159.977730470" watchObservedRunningTime="2026-03-09 13:18:05.969816886 +0000 UTC m=+1159.984284426" Mar 09 13:18:05 crc kubenswrapper[4723]: I0309 13:18:05.982458 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-5wrg7" podStartSLOduration=3.950189003 podStartE2EDuration="34.982443044s" podCreationTimestamp="2026-03-09 13:17:31 +0000 UTC" firstStartedPulling="2026-03-09 13:17:34.248493792 +0000 UTC m=+1128.262961332" lastFinishedPulling="2026-03-09 13:18:05.280747833 +0000 UTC m=+1159.295215373" observedRunningTime="2026-03-09 13:18:05.981210211 +0000 UTC m=+1159.995677751" watchObservedRunningTime="2026-03-09 13:18:05.982443044 +0000 UTC m=+1159.996910584" Mar 09 13:18:06 crc kubenswrapper[4723]: I0309 13:18:06.346580 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551038-49mwf" Mar 09 13:18:06 crc kubenswrapper[4723]: I0309 13:18:06.417199 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfwn7\" (UniqueName: \"kubernetes.io/projected/75404ce0-2053-477b-8324-d967c2dff0e9-kube-api-access-cfwn7\") pod \"75404ce0-2053-477b-8324-d967c2dff0e9\" (UID: \"75404ce0-2053-477b-8324-d967c2dff0e9\") " Mar 09 13:18:06 crc kubenswrapper[4723]: I0309 13:18:06.425646 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75404ce0-2053-477b-8324-d967c2dff0e9-kube-api-access-cfwn7" (OuterVolumeSpecName: "kube-api-access-cfwn7") pod "75404ce0-2053-477b-8324-d967c2dff0e9" (UID: "75404ce0-2053-477b-8324-d967c2dff0e9"). InnerVolumeSpecName "kube-api-access-cfwn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:18:06 crc kubenswrapper[4723]: I0309 13:18:06.519729 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfwn7\" (UniqueName: \"kubernetes.io/projected/75404ce0-2053-477b-8324-d967c2dff0e9-kube-api-access-cfwn7\") on node \"crc\" DevicePath \"\"" Mar 09 13:18:06 crc kubenswrapper[4723]: I0309 13:18:06.952781 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wrqqw" event={"ID":"02c2f97c-15b6-4c33-8be5-c61cc982e989","Type":"ContainerStarted","Data":"752dac870857085e840b069c3f4c98b59b8b7861cf546c1fecdf29185cf0bd3e"} Mar 09 13:18:06 crc kubenswrapper[4723]: I0309 13:18:06.953929 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wrqqw" Mar 09 13:18:06 crc kubenswrapper[4723]: I0309 13:18:06.955934 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551038-49mwf" event={"ID":"75404ce0-2053-477b-8324-d967c2dff0e9","Type":"ContainerDied","Data":"a175cc5b5883541a89ed9375ebb9dd7977e059161420482673e77a09754bcd7a"} Mar 09 13:18:06 crc kubenswrapper[4723]: I0309 13:18:06.955976 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a175cc5b5883541a89ed9375ebb9dd7977e059161420482673e77a09754bcd7a" Mar 09 13:18:06 crc kubenswrapper[4723]: I0309 13:18:06.956030 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551038-49mwf" Mar 09 13:18:06 crc kubenswrapper[4723]: I0309 13:18:06.984624 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wrqqw" podStartSLOduration=3.355788885 podStartE2EDuration="35.984603381s" podCreationTimestamp="2026-03-09 13:17:31 +0000 UTC" firstStartedPulling="2026-03-09 13:17:33.737801856 +0000 UTC m=+1127.752269396" lastFinishedPulling="2026-03-09 13:18:06.366616352 +0000 UTC m=+1160.381083892" observedRunningTime="2026-03-09 13:18:06.978140068 +0000 UTC m=+1160.992607628" watchObservedRunningTime="2026-03-09 13:18:06.984603381 +0000 UTC m=+1160.999070911" Mar 09 13:18:07 crc kubenswrapper[4723]: I0309 13:18:07.403949 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551032-l8bn7"] Mar 09 13:18:07 crc kubenswrapper[4723]: I0309 13:18:07.410338 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551032-l8bn7"] Mar 09 13:18:08 crc kubenswrapper[4723]: I0309 13:18:08.896568 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfc2f049-83e8-4cbe-a04e-d02db1274094" path="/var/lib/kubelet/pods/bfc2f049-83e8-4cbe-a04e-d02db1274094/volumes" Mar 09 13:18:08 crc kubenswrapper[4723]: I0309 13:18:08.973485 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-ssps2" event={"ID":"36e23b55-a129-4c5f-8938-26f58742541b","Type":"ContainerStarted","Data":"919a7440d19dba3cd52fdbd2c4c67e8a154bb3fd8128c3d6298cc080d027b5f5"} Mar 09 13:18:08 crc kubenswrapper[4723]: I0309 13:18:08.973710 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-ssps2" Mar 09 13:18:08 crc kubenswrapper[4723]: I0309 13:18:08.975586 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cz5w4f" event={"ID":"c3f17509-7e0b-452d-b3ca-0a3210159f17","Type":"ContainerStarted","Data":"08eb01411981627401918c04bf7d6a2e7869e6d0123dac7e359d019753dc1d3b"} Mar 09 13:18:08 crc kubenswrapper[4723]: I0309 13:18:08.975699 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cz5w4f" Mar 09 13:18:08 crc kubenswrapper[4723]: I0309 13:18:08.977256 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-lbqm8" event={"ID":"a8a23c57-bff5-4820-955c-441521c1e8f2","Type":"ContainerStarted","Data":"735442de47778db60c1d73a2787fa181a04a8d03b9de68703ca81877ed1bc060"} Mar 09 13:18:08 crc kubenswrapper[4723]: I0309 13:18:08.977440 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-lbqm8" Mar 09 13:18:08 crc kubenswrapper[4723]: I0309 13:18:08.978885 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-v6s9h" event={"ID":"5ea4f771-5b0c-410d-8a6c-a45b039edb6a","Type":"ContainerStarted","Data":"a807180fab717513527cded7df15da1b04bbe163ea03264e69d8a23ffe5c7760"} Mar 09 13:18:08 crc kubenswrapper[4723]: I0309 13:18:08.979003 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-v6s9h" Mar 09 13:18:09 crc kubenswrapper[4723]: I0309 13:18:09.004603 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-ssps2" podStartSLOduration=4.041412254 podStartE2EDuration="38.004572382s" podCreationTimestamp="2026-03-09 13:17:31 +0000 UTC" firstStartedPulling="2026-03-09 13:17:34.184427066 +0000 UTC m=+1128.198894606" lastFinishedPulling="2026-03-09 13:18:08.147587194 +0000 UTC m=+1162.162054734" observedRunningTime="2026-03-09 13:18:09.000775461 +0000 UTC m=+1163.015243021" watchObservedRunningTime="2026-03-09 13:18:09.004572382 +0000 UTC m=+1163.019039922" Mar 09 13:18:09 crc kubenswrapper[4723]: I0309 13:18:09.037429 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-v6s9h" podStartSLOduration=34.752608628 podStartE2EDuration="38.037413142s" podCreationTimestamp="2026-03-09 13:17:31 +0000 UTC" firstStartedPulling="2026-03-09 13:18:04.858699041 +0000 UTC m=+1158.873166581" lastFinishedPulling="2026-03-09 13:18:08.143503545 +0000 UTC m=+1162.157971095" observedRunningTime="2026-03-09 13:18:09.035954683 +0000 UTC m=+1163.050422243" watchObservedRunningTime="2026-03-09 13:18:09.037413142 +0000 UTC m=+1163.051880682" Mar 09 13:18:09 crc kubenswrapper[4723]: I0309 13:18:09.069916 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cz5w4f" podStartSLOduration=34.831792439 podStartE2EDuration="38.069897842s" podCreationTimestamp="2026-03-09 13:17:31 +0000 UTC" firstStartedPulling="2026-03-09 13:18:04.908475844 +0000 UTC m=+1158.922943384" lastFinishedPulling="2026-03-09 13:18:08.146581237 +0000 UTC m=+1162.161048787" observedRunningTime="2026-03-09 13:18:09.066577363 +0000 UTC m=+1163.081044913" watchObservedRunningTime="2026-03-09 13:18:09.069897842 +0000 UTC m=+1163.084365402" Mar 09 13:18:09 crc kubenswrapper[4723]: I0309 13:18:09.091309 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-lbqm8" podStartSLOduration=3.907278934 podStartE2EDuration="38.091294035s" podCreationTimestamp="2026-03-09 13:17:31 +0000 UTC" firstStartedPulling="2026-03-09 13:17:34.229157194 +0000 UTC m=+1128.243624734" lastFinishedPulling="2026-03-09 13:18:08.413172295 +0000 UTC m=+1162.427639835" observedRunningTime="2026-03-09 13:18:09.085434038 +0000 UTC m=+1163.099901578" watchObservedRunningTime="2026-03-09 13:18:09.091294035 +0000 UTC m=+1163.105761575" Mar 09 13:18:11 crc kubenswrapper[4723]: I0309 13:18:11.530539 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-4zwkg" Mar 09 13:18:11 crc kubenswrapper[4723]: I0309 13:18:11.563213 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-mtmcb" Mar 09 13:18:11 crc kubenswrapper[4723]: I0309 13:18:11.583474 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-dwlzx" Mar 09 13:18:11 crc kubenswrapper[4723]: I0309 13:18:11.665005 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-89rtm" Mar 09 13:18:11 crc kubenswrapper[4723]: I0309 13:18:11.763886 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-9btqb" Mar 09 13:18:11 crc kubenswrapper[4723]: I0309 13:18:11.807374 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-5wrr9" Mar 09 13:18:12 crc kubenswrapper[4723]: I0309 13:18:12.003671 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-6qstb" event={"ID":"8554b7c9-0bd7-4326-b906-fe07dcdce9da","Type":"ContainerStarted","Data":"c6089ed847db5857cbac0ea4abca21834bfa909f6b3c484343f4335a37cdbb28"} Mar 09 13:18:12 crc kubenswrapper[4723]: I0309 13:18:12.005067 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-6qstb" Mar 09 13:18:12 crc kubenswrapper[4723]: I0309 13:18:12.015499 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wrqqw" Mar 09 13:18:12 crc kubenswrapper[4723]: I0309 13:18:12.025644 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-6qstb" podStartSLOduration=3.7898278850000002 podStartE2EDuration="41.025624915s" podCreationTimestamp="2026-03-09 13:17:31 +0000 UTC" firstStartedPulling="2026-03-09 13:17:34.239699286 +0000 UTC m=+1128.254166826" lastFinishedPulling="2026-03-09 13:18:11.475496306 +0000 UTC m=+1165.489963856" observedRunningTime="2026-03-09 13:18:12.018290829 +0000 UTC m=+1166.032758379" watchObservedRunningTime="2026-03-09 13:18:12.025624915 +0000 UTC m=+1166.040092455" Mar 09 13:18:12 crc kubenswrapper[4723]: I0309 13:18:12.037626 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-67d996989d-96b5g" Mar 09 13:18:12 crc kubenswrapper[4723]: I0309 13:18:12.079317 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-54688575f-czjxc" Mar 09 13:18:12 crc kubenswrapper[4723]: I0309 13:18:12.106943 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-b2fx2" Mar 09 13:18:12 crc kubenswrapper[4723]: I0309 13:18:12.134731 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-6npjh" Mar 09 13:18:12 crc kubenswrapper[4723]: I0309 13:18:12.154509 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-rwzzl" Mar 09 13:18:12 crc kubenswrapper[4723]: I0309 13:18:12.237373 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-5wrg7" Mar 09 13:18:12 crc kubenswrapper[4723]: I0309 13:18:12.269715 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-mt7hb" Mar 09 13:18:12 crc kubenswrapper[4723]: I0309 13:18:12.320654 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-8w2sj" Mar 09 13:18:12 crc kubenswrapper[4723]: I0309 13:18:12.693393 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-4czcc" Mar 09 13:18:13 crc kubenswrapper[4723]: I0309 13:18:13.013955 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k56p4" event={"ID":"d748e45b-6515-4e15-a776-61bbe83179c0","Type":"ContainerStarted","Data":"27a73fb79529a729193a7e8106f82ce46590981246e6397e3e9119ae8e6a78c8"} Mar 09 13:18:13 crc kubenswrapper[4723]: I0309 13:18:13.032571 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k56p4" podStartSLOduration=3.099376065 podStartE2EDuration="41.032543038s" podCreationTimestamp="2026-03-09 13:17:32 +0000 UTC" firstStartedPulling="2026-03-09 13:17:34.482015055 +0000 UTC m=+1128.496482595" lastFinishedPulling="2026-03-09 13:18:12.415182038 +0000 UTC m=+1166.429649568" observedRunningTime="2026-03-09 13:18:13.027854683 +0000 UTC m=+1167.042322233" watchObservedRunningTime="2026-03-09 13:18:13.032543038 +0000 UTC m=+1167.047010578" Mar 09 13:18:13 crc kubenswrapper[4723]: I0309 13:18:13.642554 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-v6s9h" Mar 09 13:18:14 crc kubenswrapper[4723]: I0309 13:18:14.106885 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cz5w4f" Mar 09 13:18:14 crc kubenswrapper[4723]: I0309 13:18:14.582920 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-76577b8ddd-8748r" Mar 09 13:18:22 crc kubenswrapper[4723]: I0309 13:18:22.197953 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-lbqm8" Mar 09 13:18:22 crc kubenswrapper[4723]: I0309 13:18:22.333647 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-ssps2" Mar 09 13:18:22 crc kubenswrapper[4723]: I0309 13:18:22.668586 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-6qstb" Mar 09 13:18:44 crc kubenswrapper[4723]: I0309 13:18:44.305016 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-gwv9x"] Mar 09 13:18:44 crc kubenswrapper[4723]: E0309 13:18:44.305896 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75404ce0-2053-477b-8324-d967c2dff0e9" containerName="oc" Mar 09 13:18:44 crc kubenswrapper[4723]: I0309 13:18:44.305910 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="75404ce0-2053-477b-8324-d967c2dff0e9" containerName="oc" Mar 09 13:18:44 crc kubenswrapper[4723]: I0309 13:18:44.306096 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="75404ce0-2053-477b-8324-d967c2dff0e9" containerName="oc" Mar 09 13:18:44 crc kubenswrapper[4723]: I0309 13:18:44.307125 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-gwv9x" Mar 09 13:18:44 crc kubenswrapper[4723]: I0309 13:18:44.309244 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-c4gbn" Mar 09 13:18:44 crc kubenswrapper[4723]: I0309 13:18:44.309357 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 09 13:18:44 crc kubenswrapper[4723]: I0309 13:18:44.309409 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 09 13:18:44 crc kubenswrapper[4723]: I0309 13:18:44.309645 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 09 13:18:44 crc kubenswrapper[4723]: I0309 13:18:44.331767 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-gwv9x"] Mar 09 13:18:44 crc kubenswrapper[4723]: I0309 13:18:44.395654 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fclpq"] Mar 09 13:18:44 crc kubenswrapper[4723]: I0309 13:18:44.399021 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-fclpq" Mar 09 13:18:44 crc kubenswrapper[4723]: I0309 13:18:44.400731 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 09 13:18:44 crc kubenswrapper[4723]: I0309 13:18:44.404732 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55420bf1-44a4-4313-8570-cf2d1f9784ba-config\") pod \"dnsmasq-dns-675f4bcbfc-gwv9x\" (UID: \"55420bf1-44a4-4313-8570-cf2d1f9784ba\") " pod="openstack/dnsmasq-dns-675f4bcbfc-gwv9x" Mar 09 13:18:44 crc kubenswrapper[4723]: I0309 13:18:44.404790 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrtvp\" (UniqueName: \"kubernetes.io/projected/55420bf1-44a4-4313-8570-cf2d1f9784ba-kube-api-access-qrtvp\") pod \"dnsmasq-dns-675f4bcbfc-gwv9x\" (UID: \"55420bf1-44a4-4313-8570-cf2d1f9784ba\") " pod="openstack/dnsmasq-dns-675f4bcbfc-gwv9x" Mar 09 13:18:44 crc kubenswrapper[4723]: I0309 13:18:44.416873 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fclpq"] Mar 09 13:18:44 crc kubenswrapper[4723]: I0309 13:18:44.506824 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt46l\" (UniqueName: \"kubernetes.io/projected/4ab9a1db-b005-4830-8421-6caf5a04048b-kube-api-access-vt46l\") pod \"dnsmasq-dns-78dd6ddcc-fclpq\" (UID: \"4ab9a1db-b005-4830-8421-6caf5a04048b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fclpq" Mar 09 13:18:44 crc kubenswrapper[4723]: I0309 13:18:44.506959 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55420bf1-44a4-4313-8570-cf2d1f9784ba-config\") pod \"dnsmasq-dns-675f4bcbfc-gwv9x\" (UID: \"55420bf1-44a4-4313-8570-cf2d1f9784ba\") " pod="openstack/dnsmasq-dns-675f4bcbfc-gwv9x" Mar 09 13:18:44 crc kubenswrapper[4723]: I0309 13:18:44.506987 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrtvp\" (UniqueName: \"kubernetes.io/projected/55420bf1-44a4-4313-8570-cf2d1f9784ba-kube-api-access-qrtvp\") pod \"dnsmasq-dns-675f4bcbfc-gwv9x\" (UID: \"55420bf1-44a4-4313-8570-cf2d1f9784ba\") " pod="openstack/dnsmasq-dns-675f4bcbfc-gwv9x" Mar 09 13:18:44 crc kubenswrapper[4723]: I0309 13:18:44.507030 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ab9a1db-b005-4830-8421-6caf5a04048b-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-fclpq\" (UID: \"4ab9a1db-b005-4830-8421-6caf5a04048b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fclpq" Mar 09 13:18:44 crc kubenswrapper[4723]: I0309 13:18:44.507070 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ab9a1db-b005-4830-8421-6caf5a04048b-config\") pod \"dnsmasq-dns-78dd6ddcc-fclpq\" (UID: \"4ab9a1db-b005-4830-8421-6caf5a04048b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fclpq" Mar 09 13:18:44 crc kubenswrapper[4723]: I0309 13:18:44.508005 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55420bf1-44a4-4313-8570-cf2d1f9784ba-config\") pod \"dnsmasq-dns-675f4bcbfc-gwv9x\" (UID: \"55420bf1-44a4-4313-8570-cf2d1f9784ba\") " pod="openstack/dnsmasq-dns-675f4bcbfc-gwv9x" Mar 09 13:18:44 crc kubenswrapper[4723]: I0309 13:18:44.533002 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrtvp\" (UniqueName: \"kubernetes.io/projected/55420bf1-44a4-4313-8570-cf2d1f9784ba-kube-api-access-qrtvp\") pod \"dnsmasq-dns-675f4bcbfc-gwv9x\" (UID: \"55420bf1-44a4-4313-8570-cf2d1f9784ba\") " pod="openstack/dnsmasq-dns-675f4bcbfc-gwv9x" Mar 09 13:18:44 crc kubenswrapper[4723]: I0309 13:18:44.608547 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vt46l\" (UniqueName: \"kubernetes.io/projected/4ab9a1db-b005-4830-8421-6caf5a04048b-kube-api-access-vt46l\") pod \"dnsmasq-dns-78dd6ddcc-fclpq\" (UID: \"4ab9a1db-b005-4830-8421-6caf5a04048b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fclpq" Mar 09 13:18:44 crc kubenswrapper[4723]: I0309 13:18:44.608655 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ab9a1db-b005-4830-8421-6caf5a04048b-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-fclpq\" (UID: \"4ab9a1db-b005-4830-8421-6caf5a04048b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fclpq" Mar 09 13:18:44 crc kubenswrapper[4723]: I0309 13:18:44.608715 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ab9a1db-b005-4830-8421-6caf5a04048b-config\") pod \"dnsmasq-dns-78dd6ddcc-fclpq\" (UID: \"4ab9a1db-b005-4830-8421-6caf5a04048b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fclpq" Mar 09 13:18:44 crc kubenswrapper[4723]: I0309 13:18:44.610126 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ab9a1db-b005-4830-8421-6caf5a04048b-config\") pod \"dnsmasq-dns-78dd6ddcc-fclpq\" (UID: \"4ab9a1db-b005-4830-8421-6caf5a04048b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fclpq" Mar 09 13:18:44 crc kubenswrapper[4723]: I0309 13:18:44.610149 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ab9a1db-b005-4830-8421-6caf5a04048b-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-fclpq\" (UID: \"4ab9a1db-b005-4830-8421-6caf5a04048b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fclpq" Mar 09 13:18:44 crc kubenswrapper[4723]: I0309 13:18:44.631265 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt46l\" (UniqueName: \"kubernetes.io/projected/4ab9a1db-b005-4830-8421-6caf5a04048b-kube-api-access-vt46l\") pod \"dnsmasq-dns-78dd6ddcc-fclpq\" (UID: \"4ab9a1db-b005-4830-8421-6caf5a04048b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fclpq" Mar 09 13:18:44 crc kubenswrapper[4723]: I0309 13:18:44.638032 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-gwv9x" Mar 09 13:18:44 crc kubenswrapper[4723]: I0309 13:18:44.716010 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-fclpq" Mar 09 13:18:45 crc kubenswrapper[4723]: I0309 13:18:45.140584 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-gwv9x"] Mar 09 13:18:45 crc kubenswrapper[4723]: I0309 13:18:45.148615 4723 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 13:18:45 crc kubenswrapper[4723]: I0309 13:18:45.281299 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fclpq"] Mar 09 13:18:45 crc kubenswrapper[4723]: W0309 13:18:45.284268 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ab9a1db_b005_4830_8421_6caf5a04048b.slice/crio-e86dfc291134ec8211b01e91fede2a62846b33ef5303cb4dee7ea541e66520f8 WatchSource:0}: Error finding container e86dfc291134ec8211b01e91fede2a62846b33ef5303cb4dee7ea541e66520f8: Status 404 returned error can't find the container with id e86dfc291134ec8211b01e91fede2a62846b33ef5303cb4dee7ea541e66520f8 Mar 09 13:18:45 crc kubenswrapper[4723]: I0309 13:18:45.317201 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-gwv9x" event={"ID":"55420bf1-44a4-4313-8570-cf2d1f9784ba","Type":"ContainerStarted","Data":"e54e5e14366e01fe98d15117dcebec88b14dd7d429e9fdc75f6bcb5686e61a93"} Mar 09 13:18:45 crc kubenswrapper[4723]: I0309 13:18:45.319346 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-fclpq" event={"ID":"4ab9a1db-b005-4830-8421-6caf5a04048b","Type":"ContainerStarted","Data":"e86dfc291134ec8211b01e91fede2a62846b33ef5303cb4dee7ea541e66520f8"} Mar 09 13:18:47 crc kubenswrapper[4723]: I0309 13:18:47.139922 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-gwv9x"] Mar 09 13:18:47 crc kubenswrapper[4723]: I0309 13:18:47.174790 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-sf8lq"] Mar 09 13:18:47 crc kubenswrapper[4723]: I0309 13:18:47.176237 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-sf8lq" Mar 09 13:18:47 crc kubenswrapper[4723]: I0309 13:18:47.187149 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-sf8lq"] Mar 09 13:18:47 crc kubenswrapper[4723]: I0309 13:18:47.259172 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85ff66cb-dbf2-4501-bb5d-7f1142896dc3-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-sf8lq\" (UID: \"85ff66cb-dbf2-4501-bb5d-7f1142896dc3\") " pod="openstack/dnsmasq-dns-5ccc8479f9-sf8lq" Mar 09 13:18:47 crc kubenswrapper[4723]: I0309 13:18:47.259247 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85ff66cb-dbf2-4501-bb5d-7f1142896dc3-config\") pod \"dnsmasq-dns-5ccc8479f9-sf8lq\" (UID: \"85ff66cb-dbf2-4501-bb5d-7f1142896dc3\") " pod="openstack/dnsmasq-dns-5ccc8479f9-sf8lq" Mar 09 13:18:47 crc kubenswrapper[4723]: I0309 13:18:47.259277 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nmj8\" (UniqueName: \"kubernetes.io/projected/85ff66cb-dbf2-4501-bb5d-7f1142896dc3-kube-api-access-9nmj8\") pod \"dnsmasq-dns-5ccc8479f9-sf8lq\" (UID: \"85ff66cb-dbf2-4501-bb5d-7f1142896dc3\") " pod="openstack/dnsmasq-dns-5ccc8479f9-sf8lq" Mar 09 13:18:47 crc kubenswrapper[4723]: I0309 13:18:47.361486 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85ff66cb-dbf2-4501-bb5d-7f1142896dc3-config\") pod \"dnsmasq-dns-5ccc8479f9-sf8lq\" (UID: \"85ff66cb-dbf2-4501-bb5d-7f1142896dc3\") " pod="openstack/dnsmasq-dns-5ccc8479f9-sf8lq" Mar 09 13:18:47 crc kubenswrapper[4723]: I0309 13:18:47.361533 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nmj8\" (UniqueName: \"kubernetes.io/projected/85ff66cb-dbf2-4501-bb5d-7f1142896dc3-kube-api-access-9nmj8\") pod \"dnsmasq-dns-5ccc8479f9-sf8lq\" (UID: \"85ff66cb-dbf2-4501-bb5d-7f1142896dc3\") " pod="openstack/dnsmasq-dns-5ccc8479f9-sf8lq" Mar 09 13:18:47 crc kubenswrapper[4723]: I0309 13:18:47.361878 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85ff66cb-dbf2-4501-bb5d-7f1142896dc3-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-sf8lq\" (UID: \"85ff66cb-dbf2-4501-bb5d-7f1142896dc3\") " pod="openstack/dnsmasq-dns-5ccc8479f9-sf8lq" Mar 09 13:18:47 crc kubenswrapper[4723]: I0309 13:18:47.362713 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85ff66cb-dbf2-4501-bb5d-7f1142896dc3-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-sf8lq\" (UID: \"85ff66cb-dbf2-4501-bb5d-7f1142896dc3\") " pod="openstack/dnsmasq-dns-5ccc8479f9-sf8lq" Mar 09 13:18:47 crc kubenswrapper[4723]: I0309 13:18:47.362766 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85ff66cb-dbf2-4501-bb5d-7f1142896dc3-config\") pod \"dnsmasq-dns-5ccc8479f9-sf8lq\" (UID: \"85ff66cb-dbf2-4501-bb5d-7f1142896dc3\") " pod="openstack/dnsmasq-dns-5ccc8479f9-sf8lq" Mar 09 13:18:47 crc kubenswrapper[4723]: I0309 13:18:47.388079 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nmj8\" (UniqueName: \"kubernetes.io/projected/85ff66cb-dbf2-4501-bb5d-7f1142896dc3-kube-api-access-9nmj8\") pod \"dnsmasq-dns-5ccc8479f9-sf8lq\" (UID: \"85ff66cb-dbf2-4501-bb5d-7f1142896dc3\") " pod="openstack/dnsmasq-dns-5ccc8479f9-sf8lq" Mar 09 13:18:47 crc kubenswrapper[4723]: I0309 13:18:47.501627 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-sf8lq" Mar 09 13:18:47 crc kubenswrapper[4723]: I0309 13:18:47.605050 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fclpq"] Mar 09 13:18:47 crc kubenswrapper[4723]: I0309 13:18:47.692828 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gg872"] Mar 09 13:18:47 crc kubenswrapper[4723]: I0309 13:18:47.708074 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-gg872" Mar 09 13:18:47 crc kubenswrapper[4723]: I0309 13:18:47.739389 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gg872"] Mar 09 13:18:47 crc kubenswrapper[4723]: I0309 13:18:47.772564 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43506b49-01ab-4ea9-bcd6-4ce950b7815f-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-gg872\" (UID: \"43506b49-01ab-4ea9-bcd6-4ce950b7815f\") " pod="openstack/dnsmasq-dns-57d769cc4f-gg872" Mar 09 13:18:47 crc kubenswrapper[4723]: I0309 13:18:47.772615 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43506b49-01ab-4ea9-bcd6-4ce950b7815f-config\") pod \"dnsmasq-dns-57d769cc4f-gg872\" (UID: \"43506b49-01ab-4ea9-bcd6-4ce950b7815f\") " pod="openstack/dnsmasq-dns-57d769cc4f-gg872" Mar 09 13:18:47 crc kubenswrapper[4723]: I0309 13:18:47.772637 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l9dk\" (UniqueName: \"kubernetes.io/projected/43506b49-01ab-4ea9-bcd6-4ce950b7815f-kube-api-access-9l9dk\") pod \"dnsmasq-dns-57d769cc4f-gg872\" (UID: \"43506b49-01ab-4ea9-bcd6-4ce950b7815f\") " pod="openstack/dnsmasq-dns-57d769cc4f-gg872" Mar 09 13:18:47 crc kubenswrapper[4723]: I0309 13:18:47.847339 4723 scope.go:117] "RemoveContainer" containerID="50e29fcc16f3b2ad7526b05e9045d0c5bf3140d8f910d9d261087cf836a20464" Mar 09 13:18:47 crc kubenswrapper[4723]: I0309 13:18:47.874768 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43506b49-01ab-4ea9-bcd6-4ce950b7815f-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-gg872\" (UID: \"43506b49-01ab-4ea9-bcd6-4ce950b7815f\") " pod="openstack/dnsmasq-dns-57d769cc4f-gg872" Mar 09 13:18:47 crc kubenswrapper[4723]: I0309 13:18:47.874830 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43506b49-01ab-4ea9-bcd6-4ce950b7815f-config\") pod \"dnsmasq-dns-57d769cc4f-gg872\" (UID: \"43506b49-01ab-4ea9-bcd6-4ce950b7815f\") " pod="openstack/dnsmasq-dns-57d769cc4f-gg872" Mar 09 13:18:47 crc kubenswrapper[4723]: I0309 13:18:47.874847 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9l9dk\" (UniqueName: \"kubernetes.io/projected/43506b49-01ab-4ea9-bcd6-4ce950b7815f-kube-api-access-9l9dk\") pod \"dnsmasq-dns-57d769cc4f-gg872\" (UID: \"43506b49-01ab-4ea9-bcd6-4ce950b7815f\") " pod="openstack/dnsmasq-dns-57d769cc4f-gg872" Mar 09 13:18:47 crc kubenswrapper[4723]: I0309 13:18:47.877468 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43506b49-01ab-4ea9-bcd6-4ce950b7815f-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-gg872\" (UID: \"43506b49-01ab-4ea9-bcd6-4ce950b7815f\") " pod="openstack/dnsmasq-dns-57d769cc4f-gg872" Mar 09 13:18:47 crc kubenswrapper[4723]: I0309 13:18:47.877774 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43506b49-01ab-4ea9-bcd6-4ce950b7815f-config\") pod \"dnsmasq-dns-57d769cc4f-gg872\" (UID: \"43506b49-01ab-4ea9-bcd6-4ce950b7815f\") " pod="openstack/dnsmasq-dns-57d769cc4f-gg872" Mar 09 13:18:47 crc kubenswrapper[4723]: I0309 13:18:47.909786 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l9dk\" (UniqueName: \"kubernetes.io/projected/43506b49-01ab-4ea9-bcd6-4ce950b7815f-kube-api-access-9l9dk\") pod \"dnsmasq-dns-57d769cc4f-gg872\" (UID: \"43506b49-01ab-4ea9-bcd6-4ce950b7815f\") " pod="openstack/dnsmasq-dns-57d769cc4f-gg872" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.047438 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-gg872" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.279352 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-sf8lq"] Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.326419 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.328813 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.332363 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.336954 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.337045 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-d6ws7" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.336957 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.336954 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.337262 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.337322 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.366119 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.374513 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-sf8lq" event={"ID":"85ff66cb-dbf2-4501-bb5d-7f1142896dc3","Type":"ContainerStarted","Data":"7644429a190da8943e24b0bc770e8f08f4a94fbddf495a022389669472679608"} Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.382501 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.382579 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.382615 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-72f699bf-bb2b-4b1c-a36c-166b186b1319\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-72f699bf-bb2b-4b1c-a36c-166b186b1319\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.382640 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.382662 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.382683 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.382702 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srpvs\" (UniqueName: \"kubernetes.io/projected/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-kube-api-access-srpvs\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.382737 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.382754 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.382768 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.382803 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.486102 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.486185 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.486222 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-72f699bf-bb2b-4b1c-a36c-166b186b1319\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-72f699bf-bb2b-4b1c-a36c-166b186b1319\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.486246 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.486265 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.486284 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.486306 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srpvs\" (UniqueName: \"kubernetes.io/projected/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-kube-api-access-srpvs\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.486340 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.486358 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.486561 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.486596 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.486837 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.487594 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.488002 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.488022 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.488143 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.492598 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.493009 4723 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.493080 4723 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-72f699bf-bb2b-4b1c-a36c-166b186b1319\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-72f699bf-bb2b-4b1c-a36c-166b186b1319\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9904cf58384ea95f97211cb12cec7ec77900b8743fdc44d4dd98a8f5e64d4499/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.497513 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.497662 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.504804 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.508953 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srpvs\" (UniqueName: \"kubernetes.io/projected/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-kube-api-access-srpvs\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.563008 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-72f699bf-bb2b-4b1c-a36c-166b186b1319\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-72f699bf-bb2b-4b1c-a36c-166b186b1319\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.610708 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gg872"] Mar 09 13:18:48 crc kubenswrapper[4723]: W0309 13:18:48.615851 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43506b49_01ab_4ea9_bcd6_4ce950b7815f.slice/crio-3c7efa8a971d3ff09c3bb0908930a621d4d600f2661cb7100bd5ae11095a24a5 WatchSource:0}: Error finding container 3c7efa8a971d3ff09c3bb0908930a621d4d600f2661cb7100bd5ae11095a24a5: Status 404 returned error can't find the container with id 3c7efa8a971d3ff09c3bb0908930a621d4d600f2661cb7100bd5ae11095a24a5 Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.675584 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.780557 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.783253 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.785340 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.793965 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.794136 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.794277 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.794283 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.794578 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-sp6vr" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.794719 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.804312 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.830778 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-2"] Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.832234 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.867941 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-1"] Mar 09 13:18:48 crc kubenswrapper[4723]: I0309 13:18:48.869999 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.009911 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/daa528e2-bcd7-43a8-bfea-a0911b3020c5-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\") " pod="openstack/rabbitmq-server-1" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.009988 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/54210e7b-b34d-411d-93e1-e8cc3448c4b0-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\") " pod="openstack/rabbitmq-server-2" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.010020 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/daa528e2-bcd7-43a8-bfea-a0911b3020c5-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\") " pod="openstack/rabbitmq-server-1" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.010053 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/daa528e2-bcd7-43a8-bfea-a0911b3020c5-server-conf\") pod \"rabbitmq-server-1\" (UID: \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\") " pod="openstack/rabbitmq-server-1" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.010109 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/54210e7b-b34d-411d-93e1-e8cc3448c4b0-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\") " pod="openstack/rabbitmq-server-2" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.010139 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22vct\" (UniqueName: \"kubernetes.io/projected/54210e7b-b34d-411d-93e1-e8cc3448c4b0-kube-api-access-22vct\") pod \"rabbitmq-server-2\" (UID: \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\") " pod="openstack/rabbitmq-server-2" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.010179 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/19f91c12-b482-46ab-a6e1-20164abe2ee4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"19f91c12-b482-46ab-a6e1-20164abe2ee4\") " pod="openstack/rabbitmq-server-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.010224 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/54210e7b-b34d-411d-93e1-e8cc3448c4b0-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\") " pod="openstack/rabbitmq-server-2" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.010255 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/19f91c12-b482-46ab-a6e1-20164abe2ee4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"19f91c12-b482-46ab-a6e1-20164abe2ee4\") " pod="openstack/rabbitmq-server-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.010289 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/19f91c12-b482-46ab-a6e1-20164abe2ee4-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"19f91c12-b482-46ab-a6e1-20164abe2ee4\") " pod="openstack/rabbitmq-server-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.010327 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/54210e7b-b34d-411d-93e1-e8cc3448c4b0-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\") " pod="openstack/rabbitmq-server-2" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.010347 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4n48\" (UniqueName: \"kubernetes.io/projected/19f91c12-b482-46ab-a6e1-20164abe2ee4-kube-api-access-r4n48\") pod \"rabbitmq-server-0\" (UID: \"19f91c12-b482-46ab-a6e1-20164abe2ee4\") " pod="openstack/rabbitmq-server-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.010365 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/daa528e2-bcd7-43a8-bfea-a0911b3020c5-pod-info\") pod \"rabbitmq-server-1\" (UID: \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\") " pod="openstack/rabbitmq-server-1" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.010410 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/daa528e2-bcd7-43a8-bfea-a0911b3020c5-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\") " pod="openstack/rabbitmq-server-1" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.010460 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/19f91c12-b482-46ab-a6e1-20164abe2ee4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"19f91c12-b482-46ab-a6e1-20164abe2ee4\") " pod="openstack/rabbitmq-server-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.010494 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-400c2cb3-6a4d-4fb5-bae7-35d1dc27ea33\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-400c2cb3-6a4d-4fb5-bae7-35d1dc27ea33\") pod \"rabbitmq-server-0\" (UID: \"19f91c12-b482-46ab-a6e1-20164abe2ee4\") " pod="openstack/rabbitmq-server-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.010526 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/54210e7b-b34d-411d-93e1-e8cc3448c4b0-config-data\") pod \"rabbitmq-server-2\" (UID: \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\") " pod="openstack/rabbitmq-server-2" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.010557 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/19f91c12-b482-46ab-a6e1-20164abe2ee4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"19f91c12-b482-46ab-a6e1-20164abe2ee4\") " pod="openstack/rabbitmq-server-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.010588 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/19f91c12-b482-46ab-a6e1-20164abe2ee4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"19f91c12-b482-46ab-a6e1-20164abe2ee4\") " pod="openstack/rabbitmq-server-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.015978 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.016058 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.016076 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19f91c12-b482-46ab-a6e1-20164abe2ee4-config-data\") pod \"rabbitmq-server-0\" (UID: \"19f91c12-b482-46ab-a6e1-20164abe2ee4\") " pod="openstack/rabbitmq-server-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.016176 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tjrf\" (UniqueName: \"kubernetes.io/projected/daa528e2-bcd7-43a8-bfea-a0911b3020c5-kube-api-access-6tjrf\") pod \"rabbitmq-server-1\" (UID: \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\") " pod="openstack/rabbitmq-server-1" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.016402 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/daa528e2-bcd7-43a8-bfea-a0911b3020c5-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\") " pod="openstack/rabbitmq-server-1" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.016535 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ba0cc841-17e0-495d-9b6d-24d90fd9975f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ba0cc841-17e0-495d-9b6d-24d90fd9975f\") pod \"rabbitmq-server-2\" (UID: \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\") " pod="openstack/rabbitmq-server-2" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.016627 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/54210e7b-b34d-411d-93e1-e8cc3448c4b0-pod-info\") pod \"rabbitmq-server-2\" (UID: \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\") " pod="openstack/rabbitmq-server-2" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.016682 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/daa528e2-bcd7-43a8-bfea-a0911b3020c5-config-data\") pod \"rabbitmq-server-1\" (UID: \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\") " pod="openstack/rabbitmq-server-1" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.016758 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1277af1d-fdb9-417d-b33b-f6b7b0543c95\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1277af1d-fdb9-417d-b33b-f6b7b0543c95\") pod \"rabbitmq-server-1\" (UID: \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\") " pod="openstack/rabbitmq-server-1" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.049385 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/daa528e2-bcd7-43a8-bfea-a0911b3020c5-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\") " pod="openstack/rabbitmq-server-1" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.049466 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/19f91c12-b482-46ab-a6e1-20164abe2ee4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"19f91c12-b482-46ab-a6e1-20164abe2ee4\") " pod="openstack/rabbitmq-server-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.049490 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/54210e7b-b34d-411d-93e1-e8cc3448c4b0-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\") " pod="openstack/rabbitmq-server-2" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.049512 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/daa528e2-bcd7-43a8-bfea-a0911b3020c5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\") " pod="openstack/rabbitmq-server-1" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.049600 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/54210e7b-b34d-411d-93e1-e8cc3448c4b0-server-conf\") pod \"rabbitmq-server-2\" (UID: \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\") " pod="openstack/rabbitmq-server-2" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.049624 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/19f91c12-b482-46ab-a6e1-20164abe2ee4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"19f91c12-b482-46ab-a6e1-20164abe2ee4\") " pod="openstack/rabbitmq-server-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.049640 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/54210e7b-b34d-411d-93e1-e8cc3448c4b0-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\") " pod="openstack/rabbitmq-server-2" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.152899 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/54210e7b-b34d-411d-93e1-e8cc3448c4b0-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\") " pod="openstack/rabbitmq-server-2" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.152963 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/19f91c12-b482-46ab-a6e1-20164abe2ee4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"19f91c12-b482-46ab-a6e1-20164abe2ee4\") " pod="openstack/rabbitmq-server-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.152990 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/19f91c12-b482-46ab-a6e1-20164abe2ee4-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"19f91c12-b482-46ab-a6e1-20164abe2ee4\") " pod="openstack/rabbitmq-server-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.153014 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/54210e7b-b34d-411d-93e1-e8cc3448c4b0-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\") " pod="openstack/rabbitmq-server-2" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.153033 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4n48\" (UniqueName: \"kubernetes.io/projected/19f91c12-b482-46ab-a6e1-20164abe2ee4-kube-api-access-r4n48\") pod \"rabbitmq-server-0\" (UID: \"19f91c12-b482-46ab-a6e1-20164abe2ee4\") " pod="openstack/rabbitmq-server-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.153051 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/daa528e2-bcd7-43a8-bfea-a0911b3020c5-pod-info\") pod \"rabbitmq-server-1\" (UID: \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\") " pod="openstack/rabbitmq-server-1" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.153080 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/daa528e2-bcd7-43a8-bfea-a0911b3020c5-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\") " pod="openstack/rabbitmq-server-1" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.153120 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/19f91c12-b482-46ab-a6e1-20164abe2ee4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"19f91c12-b482-46ab-a6e1-20164abe2ee4\") " pod="openstack/rabbitmq-server-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.153157 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-400c2cb3-6a4d-4fb5-bae7-35d1dc27ea33\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-400c2cb3-6a4d-4fb5-bae7-35d1dc27ea33\") pod \"rabbitmq-server-0\" (UID: \"19f91c12-b482-46ab-a6e1-20164abe2ee4\") " pod="openstack/rabbitmq-server-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.153190 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/54210e7b-b34d-411d-93e1-e8cc3448c4b0-config-data\") pod \"rabbitmq-server-2\" (UID: \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\") " pod="openstack/rabbitmq-server-2" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.153225 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/19f91c12-b482-46ab-a6e1-20164abe2ee4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"19f91c12-b482-46ab-a6e1-20164abe2ee4\") " pod="openstack/rabbitmq-server-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.154142 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/19f91c12-b482-46ab-a6e1-20164abe2ee4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"19f91c12-b482-46ab-a6e1-20164abe2ee4\") " pod="openstack/rabbitmq-server-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.154187 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19f91c12-b482-46ab-a6e1-20164abe2ee4-config-data\") pod \"rabbitmq-server-0\" (UID: \"19f91c12-b482-46ab-a6e1-20164abe2ee4\") " pod="openstack/rabbitmq-server-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.154210 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tjrf\" (UniqueName: \"kubernetes.io/projected/daa528e2-bcd7-43a8-bfea-a0911b3020c5-kube-api-access-6tjrf\") pod \"rabbitmq-server-1\" (UID: \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\") " pod="openstack/rabbitmq-server-1" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.154245 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/daa528e2-bcd7-43a8-bfea-a0911b3020c5-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\") " pod="openstack/rabbitmq-server-1" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.154278 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ba0cc841-17e0-495d-9b6d-24d90fd9975f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ba0cc841-17e0-495d-9b6d-24d90fd9975f\") pod \"rabbitmq-server-2\" (UID: \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\") " pod="openstack/rabbitmq-server-2" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.154311 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/54210e7b-b34d-411d-93e1-e8cc3448c4b0-pod-info\") pod \"rabbitmq-server-2\" (UID: \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\") " pod="openstack/rabbitmq-server-2" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.154345 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/daa528e2-bcd7-43a8-bfea-a0911b3020c5-config-data\") pod \"rabbitmq-server-1\" (UID: \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\") " pod="openstack/rabbitmq-server-1" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.154383 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1277af1d-fdb9-417d-b33b-f6b7b0543c95\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1277af1d-fdb9-417d-b33b-f6b7b0543c95\") pod \"rabbitmq-server-1\" (UID: \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\") " pod="openstack/rabbitmq-server-1" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.154449 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/19f91c12-b482-46ab-a6e1-20164abe2ee4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"19f91c12-b482-46ab-a6e1-20164abe2ee4\") " pod="openstack/rabbitmq-server-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.154469 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/54210e7b-b34d-411d-93e1-e8cc3448c4b0-config-data\") pod \"rabbitmq-server-2\" (UID: \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\") " pod="openstack/rabbitmq-server-2" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.154474 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/daa528e2-bcd7-43a8-bfea-a0911b3020c5-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\") " pod="openstack/rabbitmq-server-1" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.154517 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/54210e7b-b34d-411d-93e1-e8cc3448c4b0-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\") " pod="openstack/rabbitmq-server-2" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.154536 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/daa528e2-bcd7-43a8-bfea-a0911b3020c5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\") " pod="openstack/rabbitmq-server-1" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.154554 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/54210e7b-b34d-411d-93e1-e8cc3448c4b0-server-conf\") pod \"rabbitmq-server-2\" (UID: \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\") " pod="openstack/rabbitmq-server-2" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.154570 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/19f91c12-b482-46ab-a6e1-20164abe2ee4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"19f91c12-b482-46ab-a6e1-20164abe2ee4\") " pod="openstack/rabbitmq-server-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.154584 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/54210e7b-b34d-411d-93e1-e8cc3448c4b0-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\") " pod="openstack/rabbitmq-server-2" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.154602 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/daa528e2-bcd7-43a8-bfea-a0911b3020c5-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\") " pod="openstack/rabbitmq-server-1" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.154638 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/54210e7b-b34d-411d-93e1-e8cc3448c4b0-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\") " pod="openstack/rabbitmq-server-2" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.154652 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/daa528e2-bcd7-43a8-bfea-a0911b3020c5-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\") " pod="openstack/rabbitmq-server-1" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.154674 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/daa528e2-bcd7-43a8-bfea-a0911b3020c5-server-conf\") pod \"rabbitmq-server-1\" (UID: \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\") " pod="openstack/rabbitmq-server-1" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.154697 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/54210e7b-b34d-411d-93e1-e8cc3448c4b0-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\") " pod="openstack/rabbitmq-server-2" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.154722 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22vct\" (UniqueName: \"kubernetes.io/projected/54210e7b-b34d-411d-93e1-e8cc3448c4b0-kube-api-access-22vct\") pod \"rabbitmq-server-2\" (UID: \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\") " pod="openstack/rabbitmq-server-2" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.154746 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/19f91c12-b482-46ab-a6e1-20164abe2ee4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"19f91c12-b482-46ab-a6e1-20164abe2ee4\") " pod="openstack/rabbitmq-server-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.154977 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19f91c12-b482-46ab-a6e1-20164abe2ee4-config-data\") pod \"rabbitmq-server-0\" (UID: \"19f91c12-b482-46ab-a6e1-20164abe2ee4\") " pod="openstack/rabbitmq-server-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.155352 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/19f91c12-b482-46ab-a6e1-20164abe2ee4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"19f91c12-b482-46ab-a6e1-20164abe2ee4\") " pod="openstack/rabbitmq-server-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.155597 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/54210e7b-b34d-411d-93e1-e8cc3448c4b0-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\") " pod="openstack/rabbitmq-server-2" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.155801 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/daa528e2-bcd7-43a8-bfea-a0911b3020c5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\") " pod="openstack/rabbitmq-server-1" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.156660 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/54210e7b-b34d-411d-93e1-e8cc3448c4b0-server-conf\") pod \"rabbitmq-server-2\" (UID: \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\") " pod="openstack/rabbitmq-server-2" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.156910 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/daa528e2-bcd7-43a8-bfea-a0911b3020c5-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\") " pod="openstack/rabbitmq-server-1" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.157382 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/daa528e2-bcd7-43a8-bfea-a0911b3020c5-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\") " pod="openstack/rabbitmq-server-1" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.154238 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/19f91c12-b482-46ab-a6e1-20164abe2ee4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"19f91c12-b482-46ab-a6e1-20164abe2ee4\") " pod="openstack/rabbitmq-server-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.157657 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/19f91c12-b482-46ab-a6e1-20164abe2ee4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"19f91c12-b482-46ab-a6e1-20164abe2ee4\") " pod="openstack/rabbitmq-server-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.159286 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/daa528e2-bcd7-43a8-bfea-a0911b3020c5-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\") " pod="openstack/rabbitmq-server-1" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.161241 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/19f91c12-b482-46ab-a6e1-20164abe2ee4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"19f91c12-b482-46ab-a6e1-20164abe2ee4\") " pod="openstack/rabbitmq-server-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.161383 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/54210e7b-b34d-411d-93e1-e8cc3448c4b0-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\") " pod="openstack/rabbitmq-server-2" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.161889 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/19f91c12-b482-46ab-a6e1-20164abe2ee4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"19f91c12-b482-46ab-a6e1-20164abe2ee4\") " pod="openstack/rabbitmq-server-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.162297 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/54210e7b-b34d-411d-93e1-e8cc3448c4b0-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\") " pod="openstack/rabbitmq-server-2" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.163168 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/daa528e2-bcd7-43a8-bfea-a0911b3020c5-server-conf\") pod \"rabbitmq-server-1\" (UID: \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\") " pod="openstack/rabbitmq-server-1" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.163669 4723 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.163696 4723 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1277af1d-fdb9-417d-b33b-f6b7b0543c95\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1277af1d-fdb9-417d-b33b-f6b7b0543c95\") pod \"rabbitmq-server-1\" (UID: \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b95d2f7d3a9f884abc2cc747032dbde014948f5683a3dc24108da2959777a268/globalmount\"" pod="openstack/rabbitmq-server-1" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.164116 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/daa528e2-bcd7-43a8-bfea-a0911b3020c5-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\") " pod="openstack/rabbitmq-server-1" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.164122 4723 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.164170 4723 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ba0cc841-17e0-495d-9b6d-24d90fd9975f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ba0cc841-17e0-495d-9b6d-24d90fd9975f\") pod \"rabbitmq-server-2\" (UID: \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7ed97ef2ccf927dc4b78d1b5ddfee572a0deac37d7e58eca401e08357de5559e/globalmount\"" pod="openstack/rabbitmq-server-2" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.164332 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/daa528e2-bcd7-43a8-bfea-a0911b3020c5-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\") " pod="openstack/rabbitmq-server-1" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.167619 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/daa528e2-bcd7-43a8-bfea-a0911b3020c5-config-data\") pod \"rabbitmq-server-1\" (UID: \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\") " pod="openstack/rabbitmq-server-1" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.170447 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/54210e7b-b34d-411d-93e1-e8cc3448c4b0-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\") " pod="openstack/rabbitmq-server-2" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.170563 4723 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.170606 4723 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-400c2cb3-6a4d-4fb5-bae7-35d1dc27ea33\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-400c2cb3-6a4d-4fb5-bae7-35d1dc27ea33\") pod \"rabbitmq-server-0\" (UID: \"19f91c12-b482-46ab-a6e1-20164abe2ee4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1c8821f4c993ac83904f8093fe48c4871280d085aeb9e957927750fc4aaf55ee/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.170737 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/19f91c12-b482-46ab-a6e1-20164abe2ee4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"19f91c12-b482-46ab-a6e1-20164abe2ee4\") " pod="openstack/rabbitmq-server-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.173155 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/54210e7b-b34d-411d-93e1-e8cc3448c4b0-pod-info\") pod \"rabbitmq-server-2\" (UID: \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\") " pod="openstack/rabbitmq-server-2" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.173221 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/19f91c12-b482-46ab-a6e1-20164abe2ee4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"19f91c12-b482-46ab-a6e1-20164abe2ee4\") " pod="openstack/rabbitmq-server-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.173632 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/daa528e2-bcd7-43a8-bfea-a0911b3020c5-pod-info\") pod \"rabbitmq-server-1\" (UID: \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\") " pod="openstack/rabbitmq-server-1" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.173723 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/19f91c12-b482-46ab-a6e1-20164abe2ee4-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"19f91c12-b482-46ab-a6e1-20164abe2ee4\") " pod="openstack/rabbitmq-server-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.178793 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/54210e7b-b34d-411d-93e1-e8cc3448c4b0-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\") " pod="openstack/rabbitmq-server-2" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.178996 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/54210e7b-b34d-411d-93e1-e8cc3448c4b0-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\") " pod="openstack/rabbitmq-server-2" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.181446 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tjrf\" (UniqueName: \"kubernetes.io/projected/daa528e2-bcd7-43a8-bfea-a0911b3020c5-kube-api-access-6tjrf\") pod \"rabbitmq-server-1\" (UID: \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\") " pod="openstack/rabbitmq-server-1" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.183368 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4n48\" (UniqueName: \"kubernetes.io/projected/19f91c12-b482-46ab-a6e1-20164abe2ee4-kube-api-access-r4n48\") pod \"rabbitmq-server-0\" (UID: \"19f91c12-b482-46ab-a6e1-20164abe2ee4\") " pod="openstack/rabbitmq-server-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.191814 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22vct\" (UniqueName: \"kubernetes.io/projected/54210e7b-b34d-411d-93e1-e8cc3448c4b0-kube-api-access-22vct\") pod \"rabbitmq-server-2\" (UID: \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\") " pod="openstack/rabbitmq-server-2" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.233644 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ba0cc841-17e0-495d-9b6d-24d90fd9975f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ba0cc841-17e0-495d-9b6d-24d90fd9975f\") pod \"rabbitmq-server-2\" (UID: \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\") " pod="openstack/rabbitmq-server-2" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.247568 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1277af1d-fdb9-417d-b33b-f6b7b0543c95\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1277af1d-fdb9-417d-b33b-f6b7b0543c95\") pod \"rabbitmq-server-1\" (UID: \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\") " pod="openstack/rabbitmq-server-1" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.258313 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.283012 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-400c2cb3-6a4d-4fb5-bae7-35d1dc27ea33\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-400c2cb3-6a4d-4fb5-bae7-35d1dc27ea33\") pod \"rabbitmq-server-0\" (UID: \"19f91c12-b482-46ab-a6e1-20164abe2ee4\") " pod="openstack/rabbitmq-server-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.305993 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.306286 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 09 13:18:49 crc kubenswrapper[4723]: W0309 13:18:49.314830 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a39acc6_3d02_4b5a_957f_eb4e3d578aeb.slice/crio-2eebd4ad662e6e17733e7e49525e73dcd42ab261dbfdc490140f01c21f8e4ec9 WatchSource:0}: Error finding container 2eebd4ad662e6e17733e7e49525e73dcd42ab261dbfdc490140f01c21f8e4ec9: Status 404 returned error can't find the container with id 2eebd4ad662e6e17733e7e49525e73dcd42ab261dbfdc490140f01c21f8e4ec9 Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.421362 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb","Type":"ContainerStarted","Data":"2eebd4ad662e6e17733e7e49525e73dcd42ab261dbfdc490140f01c21f8e4ec9"} Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.425146 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gg872" event={"ID":"43506b49-01ab-4ea9-bcd6-4ce950b7815f","Type":"ContainerStarted","Data":"3c7efa8a971d3ff09c3bb0908930a621d4d600f2661cb7100bd5ae11095a24a5"} Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.436191 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.796678 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.804828 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.812064 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.812280 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.812417 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.813287 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-85gn8" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.815931 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.818523 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.870310 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17-operator-scripts\") pod \"openstack-galera-0\" (UID: \"5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17\") " pod="openstack/openstack-galera-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.870699 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17-config-data-default\") pod \"openstack-galera-0\" (UID: \"5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17\") " pod="openstack/openstack-galera-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.870737 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17\") " pod="openstack/openstack-galera-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.870786 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f50e6551-1897-4b14-97aa-8e2eb1ce6bcb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f50e6551-1897-4b14-97aa-8e2eb1ce6bcb\") pod \"openstack-galera-0\" (UID: \"5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17\") " pod="openstack/openstack-galera-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.870814 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17-config-data-generated\") pod \"openstack-galera-0\" (UID: \"5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17\") " pod="openstack/openstack-galera-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.870848 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17\") " pod="openstack/openstack-galera-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.870908 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17-kolla-config\") pod \"openstack-galera-0\" (UID: \"5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17\") " pod="openstack/openstack-galera-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.871006 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjrxl\" (UniqueName: \"kubernetes.io/projected/5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17-kube-api-access-hjrxl\") pod \"openstack-galera-0\" (UID: \"5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17\") " pod="openstack/openstack-galera-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.975212 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17\") " pod="openstack/openstack-galera-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.975294 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f50e6551-1897-4b14-97aa-8e2eb1ce6bcb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f50e6551-1897-4b14-97aa-8e2eb1ce6bcb\") pod \"openstack-galera-0\" (UID: \"5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17\") " pod="openstack/openstack-galera-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.975322 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17-config-data-generated\") pod \"openstack-galera-0\" (UID: \"5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17\") " pod="openstack/openstack-galera-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.975451 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17\") " pod="openstack/openstack-galera-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.975489 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17-kolla-config\") pod \"openstack-galera-0\" (UID: \"5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17\") " pod="openstack/openstack-galera-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.975566 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjrxl\" (UniqueName: \"kubernetes.io/projected/5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17-kube-api-access-hjrxl\") pod \"openstack-galera-0\" (UID: \"5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17\") " pod="openstack/openstack-galera-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.975602 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17-operator-scripts\") pod \"openstack-galera-0\" (UID: \"5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17\") " pod="openstack/openstack-galera-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.975664 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17-config-data-default\") pod \"openstack-galera-0\" (UID: \"5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17\") " pod="openstack/openstack-galera-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.976733 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17-config-data-default\") pod \"openstack-galera-0\" (UID: \"5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17\") " pod="openstack/openstack-galera-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.977246 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17-kolla-config\") pod \"openstack-galera-0\" (UID: \"5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17\") " pod="openstack/openstack-galera-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.979616 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17-config-data-generated\") pod \"openstack-galera-0\" (UID: \"5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17\") " pod="openstack/openstack-galera-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.980725 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17-operator-scripts\") pod \"openstack-galera-0\" (UID: \"5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17\") " pod="openstack/openstack-galera-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.982481 4723 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.982507 4723 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f50e6551-1897-4b14-97aa-8e2eb1ce6bcb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f50e6551-1897-4b14-97aa-8e2eb1ce6bcb\") pod \"openstack-galera-0\" (UID: \"5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1eb22189d331ffaf191d2b5f485eee1e432b6a707c3ff134690467aa854aef6d/globalmount\"" pod="openstack/openstack-galera-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.986325 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17\") " pod="openstack/openstack-galera-0" Mar 09 13:18:49 crc kubenswrapper[4723]: I0309 13:18:49.989970 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17\") " pod="openstack/openstack-galera-0" Mar 09 13:18:50 crc kubenswrapper[4723]: I0309 13:18:50.021622 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjrxl\" (UniqueName: \"kubernetes.io/projected/5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17-kube-api-access-hjrxl\") pod \"openstack-galera-0\" (UID: \"5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17\") " pod="openstack/openstack-galera-0" Mar 09 13:18:50 crc kubenswrapper[4723]: I0309 13:18:50.027152 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 09 13:18:50 crc kubenswrapper[4723]: I0309 13:18:50.055539 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f50e6551-1897-4b14-97aa-8e2eb1ce6bcb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f50e6551-1897-4b14-97aa-8e2eb1ce6bcb\") pod \"openstack-galera-0\" (UID: \"5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17\") " pod="openstack/openstack-galera-0" Mar 09 13:18:50 crc kubenswrapper[4723]: I0309 13:18:50.147529 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 09 13:18:50 crc kubenswrapper[4723]: I0309 13:18:50.198197 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 09 13:18:50 crc kubenswrapper[4723]: I0309 13:18:50.280791 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 13:18:50 crc kubenswrapper[4723]: I0309 13:18:50.466235 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"54210e7b-b34d-411d-93e1-e8cc3448c4b0","Type":"ContainerStarted","Data":"0e2903e72b9b2d42ff4a0150e30f0bb0c4aa44b0fec566d81c30e6a55f3d5c35"} Mar 09 13:18:51 crc kubenswrapper[4723]: W0309 13:18:51.072635 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddaa528e2_bcd7_43a8_bfea_a0911b3020c5.slice/crio-8abe0dab6a7f482a198575018a6b66071882a893761c80b9a06949d7ef573b27 WatchSource:0}: Error finding container 8abe0dab6a7f482a198575018a6b66071882a893761c80b9a06949d7ef573b27: Status 404 returned error can't find the container with id 8abe0dab6a7f482a198575018a6b66071882a893761c80b9a06949d7ef573b27 Mar 09 13:18:51 crc kubenswrapper[4723]: W0309 13:18:51.081250 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19f91c12_b482_46ab_a6e1_20164abe2ee4.slice/crio-a582220915ebe3312e61478c345cd6f7b9adf7abe770e979886329554ca913c7 WatchSource:0}: Error finding container a582220915ebe3312e61478c345cd6f7b9adf7abe770e979886329554ca913c7: Status 404 returned error can't find the container with id a582220915ebe3312e61478c345cd6f7b9adf7abe770e979886329554ca913c7 Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.198749 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.201475 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.216510 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.218734 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-6wwv2" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.219776 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.219897 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.220094 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.310002 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f01dc50c-55d6-4f99-92f8-d3adfcf8d71b-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f01dc50c-55d6-4f99-92f8-d3adfcf8d71b\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.310052 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f01dc50c-55d6-4f99-92f8-d3adfcf8d71b-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f01dc50c-55d6-4f99-92f8-d3adfcf8d71b\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.310121 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f01dc50c-55d6-4f99-92f8-d3adfcf8d71b-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f01dc50c-55d6-4f99-92f8-d3adfcf8d71b\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.310158 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czpn7\" (UniqueName: \"kubernetes.io/projected/f01dc50c-55d6-4f99-92f8-d3adfcf8d71b-kube-api-access-czpn7\") pod \"openstack-cell1-galera-0\" (UID: \"f01dc50c-55d6-4f99-92f8-d3adfcf8d71b\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.310192 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-360e933e-6078-4c3a-b0cd-d945331c8397\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-360e933e-6078-4c3a-b0cd-d945331c8397\") pod \"openstack-cell1-galera-0\" (UID: \"f01dc50c-55d6-4f99-92f8-d3adfcf8d71b\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.310225 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f01dc50c-55d6-4f99-92f8-d3adfcf8d71b-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f01dc50c-55d6-4f99-92f8-d3adfcf8d71b\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.310243 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f01dc50c-55d6-4f99-92f8-d3adfcf8d71b-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f01dc50c-55d6-4f99-92f8-d3adfcf8d71b\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.310269 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f01dc50c-55d6-4f99-92f8-d3adfcf8d71b-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f01dc50c-55d6-4f99-92f8-d3adfcf8d71b\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.413190 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f01dc50c-55d6-4f99-92f8-d3adfcf8d71b-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f01dc50c-55d6-4f99-92f8-d3adfcf8d71b\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.413305 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f01dc50c-55d6-4f99-92f8-d3adfcf8d71b-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f01dc50c-55d6-4f99-92f8-d3adfcf8d71b\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.413428 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f01dc50c-55d6-4f99-92f8-d3adfcf8d71b-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f01dc50c-55d6-4f99-92f8-d3adfcf8d71b\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.413480 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czpn7\" (UniqueName: \"kubernetes.io/projected/f01dc50c-55d6-4f99-92f8-d3adfcf8d71b-kube-api-access-czpn7\") pod \"openstack-cell1-galera-0\" (UID: \"f01dc50c-55d6-4f99-92f8-d3adfcf8d71b\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.413524 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-360e933e-6078-4c3a-b0cd-d945331c8397\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-360e933e-6078-4c3a-b0cd-d945331c8397\") pod \"openstack-cell1-galera-0\" (UID: \"f01dc50c-55d6-4f99-92f8-d3adfcf8d71b\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.413565 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f01dc50c-55d6-4f99-92f8-d3adfcf8d71b-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f01dc50c-55d6-4f99-92f8-d3adfcf8d71b\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.413589 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f01dc50c-55d6-4f99-92f8-d3adfcf8d71b-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f01dc50c-55d6-4f99-92f8-d3adfcf8d71b\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.413619 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f01dc50c-55d6-4f99-92f8-d3adfcf8d71b-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f01dc50c-55d6-4f99-92f8-d3adfcf8d71b\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.414350 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f01dc50c-55d6-4f99-92f8-d3adfcf8d71b-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f01dc50c-55d6-4f99-92f8-d3adfcf8d71b\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.415328 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f01dc50c-55d6-4f99-92f8-d3adfcf8d71b-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f01dc50c-55d6-4f99-92f8-d3adfcf8d71b\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.415342 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f01dc50c-55d6-4f99-92f8-d3adfcf8d71b-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f01dc50c-55d6-4f99-92f8-d3adfcf8d71b\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.416431 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f01dc50c-55d6-4f99-92f8-d3adfcf8d71b-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f01dc50c-55d6-4f99-92f8-d3adfcf8d71b\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.427361 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f01dc50c-55d6-4f99-92f8-d3adfcf8d71b-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f01dc50c-55d6-4f99-92f8-d3adfcf8d71b\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.428274 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f01dc50c-55d6-4f99-92f8-d3adfcf8d71b-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f01dc50c-55d6-4f99-92f8-d3adfcf8d71b\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.430334 4723 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.430405 4723 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-360e933e-6078-4c3a-b0cd-d945331c8397\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-360e933e-6078-4c3a-b0cd-d945331c8397\") pod \"openstack-cell1-galera-0\" (UID: \"f01dc50c-55d6-4f99-92f8-d3adfcf8d71b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8ee4263235854238d7ec3af01f06b335fe70056c163aa8c707ce9c73fc8810cc/globalmount\"" pod="openstack/openstack-cell1-galera-0" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.435638 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czpn7\" (UniqueName: \"kubernetes.io/projected/f01dc50c-55d6-4f99-92f8-d3adfcf8d71b-kube-api-access-czpn7\") pod \"openstack-cell1-galera-0\" (UID: \"f01dc50c-55d6-4f99-92f8-d3adfcf8d71b\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.510818 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"daa528e2-bcd7-43a8-bfea-a0911b3020c5","Type":"ContainerStarted","Data":"8abe0dab6a7f482a198575018a6b66071882a893761c80b9a06949d7ef573b27"} Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.513885 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"19f91c12-b482-46ab-a6e1-20164abe2ee4","Type":"ContainerStarted","Data":"a582220915ebe3312e61478c345cd6f7b9adf7abe770e979886329554ca913c7"} Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.516212 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-360e933e-6078-4c3a-b0cd-d945331c8397\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-360e933e-6078-4c3a-b0cd-d945331c8397\") pod \"openstack-cell1-galera-0\" (UID: \"f01dc50c-55d6-4f99-92f8-d3adfcf8d71b\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.549400 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.560399 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.575265 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.575364 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.579469 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-bnd8n" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.579783 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.581603 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.616405 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b01063ef-9ba6-4f2b-8298-46acf5a50e80-combined-ca-bundle\") pod \"memcached-0\" (UID: \"b01063ef-9ba6-4f2b-8298-46acf5a50e80\") " pod="openstack/memcached-0" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.616445 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b01063ef-9ba6-4f2b-8298-46acf5a50e80-kolla-config\") pod \"memcached-0\" (UID: \"b01063ef-9ba6-4f2b-8298-46acf5a50e80\") " pod="openstack/memcached-0" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.616497 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54gq8\" (UniqueName: \"kubernetes.io/projected/b01063ef-9ba6-4f2b-8298-46acf5a50e80-kube-api-access-54gq8\") pod \"memcached-0\" (UID: \"b01063ef-9ba6-4f2b-8298-46acf5a50e80\") " pod="openstack/memcached-0" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.616558 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b01063ef-9ba6-4f2b-8298-46acf5a50e80-config-data\") pod \"memcached-0\" (UID: \"b01063ef-9ba6-4f2b-8298-46acf5a50e80\") " pod="openstack/memcached-0" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.616578 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/b01063ef-9ba6-4f2b-8298-46acf5a50e80-memcached-tls-certs\") pod \"memcached-0\" (UID: \"b01063ef-9ba6-4f2b-8298-46acf5a50e80\") " pod="openstack/memcached-0" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.634058 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.720377 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b01063ef-9ba6-4f2b-8298-46acf5a50e80-combined-ca-bundle\") pod \"memcached-0\" (UID: \"b01063ef-9ba6-4f2b-8298-46acf5a50e80\") " pod="openstack/memcached-0" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.720449 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b01063ef-9ba6-4f2b-8298-46acf5a50e80-kolla-config\") pod \"memcached-0\" (UID: \"b01063ef-9ba6-4f2b-8298-46acf5a50e80\") " pod="openstack/memcached-0" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.720530 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54gq8\" (UniqueName: \"kubernetes.io/projected/b01063ef-9ba6-4f2b-8298-46acf5a50e80-kube-api-access-54gq8\") pod \"memcached-0\" (UID: \"b01063ef-9ba6-4f2b-8298-46acf5a50e80\") " pod="openstack/memcached-0" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.720618 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b01063ef-9ba6-4f2b-8298-46acf5a50e80-config-data\") pod \"memcached-0\" (UID: \"b01063ef-9ba6-4f2b-8298-46acf5a50e80\") " pod="openstack/memcached-0" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.720649 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/b01063ef-9ba6-4f2b-8298-46acf5a50e80-memcached-tls-certs\") pod \"memcached-0\" (UID: \"b01063ef-9ba6-4f2b-8298-46acf5a50e80\") " pod="openstack/memcached-0" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.721801 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b01063ef-9ba6-4f2b-8298-46acf5a50e80-kolla-config\") pod \"memcached-0\" (UID: \"b01063ef-9ba6-4f2b-8298-46acf5a50e80\") " pod="openstack/memcached-0" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.722436 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b01063ef-9ba6-4f2b-8298-46acf5a50e80-config-data\") pod \"memcached-0\" (UID: \"b01063ef-9ba6-4f2b-8298-46acf5a50e80\") " pod="openstack/memcached-0" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.735756 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/b01063ef-9ba6-4f2b-8298-46acf5a50e80-memcached-tls-certs\") pod \"memcached-0\" (UID: \"b01063ef-9ba6-4f2b-8298-46acf5a50e80\") " pod="openstack/memcached-0" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.737742 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54gq8\" (UniqueName: \"kubernetes.io/projected/b01063ef-9ba6-4f2b-8298-46acf5a50e80-kube-api-access-54gq8\") pod \"memcached-0\" (UID: \"b01063ef-9ba6-4f2b-8298-46acf5a50e80\") " pod="openstack/memcached-0" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.750500 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b01063ef-9ba6-4f2b-8298-46acf5a50e80-combined-ca-bundle\") pod \"memcached-0\" (UID: \"b01063ef-9ba6-4f2b-8298-46acf5a50e80\") " pod="openstack/memcached-0" Mar 09 13:18:51 crc kubenswrapper[4723]: I0309 13:18:51.914166 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 09 13:18:52 crc kubenswrapper[4723]: I0309 13:18:52.538345 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17","Type":"ContainerStarted","Data":"229daa6173a7b9723113371e8324f72bb77864f1c72a0403ec28e56bbb09d68c"} Mar 09 13:18:54 crc kubenswrapper[4723]: I0309 13:18:54.249144 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 13:18:54 crc kubenswrapper[4723]: I0309 13:18:54.250928 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 09 13:18:54 crc kubenswrapper[4723]: I0309 13:18:54.253399 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-bc7kv" Mar 09 13:18:54 crc kubenswrapper[4723]: I0309 13:18:54.300388 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9wk7\" (UniqueName: \"kubernetes.io/projected/85a79034-ec88-4dfb-9714-0630d9637c3b-kube-api-access-w9wk7\") pod \"kube-state-metrics-0\" (UID: \"85a79034-ec88-4dfb-9714-0630d9637c3b\") " pod="openstack/kube-state-metrics-0" Mar 09 13:18:54 crc kubenswrapper[4723]: I0309 13:18:54.317174 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 13:18:54 crc kubenswrapper[4723]: I0309 13:18:54.404954 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9wk7\" (UniqueName: \"kubernetes.io/projected/85a79034-ec88-4dfb-9714-0630d9637c3b-kube-api-access-w9wk7\") pod \"kube-state-metrics-0\" (UID: \"85a79034-ec88-4dfb-9714-0630d9637c3b\") " pod="openstack/kube-state-metrics-0" Mar 09 13:18:54 crc kubenswrapper[4723]: I0309 13:18:54.431763 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9wk7\" (UniqueName: \"kubernetes.io/projected/85a79034-ec88-4dfb-9714-0630d9637c3b-kube-api-access-w9wk7\") pod \"kube-state-metrics-0\" (UID: \"85a79034-ec88-4dfb-9714-0630d9637c3b\") " pod="openstack/kube-state-metrics-0" Mar 09 13:18:54 crc kubenswrapper[4723]: I0309 13:18:54.585712 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.272840 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-ui-dashboards-66cbf594b5-fwcs7"] Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.276268 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-fwcs7" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.280233 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.281059 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards-sa-dockercfg-tt7mr" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.289771 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-66cbf594b5-fwcs7"] Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.329480 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f19d74e0-826f-47c6-80bc-d82478a56657-serving-cert\") pod \"observability-ui-dashboards-66cbf594b5-fwcs7\" (UID: \"f19d74e0-826f-47c6-80bc-d82478a56657\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-fwcs7" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.329559 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2jq2\" (UniqueName: \"kubernetes.io/projected/f19d74e0-826f-47c6-80bc-d82478a56657-kube-api-access-l2jq2\") pod \"observability-ui-dashboards-66cbf594b5-fwcs7\" (UID: \"f19d74e0-826f-47c6-80bc-d82478a56657\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-fwcs7" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.437851 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2jq2\" (UniqueName: \"kubernetes.io/projected/f19d74e0-826f-47c6-80bc-d82478a56657-kube-api-access-l2jq2\") pod \"observability-ui-dashboards-66cbf594b5-fwcs7\" (UID: \"f19d74e0-826f-47c6-80bc-d82478a56657\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-fwcs7" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.438032 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f19d74e0-826f-47c6-80bc-d82478a56657-serving-cert\") pod \"observability-ui-dashboards-66cbf594b5-fwcs7\" (UID: \"f19d74e0-826f-47c6-80bc-d82478a56657\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-fwcs7" Mar 09 13:18:55 crc kubenswrapper[4723]: E0309 13:18:55.438151 4723 secret.go:188] Couldn't get secret openshift-operators/observability-ui-dashboards: secret "observability-ui-dashboards" not found Mar 09 13:18:55 crc kubenswrapper[4723]: E0309 13:18:55.438199 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f19d74e0-826f-47c6-80bc-d82478a56657-serving-cert podName:f19d74e0-826f-47c6-80bc-d82478a56657 nodeName:}" failed. No retries permitted until 2026-03-09 13:18:55.93818278 +0000 UTC m=+1209.952650320 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/f19d74e0-826f-47c6-80bc-d82478a56657-serving-cert") pod "observability-ui-dashboards-66cbf594b5-fwcs7" (UID: "f19d74e0-826f-47c6-80bc-d82478a56657") : secret "observability-ui-dashboards" not found Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.483736 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2jq2\" (UniqueName: \"kubernetes.io/projected/f19d74e0-826f-47c6-80bc-d82478a56657-kube-api-access-l2jq2\") pod \"observability-ui-dashboards-66cbf594b5-fwcs7\" (UID: \"f19d74e0-826f-47c6-80bc-d82478a56657\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-fwcs7" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.656433 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-85b7499c-sqsr9"] Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.657690 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85b7499c-sqsr9" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.665717 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.668731 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.673754 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.673947 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.674050 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.674297 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-bkfhs" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.680278 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.680572 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.681046 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.692914 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85b7499c-sqsr9"] Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.701835 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.703846 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.741995 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6qcs\" (UniqueName: \"kubernetes.io/projected/731ffd33-861f-45a8-a54a-5a18dcca5ae6-kube-api-access-d6qcs\") pod \"console-85b7499c-sqsr9\" (UID: \"731ffd33-861f-45a8-a54a-5a18dcca5ae6\") " pod="openshift-console/console-85b7499c-sqsr9" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.742045 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/731ffd33-861f-45a8-a54a-5a18dcca5ae6-trusted-ca-bundle\") pod \"console-85b7499c-sqsr9\" (UID: \"731ffd33-861f-45a8-a54a-5a18dcca5ae6\") " pod="openshift-console/console-85b7499c-sqsr9" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.742089 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/731ffd33-861f-45a8-a54a-5a18dcca5ae6-oauth-serving-cert\") pod \"console-85b7499c-sqsr9\" (UID: \"731ffd33-861f-45a8-a54a-5a18dcca5ae6\") " pod="openshift-console/console-85b7499c-sqsr9" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.742183 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/731ffd33-861f-45a8-a54a-5a18dcca5ae6-console-oauth-config\") pod \"console-85b7499c-sqsr9\" (UID: \"731ffd33-861f-45a8-a54a-5a18dcca5ae6\") " pod="openshift-console/console-85b7499c-sqsr9" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.742308 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/731ffd33-861f-45a8-a54a-5a18dcca5ae6-service-ca\") pod \"console-85b7499c-sqsr9\" (UID: \"731ffd33-861f-45a8-a54a-5a18dcca5ae6\") " pod="openshift-console/console-85b7499c-sqsr9" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.742343 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/731ffd33-861f-45a8-a54a-5a18dcca5ae6-console-config\") pod \"console-85b7499c-sqsr9\" (UID: \"731ffd33-861f-45a8-a54a-5a18dcca5ae6\") " pod="openshift-console/console-85b7499c-sqsr9" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.742410 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/731ffd33-861f-45a8-a54a-5a18dcca5ae6-console-serving-cert\") pod \"console-85b7499c-sqsr9\" (UID: \"731ffd33-861f-45a8-a54a-5a18dcca5ae6\") " pod="openshift-console/console-85b7499c-sqsr9" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.844069 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-config\") pod \"prometheus-metric-storage-0\" (UID: \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.844316 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.844350 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/731ffd33-861f-45a8-a54a-5a18dcca5ae6-service-ca\") pod \"console-85b7499c-sqsr9\" (UID: \"731ffd33-861f-45a8-a54a-5a18dcca5ae6\") " pod="openshift-console/console-85b7499c-sqsr9" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.844371 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/731ffd33-861f-45a8-a54a-5a18dcca5ae6-console-config\") pod \"console-85b7499c-sqsr9\" (UID: \"731ffd33-861f-45a8-a54a-5a18dcca5ae6\") " pod="openshift-console/console-85b7499c-sqsr9" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.844395 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.844435 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-81bbfcdd-8860-466d-9fd8-a0c71da7caf4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-81bbfcdd-8860-466d-9fd8-a0c71da7caf4\") pod \"prometheus-metric-storage-0\" (UID: \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.844456 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/731ffd33-861f-45a8-a54a-5a18dcca5ae6-console-serving-cert\") pod \"console-85b7499c-sqsr9\" (UID: \"731ffd33-861f-45a8-a54a-5a18dcca5ae6\") " pod="openshift-console/console-85b7499c-sqsr9" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.844494 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.844515 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6qcs\" (UniqueName: \"kubernetes.io/projected/731ffd33-861f-45a8-a54a-5a18dcca5ae6-kube-api-access-d6qcs\") pod \"console-85b7499c-sqsr9\" (UID: \"731ffd33-861f-45a8-a54a-5a18dcca5ae6\") " pod="openshift-console/console-85b7499c-sqsr9" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.844539 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.844793 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/731ffd33-861f-45a8-a54a-5a18dcca5ae6-trusted-ca-bundle\") pod \"console-85b7499c-sqsr9\" (UID: \"731ffd33-861f-45a8-a54a-5a18dcca5ae6\") " pod="openshift-console/console-85b7499c-sqsr9" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.845019 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.845104 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.845153 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/731ffd33-861f-45a8-a54a-5a18dcca5ae6-oauth-serving-cert\") pod \"console-85b7499c-sqsr9\" (UID: \"731ffd33-861f-45a8-a54a-5a18dcca5ae6\") " pod="openshift-console/console-85b7499c-sqsr9" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.845416 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/731ffd33-861f-45a8-a54a-5a18dcca5ae6-console-config\") pod \"console-85b7499c-sqsr9\" (UID: \"731ffd33-861f-45a8-a54a-5a18dcca5ae6\") " pod="openshift-console/console-85b7499c-sqsr9" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.845803 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/731ffd33-861f-45a8-a54a-5a18dcca5ae6-service-ca\") pod \"console-85b7499c-sqsr9\" (UID: \"731ffd33-861f-45a8-a54a-5a18dcca5ae6\") " pod="openshift-console/console-85b7499c-sqsr9" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.846072 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/731ffd33-861f-45a8-a54a-5a18dcca5ae6-oauth-serving-cert\") pod \"console-85b7499c-sqsr9\" (UID: \"731ffd33-861f-45a8-a54a-5a18dcca5ae6\") " pod="openshift-console/console-85b7499c-sqsr9" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.846322 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/731ffd33-861f-45a8-a54a-5a18dcca5ae6-trusted-ca-bundle\") pod \"console-85b7499c-sqsr9\" (UID: \"731ffd33-861f-45a8-a54a-5a18dcca5ae6\") " pod="openshift-console/console-85b7499c-sqsr9" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.846364 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/731ffd33-861f-45a8-a54a-5a18dcca5ae6-console-oauth-config\") pod \"console-85b7499c-sqsr9\" (UID: \"731ffd33-861f-45a8-a54a-5a18dcca5ae6\") " pod="openshift-console/console-85b7499c-sqsr9" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.846397 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcj5h\" (UniqueName: \"kubernetes.io/projected/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-kube-api-access-rcj5h\") pod \"prometheus-metric-storage-0\" (UID: \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.846421 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.849083 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/731ffd33-861f-45a8-a54a-5a18dcca5ae6-console-serving-cert\") pod \"console-85b7499c-sqsr9\" (UID: \"731ffd33-861f-45a8-a54a-5a18dcca5ae6\") " pod="openshift-console/console-85b7499c-sqsr9" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.850304 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/731ffd33-861f-45a8-a54a-5a18dcca5ae6-console-oauth-config\") pod \"console-85b7499c-sqsr9\" (UID: \"731ffd33-861f-45a8-a54a-5a18dcca5ae6\") " pod="openshift-console/console-85b7499c-sqsr9" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.875742 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6qcs\" (UniqueName: \"kubernetes.io/projected/731ffd33-861f-45a8-a54a-5a18dcca5ae6-kube-api-access-d6qcs\") pod \"console-85b7499c-sqsr9\" (UID: \"731ffd33-861f-45a8-a54a-5a18dcca5ae6\") " pod="openshift-console/console-85b7499c-sqsr9" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.948256 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-config\") pod \"prometheus-metric-storage-0\" (UID: \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.948305 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.948890 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.948952 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-81bbfcdd-8860-466d-9fd8-a0c71da7caf4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-81bbfcdd-8860-466d-9fd8-a0c71da7caf4\") pod \"prometheus-metric-storage-0\" (UID: \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.949016 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.949042 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.949062 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.949093 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.949138 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f19d74e0-826f-47c6-80bc-d82478a56657-serving-cert\") pod \"observability-ui-dashboards-66cbf594b5-fwcs7\" (UID: \"f19d74e0-826f-47c6-80bc-d82478a56657\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-fwcs7" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.949174 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcj5h\" (UniqueName: \"kubernetes.io/projected/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-kube-api-access-rcj5h\") pod \"prometheus-metric-storage-0\" (UID: \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.949189 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.949214 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.950270 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.950343 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.970136 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.973609 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.976429 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-config\") pod \"prometheus-metric-storage-0\" (UID: \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.977359 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f19d74e0-826f-47c6-80bc-d82478a56657-serving-cert\") pod \"observability-ui-dashboards-66cbf594b5-fwcs7\" (UID: \"f19d74e0-826f-47c6-80bc-d82478a56657\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-fwcs7" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.984278 4723 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.984323 4723 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-81bbfcdd-8860-466d-9fd8-a0c71da7caf4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-81bbfcdd-8860-466d-9fd8-a0c71da7caf4\") pod \"prometheus-metric-storage-0\" (UID: \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7a5bc2ca863004c00c102bc266d43328c7878cd25689db409330e8972fedad87/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.989716 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85b7499c-sqsr9" Mar 09 13:18:55 crc kubenswrapper[4723]: I0309 13:18:55.989755 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:18:56 crc kubenswrapper[4723]: I0309 13:18:56.013675 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:18:56 crc kubenswrapper[4723]: I0309 13:18:56.027508 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcj5h\" (UniqueName: \"kubernetes.io/projected/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-kube-api-access-rcj5h\") pod \"prometheus-metric-storage-0\" (UID: \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:18:56 crc kubenswrapper[4723]: I0309 13:18:56.064939 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-81bbfcdd-8860-466d-9fd8-a0c71da7caf4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-81bbfcdd-8860-466d-9fd8-a0c71da7caf4\") pod \"prometheus-metric-storage-0\" (UID: \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:18:56 crc kubenswrapper[4723]: I0309 13:18:56.203438 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-fwcs7" Mar 09 13:18:56 crc kubenswrapper[4723]: I0309 13:18:56.332956 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.112922 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.247500 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.249244 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.263302 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.263458 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.263551 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.263658 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.263756 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-tvzbn" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.275492 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.388161 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a66d0873-8ac9-4410-92d9-aaf8efdaa527-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a66d0873-8ac9-4410-92d9-aaf8efdaa527\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.388211 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a66d0873-8ac9-4410-92d9-aaf8efdaa527-config\") pod \"ovsdbserver-nb-0\" (UID: \"a66d0873-8ac9-4410-92d9-aaf8efdaa527\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.388239 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a66d0873-8ac9-4410-92d9-aaf8efdaa527-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a66d0873-8ac9-4410-92d9-aaf8efdaa527\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.388269 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsnbr\" (UniqueName: \"kubernetes.io/projected/a66d0873-8ac9-4410-92d9-aaf8efdaa527-kube-api-access-tsnbr\") pod \"ovsdbserver-nb-0\" (UID: \"a66d0873-8ac9-4410-92d9-aaf8efdaa527\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.388335 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a66d0873-8ac9-4410-92d9-aaf8efdaa527-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a66d0873-8ac9-4410-92d9-aaf8efdaa527\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.388370 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a66d0873-8ac9-4410-92d9-aaf8efdaa527-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a66d0873-8ac9-4410-92d9-aaf8efdaa527\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.388398 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ae51fea5-d2c7-415a-adcf-46c98bd34a45\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ae51fea5-d2c7-415a-adcf-46c98bd34a45\") pod \"ovsdbserver-nb-0\" (UID: \"a66d0873-8ac9-4410-92d9-aaf8efdaa527\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.388422 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a66d0873-8ac9-4410-92d9-aaf8efdaa527-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a66d0873-8ac9-4410-92d9-aaf8efdaa527\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.490563 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a66d0873-8ac9-4410-92d9-aaf8efdaa527-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a66d0873-8ac9-4410-92d9-aaf8efdaa527\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.490616 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a66d0873-8ac9-4410-92d9-aaf8efdaa527-config\") pod \"ovsdbserver-nb-0\" (UID: \"a66d0873-8ac9-4410-92d9-aaf8efdaa527\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.490648 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a66d0873-8ac9-4410-92d9-aaf8efdaa527-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a66d0873-8ac9-4410-92d9-aaf8efdaa527\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.490684 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsnbr\" (UniqueName: \"kubernetes.io/projected/a66d0873-8ac9-4410-92d9-aaf8efdaa527-kube-api-access-tsnbr\") pod \"ovsdbserver-nb-0\" (UID: \"a66d0873-8ac9-4410-92d9-aaf8efdaa527\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.490761 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a66d0873-8ac9-4410-92d9-aaf8efdaa527-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a66d0873-8ac9-4410-92d9-aaf8efdaa527\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.490807 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a66d0873-8ac9-4410-92d9-aaf8efdaa527-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a66d0873-8ac9-4410-92d9-aaf8efdaa527\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.490840 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ae51fea5-d2c7-415a-adcf-46c98bd34a45\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ae51fea5-d2c7-415a-adcf-46c98bd34a45\") pod \"ovsdbserver-nb-0\" (UID: \"a66d0873-8ac9-4410-92d9-aaf8efdaa527\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.490889 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a66d0873-8ac9-4410-92d9-aaf8efdaa527-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a66d0873-8ac9-4410-92d9-aaf8efdaa527\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.491397 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a66d0873-8ac9-4410-92d9-aaf8efdaa527-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a66d0873-8ac9-4410-92d9-aaf8efdaa527\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.493382 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a66d0873-8ac9-4410-92d9-aaf8efdaa527-config\") pod \"ovsdbserver-nb-0\" (UID: \"a66d0873-8ac9-4410-92d9-aaf8efdaa527\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.494826 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a66d0873-8ac9-4410-92d9-aaf8efdaa527-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a66d0873-8ac9-4410-92d9-aaf8efdaa527\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.498690 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a66d0873-8ac9-4410-92d9-aaf8efdaa527-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a66d0873-8ac9-4410-92d9-aaf8efdaa527\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.498690 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a66d0873-8ac9-4410-92d9-aaf8efdaa527-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a66d0873-8ac9-4410-92d9-aaf8efdaa527\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.501926 4723 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.501983 4723 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ae51fea5-d2c7-415a-adcf-46c98bd34a45\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ae51fea5-d2c7-415a-adcf-46c98bd34a45\") pod \"ovsdbserver-nb-0\" (UID: \"a66d0873-8ac9-4410-92d9-aaf8efdaa527\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c01b42074a28c7455d7db2453cee869567d3600312453f1023f1fec8b564b07e/globalmount\"" pod="openstack/ovsdbserver-nb-0" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.510642 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a66d0873-8ac9-4410-92d9-aaf8efdaa527-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a66d0873-8ac9-4410-92d9-aaf8efdaa527\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.514390 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsnbr\" (UniqueName: \"kubernetes.io/projected/a66d0873-8ac9-4410-92d9-aaf8efdaa527-kube-api-access-tsnbr\") pod \"ovsdbserver-nb-0\" (UID: \"a66d0873-8ac9-4410-92d9-aaf8efdaa527\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.562205 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-5n52p"] Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.563936 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5n52p" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.570025 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.570233 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.570370 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-5q6tq" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.588330 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-65x7z"] Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.590984 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-65x7z" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.606989 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-5n52p"] Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.612880 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ae51fea5-d2c7-415a-adcf-46c98bd34a45\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ae51fea5-d2c7-415a-adcf-46c98bd34a45\") pod \"ovsdbserver-nb-0\" (UID: \"a66d0873-8ac9-4410-92d9-aaf8efdaa527\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.656960 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-65x7z"] Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.694397 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ea8d3865-305b-4ab6-833c-f8b227b6bae4-var-run\") pod \"ovn-controller-5n52p\" (UID: \"ea8d3865-305b-4ab6-833c-f8b227b6bae4\") " pod="openstack/ovn-controller-5n52p" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.694506 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea8d3865-305b-4ab6-833c-f8b227b6bae4-combined-ca-bundle\") pod \"ovn-controller-5n52p\" (UID: \"ea8d3865-305b-4ab6-833c-f8b227b6bae4\") " pod="openstack/ovn-controller-5n52p" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.694551 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/afd41bfc-ea4f-4ac5-aa58-896d8fd76cfa-var-run\") pod \"ovn-controller-ovs-65x7z\" (UID: \"afd41bfc-ea4f-4ac5-aa58-896d8fd76cfa\") " pod="openstack/ovn-controller-ovs-65x7z" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.694596 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/afd41bfc-ea4f-4ac5-aa58-896d8fd76cfa-var-lib\") pod \"ovn-controller-ovs-65x7z\" (UID: \"afd41bfc-ea4f-4ac5-aa58-896d8fd76cfa\") " pod="openstack/ovn-controller-ovs-65x7z" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.694621 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ea8d3865-305b-4ab6-833c-f8b227b6bae4-var-log-ovn\") pod \"ovn-controller-5n52p\" (UID: \"ea8d3865-305b-4ab6-833c-f8b227b6bae4\") " pod="openstack/ovn-controller-5n52p" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.694646 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/afd41bfc-ea4f-4ac5-aa58-896d8fd76cfa-scripts\") pod \"ovn-controller-ovs-65x7z\" (UID: \"afd41bfc-ea4f-4ac5-aa58-896d8fd76cfa\") " pod="openstack/ovn-controller-ovs-65x7z" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.698200 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/afd41bfc-ea4f-4ac5-aa58-896d8fd76cfa-etc-ovs\") pod \"ovn-controller-ovs-65x7z\" (UID: \"afd41bfc-ea4f-4ac5-aa58-896d8fd76cfa\") " pod="openstack/ovn-controller-ovs-65x7z" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.698423 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q2q5\" (UniqueName: \"kubernetes.io/projected/afd41bfc-ea4f-4ac5-aa58-896d8fd76cfa-kube-api-access-5q2q5\") pod \"ovn-controller-ovs-65x7z\" (UID: \"afd41bfc-ea4f-4ac5-aa58-896d8fd76cfa\") " pod="openstack/ovn-controller-ovs-65x7z" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.698506 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea8d3865-305b-4ab6-833c-f8b227b6bae4-scripts\") pod \"ovn-controller-5n52p\" (UID: \"ea8d3865-305b-4ab6-833c-f8b227b6bae4\") " pod="openstack/ovn-controller-5n52p" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.699013 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ea8d3865-305b-4ab6-833c-f8b227b6bae4-var-run-ovn\") pod \"ovn-controller-5n52p\" (UID: \"ea8d3865-305b-4ab6-833c-f8b227b6bae4\") " pod="openstack/ovn-controller-5n52p" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.699074 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9zvq\" (UniqueName: \"kubernetes.io/projected/ea8d3865-305b-4ab6-833c-f8b227b6bae4-kube-api-access-z9zvq\") pod \"ovn-controller-5n52p\" (UID: \"ea8d3865-305b-4ab6-833c-f8b227b6bae4\") " pod="openstack/ovn-controller-5n52p" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.699159 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea8d3865-305b-4ab6-833c-f8b227b6bae4-ovn-controller-tls-certs\") pod \"ovn-controller-5n52p\" (UID: \"ea8d3865-305b-4ab6-833c-f8b227b6bae4\") " pod="openstack/ovn-controller-5n52p" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.699245 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/afd41bfc-ea4f-4ac5-aa58-896d8fd76cfa-var-log\") pod \"ovn-controller-ovs-65x7z\" (UID: \"afd41bfc-ea4f-4ac5-aa58-896d8fd76cfa\") " pod="openstack/ovn-controller-ovs-65x7z" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.805743 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ea8d3865-305b-4ab6-833c-f8b227b6bae4-var-log-ovn\") pod \"ovn-controller-5n52p\" (UID: \"ea8d3865-305b-4ab6-833c-f8b227b6bae4\") " pod="openstack/ovn-controller-5n52p" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.805785 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/afd41bfc-ea4f-4ac5-aa58-896d8fd76cfa-scripts\") pod \"ovn-controller-ovs-65x7z\" (UID: \"afd41bfc-ea4f-4ac5-aa58-896d8fd76cfa\") " pod="openstack/ovn-controller-ovs-65x7z" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.805828 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/afd41bfc-ea4f-4ac5-aa58-896d8fd76cfa-etc-ovs\") pod \"ovn-controller-ovs-65x7z\" (UID: \"afd41bfc-ea4f-4ac5-aa58-896d8fd76cfa\") " pod="openstack/ovn-controller-ovs-65x7z" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.805847 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q2q5\" (UniqueName: \"kubernetes.io/projected/afd41bfc-ea4f-4ac5-aa58-896d8fd76cfa-kube-api-access-5q2q5\") pod \"ovn-controller-ovs-65x7z\" (UID: \"afd41bfc-ea4f-4ac5-aa58-896d8fd76cfa\") " pod="openstack/ovn-controller-ovs-65x7z" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.805888 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea8d3865-305b-4ab6-833c-f8b227b6bae4-scripts\") pod \"ovn-controller-5n52p\" (UID: \"ea8d3865-305b-4ab6-833c-f8b227b6bae4\") " pod="openstack/ovn-controller-5n52p" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.805913 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ea8d3865-305b-4ab6-833c-f8b227b6bae4-var-run-ovn\") pod \"ovn-controller-5n52p\" (UID: \"ea8d3865-305b-4ab6-833c-f8b227b6bae4\") " pod="openstack/ovn-controller-5n52p" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.805934 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9zvq\" (UniqueName: \"kubernetes.io/projected/ea8d3865-305b-4ab6-833c-f8b227b6bae4-kube-api-access-z9zvq\") pod \"ovn-controller-5n52p\" (UID: \"ea8d3865-305b-4ab6-833c-f8b227b6bae4\") " pod="openstack/ovn-controller-5n52p" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.805965 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea8d3865-305b-4ab6-833c-f8b227b6bae4-ovn-controller-tls-certs\") pod \"ovn-controller-5n52p\" (UID: \"ea8d3865-305b-4ab6-833c-f8b227b6bae4\") " pod="openstack/ovn-controller-5n52p" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.805997 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/afd41bfc-ea4f-4ac5-aa58-896d8fd76cfa-var-log\") pod \"ovn-controller-ovs-65x7z\" (UID: \"afd41bfc-ea4f-4ac5-aa58-896d8fd76cfa\") " pod="openstack/ovn-controller-ovs-65x7z" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.806039 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ea8d3865-305b-4ab6-833c-f8b227b6bae4-var-run\") pod \"ovn-controller-5n52p\" (UID: \"ea8d3865-305b-4ab6-833c-f8b227b6bae4\") " pod="openstack/ovn-controller-5n52p" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.806075 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea8d3865-305b-4ab6-833c-f8b227b6bae4-combined-ca-bundle\") pod \"ovn-controller-5n52p\" (UID: \"ea8d3865-305b-4ab6-833c-f8b227b6bae4\") " pod="openstack/ovn-controller-5n52p" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.806097 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/afd41bfc-ea4f-4ac5-aa58-896d8fd76cfa-var-run\") pod \"ovn-controller-ovs-65x7z\" (UID: \"afd41bfc-ea4f-4ac5-aa58-896d8fd76cfa\") " pod="openstack/ovn-controller-ovs-65x7z" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.806119 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/afd41bfc-ea4f-4ac5-aa58-896d8fd76cfa-var-lib\") pod \"ovn-controller-ovs-65x7z\" (UID: \"afd41bfc-ea4f-4ac5-aa58-896d8fd76cfa\") " pod="openstack/ovn-controller-ovs-65x7z" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.806628 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/afd41bfc-ea4f-4ac5-aa58-896d8fd76cfa-var-lib\") pod \"ovn-controller-ovs-65x7z\" (UID: \"afd41bfc-ea4f-4ac5-aa58-896d8fd76cfa\") " pod="openstack/ovn-controller-ovs-65x7z" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.806750 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ea8d3865-305b-4ab6-833c-f8b227b6bae4-var-run-ovn\") pod \"ovn-controller-5n52p\" (UID: \"ea8d3865-305b-4ab6-833c-f8b227b6bae4\") " pod="openstack/ovn-controller-5n52p" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.806872 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ea8d3865-305b-4ab6-833c-f8b227b6bae4-var-log-ovn\") pod \"ovn-controller-5n52p\" (UID: \"ea8d3865-305b-4ab6-833c-f8b227b6bae4\") " pod="openstack/ovn-controller-5n52p" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.807562 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/afd41bfc-ea4f-4ac5-aa58-896d8fd76cfa-etc-ovs\") pod \"ovn-controller-ovs-65x7z\" (UID: \"afd41bfc-ea4f-4ac5-aa58-896d8fd76cfa\") " pod="openstack/ovn-controller-ovs-65x7z" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.808106 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/afd41bfc-ea4f-4ac5-aa58-896d8fd76cfa-var-log\") pod \"ovn-controller-ovs-65x7z\" (UID: \"afd41bfc-ea4f-4ac5-aa58-896d8fd76cfa\") " pod="openstack/ovn-controller-ovs-65x7z" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.810321 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/afd41bfc-ea4f-4ac5-aa58-896d8fd76cfa-scripts\") pod \"ovn-controller-ovs-65x7z\" (UID: \"afd41bfc-ea4f-4ac5-aa58-896d8fd76cfa\") " pod="openstack/ovn-controller-ovs-65x7z" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.814128 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/afd41bfc-ea4f-4ac5-aa58-896d8fd76cfa-var-run\") pod \"ovn-controller-ovs-65x7z\" (UID: \"afd41bfc-ea4f-4ac5-aa58-896d8fd76cfa\") " pod="openstack/ovn-controller-ovs-65x7z" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.814128 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ea8d3865-305b-4ab6-833c-f8b227b6bae4-var-run\") pod \"ovn-controller-5n52p\" (UID: \"ea8d3865-305b-4ab6-833c-f8b227b6bae4\") " pod="openstack/ovn-controller-5n52p" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.814990 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea8d3865-305b-4ab6-833c-f8b227b6bae4-scripts\") pod \"ovn-controller-5n52p\" (UID: \"ea8d3865-305b-4ab6-833c-f8b227b6bae4\") " pod="openstack/ovn-controller-5n52p" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.817371 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea8d3865-305b-4ab6-833c-f8b227b6bae4-ovn-controller-tls-certs\") pod \"ovn-controller-5n52p\" (UID: \"ea8d3865-305b-4ab6-833c-f8b227b6bae4\") " pod="openstack/ovn-controller-5n52p" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.819598 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea8d3865-305b-4ab6-833c-f8b227b6bae4-combined-ca-bundle\") pod \"ovn-controller-5n52p\" (UID: \"ea8d3865-305b-4ab6-833c-f8b227b6bae4\") " pod="openstack/ovn-controller-5n52p" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.830291 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q2q5\" (UniqueName: \"kubernetes.io/projected/afd41bfc-ea4f-4ac5-aa58-896d8fd76cfa-kube-api-access-5q2q5\") pod \"ovn-controller-ovs-65x7z\" (UID: \"afd41bfc-ea4f-4ac5-aa58-896d8fd76cfa\") " pod="openstack/ovn-controller-ovs-65x7z" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.834713 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9zvq\" (UniqueName: \"kubernetes.io/projected/ea8d3865-305b-4ab6-833c-f8b227b6bae4-kube-api-access-z9zvq\") pod \"ovn-controller-5n52p\" (UID: \"ea8d3865-305b-4ab6-833c-f8b227b6bae4\") " pod="openstack/ovn-controller-5n52p" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.899441 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.929480 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5n52p" Mar 09 13:18:57 crc kubenswrapper[4723]: I0309 13:18:57.969937 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-65x7z" Mar 09 13:19:00 crc kubenswrapper[4723]: I0309 13:19:00.218061 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 09 13:19:00 crc kubenswrapper[4723]: I0309 13:19:00.939403 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 09 13:19:00 crc kubenswrapper[4723]: I0309 13:19:00.943655 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 09 13:19:00 crc kubenswrapper[4723]: I0309 13:19:00.946134 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 09 13:19:00 crc kubenswrapper[4723]: I0309 13:19:00.946278 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 09 13:19:00 crc kubenswrapper[4723]: I0309 13:19:00.948391 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 09 13:19:00 crc kubenswrapper[4723]: I0309 13:19:00.948393 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-vsk2b" Mar 09 13:19:00 crc kubenswrapper[4723]: I0309 13:19:00.949543 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 09 13:19:01 crc kubenswrapper[4723]: I0309 13:19:01.082767 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3da5bb20-bbc3-4c40-9b5c-f1cb12074c23-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3da5bb20-bbc3-4c40-9b5c-f1cb12074c23\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:19:01 crc kubenswrapper[4723]: I0309 13:19:01.082815 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3da5bb20-bbc3-4c40-9b5c-f1cb12074c23-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"3da5bb20-bbc3-4c40-9b5c-f1cb12074c23\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:19:01 crc kubenswrapper[4723]: I0309 13:19:01.082838 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3da5bb20-bbc3-4c40-9b5c-f1cb12074c23-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"3da5bb20-bbc3-4c40-9b5c-f1cb12074c23\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:19:01 crc kubenswrapper[4723]: I0309 13:19:01.082854 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3da5bb20-bbc3-4c40-9b5c-f1cb12074c23-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3da5bb20-bbc3-4c40-9b5c-f1cb12074c23\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:19:01 crc kubenswrapper[4723]: I0309 13:19:01.082926 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3da5bb20-bbc3-4c40-9b5c-f1cb12074c23-config\") pod \"ovsdbserver-sb-0\" (UID: \"3da5bb20-bbc3-4c40-9b5c-f1cb12074c23\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:19:01 crc kubenswrapper[4723]: I0309 13:19:01.082973 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl5c5\" (UniqueName: \"kubernetes.io/projected/3da5bb20-bbc3-4c40-9b5c-f1cb12074c23-kube-api-access-fl5c5\") pod \"ovsdbserver-sb-0\" (UID: \"3da5bb20-bbc3-4c40-9b5c-f1cb12074c23\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:19:01 crc kubenswrapper[4723]: I0309 13:19:01.083016 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3da5bb20-bbc3-4c40-9b5c-f1cb12074c23-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"3da5bb20-bbc3-4c40-9b5c-f1cb12074c23\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:19:01 crc kubenswrapper[4723]: I0309 13:19:01.083043 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1e643f7e-2fe9-4cac-9c54-9d30c3dbca04\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1e643f7e-2fe9-4cac-9c54-9d30c3dbca04\") pod \"ovsdbserver-sb-0\" (UID: \"3da5bb20-bbc3-4c40-9b5c-f1cb12074c23\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:19:01 crc kubenswrapper[4723]: I0309 13:19:01.184920 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3da5bb20-bbc3-4c40-9b5c-f1cb12074c23-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"3da5bb20-bbc3-4c40-9b5c-f1cb12074c23\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:19:01 crc kubenswrapper[4723]: I0309 13:19:01.185003 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1e643f7e-2fe9-4cac-9c54-9d30c3dbca04\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1e643f7e-2fe9-4cac-9c54-9d30c3dbca04\") pod \"ovsdbserver-sb-0\" (UID: \"3da5bb20-bbc3-4c40-9b5c-f1cb12074c23\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:19:01 crc kubenswrapper[4723]: I0309 13:19:01.185106 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3da5bb20-bbc3-4c40-9b5c-f1cb12074c23-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3da5bb20-bbc3-4c40-9b5c-f1cb12074c23\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:19:01 crc kubenswrapper[4723]: I0309 13:19:01.185158 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3da5bb20-bbc3-4c40-9b5c-f1cb12074c23-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"3da5bb20-bbc3-4c40-9b5c-f1cb12074c23\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:19:01 crc kubenswrapper[4723]: I0309 13:19:01.185178 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3da5bb20-bbc3-4c40-9b5c-f1cb12074c23-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3da5bb20-bbc3-4c40-9b5c-f1cb12074c23\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:19:01 crc kubenswrapper[4723]: I0309 13:19:01.185193 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3da5bb20-bbc3-4c40-9b5c-f1cb12074c23-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"3da5bb20-bbc3-4c40-9b5c-f1cb12074c23\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:19:01 crc kubenswrapper[4723]: I0309 13:19:01.185271 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3da5bb20-bbc3-4c40-9b5c-f1cb12074c23-config\") pod \"ovsdbserver-sb-0\" (UID: \"3da5bb20-bbc3-4c40-9b5c-f1cb12074c23\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:19:01 crc kubenswrapper[4723]: I0309 13:19:01.185349 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl5c5\" (UniqueName: \"kubernetes.io/projected/3da5bb20-bbc3-4c40-9b5c-f1cb12074c23-kube-api-access-fl5c5\") pod \"ovsdbserver-sb-0\" (UID: \"3da5bb20-bbc3-4c40-9b5c-f1cb12074c23\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:19:01 crc kubenswrapper[4723]: I0309 13:19:01.186192 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3da5bb20-bbc3-4c40-9b5c-f1cb12074c23-config\") pod \"ovsdbserver-sb-0\" (UID: \"3da5bb20-bbc3-4c40-9b5c-f1cb12074c23\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:19:01 crc kubenswrapper[4723]: I0309 13:19:01.186243 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3da5bb20-bbc3-4c40-9b5c-f1cb12074c23-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"3da5bb20-bbc3-4c40-9b5c-f1cb12074c23\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:19:01 crc kubenswrapper[4723]: I0309 13:19:01.187067 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3da5bb20-bbc3-4c40-9b5c-f1cb12074c23-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"3da5bb20-bbc3-4c40-9b5c-f1cb12074c23\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:19:01 crc kubenswrapper[4723]: I0309 13:19:01.191407 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3da5bb20-bbc3-4c40-9b5c-f1cb12074c23-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3da5bb20-bbc3-4c40-9b5c-f1cb12074c23\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:19:01 crc kubenswrapper[4723]: I0309 13:19:01.191773 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3da5bb20-bbc3-4c40-9b5c-f1cb12074c23-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"3da5bb20-bbc3-4c40-9b5c-f1cb12074c23\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:19:01 crc kubenswrapper[4723]: I0309 13:19:01.196040 4723 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 13:19:01 crc kubenswrapper[4723]: I0309 13:19:01.196063 4723 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1e643f7e-2fe9-4cac-9c54-9d30c3dbca04\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1e643f7e-2fe9-4cac-9c54-9d30c3dbca04\") pod \"ovsdbserver-sb-0\" (UID: \"3da5bb20-bbc3-4c40-9b5c-f1cb12074c23\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/be0741ed34ebf910dabff7fb90621360ec97473ce8935227273602a9e15cb41a/globalmount\"" pod="openstack/ovsdbserver-sb-0" Mar 09 13:19:01 crc kubenswrapper[4723]: I0309 13:19:01.202158 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl5c5\" (UniqueName: \"kubernetes.io/projected/3da5bb20-bbc3-4c40-9b5c-f1cb12074c23-kube-api-access-fl5c5\") pod \"ovsdbserver-sb-0\" (UID: \"3da5bb20-bbc3-4c40-9b5c-f1cb12074c23\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:19:01 crc kubenswrapper[4723]: I0309 13:19:01.203899 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3da5bb20-bbc3-4c40-9b5c-f1cb12074c23-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3da5bb20-bbc3-4c40-9b5c-f1cb12074c23\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:19:01 crc kubenswrapper[4723]: I0309 13:19:01.235710 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1e643f7e-2fe9-4cac-9c54-9d30c3dbca04\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1e643f7e-2fe9-4cac-9c54-9d30c3dbca04\") pod \"ovsdbserver-sb-0\" (UID: \"3da5bb20-bbc3-4c40-9b5c-f1cb12074c23\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:19:01 crc kubenswrapper[4723]: I0309 13:19:01.271440 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 09 13:19:01 crc kubenswrapper[4723]: I0309 13:19:01.684783 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f01dc50c-55d6-4f99-92f8-d3adfcf8d71b","Type":"ContainerStarted","Data":"0d3070d6c67fd0b6c01564da73982acb5c6d9ceaa6b537cf5c9284d3c663edaa"} Mar 09 13:19:04 crc kubenswrapper[4723]: I0309 13:19:04.046209 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 09 13:19:14 crc kubenswrapper[4723]: E0309 13:19:14.187673 4723 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Mar 09 13:19:14 crc kubenswrapper[4723]: E0309 13:19:14.189150 4723 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hjrxl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 13:19:14 crc kubenswrapper[4723]: E0309 13:19:14.189315 4723 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Mar 09 13:19:14 crc kubenswrapper[4723]: E0309 13:19:14.189487 4723 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r4n48,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(19f91c12-b482-46ab-a6e1-20164abe2ee4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 13:19:14 crc kubenswrapper[4723]: E0309 13:19:14.190633 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="19f91c12-b482-46ab-a6e1-20164abe2ee4" Mar 09 13:19:14 crc kubenswrapper[4723]: E0309 13:19:14.190656 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17" Mar 09 13:19:14 crc kubenswrapper[4723]: W0309 13:19:14.212754 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb01063ef_9ba6_4f2b_8298_46acf5a50e80.slice/crio-241283b712b23c60550ed1616b634bfc0ea70415f7015ce04695c70b978079fb WatchSource:0}: Error finding container 241283b712b23c60550ed1616b634bfc0ea70415f7015ce04695c70b978079fb: Status 404 returned error can't find the container with id 241283b712b23c60550ed1616b634bfc0ea70415f7015ce04695c70b978079fb Mar 09 13:19:14 crc kubenswrapper[4723]: I0309 13:19:14.667285 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 13:19:14 crc kubenswrapper[4723]: I0309 13:19:14.810111 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"b01063ef-9ba6-4f2b-8298-46acf5a50e80","Type":"ContainerStarted","Data":"241283b712b23c60550ed1616b634bfc0ea70415f7015ce04695c70b978079fb"} Mar 09 13:19:14 crc kubenswrapper[4723]: E0309 13:19:14.812748 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-galera-0" podUID="5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17" Mar 09 13:19:14 crc kubenswrapper[4723]: E0309 13:19:14.812752 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="19f91c12-b482-46ab-a6e1-20164abe2ee4" Mar 09 13:19:15 crc kubenswrapper[4723]: I0309 13:19:15.064970 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 09 13:19:15 crc kubenswrapper[4723]: W0309 13:19:15.150142 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85a79034_ec88_4dfb_9714_0630d9637c3b.slice/crio-b9625e652dc29aa4132b44f35c3022ce612b2a17e125dce85970f3a343c7fdff WatchSource:0}: Error finding container b9625e652dc29aa4132b44f35c3022ce612b2a17e125dce85970f3a343c7fdff: Status 404 returned error can't find the container with id b9625e652dc29aa4132b44f35c3022ce612b2a17e125dce85970f3a343c7fdff Mar 09 13:19:15 crc kubenswrapper[4723]: E0309 13:19:15.164591 4723 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 09 13:19:15 crc kubenswrapper[4723]: E0309 13:19:15.164731 4723 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9l9dk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-gg872_openstack(43506b49-01ab-4ea9-bcd6-4ce950b7815f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 13:19:15 crc kubenswrapper[4723]: E0309 13:19:15.166117 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-gg872" podUID="43506b49-01ab-4ea9-bcd6-4ce950b7815f" Mar 09 13:19:15 crc kubenswrapper[4723]: E0309 13:19:15.221145 4723 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 09 13:19:15 crc kubenswrapper[4723]: E0309 13:19:15.221317 4723 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qrtvp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-gwv9x_openstack(55420bf1-44a4-4313-8570-cf2d1f9784ba): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 13:19:15 crc kubenswrapper[4723]: E0309 13:19:15.222698 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-gwv9x" podUID="55420bf1-44a4-4313-8570-cf2d1f9784ba" Mar 09 13:19:15 crc kubenswrapper[4723]: E0309 13:19:15.241376 4723 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 09 13:19:15 crc kubenswrapper[4723]: E0309 13:19:15.241418 4723 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 09 13:19:15 crc kubenswrapper[4723]: E0309 13:19:15.241527 4723 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vt46l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-fclpq_openstack(4ab9a1db-b005-4830-8421-6caf5a04048b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 13:19:15 crc kubenswrapper[4723]: E0309 13:19:15.241586 4723 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9nmj8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-sf8lq_openstack(85ff66cb-dbf2-4501-bb5d-7f1142896dc3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 13:19:15 crc kubenswrapper[4723]: E0309 13:19:15.243142 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-fclpq" podUID="4ab9a1db-b005-4830-8421-6caf5a04048b" Mar 09 13:19:15 crc kubenswrapper[4723]: E0309 13:19:15.243181 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5ccc8479f9-sf8lq" podUID="85ff66cb-dbf2-4501-bb5d-7f1142896dc3" Mar 09 13:19:15 crc kubenswrapper[4723]: I0309 13:19:15.742995 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 09 13:19:15 crc kubenswrapper[4723]: W0309 13:19:15.750937 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc72a0aaf_0e9e_4327_91d3_bc7c68542eab.slice/crio-90917e42d98f28638867d5e7f3af9f6bf01c5f77d8a526999ff64c9454528467 WatchSource:0}: Error finding container 90917e42d98f28638867d5e7f3af9f6bf01c5f77d8a526999ff64c9454528467: Status 404 returned error can't find the container with id 90917e42d98f28638867d5e7f3af9f6bf01c5f77d8a526999ff64c9454528467 Mar 09 13:19:15 crc kubenswrapper[4723]: I0309 13:19:15.819296 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"85a79034-ec88-4dfb-9714-0630d9637c3b","Type":"ContainerStarted","Data":"b9625e652dc29aa4132b44f35c3022ce612b2a17e125dce85970f3a343c7fdff"} Mar 09 13:19:15 crc kubenswrapper[4723]: I0309 13:19:15.820579 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f01dc50c-55d6-4f99-92f8-d3adfcf8d71b","Type":"ContainerStarted","Data":"a6b308940c8d43200f48b7b328753a9c089b19e63546ad8193970c0d815df90d"} Mar 09 13:19:15 crc kubenswrapper[4723]: I0309 13:19:15.822031 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c72a0aaf-0e9e-4327-91d3-bc7c68542eab","Type":"ContainerStarted","Data":"90917e42d98f28638867d5e7f3af9f6bf01c5f77d8a526999ff64c9454528467"} Mar 09 13:19:15 crc kubenswrapper[4723]: I0309 13:19:15.824310 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a66d0873-8ac9-4410-92d9-aaf8efdaa527","Type":"ContainerStarted","Data":"951d97e796342b59a1eb5594fd74100d1a4370969ede1dc5cdaaf0849a52c131"} Mar 09 13:19:15 crc kubenswrapper[4723]: E0309 13:19:15.825988 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-5ccc8479f9-sf8lq" podUID="85ff66cb-dbf2-4501-bb5d-7f1142896dc3" Mar 09 13:19:15 crc kubenswrapper[4723]: E0309 13:19:15.826380 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-gg872" podUID="43506b49-01ab-4ea9-bcd6-4ce950b7815f" Mar 09 13:19:16 crc kubenswrapper[4723]: I0309 13:19:16.137594 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85b7499c-sqsr9"] Mar 09 13:19:16 crc kubenswrapper[4723]: I0309 13:19:16.152029 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-5n52p"] Mar 09 13:19:16 crc kubenswrapper[4723]: I0309 13:19:16.351558 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-65x7z"] Mar 09 13:19:16 crc kubenswrapper[4723]: W0309 13:19:16.624199 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea8d3865_305b_4ab6_833c_f8b227b6bae4.slice/crio-d409b0c78b6764470bdbc26df2715f5a918df3b2100f884b8dc2f73e70d3358d WatchSource:0}: Error finding container d409b0c78b6764470bdbc26df2715f5a918df3b2100f884b8dc2f73e70d3358d: Status 404 returned error can't find the container with id d409b0c78b6764470bdbc26df2715f5a918df3b2100f884b8dc2f73e70d3358d Mar 09 13:19:16 crc kubenswrapper[4723]: I0309 13:19:16.649751 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-66cbf594b5-fwcs7"] Mar 09 13:19:16 crc kubenswrapper[4723]: I0309 13:19:16.763648 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 09 13:19:16 crc kubenswrapper[4723]: I0309 13:19:16.834335 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5n52p" event={"ID":"ea8d3865-305b-4ab6-833c-f8b227b6bae4","Type":"ContainerStarted","Data":"d409b0c78b6764470bdbc26df2715f5a918df3b2100f884b8dc2f73e70d3358d"} Mar 09 13:19:17 crc kubenswrapper[4723]: W0309 13:19:17.262160 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafd41bfc_ea4f_4ac5_aa58_896d8fd76cfa.slice/crio-07dcd72da66f72922df623e6a037d0d6e45bfc3b7f22bb618617047188c3a0a0 WatchSource:0}: Error finding container 07dcd72da66f72922df623e6a037d0d6e45bfc3b7f22bb618617047188c3a0a0: Status 404 returned error can't find the container with id 07dcd72da66f72922df623e6a037d0d6e45bfc3b7f22bb618617047188c3a0a0 Mar 09 13:19:17 crc kubenswrapper[4723]: I0309 13:19:17.842393 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"54210e7b-b34d-411d-93e1-e8cc3448c4b0","Type":"ContainerStarted","Data":"42687d97a92f3eec8eb044239bd85c0cb19dc311299573a1084789dac3e84d1d"} Mar 09 13:19:17 crc kubenswrapper[4723]: I0309 13:19:17.843397 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85b7499c-sqsr9" event={"ID":"731ffd33-861f-45a8-a54a-5a18dcca5ae6","Type":"ContainerStarted","Data":"1479e0f404d0d4c8b24e4a7885c4d41355e2685c65eca026333234c7f4f9e03a"} Mar 09 13:19:17 crc kubenswrapper[4723]: I0309 13:19:17.844577 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb","Type":"ContainerStarted","Data":"f600bd95ed92947bbe218dbf141750e32f7e39e37c298df8afa64d12dc276f50"} Mar 09 13:19:17 crc kubenswrapper[4723]: I0309 13:19:17.846269 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"daa528e2-bcd7-43a8-bfea-a0911b3020c5","Type":"ContainerStarted","Data":"8ef261746c675013eb62f34eb677fb0207d09a06464e1471d9a77690a6583b55"} Mar 09 13:19:17 crc kubenswrapper[4723]: I0309 13:19:17.851773 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-65x7z" event={"ID":"afd41bfc-ea4f-4ac5-aa58-896d8fd76cfa","Type":"ContainerStarted","Data":"07dcd72da66f72922df623e6a037d0d6e45bfc3b7f22bb618617047188c3a0a0"} Mar 09 13:19:18 crc kubenswrapper[4723]: I0309 13:19:18.231738 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-gwv9x" Mar 09 13:19:18 crc kubenswrapper[4723]: I0309 13:19:18.240301 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-fclpq" Mar 09 13:19:18 crc kubenswrapper[4723]: I0309 13:19:18.390637 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrtvp\" (UniqueName: \"kubernetes.io/projected/55420bf1-44a4-4313-8570-cf2d1f9784ba-kube-api-access-qrtvp\") pod \"55420bf1-44a4-4313-8570-cf2d1f9784ba\" (UID: \"55420bf1-44a4-4313-8570-cf2d1f9784ba\") " Mar 09 13:19:18 crc kubenswrapper[4723]: I0309 13:19:18.391068 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ab9a1db-b005-4830-8421-6caf5a04048b-config\") pod \"4ab9a1db-b005-4830-8421-6caf5a04048b\" (UID: \"4ab9a1db-b005-4830-8421-6caf5a04048b\") " Mar 09 13:19:18 crc kubenswrapper[4723]: I0309 13:19:18.391191 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55420bf1-44a4-4313-8570-cf2d1f9784ba-config\") pod \"55420bf1-44a4-4313-8570-cf2d1f9784ba\" (UID: \"55420bf1-44a4-4313-8570-cf2d1f9784ba\") " Mar 09 13:19:18 crc kubenswrapper[4723]: I0309 13:19:18.391236 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ab9a1db-b005-4830-8421-6caf5a04048b-dns-svc\") pod \"4ab9a1db-b005-4830-8421-6caf5a04048b\" (UID: \"4ab9a1db-b005-4830-8421-6caf5a04048b\") " Mar 09 13:19:18 crc kubenswrapper[4723]: I0309 13:19:18.391285 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt46l\" (UniqueName: \"kubernetes.io/projected/4ab9a1db-b005-4830-8421-6caf5a04048b-kube-api-access-vt46l\") pod \"4ab9a1db-b005-4830-8421-6caf5a04048b\" (UID: \"4ab9a1db-b005-4830-8421-6caf5a04048b\") " Mar 09 13:19:18 crc kubenswrapper[4723]: I0309 13:19:18.391544 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ab9a1db-b005-4830-8421-6caf5a04048b-config" (OuterVolumeSpecName: "config") pod "4ab9a1db-b005-4830-8421-6caf5a04048b" (UID: "4ab9a1db-b005-4830-8421-6caf5a04048b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:19:18 crc kubenswrapper[4723]: I0309 13:19:18.391795 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ab9a1db-b005-4830-8421-6caf5a04048b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4ab9a1db-b005-4830-8421-6caf5a04048b" (UID: "4ab9a1db-b005-4830-8421-6caf5a04048b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:19:18 crc kubenswrapper[4723]: I0309 13:19:18.391833 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ab9a1db-b005-4830-8421-6caf5a04048b-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:18 crc kubenswrapper[4723]: I0309 13:19:18.392117 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55420bf1-44a4-4313-8570-cf2d1f9784ba-config" (OuterVolumeSpecName: "config") pod "55420bf1-44a4-4313-8570-cf2d1f9784ba" (UID: "55420bf1-44a4-4313-8570-cf2d1f9784ba"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:19:18 crc kubenswrapper[4723]: I0309 13:19:18.404206 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55420bf1-44a4-4313-8570-cf2d1f9784ba-kube-api-access-qrtvp" (OuterVolumeSpecName: "kube-api-access-qrtvp") pod "55420bf1-44a4-4313-8570-cf2d1f9784ba" (UID: "55420bf1-44a4-4313-8570-cf2d1f9784ba"). InnerVolumeSpecName "kube-api-access-qrtvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:19:18 crc kubenswrapper[4723]: I0309 13:19:18.408738 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ab9a1db-b005-4830-8421-6caf5a04048b-kube-api-access-vt46l" (OuterVolumeSpecName: "kube-api-access-vt46l") pod "4ab9a1db-b005-4830-8421-6caf5a04048b" (UID: "4ab9a1db-b005-4830-8421-6caf5a04048b"). InnerVolumeSpecName "kube-api-access-vt46l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:19:18 crc kubenswrapper[4723]: I0309 13:19:18.498023 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrtvp\" (UniqueName: \"kubernetes.io/projected/55420bf1-44a4-4313-8570-cf2d1f9784ba-kube-api-access-qrtvp\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:18 crc kubenswrapper[4723]: I0309 13:19:18.498067 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55420bf1-44a4-4313-8570-cf2d1f9784ba-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:18 crc kubenswrapper[4723]: I0309 13:19:18.498083 4723 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ab9a1db-b005-4830-8421-6caf5a04048b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:18 crc kubenswrapper[4723]: I0309 13:19:18.498094 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt46l\" (UniqueName: \"kubernetes.io/projected/4ab9a1db-b005-4830-8421-6caf5a04048b-kube-api-access-vt46l\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:18 crc kubenswrapper[4723]: I0309 13:19:18.867187 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-gwv9x" event={"ID":"55420bf1-44a4-4313-8570-cf2d1f9784ba","Type":"ContainerDied","Data":"e54e5e14366e01fe98d15117dcebec88b14dd7d429e9fdc75f6bcb5686e61a93"} Mar 09 13:19:18 crc kubenswrapper[4723]: I0309 13:19:18.867280 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-gwv9x" Mar 09 13:19:18 crc kubenswrapper[4723]: I0309 13:19:18.870200 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-fclpq" Mar 09 13:19:18 crc kubenswrapper[4723]: I0309 13:19:18.870193 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-fclpq" event={"ID":"4ab9a1db-b005-4830-8421-6caf5a04048b","Type":"ContainerDied","Data":"e86dfc291134ec8211b01e91fede2a62846b33ef5303cb4dee7ea541e66520f8"} Mar 09 13:19:18 crc kubenswrapper[4723]: I0309 13:19:18.871445 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-fwcs7" event={"ID":"f19d74e0-826f-47c6-80bc-d82478a56657","Type":"ContainerStarted","Data":"b3896ff3c9d82bc6340ce5c7d2c0ecd0ceebfd227005ad3d964c97c4a4c1a05d"} Mar 09 13:19:18 crc kubenswrapper[4723]: I0309 13:19:18.872342 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3da5bb20-bbc3-4c40-9b5c-f1cb12074c23","Type":"ContainerStarted","Data":"70cbce0210a3285c76108faec8e2b8510f54a875cba9c83bd860a8646609e103"} Mar 09 13:19:18 crc kubenswrapper[4723]: I0309 13:19:18.945838 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fclpq"] Mar 09 13:19:18 crc kubenswrapper[4723]: I0309 13:19:18.958707 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fclpq"] Mar 09 13:19:18 crc kubenswrapper[4723]: I0309 13:19:18.977218 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-gwv9x"] Mar 09 13:19:18 crc kubenswrapper[4723]: I0309 13:19:18.985925 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-gwv9x"] Mar 09 13:19:19 crc kubenswrapper[4723]: I0309 13:19:19.885665 4723 generic.go:334] "Generic (PLEG): container finished" podID="f01dc50c-55d6-4f99-92f8-d3adfcf8d71b" containerID="a6b308940c8d43200f48b7b328753a9c089b19e63546ad8193970c0d815df90d" exitCode=0 Mar 09 13:19:19 crc kubenswrapper[4723]: I0309 13:19:19.885754 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f01dc50c-55d6-4f99-92f8-d3adfcf8d71b","Type":"ContainerDied","Data":"a6b308940c8d43200f48b7b328753a9c089b19e63546ad8193970c0d815df90d"} Mar 09 13:19:20 crc kubenswrapper[4723]: E0309 13:19:20.359271 4723 log.go:32] "ImageFsInfo from image service failed" err="rpc error: code = Unknown desc = get image fs info unable to get usage for /var/lib/containers/storage/overlay-images: get disk usage for path /var/lib/containers/storage/overlay-images: lstat /var/lib/containers/storage/overlay-images/.tmp-images.json2388983486: no such file or directory" Mar 09 13:19:20 crc kubenswrapper[4723]: E0309 13:19:20.359325 4723 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get imageFs stats: missing image stats: nil" Mar 09 13:19:20 crc kubenswrapper[4723]: I0309 13:19:20.894590 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ab9a1db-b005-4830-8421-6caf5a04048b" path="/var/lib/kubelet/pods/4ab9a1db-b005-4830-8421-6caf5a04048b/volumes" Mar 09 13:19:20 crc kubenswrapper[4723]: I0309 13:19:20.895562 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55420bf1-44a4-4313-8570-cf2d1f9784ba" path="/var/lib/kubelet/pods/55420bf1-44a4-4313-8570-cf2d1f9784ba/volumes" Mar 09 13:19:22 crc kubenswrapper[4723]: I0309 13:19:22.912994 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85b7499c-sqsr9" event={"ID":"731ffd33-861f-45a8-a54a-5a18dcca5ae6","Type":"ContainerStarted","Data":"a53df7508cd3c82609916b5c91a51441c46fa671d1ce420dbfefb5e587a4fad8"} Mar 09 13:19:22 crc kubenswrapper[4723]: I0309 13:19:22.938252 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-85b7499c-sqsr9" podStartSLOduration=27.938236059 podStartE2EDuration="27.938236059s" podCreationTimestamp="2026-03-09 13:18:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:19:22.934634053 +0000 UTC m=+1236.949101593" watchObservedRunningTime="2026-03-09 13:19:22.938236059 +0000 UTC m=+1236.952703599" Mar 09 13:19:23 crc kubenswrapper[4723]: I0309 13:19:23.922697 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-fwcs7" event={"ID":"f19d74e0-826f-47c6-80bc-d82478a56657","Type":"ContainerStarted","Data":"5cadacb0349b09f468357d674f0fecd8e0c68007d6f981918cba310a50d9bf31"} Mar 09 13:19:23 crc kubenswrapper[4723]: I0309 13:19:23.924888 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3da5bb20-bbc3-4c40-9b5c-f1cb12074c23","Type":"ContainerStarted","Data":"1b37072ebff72ba3c75573a2ed7ce1c3986aa3d79fd03c31c3afcb0e4630452c"} Mar 09 13:19:23 crc kubenswrapper[4723]: I0309 13:19:23.926950 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5n52p" event={"ID":"ea8d3865-305b-4ab6-833c-f8b227b6bae4","Type":"ContainerStarted","Data":"8fa27b05466c412208d786cf783da14422f9daa5af250201d736cccfabe4d60d"} Mar 09 13:19:23 crc kubenswrapper[4723]: I0309 13:19:23.927173 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-5n52p" Mar 09 13:19:23 crc kubenswrapper[4723]: I0309 13:19:23.931236 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a66d0873-8ac9-4410-92d9-aaf8efdaa527","Type":"ContainerStarted","Data":"ce4a1a36a558202a760d683665d461c4143e392f454689c0eee85192abc9c19e"} Mar 09 13:19:23 crc kubenswrapper[4723]: I0309 13:19:23.933828 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"b01063ef-9ba6-4f2b-8298-46acf5a50e80","Type":"ContainerStarted","Data":"8966a8549c324e025cca6e0556162aa3a764091b92234acbc19af0b5a6472671"} Mar 09 13:19:23 crc kubenswrapper[4723]: I0309 13:19:23.933982 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 09 13:19:23 crc kubenswrapper[4723]: I0309 13:19:23.935302 4723 generic.go:334] "Generic (PLEG): container finished" podID="afd41bfc-ea4f-4ac5-aa58-896d8fd76cfa" containerID="4c4177c7edcdc8f2cefe01ad4e991e9e6bacae97cab602237cb051c305f4c205" exitCode=0 Mar 09 13:19:23 crc kubenswrapper[4723]: I0309 13:19:23.935372 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-65x7z" event={"ID":"afd41bfc-ea4f-4ac5-aa58-896d8fd76cfa","Type":"ContainerDied","Data":"4c4177c7edcdc8f2cefe01ad4e991e9e6bacae97cab602237cb051c305f4c205"} Mar 09 13:19:23 crc kubenswrapper[4723]: I0309 13:19:23.942624 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"85a79034-ec88-4dfb-9714-0630d9637c3b","Type":"ContainerStarted","Data":"741ecb501e21c591eab9cbe41217a7c4b863f0bfc7046701176cee2c13ed487e"} Mar 09 13:19:23 crc kubenswrapper[4723]: I0309 13:19:23.943220 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 09 13:19:23 crc kubenswrapper[4723]: I0309 13:19:23.950158 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-fwcs7" podStartSLOduration=24.639871648 podStartE2EDuration="28.950137143s" podCreationTimestamp="2026-03-09 13:18:55 +0000 UTC" firstStartedPulling="2026-03-09 13:19:18.138278414 +0000 UTC m=+1232.152745954" lastFinishedPulling="2026-03-09 13:19:22.448543909 +0000 UTC m=+1236.463011449" observedRunningTime="2026-03-09 13:19:23.945227502 +0000 UTC m=+1237.959695062" watchObservedRunningTime="2026-03-09 13:19:23.950137143 +0000 UTC m=+1237.964604683" Mar 09 13:19:23 crc kubenswrapper[4723]: I0309 13:19:23.965994 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f01dc50c-55d6-4f99-92f8-d3adfcf8d71b","Type":"ContainerStarted","Data":"3fe26bad9b16dcbced32e65577499814226c7b87e253d4a51e58c006fcbe365f"} Mar 09 13:19:24 crc kubenswrapper[4723]: I0309 13:19:24.066935 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=22.697079147 podStartE2EDuration="30.066911213s" podCreationTimestamp="2026-03-09 13:18:54 +0000 UTC" firstStartedPulling="2026-03-09 13:19:15.153427973 +0000 UTC m=+1229.167895513" lastFinishedPulling="2026-03-09 13:19:22.523260039 +0000 UTC m=+1236.537727579" observedRunningTime="2026-03-09 13:19:24.057316167 +0000 UTC m=+1238.071783717" watchObservedRunningTime="2026-03-09 13:19:24.066911213 +0000 UTC m=+1238.081378753" Mar 09 13:19:24 crc kubenswrapper[4723]: I0309 13:19:24.068508 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=26.384854282 podStartE2EDuration="33.068499905s" podCreationTimestamp="2026-03-09 13:18:51 +0000 UTC" firstStartedPulling="2026-03-09 13:19:14.244161231 +0000 UTC m=+1228.258628771" lastFinishedPulling="2026-03-09 13:19:20.927806854 +0000 UTC m=+1234.942274394" observedRunningTime="2026-03-09 13:19:24.012773131 +0000 UTC m=+1238.027240681" watchObservedRunningTime="2026-03-09 13:19:24.068499905 +0000 UTC m=+1238.082967445" Mar 09 13:19:24 crc kubenswrapper[4723]: I0309 13:19:24.105368 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-5n52p" podStartSLOduration=21.310886359 podStartE2EDuration="27.105353146s" podCreationTimestamp="2026-03-09 13:18:57 +0000 UTC" firstStartedPulling="2026-03-09 13:19:16.63296437 +0000 UTC m=+1230.647431910" lastFinishedPulling="2026-03-09 13:19:22.427431157 +0000 UTC m=+1236.441898697" observedRunningTime="2026-03-09 13:19:24.075756278 +0000 UTC m=+1238.090223818" watchObservedRunningTime="2026-03-09 13:19:24.105353146 +0000 UTC m=+1238.119820686" Mar 09 13:19:24 crc kubenswrapper[4723]: I0309 13:19:24.109355 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=19.66778645 podStartE2EDuration="34.109338993s" podCreationTimestamp="2026-03-09 13:18:50 +0000 UTC" firstStartedPulling="2026-03-09 13:19:00.838513372 +0000 UTC m=+1214.852980912" lastFinishedPulling="2026-03-09 13:19:15.280065905 +0000 UTC m=+1229.294533455" observedRunningTime="2026-03-09 13:19:24.103265671 +0000 UTC m=+1238.117733211" watchObservedRunningTime="2026-03-09 13:19:24.109338993 +0000 UTC m=+1238.123806533" Mar 09 13:19:24 crc kubenswrapper[4723]: I0309 13:19:24.976505 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-65x7z" event={"ID":"afd41bfc-ea4f-4ac5-aa58-896d8fd76cfa","Type":"ContainerStarted","Data":"367414352dfe92c67a24ffbce5e092392c9c22b9d6e71c48045fcfc0e5b6972f"} Mar 09 13:19:24 crc kubenswrapper[4723]: I0309 13:19:24.976988 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-65x7z" event={"ID":"afd41bfc-ea4f-4ac5-aa58-896d8fd76cfa","Type":"ContainerStarted","Data":"1756a9686adf57b8aa22f6f63b6c2ec34252434ae8ce4b715c6fea85214b3f77"} Mar 09 13:19:25 crc kubenswrapper[4723]: I0309 13:19:25.005904 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-65x7z" podStartSLOduration=22.844263841 podStartE2EDuration="28.005883226s" podCreationTimestamp="2026-03-09 13:18:57 +0000 UTC" firstStartedPulling="2026-03-09 13:19:17.265446612 +0000 UTC m=+1231.279914152" lastFinishedPulling="2026-03-09 13:19:22.427065967 +0000 UTC m=+1236.441533537" observedRunningTime="2026-03-09 13:19:24.997123253 +0000 UTC m=+1239.011590803" watchObservedRunningTime="2026-03-09 13:19:25.005883226 +0000 UTC m=+1239.020350766" Mar 09 13:19:25 crc kubenswrapper[4723]: I0309 13:19:25.989794 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-85b7499c-sqsr9" Mar 09 13:19:25 crc kubenswrapper[4723]: I0309 13:19:25.990058 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-85b7499c-sqsr9" Mar 09 13:19:25 crc kubenswrapper[4723]: I0309 13:19:25.992371 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c72a0aaf-0e9e-4327-91d3-bc7c68542eab","Type":"ContainerStarted","Data":"07575b77da38dde101b5863d2a83706ebb79127082bebd05c9e54b540f0cd082"} Mar 09 13:19:25 crc kubenswrapper[4723]: I0309 13:19:25.992393 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="c72a0aaf-0e9e-4327-91d3-bc7c68542eab" containerName="init-config-reloader" containerID="cri-o://07575b77da38dde101b5863d2a83706ebb79127082bebd05c9e54b540f0cd082" gracePeriod=600 Mar 09 13:19:25 crc kubenswrapper[4723]: I0309 13:19:25.992697 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-65x7z" Mar 09 13:19:25 crc kubenswrapper[4723]: I0309 13:19:25.993637 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-65x7z" Mar 09 13:19:25 crc kubenswrapper[4723]: I0309 13:19:25.994903 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-85b7499c-sqsr9" Mar 09 13:19:27 crc kubenswrapper[4723]: I0309 13:19:27.003540 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-85b7499c-sqsr9" Mar 09 13:19:27 crc kubenswrapper[4723]: I0309 13:19:27.100313 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6d748498-cmr7m"] Mar 09 13:19:28 crc kubenswrapper[4723]: I0309 13:19:28.010919 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a66d0873-8ac9-4410-92d9-aaf8efdaa527","Type":"ContainerStarted","Data":"c7abb59c7a4f9790b65a8c58debf1bea6468c9799a775035fdef518e834674c1"} Mar 09 13:19:28 crc kubenswrapper[4723]: I0309 13:19:28.012983 4723 generic.go:334] "Generic (PLEG): container finished" podID="85ff66cb-dbf2-4501-bb5d-7f1142896dc3" containerID="8a1453ffa6d54142b5917e60c23baf06f0691b780e65ba722d03c0df4217fd77" exitCode=0 Mar 09 13:19:28 crc kubenswrapper[4723]: I0309 13:19:28.013274 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-sf8lq" event={"ID":"85ff66cb-dbf2-4501-bb5d-7f1142896dc3","Type":"ContainerDied","Data":"8a1453ffa6d54142b5917e60c23baf06f0691b780e65ba722d03c0df4217fd77"} Mar 09 13:19:28 crc kubenswrapper[4723]: I0309 13:19:28.018406 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3da5bb20-bbc3-4c40-9b5c-f1cb12074c23","Type":"ContainerStarted","Data":"36def294b58a6877fd202d98f6ff4be65f87978c2bca5f18e10b88cbe9a69e44"} Mar 09 13:19:28 crc kubenswrapper[4723]: I0309 13:19:28.036202 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=20.273609763 podStartE2EDuration="32.036179799s" podCreationTimestamp="2026-03-09 13:18:56 +0000 UTC" firstStartedPulling="2026-03-09 13:19:15.280735923 +0000 UTC m=+1229.295203463" lastFinishedPulling="2026-03-09 13:19:27.043305959 +0000 UTC m=+1241.057773499" observedRunningTime="2026-03-09 13:19:28.03211229 +0000 UTC m=+1242.046579830" watchObservedRunningTime="2026-03-09 13:19:28.036179799 +0000 UTC m=+1242.050647339" Mar 09 13:19:28 crc kubenswrapper[4723]: I0309 13:19:28.062961 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=20.151717541 podStartE2EDuration="29.062940471s" podCreationTimestamp="2026-03-09 13:18:59 +0000 UTC" firstStartedPulling="2026-03-09 13:19:18.13887853 +0000 UTC m=+1232.153346070" lastFinishedPulling="2026-03-09 13:19:27.05010146 +0000 UTC m=+1241.064569000" observedRunningTime="2026-03-09 13:19:28.06213699 +0000 UTC m=+1242.076604540" watchObservedRunningTime="2026-03-09 13:19:28.062940471 +0000 UTC m=+1242.077408011" Mar 09 13:19:28 crc kubenswrapper[4723]: I0309 13:19:28.272178 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 09 13:19:28 crc kubenswrapper[4723]: I0309 13:19:28.316536 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.038442 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-sf8lq" event={"ID":"85ff66cb-dbf2-4501-bb5d-7f1142896dc3","Type":"ContainerStarted","Data":"2a236cb50fae8bfc47e5fe3401512cbb2342fcc29e8497254621a36fdde8287f"} Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.039230 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc8479f9-sf8lq" Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.039433 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.063423 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc8479f9-sf8lq" podStartSLOduration=3.041570939 podStartE2EDuration="42.063403603s" podCreationTimestamp="2026-03-09 13:18:47 +0000 UTC" firstStartedPulling="2026-03-09 13:18:48.292390649 +0000 UTC m=+1202.306858189" lastFinishedPulling="2026-03-09 13:19:27.314223313 +0000 UTC m=+1241.328690853" observedRunningTime="2026-03-09 13:19:29.063088714 +0000 UTC m=+1243.077556254" watchObservedRunningTime="2026-03-09 13:19:29.063403603 +0000 UTC m=+1243.077871143" Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.118820 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.391778 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gg872"] Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.426114 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-2k7l8"] Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.427621 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-2k7l8" Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.430758 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.441352 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-2k7l8"] Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.535958 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-vxlcg"] Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.537296 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-vxlcg" Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.540095 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.551746 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-vxlcg"] Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.578038 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e02a0c7b-7017-43f5-a43e-300bf157cd37-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-2k7l8\" (UID: \"e02a0c7b-7017-43f5-a43e-300bf157cd37\") " pod="openstack/dnsmasq-dns-7f896c8c65-2k7l8" Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.578171 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e02a0c7b-7017-43f5-a43e-300bf157cd37-config\") pod \"dnsmasq-dns-7f896c8c65-2k7l8\" (UID: \"e02a0c7b-7017-43f5-a43e-300bf157cd37\") " pod="openstack/dnsmasq-dns-7f896c8c65-2k7l8" Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.578235 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e02a0c7b-7017-43f5-a43e-300bf157cd37-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-2k7l8\" (UID: \"e02a0c7b-7017-43f5-a43e-300bf157cd37\") " pod="openstack/dnsmasq-dns-7f896c8c65-2k7l8" Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.578262 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xzkh\" (UniqueName: \"kubernetes.io/projected/e02a0c7b-7017-43f5-a43e-300bf157cd37-kube-api-access-4xzkh\") pod \"dnsmasq-dns-7f896c8c65-2k7l8\" (UID: \"e02a0c7b-7017-43f5-a43e-300bf157cd37\") " pod="openstack/dnsmasq-dns-7f896c8c65-2k7l8" Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.681790 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e02a0c7b-7017-43f5-a43e-300bf157cd37-config\") pod \"dnsmasq-dns-7f896c8c65-2k7l8\" (UID: \"e02a0c7b-7017-43f5-a43e-300bf157cd37\") " pod="openstack/dnsmasq-dns-7f896c8c65-2k7l8" Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.681873 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ae48b5ff-caaa-4d4c-81c5-201afe2220ef-ovn-rundir\") pod \"ovn-controller-metrics-vxlcg\" (UID: \"ae48b5ff-caaa-4d4c-81c5-201afe2220ef\") " pod="openstack/ovn-controller-metrics-vxlcg" Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.681934 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae48b5ff-caaa-4d4c-81c5-201afe2220ef-combined-ca-bundle\") pod \"ovn-controller-metrics-vxlcg\" (UID: \"ae48b5ff-caaa-4d4c-81c5-201afe2220ef\") " pod="openstack/ovn-controller-metrics-vxlcg" Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.681974 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e02a0c7b-7017-43f5-a43e-300bf157cd37-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-2k7l8\" (UID: \"e02a0c7b-7017-43f5-a43e-300bf157cd37\") " pod="openstack/dnsmasq-dns-7f896c8c65-2k7l8" Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.682007 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xzkh\" (UniqueName: \"kubernetes.io/projected/e02a0c7b-7017-43f5-a43e-300bf157cd37-kube-api-access-4xzkh\") pod \"dnsmasq-dns-7f896c8c65-2k7l8\" (UID: \"e02a0c7b-7017-43f5-a43e-300bf157cd37\") " pod="openstack/dnsmasq-dns-7f896c8c65-2k7l8" Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.682079 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae48b5ff-caaa-4d4c-81c5-201afe2220ef-config\") pod \"ovn-controller-metrics-vxlcg\" (UID: \"ae48b5ff-caaa-4d4c-81c5-201afe2220ef\") " pod="openstack/ovn-controller-metrics-vxlcg" Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.682119 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e02a0c7b-7017-43f5-a43e-300bf157cd37-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-2k7l8\" (UID: \"e02a0c7b-7017-43f5-a43e-300bf157cd37\") " pod="openstack/dnsmasq-dns-7f896c8c65-2k7l8" Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.682144 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ae48b5ff-caaa-4d4c-81c5-201afe2220ef-ovs-rundir\") pod \"ovn-controller-metrics-vxlcg\" (UID: \"ae48b5ff-caaa-4d4c-81c5-201afe2220ef\") " pod="openstack/ovn-controller-metrics-vxlcg" Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.682180 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dm794\" (UniqueName: \"kubernetes.io/projected/ae48b5ff-caaa-4d4c-81c5-201afe2220ef-kube-api-access-dm794\") pod \"ovn-controller-metrics-vxlcg\" (UID: \"ae48b5ff-caaa-4d4c-81c5-201afe2220ef\") " pod="openstack/ovn-controller-metrics-vxlcg" Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.682233 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae48b5ff-caaa-4d4c-81c5-201afe2220ef-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-vxlcg\" (UID: \"ae48b5ff-caaa-4d4c-81c5-201afe2220ef\") " pod="openstack/ovn-controller-metrics-vxlcg" Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.683044 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e02a0c7b-7017-43f5-a43e-300bf157cd37-config\") pod \"dnsmasq-dns-7f896c8c65-2k7l8\" (UID: \"e02a0c7b-7017-43f5-a43e-300bf157cd37\") " pod="openstack/dnsmasq-dns-7f896c8c65-2k7l8" Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.683716 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e02a0c7b-7017-43f5-a43e-300bf157cd37-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-2k7l8\" (UID: \"e02a0c7b-7017-43f5-a43e-300bf157cd37\") " pod="openstack/dnsmasq-dns-7f896c8c65-2k7l8" Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.684236 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e02a0c7b-7017-43f5-a43e-300bf157cd37-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-2k7l8\" (UID: \"e02a0c7b-7017-43f5-a43e-300bf157cd37\") " pod="openstack/dnsmasq-dns-7f896c8c65-2k7l8" Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.706399 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xzkh\" (UniqueName: \"kubernetes.io/projected/e02a0c7b-7017-43f5-a43e-300bf157cd37-kube-api-access-4xzkh\") pod \"dnsmasq-dns-7f896c8c65-2k7l8\" (UID: \"e02a0c7b-7017-43f5-a43e-300bf157cd37\") " pod="openstack/dnsmasq-dns-7f896c8c65-2k7l8" Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.780313 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-sf8lq"] Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.784362 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae48b5ff-caaa-4d4c-81c5-201afe2220ef-config\") pod \"ovn-controller-metrics-vxlcg\" (UID: \"ae48b5ff-caaa-4d4c-81c5-201afe2220ef\") " pod="openstack/ovn-controller-metrics-vxlcg" Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.784421 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ae48b5ff-caaa-4d4c-81c5-201afe2220ef-ovs-rundir\") pod \"ovn-controller-metrics-vxlcg\" (UID: \"ae48b5ff-caaa-4d4c-81c5-201afe2220ef\") " pod="openstack/ovn-controller-metrics-vxlcg" Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.784459 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dm794\" (UniqueName: \"kubernetes.io/projected/ae48b5ff-caaa-4d4c-81c5-201afe2220ef-kube-api-access-dm794\") pod \"ovn-controller-metrics-vxlcg\" (UID: \"ae48b5ff-caaa-4d4c-81c5-201afe2220ef\") " pod="openstack/ovn-controller-metrics-vxlcg" Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.784516 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae48b5ff-caaa-4d4c-81c5-201afe2220ef-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-vxlcg\" (UID: \"ae48b5ff-caaa-4d4c-81c5-201afe2220ef\") " pod="openstack/ovn-controller-metrics-vxlcg" Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.784640 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ae48b5ff-caaa-4d4c-81c5-201afe2220ef-ovn-rundir\") pod \"ovn-controller-metrics-vxlcg\" (UID: \"ae48b5ff-caaa-4d4c-81c5-201afe2220ef\") " pod="openstack/ovn-controller-metrics-vxlcg" Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.784708 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae48b5ff-caaa-4d4c-81c5-201afe2220ef-combined-ca-bundle\") pod \"ovn-controller-metrics-vxlcg\" (UID: \"ae48b5ff-caaa-4d4c-81c5-201afe2220ef\") " pod="openstack/ovn-controller-metrics-vxlcg" Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.784804 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ae48b5ff-caaa-4d4c-81c5-201afe2220ef-ovn-rundir\") pod \"ovn-controller-metrics-vxlcg\" (UID: \"ae48b5ff-caaa-4d4c-81c5-201afe2220ef\") " pod="openstack/ovn-controller-metrics-vxlcg" Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.785133 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae48b5ff-caaa-4d4c-81c5-201afe2220ef-config\") pod \"ovn-controller-metrics-vxlcg\" (UID: \"ae48b5ff-caaa-4d4c-81c5-201afe2220ef\") " pod="openstack/ovn-controller-metrics-vxlcg" Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.785138 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ae48b5ff-caaa-4d4c-81c5-201afe2220ef-ovs-rundir\") pod \"ovn-controller-metrics-vxlcg\" (UID: \"ae48b5ff-caaa-4d4c-81c5-201afe2220ef\") " pod="openstack/ovn-controller-metrics-vxlcg" Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.792359 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-2k7l8" Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.792366 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae48b5ff-caaa-4d4c-81c5-201afe2220ef-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-vxlcg\" (UID: \"ae48b5ff-caaa-4d4c-81c5-201afe2220ef\") " pod="openstack/ovn-controller-metrics-vxlcg" Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.809536 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae48b5ff-caaa-4d4c-81c5-201afe2220ef-combined-ca-bundle\") pod \"ovn-controller-metrics-vxlcg\" (UID: \"ae48b5ff-caaa-4d4c-81c5-201afe2220ef\") " pod="openstack/ovn-controller-metrics-vxlcg" Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.818347 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dm794\" (UniqueName: \"kubernetes.io/projected/ae48b5ff-caaa-4d4c-81c5-201afe2220ef-kube-api-access-dm794\") pod \"ovn-controller-metrics-vxlcg\" (UID: \"ae48b5ff-caaa-4d4c-81c5-201afe2220ef\") " pod="openstack/ovn-controller-metrics-vxlcg" Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.832305 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-vmb2f"] Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.852583 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-vmb2f" Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.868728 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 09 13:19:29 crc kubenswrapper[4723]: I0309 13:19:29.877259 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-vxlcg" Mar 09 13:19:30 crc kubenswrapper[4723]: I0309 13:19:30.037645 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d07d123-4437-46d2-b1f8-d4e0b495e0fa-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-vmb2f\" (UID: \"0d07d123-4437-46d2-b1f8-d4e0b495e0fa\") " pod="openstack/dnsmasq-dns-86db49b7ff-vmb2f" Mar 09 13:19:30 crc kubenswrapper[4723]: I0309 13:19:30.038576 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7zpg\" (UniqueName: \"kubernetes.io/projected/0d07d123-4437-46d2-b1f8-d4e0b495e0fa-kube-api-access-r7zpg\") pod \"dnsmasq-dns-86db49b7ff-vmb2f\" (UID: \"0d07d123-4437-46d2-b1f8-d4e0b495e0fa\") " pod="openstack/dnsmasq-dns-86db49b7ff-vmb2f" Mar 09 13:19:30 crc kubenswrapper[4723]: I0309 13:19:30.038698 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d07d123-4437-46d2-b1f8-d4e0b495e0fa-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-vmb2f\" (UID: \"0d07d123-4437-46d2-b1f8-d4e0b495e0fa\") " pod="openstack/dnsmasq-dns-86db49b7ff-vmb2f" Mar 09 13:19:30 crc kubenswrapper[4723]: I0309 13:19:30.038932 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d07d123-4437-46d2-b1f8-d4e0b495e0fa-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-vmb2f\" (UID: \"0d07d123-4437-46d2-b1f8-d4e0b495e0fa\") " pod="openstack/dnsmasq-dns-86db49b7ff-vmb2f" Mar 09 13:19:30 crc kubenswrapper[4723]: I0309 13:19:30.040997 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d07d123-4437-46d2-b1f8-d4e0b495e0fa-config\") pod \"dnsmasq-dns-86db49b7ff-vmb2f\" (UID: \"0d07d123-4437-46d2-b1f8-d4e0b495e0fa\") " pod="openstack/dnsmasq-dns-86db49b7ff-vmb2f" Mar 09 13:19:30 crc kubenswrapper[4723]: I0309 13:19:30.046244 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-vmb2f"] Mar 09 13:19:30 crc kubenswrapper[4723]: I0309 13:19:30.069536 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"19f91c12-b482-46ab-a6e1-20164abe2ee4","Type":"ContainerStarted","Data":"1619049fc971a5f761213226bb2e8e6badaa3b8e5bc14d02cccc57f4fa2faf21"} Mar 09 13:19:30 crc kubenswrapper[4723]: I0309 13:19:30.073143 4723 generic.go:334] "Generic (PLEG): container finished" podID="43506b49-01ab-4ea9-bcd6-4ce950b7815f" containerID="6c0568b4b9eab8cfbbfeac71e2c9588bdde5336c8f6bab4ce6234c98ed627bd3" exitCode=0 Mar 09 13:19:30 crc kubenswrapper[4723]: I0309 13:19:30.073438 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gg872" event={"ID":"43506b49-01ab-4ea9-bcd6-4ce950b7815f","Type":"ContainerDied","Data":"6c0568b4b9eab8cfbbfeac71e2c9588bdde5336c8f6bab4ce6234c98ed627bd3"} Mar 09 13:19:30 crc kubenswrapper[4723]: I0309 13:19:30.143630 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d07d123-4437-46d2-b1f8-d4e0b495e0fa-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-vmb2f\" (UID: \"0d07d123-4437-46d2-b1f8-d4e0b495e0fa\") " pod="openstack/dnsmasq-dns-86db49b7ff-vmb2f" Mar 09 13:19:30 crc kubenswrapper[4723]: I0309 13:19:30.143748 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d07d123-4437-46d2-b1f8-d4e0b495e0fa-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-vmb2f\" (UID: \"0d07d123-4437-46d2-b1f8-d4e0b495e0fa\") " pod="openstack/dnsmasq-dns-86db49b7ff-vmb2f" Mar 09 13:19:30 crc kubenswrapper[4723]: I0309 13:19:30.143791 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d07d123-4437-46d2-b1f8-d4e0b495e0fa-config\") pod \"dnsmasq-dns-86db49b7ff-vmb2f\" (UID: \"0d07d123-4437-46d2-b1f8-d4e0b495e0fa\") " pod="openstack/dnsmasq-dns-86db49b7ff-vmb2f" Mar 09 13:19:30 crc kubenswrapper[4723]: I0309 13:19:30.144799 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d07d123-4437-46d2-b1f8-d4e0b495e0fa-config\") pod \"dnsmasq-dns-86db49b7ff-vmb2f\" (UID: \"0d07d123-4437-46d2-b1f8-d4e0b495e0fa\") " pod="openstack/dnsmasq-dns-86db49b7ff-vmb2f" Mar 09 13:19:30 crc kubenswrapper[4723]: I0309 13:19:30.144911 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d07d123-4437-46d2-b1f8-d4e0b495e0fa-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-vmb2f\" (UID: \"0d07d123-4437-46d2-b1f8-d4e0b495e0fa\") " pod="openstack/dnsmasq-dns-86db49b7ff-vmb2f" Mar 09 13:19:30 crc kubenswrapper[4723]: I0309 13:19:30.144992 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7zpg\" (UniqueName: \"kubernetes.io/projected/0d07d123-4437-46d2-b1f8-d4e0b495e0fa-kube-api-access-r7zpg\") pod \"dnsmasq-dns-86db49b7ff-vmb2f\" (UID: \"0d07d123-4437-46d2-b1f8-d4e0b495e0fa\") " pod="openstack/dnsmasq-dns-86db49b7ff-vmb2f" Mar 09 13:19:30 crc kubenswrapper[4723]: I0309 13:19:30.152006 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d07d123-4437-46d2-b1f8-d4e0b495e0fa-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-vmb2f\" (UID: \"0d07d123-4437-46d2-b1f8-d4e0b495e0fa\") " pod="openstack/dnsmasq-dns-86db49b7ff-vmb2f" Mar 09 13:19:30 crc kubenswrapper[4723]: I0309 13:19:30.152554 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d07d123-4437-46d2-b1f8-d4e0b495e0fa-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-vmb2f\" (UID: \"0d07d123-4437-46d2-b1f8-d4e0b495e0fa\") " pod="openstack/dnsmasq-dns-86db49b7ff-vmb2f" Mar 09 13:19:30 crc kubenswrapper[4723]: I0309 13:19:30.156247 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d07d123-4437-46d2-b1f8-d4e0b495e0fa-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-vmb2f\" (UID: \"0d07d123-4437-46d2-b1f8-d4e0b495e0fa\") " pod="openstack/dnsmasq-dns-86db49b7ff-vmb2f" Mar 09 13:19:30 crc kubenswrapper[4723]: I0309 13:19:30.177257 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7zpg\" (UniqueName: \"kubernetes.io/projected/0d07d123-4437-46d2-b1f8-d4e0b495e0fa-kube-api-access-r7zpg\") pod \"dnsmasq-dns-86db49b7ff-vmb2f\" (UID: \"0d07d123-4437-46d2-b1f8-d4e0b495e0fa\") " pod="openstack/dnsmasq-dns-86db49b7ff-vmb2f" Mar 09 13:19:30 crc kubenswrapper[4723]: I0309 13:19:30.334361 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-vmb2f" Mar 09 13:19:30 crc kubenswrapper[4723]: I0309 13:19:30.478749 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-2k7l8"] Mar 09 13:19:30 crc kubenswrapper[4723]: I0309 13:19:30.822711 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-vxlcg"] Mar 09 13:19:30 crc kubenswrapper[4723]: I0309 13:19:30.906237 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 09 13:19:30 crc kubenswrapper[4723]: I0309 13:19:30.940593 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 09 13:19:30 crc kubenswrapper[4723]: I0309 13:19:30.993386 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-gg872" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.073132 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43506b49-01ab-4ea9-bcd6-4ce950b7815f-dns-svc\") pod \"43506b49-01ab-4ea9-bcd6-4ce950b7815f\" (UID: \"43506b49-01ab-4ea9-bcd6-4ce950b7815f\") " Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.073472 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9l9dk\" (UniqueName: \"kubernetes.io/projected/43506b49-01ab-4ea9-bcd6-4ce950b7815f-kube-api-access-9l9dk\") pod \"43506b49-01ab-4ea9-bcd6-4ce950b7815f\" (UID: \"43506b49-01ab-4ea9-bcd6-4ce950b7815f\") " Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.073670 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43506b49-01ab-4ea9-bcd6-4ce950b7815f-config\") pod \"43506b49-01ab-4ea9-bcd6-4ce950b7815f\" (UID: \"43506b49-01ab-4ea9-bcd6-4ce950b7815f\") " Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.092229 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43506b49-01ab-4ea9-bcd6-4ce950b7815f-kube-api-access-9l9dk" (OuterVolumeSpecName: "kube-api-access-9l9dk") pod "43506b49-01ab-4ea9-bcd6-4ce950b7815f" (UID: "43506b49-01ab-4ea9-bcd6-4ce950b7815f"). InnerVolumeSpecName "kube-api-access-9l9dk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.098131 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17","Type":"ContainerStarted","Data":"48d1a939153780cd96b37f85bf410269a1272fb3d3d155e6ce069c4ad4d3cd95"} Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.104434 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43506b49-01ab-4ea9-bcd6-4ce950b7815f-config" (OuterVolumeSpecName: "config") pod "43506b49-01ab-4ea9-bcd6-4ce950b7815f" (UID: "43506b49-01ab-4ea9-bcd6-4ce950b7815f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.104752 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-gg872" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.104950 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gg872" event={"ID":"43506b49-01ab-4ea9-bcd6-4ce950b7815f","Type":"ContainerDied","Data":"3c7efa8a971d3ff09c3bb0908930a621d4d600f2661cb7100bd5ae11095a24a5"} Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.104991 4723 scope.go:117] "RemoveContainer" containerID="6c0568b4b9eab8cfbbfeac71e2c9588bdde5336c8f6bab4ce6234c98ed627bd3" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.107371 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43506b49-01ab-4ea9-bcd6-4ce950b7815f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "43506b49-01ab-4ea9-bcd6-4ce950b7815f" (UID: "43506b49-01ab-4ea9-bcd6-4ce950b7815f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.117786 4723 generic.go:334] "Generic (PLEG): container finished" podID="e02a0c7b-7017-43f5-a43e-300bf157cd37" containerID="8bdf8e2c56fdbc2f38f4e61046b7273b9601691fd104c700f981345420a2b2a8" exitCode=0 Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.117843 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-2k7l8" event={"ID":"e02a0c7b-7017-43f5-a43e-300bf157cd37","Type":"ContainerDied","Data":"8bdf8e2c56fdbc2f38f4e61046b7273b9601691fd104c700f981345420a2b2a8"} Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.117907 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-2k7l8" event={"ID":"e02a0c7b-7017-43f5-a43e-300bf157cd37","Type":"ContainerStarted","Data":"695ab2da5298877bcb0c9544c311972b271d10812fe6ec617579b9638976dc9b"} Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.128048 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-vxlcg" event={"ID":"ae48b5ff-caaa-4d4c-81c5-201afe2220ef","Type":"ContainerStarted","Data":"603930389d620a5b7f56cc2ce4938df712b4c56eec4f58439abc093b0c81c567"} Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.128079 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc8479f9-sf8lq" podUID="85ff66cb-dbf2-4501-bb5d-7f1142896dc3" containerName="dnsmasq-dns" containerID="cri-o://2a236cb50fae8bfc47e5fe3401512cbb2342fcc29e8497254621a36fdde8287f" gracePeriod=10 Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.129431 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.176444 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43506b49-01ab-4ea9-bcd6-4ce950b7815f-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.176469 4723 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43506b49-01ab-4ea9-bcd6-4ce950b7815f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.176479 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9l9dk\" (UniqueName: \"kubernetes.io/projected/43506b49-01ab-4ea9-bcd6-4ce950b7815f-kube-api-access-9l9dk\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.181252 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-vmb2f"] Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.203161 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 09 13:19:31 crc kubenswrapper[4723]: W0309 13:19:31.221164 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d07d123_4437_46d2_b1f8_d4e0b495e0fa.slice/crio-821423caf45cfb68ea6aa64738ff5f33f209cac63e167418f4f29100207dd1e7 WatchSource:0}: Error finding container 821423caf45cfb68ea6aa64738ff5f33f209cac63e167418f4f29100207dd1e7: Status 404 returned error can't find the container with id 821423caf45cfb68ea6aa64738ff5f33f209cac63e167418f4f29100207dd1e7 Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.466068 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 09 13:19:31 crc kubenswrapper[4723]: E0309 13:19:31.466774 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43506b49-01ab-4ea9-bcd6-4ce950b7815f" containerName="init" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.466795 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="43506b49-01ab-4ea9-bcd6-4ce950b7815f" containerName="init" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.467027 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="43506b49-01ab-4ea9-bcd6-4ce950b7815f" containerName="init" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.468061 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.472737 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.473024 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-pzvqv" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.473127 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.473209 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.516564 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/80aecab8-a10c-48aa-9cba-a35bd822cc09-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"80aecab8-a10c-48aa-9cba-a35bd822cc09\") " pod="openstack/ovn-northd-0" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.516692 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/80aecab8-a10c-48aa-9cba-a35bd822cc09-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"80aecab8-a10c-48aa-9cba-a35bd822cc09\") " pod="openstack/ovn-northd-0" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.516722 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80aecab8-a10c-48aa-9cba-a35bd822cc09-config\") pod \"ovn-northd-0\" (UID: \"80aecab8-a10c-48aa-9cba-a35bd822cc09\") " pod="openstack/ovn-northd-0" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.516771 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7zdp\" (UniqueName: \"kubernetes.io/projected/80aecab8-a10c-48aa-9cba-a35bd822cc09-kube-api-access-s7zdp\") pod \"ovn-northd-0\" (UID: \"80aecab8-a10c-48aa-9cba-a35bd822cc09\") " pod="openstack/ovn-northd-0" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.516805 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/80aecab8-a10c-48aa-9cba-a35bd822cc09-scripts\") pod \"ovn-northd-0\" (UID: \"80aecab8-a10c-48aa-9cba-a35bd822cc09\") " pod="openstack/ovn-northd-0" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.516828 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80aecab8-a10c-48aa-9cba-a35bd822cc09-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"80aecab8-a10c-48aa-9cba-a35bd822cc09\") " pod="openstack/ovn-northd-0" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.516938 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/80aecab8-a10c-48aa-9cba-a35bd822cc09-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"80aecab8-a10c-48aa-9cba-a35bd822cc09\") " pod="openstack/ovn-northd-0" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.519077 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.551931 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.551971 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.570278 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gg872"] Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.596572 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gg872"] Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.651406 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/80aecab8-a10c-48aa-9cba-a35bd822cc09-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"80aecab8-a10c-48aa-9cba-a35bd822cc09\") " pod="openstack/ovn-northd-0" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.651497 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/80aecab8-a10c-48aa-9cba-a35bd822cc09-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"80aecab8-a10c-48aa-9cba-a35bd822cc09\") " pod="openstack/ovn-northd-0" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.651561 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/80aecab8-a10c-48aa-9cba-a35bd822cc09-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"80aecab8-a10c-48aa-9cba-a35bd822cc09\") " pod="openstack/ovn-northd-0" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.651582 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80aecab8-a10c-48aa-9cba-a35bd822cc09-config\") pod \"ovn-northd-0\" (UID: \"80aecab8-a10c-48aa-9cba-a35bd822cc09\") " pod="openstack/ovn-northd-0" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.651614 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7zdp\" (UniqueName: \"kubernetes.io/projected/80aecab8-a10c-48aa-9cba-a35bd822cc09-kube-api-access-s7zdp\") pod \"ovn-northd-0\" (UID: \"80aecab8-a10c-48aa-9cba-a35bd822cc09\") " pod="openstack/ovn-northd-0" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.651646 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/80aecab8-a10c-48aa-9cba-a35bd822cc09-scripts\") pod \"ovn-northd-0\" (UID: \"80aecab8-a10c-48aa-9cba-a35bd822cc09\") " pod="openstack/ovn-northd-0" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.651670 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80aecab8-a10c-48aa-9cba-a35bd822cc09-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"80aecab8-a10c-48aa-9cba-a35bd822cc09\") " pod="openstack/ovn-northd-0" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.654281 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/80aecab8-a10c-48aa-9cba-a35bd822cc09-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"80aecab8-a10c-48aa-9cba-a35bd822cc09\") " pod="openstack/ovn-northd-0" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.657749 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80aecab8-a10c-48aa-9cba-a35bd822cc09-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"80aecab8-a10c-48aa-9cba-a35bd822cc09\") " pod="openstack/ovn-northd-0" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.658374 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80aecab8-a10c-48aa-9cba-a35bd822cc09-config\") pod \"ovn-northd-0\" (UID: \"80aecab8-a10c-48aa-9cba-a35bd822cc09\") " pod="openstack/ovn-northd-0" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.659340 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/80aecab8-a10c-48aa-9cba-a35bd822cc09-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"80aecab8-a10c-48aa-9cba-a35bd822cc09\") " pod="openstack/ovn-northd-0" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.663810 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/80aecab8-a10c-48aa-9cba-a35bd822cc09-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"80aecab8-a10c-48aa-9cba-a35bd822cc09\") " pod="openstack/ovn-northd-0" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.675117 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/80aecab8-a10c-48aa-9cba-a35bd822cc09-scripts\") pod \"ovn-northd-0\" (UID: \"80aecab8-a10c-48aa-9cba-a35bd822cc09\") " pod="openstack/ovn-northd-0" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.683645 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7zdp\" (UniqueName: \"kubernetes.io/projected/80aecab8-a10c-48aa-9cba-a35bd822cc09-kube-api-access-s7zdp\") pod \"ovn-northd-0\" (UID: \"80aecab8-a10c-48aa-9cba-a35bd822cc09\") " pod="openstack/ovn-northd-0" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.694915 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-sf8lq" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.781817 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.830124 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.857974 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85ff66cb-dbf2-4501-bb5d-7f1142896dc3-dns-svc\") pod \"85ff66cb-dbf2-4501-bb5d-7f1142896dc3\" (UID: \"85ff66cb-dbf2-4501-bb5d-7f1142896dc3\") " Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.858232 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nmj8\" (UniqueName: \"kubernetes.io/projected/85ff66cb-dbf2-4501-bb5d-7f1142896dc3-kube-api-access-9nmj8\") pod \"85ff66cb-dbf2-4501-bb5d-7f1142896dc3\" (UID: \"85ff66cb-dbf2-4501-bb5d-7f1142896dc3\") " Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.858296 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85ff66cb-dbf2-4501-bb5d-7f1142896dc3-config\") pod \"85ff66cb-dbf2-4501-bb5d-7f1142896dc3\" (UID: \"85ff66cb-dbf2-4501-bb5d-7f1142896dc3\") " Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.864481 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85ff66cb-dbf2-4501-bb5d-7f1142896dc3-kube-api-access-9nmj8" (OuterVolumeSpecName: "kube-api-access-9nmj8") pod "85ff66cb-dbf2-4501-bb5d-7f1142896dc3" (UID: "85ff66cb-dbf2-4501-bb5d-7f1142896dc3"). InnerVolumeSpecName "kube-api-access-9nmj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.921091 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.957578 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85ff66cb-dbf2-4501-bb5d-7f1142896dc3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "85ff66cb-dbf2-4501-bb5d-7f1142896dc3" (UID: "85ff66cb-dbf2-4501-bb5d-7f1142896dc3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.960634 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nmj8\" (UniqueName: \"kubernetes.io/projected/85ff66cb-dbf2-4501-bb5d-7f1142896dc3-kube-api-access-9nmj8\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.960658 4723 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85ff66cb-dbf2-4501-bb5d-7f1142896dc3-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:31 crc kubenswrapper[4723]: I0309 13:19:31.968073 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85ff66cb-dbf2-4501-bb5d-7f1142896dc3-config" (OuterVolumeSpecName: "config") pod "85ff66cb-dbf2-4501-bb5d-7f1142896dc3" (UID: "85ff66cb-dbf2-4501-bb5d-7f1142896dc3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.040941 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.063944 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85ff66cb-dbf2-4501-bb5d-7f1142896dc3-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.139992 4723 generic.go:334] "Generic (PLEG): container finished" podID="85ff66cb-dbf2-4501-bb5d-7f1142896dc3" containerID="2a236cb50fae8bfc47e5fe3401512cbb2342fcc29e8497254621a36fdde8287f" exitCode=0 Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.140059 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-sf8lq" event={"ID":"85ff66cb-dbf2-4501-bb5d-7f1142896dc3","Type":"ContainerDied","Data":"2a236cb50fae8bfc47e5fe3401512cbb2342fcc29e8497254621a36fdde8287f"} Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.140092 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-sf8lq" event={"ID":"85ff66cb-dbf2-4501-bb5d-7f1142896dc3","Type":"ContainerDied","Data":"7644429a190da8943e24b0bc770e8f08f4a94fbddf495a022389669472679608"} Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.140112 4723 scope.go:117] "RemoveContainer" containerID="2a236cb50fae8bfc47e5fe3401512cbb2342fcc29e8497254621a36fdde8287f" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.140254 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-sf8lq" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.147634 4723 generic.go:334] "Generic (PLEG): container finished" podID="c72a0aaf-0e9e-4327-91d3-bc7c68542eab" containerID="07575b77da38dde101b5863d2a83706ebb79127082bebd05c9e54b540f0cd082" exitCode=0 Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.147747 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.148149 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c72a0aaf-0e9e-4327-91d3-bc7c68542eab","Type":"ContainerDied","Data":"07575b77da38dde101b5863d2a83706ebb79127082bebd05c9e54b540f0cd082"} Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.148213 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c72a0aaf-0e9e-4327-91d3-bc7c68542eab","Type":"ContainerDied","Data":"90917e42d98f28638867d5e7f3af9f6bf01c5f77d8a526999ff64c9454528467"} Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.150331 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-vxlcg" event={"ID":"ae48b5ff-caaa-4d4c-81c5-201afe2220ef","Type":"ContainerStarted","Data":"44a285fa382efe0f0e2c6ca3b0703736e1de257fcd5bf097eb6e8e01b7249136"} Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.156411 4723 generic.go:334] "Generic (PLEG): container finished" podID="0d07d123-4437-46d2-b1f8-d4e0b495e0fa" containerID="55a1bea2226982c9fc007254ecd0bafcac3b14d4b114155b0ecf7dc7062af31b" exitCode=0 Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.156454 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-vmb2f" event={"ID":"0d07d123-4437-46d2-b1f8-d4e0b495e0fa","Type":"ContainerDied","Data":"55a1bea2226982c9fc007254ecd0bafcac3b14d4b114155b0ecf7dc7062af31b"} Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.156736 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-vmb2f" event={"ID":"0d07d123-4437-46d2-b1f8-d4e0b495e0fa","Type":"ContainerStarted","Data":"821423caf45cfb68ea6aa64738ff5f33f209cac63e167418f4f29100207dd1e7"} Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.161099 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-2k7l8" event={"ID":"e02a0c7b-7017-43f5-a43e-300bf157cd37","Type":"ContainerStarted","Data":"c2075174a9cd19a29e40fa0c0885414e2511911e74a51b19dd10ec38d93573c6"} Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.162059 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f896c8c65-2k7l8" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.166128 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-config-out\") pod \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\" (UID: \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\") " Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.167168 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-81bbfcdd-8860-466d-9fd8-a0c71da7caf4\") pod \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\" (UID: \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\") " Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.167223 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-thanos-prometheus-http-client-file\") pod \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\" (UID: \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\") " Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.167299 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-web-config\") pod \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\" (UID: \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\") " Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.167333 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-prometheus-metric-storage-rulefiles-0\") pod \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\" (UID: \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\") " Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.167356 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcj5h\" (UniqueName: \"kubernetes.io/projected/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-kube-api-access-rcj5h\") pod \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\" (UID: \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\") " Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.167460 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-tls-assets\") pod \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\" (UID: \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\") " Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.167548 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-prometheus-metric-storage-rulefiles-1\") pod \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\" (UID: \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\") " Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.167617 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-prometheus-metric-storage-rulefiles-2\") pod \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\" (UID: \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\") " Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.167648 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-config\") pod \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\" (UID: \"c72a0aaf-0e9e-4327-91d3-bc7c68542eab\") " Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.169281 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "c72a0aaf-0e9e-4327-91d3-bc7c68542eab" (UID: "c72a0aaf-0e9e-4327-91d3-bc7c68542eab"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.172546 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "c72a0aaf-0e9e-4327-91d3-bc7c68542eab" (UID: "c72a0aaf-0e9e-4327-91d3-bc7c68542eab"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.173958 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "c72a0aaf-0e9e-4327-91d3-bc7c68542eab" (UID: "c72a0aaf-0e9e-4327-91d3-bc7c68542eab"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.185169 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "c72a0aaf-0e9e-4327-91d3-bc7c68542eab" (UID: "c72a0aaf-0e9e-4327-91d3-bc7c68542eab"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.186323 4723 scope.go:117] "RemoveContainer" containerID="8a1453ffa6d54142b5917e60c23baf06f0691b780e65ba722d03c0df4217fd77" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.186553 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "c72a0aaf-0e9e-4327-91d3-bc7c68542eab" (UID: "c72a0aaf-0e9e-4327-91d3-bc7c68542eab"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.186623 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-config" (OuterVolumeSpecName: "config") pod "c72a0aaf-0e9e-4327-91d3-bc7c68542eab" (UID: "c72a0aaf-0e9e-4327-91d3-bc7c68542eab"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.200554 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-kube-api-access-rcj5h" (OuterVolumeSpecName: "kube-api-access-rcj5h") pod "c72a0aaf-0e9e-4327-91d3-bc7c68542eab" (UID: "c72a0aaf-0e9e-4327-91d3-bc7c68542eab"). InnerVolumeSpecName "kube-api-access-rcj5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.200650 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-config-out" (OuterVolumeSpecName: "config-out") pod "c72a0aaf-0e9e-4327-91d3-bc7c68542eab" (UID: "c72a0aaf-0e9e-4327-91d3-bc7c68542eab"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.200662 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-web-config" (OuterVolumeSpecName: "web-config") pod "c72a0aaf-0e9e-4327-91d3-bc7c68542eab" (UID: "c72a0aaf-0e9e-4327-91d3-bc7c68542eab"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.234057 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-vxlcg" podStartSLOduration=3.234037902 podStartE2EDuration="3.234037902s" podCreationTimestamp="2026-03-09 13:19:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:19:32.182400997 +0000 UTC m=+1246.196868547" watchObservedRunningTime="2026-03-09 13:19:32.234037902 +0000 UTC m=+1246.248505442" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.243818 4723 scope.go:117] "RemoveContainer" containerID="2a236cb50fae8bfc47e5fe3401512cbb2342fcc29e8497254621a36fdde8287f" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.249104 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-81bbfcdd-8860-466d-9fd8-a0c71da7caf4" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "c72a0aaf-0e9e-4327-91d3-bc7c68542eab" (UID: "c72a0aaf-0e9e-4327-91d3-bc7c68542eab"). InnerVolumeSpecName "pvc-81bbfcdd-8860-466d-9fd8-a0c71da7caf4". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 09 13:19:32 crc kubenswrapper[4723]: E0309 13:19:32.254586 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a236cb50fae8bfc47e5fe3401512cbb2342fcc29e8497254621a36fdde8287f\": container with ID starting with 2a236cb50fae8bfc47e5fe3401512cbb2342fcc29e8497254621a36fdde8287f not found: ID does not exist" containerID="2a236cb50fae8bfc47e5fe3401512cbb2342fcc29e8497254621a36fdde8287f" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.254624 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a236cb50fae8bfc47e5fe3401512cbb2342fcc29e8497254621a36fdde8287f"} err="failed to get container status \"2a236cb50fae8bfc47e5fe3401512cbb2342fcc29e8497254621a36fdde8287f\": rpc error: code = NotFound desc = could not find container \"2a236cb50fae8bfc47e5fe3401512cbb2342fcc29e8497254621a36fdde8287f\": container with ID starting with 2a236cb50fae8bfc47e5fe3401512cbb2342fcc29e8497254621a36fdde8287f not found: ID does not exist" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.254646 4723 scope.go:117] "RemoveContainer" containerID="8a1453ffa6d54142b5917e60c23baf06f0691b780e65ba722d03c0df4217fd77" Mar 09 13:19:32 crc kubenswrapper[4723]: E0309 13:19:32.257013 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a1453ffa6d54142b5917e60c23baf06f0691b780e65ba722d03c0df4217fd77\": container with ID starting with 8a1453ffa6d54142b5917e60c23baf06f0691b780e65ba722d03c0df4217fd77 not found: ID does not exist" containerID="8a1453ffa6d54142b5917e60c23baf06f0691b780e65ba722d03c0df4217fd77" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.257058 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a1453ffa6d54142b5917e60c23baf06f0691b780e65ba722d03c0df4217fd77"} err="failed to get container status \"8a1453ffa6d54142b5917e60c23baf06f0691b780e65ba722d03c0df4217fd77\": rpc error: code = NotFound desc = could not find container \"8a1453ffa6d54142b5917e60c23baf06f0691b780e65ba722d03c0df4217fd77\": container with ID starting with 8a1453ffa6d54142b5917e60c23baf06f0691b780e65ba722d03c0df4217fd77 not found: ID does not exist" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.257087 4723 scope.go:117] "RemoveContainer" containerID="07575b77da38dde101b5863d2a83706ebb79127082bebd05c9e54b540f0cd082" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.261266 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f896c8c65-2k7l8" podStartSLOduration=3.261242966 podStartE2EDuration="3.261242966s" podCreationTimestamp="2026-03-09 13:19:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:19:32.215850697 +0000 UTC m=+1246.230318237" watchObservedRunningTime="2026-03-09 13:19:32.261242966 +0000 UTC m=+1246.275710506" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.270401 4723 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.270425 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcj5h\" (UniqueName: \"kubernetes.io/projected/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-kube-api-access-rcj5h\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.270445 4723 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-tls-assets\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.270455 4723 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.270465 4723 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.270475 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.270483 4723 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-config-out\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.270505 4723 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-81bbfcdd-8860-466d-9fd8-a0c71da7caf4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-81bbfcdd-8860-466d-9fd8-a0c71da7caf4\") on node \"crc\" " Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.270517 4723 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.270526 4723 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c72a0aaf-0e9e-4327-91d3-bc7c68542eab-web-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.291197 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-sf8lq"] Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.295982 4723 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.296254 4723 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-81bbfcdd-8860-466d-9fd8-a0c71da7caf4" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-81bbfcdd-8860-466d-9fd8-a0c71da7caf4") on node "crc" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.302280 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-sf8lq"] Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.307114 4723 scope.go:117] "RemoveContainer" containerID="07575b77da38dde101b5863d2a83706ebb79127082bebd05c9e54b540f0cd082" Mar 09 13:19:32 crc kubenswrapper[4723]: E0309 13:19:32.308557 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07575b77da38dde101b5863d2a83706ebb79127082bebd05c9e54b540f0cd082\": container with ID starting with 07575b77da38dde101b5863d2a83706ebb79127082bebd05c9e54b540f0cd082 not found: ID does not exist" containerID="07575b77da38dde101b5863d2a83706ebb79127082bebd05c9e54b540f0cd082" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.308605 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07575b77da38dde101b5863d2a83706ebb79127082bebd05c9e54b540f0cd082"} err="failed to get container status \"07575b77da38dde101b5863d2a83706ebb79127082bebd05c9e54b540f0cd082\": rpc error: code = NotFound desc = could not find container \"07575b77da38dde101b5863d2a83706ebb79127082bebd05c9e54b540f0cd082\": container with ID starting with 07575b77da38dde101b5863d2a83706ebb79127082bebd05c9e54b540f0cd082 not found: ID does not exist" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.372222 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.374877 4723 reconciler_common.go:293] "Volume detached for volume \"pvc-81bbfcdd-8860-466d-9fd8-a0c71da7caf4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-81bbfcdd-8860-466d-9fd8-a0c71da7caf4\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.384177 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 09 13:19:32 crc kubenswrapper[4723]: W0309 13:19:32.401325 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80aecab8_a10c_48aa_9cba_a35bd822cc09.slice/crio-7fd513c9aea3f28320beb5aae171ed342c2c66a77be95ffd0baab0ba2a0fd8f5 WatchSource:0}: Error finding container 7fd513c9aea3f28320beb5aae171ed342c2c66a77be95ffd0baab0ba2a0fd8f5: Status 404 returned error can't find the container with id 7fd513c9aea3f28320beb5aae171ed342c2c66a77be95ffd0baab0ba2a0fd8f5 Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.511599 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.518699 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.534810 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 09 13:19:32 crc kubenswrapper[4723]: E0309 13:19:32.535425 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85ff66cb-dbf2-4501-bb5d-7f1142896dc3" containerName="dnsmasq-dns" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.535455 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="85ff66cb-dbf2-4501-bb5d-7f1142896dc3" containerName="dnsmasq-dns" Mar 09 13:19:32 crc kubenswrapper[4723]: E0309 13:19:32.535471 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85ff66cb-dbf2-4501-bb5d-7f1142896dc3" containerName="init" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.535477 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="85ff66cb-dbf2-4501-bb5d-7f1142896dc3" containerName="init" Mar 09 13:19:32 crc kubenswrapper[4723]: E0309 13:19:32.535487 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c72a0aaf-0e9e-4327-91d3-bc7c68542eab" containerName="init-config-reloader" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.535494 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="c72a0aaf-0e9e-4327-91d3-bc7c68542eab" containerName="init-config-reloader" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.535679 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="c72a0aaf-0e9e-4327-91d3-bc7c68542eab" containerName="init-config-reloader" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.535692 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="85ff66cb-dbf2-4501-bb5d-7f1142896dc3" containerName="dnsmasq-dns" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.537431 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.546678 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.546688 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.547579 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.547997 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.548355 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.548989 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.548983 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-bkfhs" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.550529 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.553395 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.681570 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4a548a9c-33c8-4a35-a559-7290357170c1-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"4a548a9c-33c8-4a35-a559-7290357170c1\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.681645 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwx7t\" (UniqueName: \"kubernetes.io/projected/4a548a9c-33c8-4a35-a559-7290357170c1-kube-api-access-lwx7t\") pod \"prometheus-metric-storage-0\" (UID: \"4a548a9c-33c8-4a35-a559-7290357170c1\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.681774 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4a548a9c-33c8-4a35-a559-7290357170c1-config\") pod \"prometheus-metric-storage-0\" (UID: \"4a548a9c-33c8-4a35-a559-7290357170c1\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.681943 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-81bbfcdd-8860-466d-9fd8-a0c71da7caf4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-81bbfcdd-8860-466d-9fd8-a0c71da7caf4\") pod \"prometheus-metric-storage-0\" (UID: \"4a548a9c-33c8-4a35-a559-7290357170c1\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.682016 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4a548a9c-33c8-4a35-a559-7290357170c1-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"4a548a9c-33c8-4a35-a559-7290357170c1\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.682302 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4a548a9c-33c8-4a35-a559-7290357170c1-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"4a548a9c-33c8-4a35-a559-7290357170c1\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.682339 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4a548a9c-33c8-4a35-a559-7290357170c1-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"4a548a9c-33c8-4a35-a559-7290357170c1\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.682386 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/4a548a9c-33c8-4a35-a559-7290357170c1-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"4a548a9c-33c8-4a35-a559-7290357170c1\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.682481 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/4a548a9c-33c8-4a35-a559-7290357170c1-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"4a548a9c-33c8-4a35-a559-7290357170c1\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.682538 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4a548a9c-33c8-4a35-a559-7290357170c1-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"4a548a9c-33c8-4a35-a559-7290357170c1\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.784729 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4a548a9c-33c8-4a35-a559-7290357170c1-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"4a548a9c-33c8-4a35-a559-7290357170c1\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.784788 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4a548a9c-33c8-4a35-a559-7290357170c1-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"4a548a9c-33c8-4a35-a559-7290357170c1\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.784818 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/4a548a9c-33c8-4a35-a559-7290357170c1-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"4a548a9c-33c8-4a35-a559-7290357170c1\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.784849 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/4a548a9c-33c8-4a35-a559-7290357170c1-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"4a548a9c-33c8-4a35-a559-7290357170c1\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.784893 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4a548a9c-33c8-4a35-a559-7290357170c1-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"4a548a9c-33c8-4a35-a559-7290357170c1\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.784974 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4a548a9c-33c8-4a35-a559-7290357170c1-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"4a548a9c-33c8-4a35-a559-7290357170c1\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.785011 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwx7t\" (UniqueName: \"kubernetes.io/projected/4a548a9c-33c8-4a35-a559-7290357170c1-kube-api-access-lwx7t\") pod \"prometheus-metric-storage-0\" (UID: \"4a548a9c-33c8-4a35-a559-7290357170c1\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.785040 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4a548a9c-33c8-4a35-a559-7290357170c1-config\") pod \"prometheus-metric-storage-0\" (UID: \"4a548a9c-33c8-4a35-a559-7290357170c1\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.785079 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-81bbfcdd-8860-466d-9fd8-a0c71da7caf4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-81bbfcdd-8860-466d-9fd8-a0c71da7caf4\") pod \"prometheus-metric-storage-0\" (UID: \"4a548a9c-33c8-4a35-a559-7290357170c1\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.785110 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4a548a9c-33c8-4a35-a559-7290357170c1-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"4a548a9c-33c8-4a35-a559-7290357170c1\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.785936 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/4a548a9c-33c8-4a35-a559-7290357170c1-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"4a548a9c-33c8-4a35-a559-7290357170c1\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.785945 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/4a548a9c-33c8-4a35-a559-7290357170c1-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"4a548a9c-33c8-4a35-a559-7290357170c1\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.786686 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4a548a9c-33c8-4a35-a559-7290357170c1-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"4a548a9c-33c8-4a35-a559-7290357170c1\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.790594 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4a548a9c-33c8-4a35-a559-7290357170c1-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"4a548a9c-33c8-4a35-a559-7290357170c1\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.790659 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4a548a9c-33c8-4a35-a559-7290357170c1-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"4a548a9c-33c8-4a35-a559-7290357170c1\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.790908 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4a548a9c-33c8-4a35-a559-7290357170c1-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"4a548a9c-33c8-4a35-a559-7290357170c1\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.790769 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4a548a9c-33c8-4a35-a559-7290357170c1-config\") pod \"prometheus-metric-storage-0\" (UID: \"4a548a9c-33c8-4a35-a559-7290357170c1\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.793570 4723 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.793606 4723 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-81bbfcdd-8860-466d-9fd8-a0c71da7caf4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-81bbfcdd-8860-466d-9fd8-a0c71da7caf4\") pod \"prometheus-metric-storage-0\" (UID: \"4a548a9c-33c8-4a35-a559-7290357170c1\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7a5bc2ca863004c00c102bc266d43328c7878cd25689db409330e8972fedad87/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.803651 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwx7t\" (UniqueName: \"kubernetes.io/projected/4a548a9c-33c8-4a35-a559-7290357170c1-kube-api-access-lwx7t\") pod \"prometheus-metric-storage-0\" (UID: \"4a548a9c-33c8-4a35-a559-7290357170c1\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.810192 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4a548a9c-33c8-4a35-a559-7290357170c1-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"4a548a9c-33c8-4a35-a559-7290357170c1\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.835254 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-81bbfcdd-8860-466d-9fd8-a0c71da7caf4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-81bbfcdd-8860-466d-9fd8-a0c71da7caf4\") pod \"prometheus-metric-storage-0\" (UID: \"4a548a9c-33c8-4a35-a559-7290357170c1\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.861014 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.925657 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43506b49-01ab-4ea9-bcd6-4ce950b7815f" path="/var/lib/kubelet/pods/43506b49-01ab-4ea9-bcd6-4ce950b7815f/volumes" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.926528 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85ff66cb-dbf2-4501-bb5d-7f1142896dc3" path="/var/lib/kubelet/pods/85ff66cb-dbf2-4501-bb5d-7f1142896dc3/volumes" Mar 09 13:19:32 crc kubenswrapper[4723]: I0309 13:19:32.927462 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c72a0aaf-0e9e-4327-91d3-bc7c68542eab" path="/var/lib/kubelet/pods/c72a0aaf-0e9e-4327-91d3-bc7c68542eab/volumes" Mar 09 13:19:33 crc kubenswrapper[4723]: I0309 13:19:33.177004 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"80aecab8-a10c-48aa-9cba-a35bd822cc09","Type":"ContainerStarted","Data":"7fd513c9aea3f28320beb5aae171ed342c2c66a77be95ffd0baab0ba2a0fd8f5"} Mar 09 13:19:33 crc kubenswrapper[4723]: I0309 13:19:33.196871 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-vmb2f" event={"ID":"0d07d123-4437-46d2-b1f8-d4e0b495e0fa","Type":"ContainerStarted","Data":"5f6698a68a7be5d219a30d632317697d1134c01481f7f107b78b760c1757948f"} Mar 09 13:19:33 crc kubenswrapper[4723]: I0309 13:19:33.197928 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-vmb2f" Mar 09 13:19:33 crc kubenswrapper[4723]: I0309 13:19:33.232904 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-vmb2f" podStartSLOduration=4.23288836 podStartE2EDuration="4.23288836s" podCreationTimestamp="2026-03-09 13:19:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:19:33.229437748 +0000 UTC m=+1247.243905288" watchObservedRunningTime="2026-03-09 13:19:33.23288836 +0000 UTC m=+1247.247355890" Mar 09 13:19:33 crc kubenswrapper[4723]: I0309 13:19:33.599068 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 09 13:19:34 crc kubenswrapper[4723]: I0309 13:19:34.210980 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4a548a9c-33c8-4a35-a559-7290357170c1","Type":"ContainerStarted","Data":"8b1742da4b1a6feb25bafe009951e7d4d7966a9d5403f08e561f67cea93ae41d"} Mar 09 13:19:34 crc kubenswrapper[4723]: I0309 13:19:34.213715 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"80aecab8-a10c-48aa-9cba-a35bd822cc09","Type":"ContainerStarted","Data":"600310a00c11a72596656c8e4a6e12ff7896f1eb29f927116eb6901ffdca24c5"} Mar 09 13:19:34 crc kubenswrapper[4723]: I0309 13:19:34.213745 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"80aecab8-a10c-48aa-9cba-a35bd822cc09","Type":"ContainerStarted","Data":"d14acc304dba7040f6acc76547a21a1ff5c4f77f8b008e7d85ecada103896546"} Mar 09 13:19:34 crc kubenswrapper[4723]: I0309 13:19:34.214614 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 09 13:19:34 crc kubenswrapper[4723]: I0309 13:19:34.237445 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.867105008 podStartE2EDuration="3.237429888s" podCreationTimestamp="2026-03-09 13:19:31 +0000 UTC" firstStartedPulling="2026-03-09 13:19:32.412250797 +0000 UTC m=+1246.426718337" lastFinishedPulling="2026-03-09 13:19:33.782575677 +0000 UTC m=+1247.797043217" observedRunningTime="2026-03-09 13:19:34.233285938 +0000 UTC m=+1248.247753478" watchObservedRunningTime="2026-03-09 13:19:34.237429888 +0000 UTC m=+1248.251897428" Mar 09 13:19:34 crc kubenswrapper[4723]: I0309 13:19:34.529097 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-2k7l8"] Mar 09 13:19:34 crc kubenswrapper[4723]: I0309 13:19:34.569595 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-2kn5v"] Mar 09 13:19:34 crc kubenswrapper[4723]: I0309 13:19:34.571257 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-2kn5v" Mar 09 13:19:34 crc kubenswrapper[4723]: I0309 13:19:34.584396 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-2kn5v"] Mar 09 13:19:34 crc kubenswrapper[4723]: I0309 13:19:34.591339 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 09 13:19:34 crc kubenswrapper[4723]: I0309 13:19:34.627231 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33ef3c2f-07b0-431c-96b0-b588061d9ce9-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-2kn5v\" (UID: \"33ef3c2f-07b0-431c-96b0-b588061d9ce9\") " pod="openstack/dnsmasq-dns-698758b865-2kn5v" Mar 09 13:19:34 crc kubenswrapper[4723]: I0309 13:19:34.627396 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33ef3c2f-07b0-431c-96b0-b588061d9ce9-dns-svc\") pod \"dnsmasq-dns-698758b865-2kn5v\" (UID: \"33ef3c2f-07b0-431c-96b0-b588061d9ce9\") " pod="openstack/dnsmasq-dns-698758b865-2kn5v" Mar 09 13:19:34 crc kubenswrapper[4723]: I0309 13:19:34.627474 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33ef3c2f-07b0-431c-96b0-b588061d9ce9-config\") pod \"dnsmasq-dns-698758b865-2kn5v\" (UID: \"33ef3c2f-07b0-431c-96b0-b588061d9ce9\") " pod="openstack/dnsmasq-dns-698758b865-2kn5v" Mar 09 13:19:34 crc kubenswrapper[4723]: I0309 13:19:34.627510 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmcl7\" (UniqueName: \"kubernetes.io/projected/33ef3c2f-07b0-431c-96b0-b588061d9ce9-kube-api-access-qmcl7\") pod \"dnsmasq-dns-698758b865-2kn5v\" (UID: \"33ef3c2f-07b0-431c-96b0-b588061d9ce9\") " pod="openstack/dnsmasq-dns-698758b865-2kn5v" Mar 09 13:19:34 crc kubenswrapper[4723]: I0309 13:19:34.627531 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33ef3c2f-07b0-431c-96b0-b588061d9ce9-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-2kn5v\" (UID: \"33ef3c2f-07b0-431c-96b0-b588061d9ce9\") " pod="openstack/dnsmasq-dns-698758b865-2kn5v" Mar 09 13:19:34 crc kubenswrapper[4723]: I0309 13:19:34.745135 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33ef3c2f-07b0-431c-96b0-b588061d9ce9-dns-svc\") pod \"dnsmasq-dns-698758b865-2kn5v\" (UID: \"33ef3c2f-07b0-431c-96b0-b588061d9ce9\") " pod="openstack/dnsmasq-dns-698758b865-2kn5v" Mar 09 13:19:34 crc kubenswrapper[4723]: I0309 13:19:34.745217 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33ef3c2f-07b0-431c-96b0-b588061d9ce9-config\") pod \"dnsmasq-dns-698758b865-2kn5v\" (UID: \"33ef3c2f-07b0-431c-96b0-b588061d9ce9\") " pod="openstack/dnsmasq-dns-698758b865-2kn5v" Mar 09 13:19:34 crc kubenswrapper[4723]: I0309 13:19:34.745257 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmcl7\" (UniqueName: \"kubernetes.io/projected/33ef3c2f-07b0-431c-96b0-b588061d9ce9-kube-api-access-qmcl7\") pod \"dnsmasq-dns-698758b865-2kn5v\" (UID: \"33ef3c2f-07b0-431c-96b0-b588061d9ce9\") " pod="openstack/dnsmasq-dns-698758b865-2kn5v" Mar 09 13:19:34 crc kubenswrapper[4723]: I0309 13:19:34.745273 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33ef3c2f-07b0-431c-96b0-b588061d9ce9-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-2kn5v\" (UID: \"33ef3c2f-07b0-431c-96b0-b588061d9ce9\") " pod="openstack/dnsmasq-dns-698758b865-2kn5v" Mar 09 13:19:34 crc kubenswrapper[4723]: I0309 13:19:34.745301 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33ef3c2f-07b0-431c-96b0-b588061d9ce9-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-2kn5v\" (UID: \"33ef3c2f-07b0-431c-96b0-b588061d9ce9\") " pod="openstack/dnsmasq-dns-698758b865-2kn5v" Mar 09 13:19:34 crc kubenswrapper[4723]: I0309 13:19:34.745996 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33ef3c2f-07b0-431c-96b0-b588061d9ce9-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-2kn5v\" (UID: \"33ef3c2f-07b0-431c-96b0-b588061d9ce9\") " pod="openstack/dnsmasq-dns-698758b865-2kn5v" Mar 09 13:19:34 crc kubenswrapper[4723]: I0309 13:19:34.746036 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33ef3c2f-07b0-431c-96b0-b588061d9ce9-dns-svc\") pod \"dnsmasq-dns-698758b865-2kn5v\" (UID: \"33ef3c2f-07b0-431c-96b0-b588061d9ce9\") " pod="openstack/dnsmasq-dns-698758b865-2kn5v" Mar 09 13:19:34 crc kubenswrapper[4723]: I0309 13:19:34.746493 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33ef3c2f-07b0-431c-96b0-b588061d9ce9-config\") pod \"dnsmasq-dns-698758b865-2kn5v\" (UID: \"33ef3c2f-07b0-431c-96b0-b588061d9ce9\") " pod="openstack/dnsmasq-dns-698758b865-2kn5v" Mar 09 13:19:34 crc kubenswrapper[4723]: I0309 13:19:34.746942 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33ef3c2f-07b0-431c-96b0-b588061d9ce9-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-2kn5v\" (UID: \"33ef3c2f-07b0-431c-96b0-b588061d9ce9\") " pod="openstack/dnsmasq-dns-698758b865-2kn5v" Mar 09 13:19:34 crc kubenswrapper[4723]: I0309 13:19:34.788343 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmcl7\" (UniqueName: \"kubernetes.io/projected/33ef3c2f-07b0-431c-96b0-b588061d9ce9-kube-api-access-qmcl7\") pod \"dnsmasq-dns-698758b865-2kn5v\" (UID: \"33ef3c2f-07b0-431c-96b0-b588061d9ce9\") " pod="openstack/dnsmasq-dns-698758b865-2kn5v" Mar 09 13:19:34 crc kubenswrapper[4723]: I0309 13:19:34.899626 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-2kn5v" Mar 09 13:19:35 crc kubenswrapper[4723]: I0309 13:19:35.227050 4723 generic.go:334] "Generic (PLEG): container finished" podID="5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17" containerID="48d1a939153780cd96b37f85bf410269a1272fb3d3d155e6ce069c4ad4d3cd95" exitCode=0 Mar 09 13:19:35 crc kubenswrapper[4723]: I0309 13:19:35.227186 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17","Type":"ContainerDied","Data":"48d1a939153780cd96b37f85bf410269a1272fb3d3d155e6ce069c4ad4d3cd95"} Mar 09 13:19:35 crc kubenswrapper[4723]: I0309 13:19:35.227614 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f896c8c65-2k7l8" podUID="e02a0c7b-7017-43f5-a43e-300bf157cd37" containerName="dnsmasq-dns" containerID="cri-o://c2075174a9cd19a29e40fa0c0885414e2511911e74a51b19dd10ec38d93573c6" gracePeriod=10 Mar 09 13:19:35 crc kubenswrapper[4723]: I0309 13:19:35.418463 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-2kn5v"] Mar 09 13:19:35 crc kubenswrapper[4723]: I0309 13:19:35.753301 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 09 13:19:35 crc kubenswrapper[4723]: I0309 13:19:35.821767 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 09 13:19:35 crc kubenswrapper[4723]: I0309 13:19:35.831434 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 09 13:19:35 crc kubenswrapper[4723]: I0309 13:19:35.831676 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 09 13:19:35 crc kubenswrapper[4723]: I0309 13:19:35.832220 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 09 13:19:35 crc kubenswrapper[4723]: I0309 13:19:35.832514 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-96kmk" Mar 09 13:19:35 crc kubenswrapper[4723]: I0309 13:19:35.843052 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 09 13:19:35 crc kubenswrapper[4723]: I0309 13:19:35.885020 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d924133b-d3c9-4b71-bbf4-a894a618e6c4-lock\") pod \"swift-storage-0\" (UID: \"d924133b-d3c9-4b71-bbf4-a894a618e6c4\") " pod="openstack/swift-storage-0" Mar 09 13:19:35 crc kubenswrapper[4723]: I0309 13:19:35.885293 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d924133b-d3c9-4b71-bbf4-a894a618e6c4-etc-swift\") pod \"swift-storage-0\" (UID: \"d924133b-d3c9-4b71-bbf4-a894a618e6c4\") " pod="openstack/swift-storage-0" Mar 09 13:19:35 crc kubenswrapper[4723]: I0309 13:19:35.885373 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d924133b-d3c9-4b71-bbf4-a894a618e6c4-cache\") pod \"swift-storage-0\" (UID: \"d924133b-d3c9-4b71-bbf4-a894a618e6c4\") " pod="openstack/swift-storage-0" Mar 09 13:19:35 crc kubenswrapper[4723]: I0309 13:19:35.885500 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwqgw\" (UniqueName: \"kubernetes.io/projected/d924133b-d3c9-4b71-bbf4-a894a618e6c4-kube-api-access-vwqgw\") pod \"swift-storage-0\" (UID: \"d924133b-d3c9-4b71-bbf4-a894a618e6c4\") " pod="openstack/swift-storage-0" Mar 09 13:19:35 crc kubenswrapper[4723]: I0309 13:19:35.885577 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-57728be0-a2a0-459b-b63a-e8d35913dbf9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-57728be0-a2a0-459b-b63a-e8d35913dbf9\") pod \"swift-storage-0\" (UID: \"d924133b-d3c9-4b71-bbf4-a894a618e6c4\") " pod="openstack/swift-storage-0" Mar 09 13:19:35 crc kubenswrapper[4723]: I0309 13:19:35.885655 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d924133b-d3c9-4b71-bbf4-a894a618e6c4-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"d924133b-d3c9-4b71-bbf4-a894a618e6c4\") " pod="openstack/swift-storage-0" Mar 09 13:19:35 crc kubenswrapper[4723]: I0309 13:19:35.987686 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwqgw\" (UniqueName: \"kubernetes.io/projected/d924133b-d3c9-4b71-bbf4-a894a618e6c4-kube-api-access-vwqgw\") pod \"swift-storage-0\" (UID: \"d924133b-d3c9-4b71-bbf4-a894a618e6c4\") " pod="openstack/swift-storage-0" Mar 09 13:19:35 crc kubenswrapper[4723]: I0309 13:19:35.988040 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-57728be0-a2a0-459b-b63a-e8d35913dbf9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-57728be0-a2a0-459b-b63a-e8d35913dbf9\") pod \"swift-storage-0\" (UID: \"d924133b-d3c9-4b71-bbf4-a894a618e6c4\") " pod="openstack/swift-storage-0" Mar 09 13:19:35 crc kubenswrapper[4723]: I0309 13:19:35.988183 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d924133b-d3c9-4b71-bbf4-a894a618e6c4-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"d924133b-d3c9-4b71-bbf4-a894a618e6c4\") " pod="openstack/swift-storage-0" Mar 09 13:19:35 crc kubenswrapper[4723]: I0309 13:19:35.988588 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d924133b-d3c9-4b71-bbf4-a894a618e6c4-lock\") pod \"swift-storage-0\" (UID: \"d924133b-d3c9-4b71-bbf4-a894a618e6c4\") " pod="openstack/swift-storage-0" Mar 09 13:19:35 crc kubenswrapper[4723]: I0309 13:19:35.988690 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d924133b-d3c9-4b71-bbf4-a894a618e6c4-etc-swift\") pod \"swift-storage-0\" (UID: \"d924133b-d3c9-4b71-bbf4-a894a618e6c4\") " pod="openstack/swift-storage-0" Mar 09 13:19:35 crc kubenswrapper[4723]: I0309 13:19:35.988749 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d924133b-d3c9-4b71-bbf4-a894a618e6c4-cache\") pod \"swift-storage-0\" (UID: \"d924133b-d3c9-4b71-bbf4-a894a618e6c4\") " pod="openstack/swift-storage-0" Mar 09 13:19:35 crc kubenswrapper[4723]: E0309 13:19:35.989082 4723 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 09 13:19:35 crc kubenswrapper[4723]: E0309 13:19:35.989113 4723 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 09 13:19:35 crc kubenswrapper[4723]: E0309 13:19:35.989150 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d924133b-d3c9-4b71-bbf4-a894a618e6c4-etc-swift podName:d924133b-d3c9-4b71-bbf4-a894a618e6c4 nodeName:}" failed. No retries permitted until 2026-03-09 13:19:36.489136343 +0000 UTC m=+1250.503603883 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d924133b-d3c9-4b71-bbf4-a894a618e6c4-etc-swift") pod "swift-storage-0" (UID: "d924133b-d3c9-4b71-bbf4-a894a618e6c4") : configmap "swift-ring-files" not found Mar 09 13:19:35 crc kubenswrapper[4723]: I0309 13:19:35.989632 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d924133b-d3c9-4b71-bbf4-a894a618e6c4-cache\") pod \"swift-storage-0\" (UID: \"d924133b-d3c9-4b71-bbf4-a894a618e6c4\") " pod="openstack/swift-storage-0" Mar 09 13:19:35 crc kubenswrapper[4723]: I0309 13:19:35.989767 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d924133b-d3c9-4b71-bbf4-a894a618e6c4-lock\") pod \"swift-storage-0\" (UID: \"d924133b-d3c9-4b71-bbf4-a894a618e6c4\") " pod="openstack/swift-storage-0" Mar 09 13:19:35 crc kubenswrapper[4723]: I0309 13:19:35.990643 4723 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 13:19:35 crc kubenswrapper[4723]: I0309 13:19:35.990667 4723 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-57728be0-a2a0-459b-b63a-e8d35913dbf9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-57728be0-a2a0-459b-b63a-e8d35913dbf9\") pod \"swift-storage-0\" (UID: \"d924133b-d3c9-4b71-bbf4-a894a618e6c4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a883c1485dd591e2d53834fb8cf8183d0b307fcf8b93da767526b9aa5b47b73e/globalmount\"" pod="openstack/swift-storage-0" Mar 09 13:19:35 crc kubenswrapper[4723]: I0309 13:19:35.993212 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d924133b-d3c9-4b71-bbf4-a894a618e6c4-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"d924133b-d3c9-4b71-bbf4-a894a618e6c4\") " pod="openstack/swift-storage-0" Mar 09 13:19:36 crc kubenswrapper[4723]: I0309 13:19:36.094580 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwqgw\" (UniqueName: \"kubernetes.io/projected/d924133b-d3c9-4b71-bbf4-a894a618e6c4-kube-api-access-vwqgw\") pod \"swift-storage-0\" (UID: \"d924133b-d3c9-4b71-bbf4-a894a618e6c4\") " pod="openstack/swift-storage-0" Mar 09 13:19:36 crc kubenswrapper[4723]: I0309 13:19:36.126976 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-57728be0-a2a0-459b-b63a-e8d35913dbf9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-57728be0-a2a0-459b-b63a-e8d35913dbf9\") pod \"swift-storage-0\" (UID: \"d924133b-d3c9-4b71-bbf4-a894a618e6c4\") " pod="openstack/swift-storage-0" Mar 09 13:19:36 crc kubenswrapper[4723]: I0309 13:19:36.237667 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17","Type":"ContainerStarted","Data":"ab2fbb8f1c9652233a9ef65bfa17a386d216ea38a67831da48d799997c4ddebd"} Mar 09 13:19:36 crc kubenswrapper[4723]: I0309 13:19:36.240166 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-2kn5v" event={"ID":"33ef3c2f-07b0-431c-96b0-b588061d9ce9","Type":"ContainerStarted","Data":"832c6ccd40dfa821f0bfd7b7e072c78b48a07a5ffe3103a4644b17ae7e111d12"} Mar 09 13:19:36 crc kubenswrapper[4723]: I0309 13:19:36.242365 4723 generic.go:334] "Generic (PLEG): container finished" podID="e02a0c7b-7017-43f5-a43e-300bf157cd37" containerID="c2075174a9cd19a29e40fa0c0885414e2511911e74a51b19dd10ec38d93573c6" exitCode=0 Mar 09 13:19:36 crc kubenswrapper[4723]: I0309 13:19:36.242418 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-2k7l8" event={"ID":"e02a0c7b-7017-43f5-a43e-300bf157cd37","Type":"ContainerDied","Data":"c2075174a9cd19a29e40fa0c0885414e2511911e74a51b19dd10ec38d93573c6"} Mar 09 13:19:36 crc kubenswrapper[4723]: I0309 13:19:36.242586 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-2k7l8" event={"ID":"e02a0c7b-7017-43f5-a43e-300bf157cd37","Type":"ContainerDied","Data":"695ab2da5298877bcb0c9544c311972b271d10812fe6ec617579b9638976dc9b"} Mar 09 13:19:36 crc kubenswrapper[4723]: I0309 13:19:36.242614 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="695ab2da5298877bcb0c9544c311972b271d10812fe6ec617579b9638976dc9b" Mar 09 13:19:36 crc kubenswrapper[4723]: I0309 13:19:36.265068 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=-9223371988.589725 podStartE2EDuration="48.26505064s" podCreationTimestamp="2026-03-09 13:18:48 +0000 UTC" firstStartedPulling="2026-03-09 13:18:51.684691891 +0000 UTC m=+1205.699159431" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:19:36.256627516 +0000 UTC m=+1250.271095056" watchObservedRunningTime="2026-03-09 13:19:36.26505064 +0000 UTC m=+1250.279518180" Mar 09 13:19:36 crc kubenswrapper[4723]: I0309 13:19:36.494876 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-2k7l8" Mar 09 13:19:36 crc kubenswrapper[4723]: I0309 13:19:36.498261 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d924133b-d3c9-4b71-bbf4-a894a618e6c4-etc-swift\") pod \"swift-storage-0\" (UID: \"d924133b-d3c9-4b71-bbf4-a894a618e6c4\") " pod="openstack/swift-storage-0" Mar 09 13:19:36 crc kubenswrapper[4723]: E0309 13:19:36.498574 4723 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 09 13:19:36 crc kubenswrapper[4723]: E0309 13:19:36.498594 4723 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 09 13:19:36 crc kubenswrapper[4723]: E0309 13:19:36.498635 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d924133b-d3c9-4b71-bbf4-a894a618e6c4-etc-swift podName:d924133b-d3c9-4b71-bbf4-a894a618e6c4 nodeName:}" failed. No retries permitted until 2026-03-09 13:19:37.49862 +0000 UTC m=+1251.513087540 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d924133b-d3c9-4b71-bbf4-a894a618e6c4-etc-swift") pod "swift-storage-0" (UID: "d924133b-d3c9-4b71-bbf4-a894a618e6c4") : configmap "swift-ring-files" not found Mar 09 13:19:36 crc kubenswrapper[4723]: I0309 13:19:36.599268 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e02a0c7b-7017-43f5-a43e-300bf157cd37-ovsdbserver-sb\") pod \"e02a0c7b-7017-43f5-a43e-300bf157cd37\" (UID: \"e02a0c7b-7017-43f5-a43e-300bf157cd37\") " Mar 09 13:19:36 crc kubenswrapper[4723]: I0309 13:19:36.599331 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e02a0c7b-7017-43f5-a43e-300bf157cd37-config\") pod \"e02a0c7b-7017-43f5-a43e-300bf157cd37\" (UID: \"e02a0c7b-7017-43f5-a43e-300bf157cd37\") " Mar 09 13:19:36 crc kubenswrapper[4723]: I0309 13:19:36.599474 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e02a0c7b-7017-43f5-a43e-300bf157cd37-dns-svc\") pod \"e02a0c7b-7017-43f5-a43e-300bf157cd37\" (UID: \"e02a0c7b-7017-43f5-a43e-300bf157cd37\") " Mar 09 13:19:36 crc kubenswrapper[4723]: I0309 13:19:36.599561 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xzkh\" (UniqueName: \"kubernetes.io/projected/e02a0c7b-7017-43f5-a43e-300bf157cd37-kube-api-access-4xzkh\") pod \"e02a0c7b-7017-43f5-a43e-300bf157cd37\" (UID: \"e02a0c7b-7017-43f5-a43e-300bf157cd37\") " Mar 09 13:19:36 crc kubenswrapper[4723]: I0309 13:19:36.760815 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e02a0c7b-7017-43f5-a43e-300bf157cd37-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e02a0c7b-7017-43f5-a43e-300bf157cd37" (UID: "e02a0c7b-7017-43f5-a43e-300bf157cd37"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:19:36 crc kubenswrapper[4723]: I0309 13:19:36.794061 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e02a0c7b-7017-43f5-a43e-300bf157cd37-kube-api-access-4xzkh" (OuterVolumeSpecName: "kube-api-access-4xzkh") pod "e02a0c7b-7017-43f5-a43e-300bf157cd37" (UID: "e02a0c7b-7017-43f5-a43e-300bf157cd37"). InnerVolumeSpecName "kube-api-access-4xzkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:19:36 crc kubenswrapper[4723]: I0309 13:19:36.812536 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xzkh\" (UniqueName: \"kubernetes.io/projected/e02a0c7b-7017-43f5-a43e-300bf157cd37-kube-api-access-4xzkh\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:36 crc kubenswrapper[4723]: I0309 13:19:36.812588 4723 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e02a0c7b-7017-43f5-a43e-300bf157cd37-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:36 crc kubenswrapper[4723]: I0309 13:19:36.848988 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e02a0c7b-7017-43f5-a43e-300bf157cd37-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e02a0c7b-7017-43f5-a43e-300bf157cd37" (UID: "e02a0c7b-7017-43f5-a43e-300bf157cd37"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:19:36 crc kubenswrapper[4723]: I0309 13:19:36.860095 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e02a0c7b-7017-43f5-a43e-300bf157cd37-config" (OuterVolumeSpecName: "config") pod "e02a0c7b-7017-43f5-a43e-300bf157cd37" (UID: "e02a0c7b-7017-43f5-a43e-300bf157cd37"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:19:36 crc kubenswrapper[4723]: I0309 13:19:36.914528 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e02a0c7b-7017-43f5-a43e-300bf157cd37-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:36 crc kubenswrapper[4723]: I0309 13:19:36.914572 4723 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e02a0c7b-7017-43f5-a43e-300bf157cd37-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:37 crc kubenswrapper[4723]: I0309 13:19:37.251564 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-2k7l8" Mar 09 13:19:37 crc kubenswrapper[4723]: I0309 13:19:37.272931 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-2k7l8"] Mar 09 13:19:37 crc kubenswrapper[4723]: I0309 13:19:37.280673 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-2k7l8"] Mar 09 13:19:37 crc kubenswrapper[4723]: I0309 13:19:37.529090 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d924133b-d3c9-4b71-bbf4-a894a618e6c4-etc-swift\") pod \"swift-storage-0\" (UID: \"d924133b-d3c9-4b71-bbf4-a894a618e6c4\") " pod="openstack/swift-storage-0" Mar 09 13:19:37 crc kubenswrapper[4723]: E0309 13:19:37.529301 4723 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 09 13:19:37 crc kubenswrapper[4723]: E0309 13:19:37.529321 4723 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 09 13:19:37 crc kubenswrapper[4723]: E0309 13:19:37.529376 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d924133b-d3c9-4b71-bbf4-a894a618e6c4-etc-swift podName:d924133b-d3c9-4b71-bbf4-a894a618e6c4 nodeName:}" failed. No retries permitted until 2026-03-09 13:19:39.529357487 +0000 UTC m=+1253.543825027 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d924133b-d3c9-4b71-bbf4-a894a618e6c4-etc-swift") pod "swift-storage-0" (UID: "d924133b-d3c9-4b71-bbf4-a894a618e6c4") : configmap "swift-ring-files" not found Mar 09 13:19:38 crc kubenswrapper[4723]: I0309 13:19:38.895044 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e02a0c7b-7017-43f5-a43e-300bf157cd37" path="/var/lib/kubelet/pods/e02a0c7b-7017-43f5-a43e-300bf157cd37/volumes" Mar 09 13:19:39 crc kubenswrapper[4723]: I0309 13:19:39.571940 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d924133b-d3c9-4b71-bbf4-a894a618e6c4-etc-swift\") pod \"swift-storage-0\" (UID: \"d924133b-d3c9-4b71-bbf4-a894a618e6c4\") " pod="openstack/swift-storage-0" Mar 09 13:19:39 crc kubenswrapper[4723]: E0309 13:19:39.572146 4723 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 09 13:19:39 crc kubenswrapper[4723]: E0309 13:19:39.572172 4723 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 09 13:19:39 crc kubenswrapper[4723]: E0309 13:19:39.572232 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d924133b-d3c9-4b71-bbf4-a894a618e6c4-etc-swift podName:d924133b-d3c9-4b71-bbf4-a894a618e6c4 nodeName:}" failed. No retries permitted until 2026-03-09 13:19:43.572213564 +0000 UTC m=+1257.586681104 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d924133b-d3c9-4b71-bbf4-a894a618e6c4-etc-swift") pod "swift-storage-0" (UID: "d924133b-d3c9-4b71-bbf4-a894a618e6c4") : configmap "swift-ring-files" not found Mar 09 13:19:39 crc kubenswrapper[4723]: I0309 13:19:39.739316 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-v4zmc"] Mar 09 13:19:39 crc kubenswrapper[4723]: E0309 13:19:39.739982 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e02a0c7b-7017-43f5-a43e-300bf157cd37" containerName="dnsmasq-dns" Mar 09 13:19:39 crc kubenswrapper[4723]: I0309 13:19:39.740003 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="e02a0c7b-7017-43f5-a43e-300bf157cd37" containerName="dnsmasq-dns" Mar 09 13:19:39 crc kubenswrapper[4723]: E0309 13:19:39.740026 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e02a0c7b-7017-43f5-a43e-300bf157cd37" containerName="init" Mar 09 13:19:39 crc kubenswrapper[4723]: I0309 13:19:39.740034 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="e02a0c7b-7017-43f5-a43e-300bf157cd37" containerName="init" Mar 09 13:19:39 crc kubenswrapper[4723]: I0309 13:19:39.740300 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="e02a0c7b-7017-43f5-a43e-300bf157cd37" containerName="dnsmasq-dns" Mar 09 13:19:39 crc kubenswrapper[4723]: I0309 13:19:39.741123 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-v4zmc" Mar 09 13:19:39 crc kubenswrapper[4723]: I0309 13:19:39.743053 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 09 13:19:39 crc kubenswrapper[4723]: I0309 13:19:39.743323 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 09 13:19:39 crc kubenswrapper[4723]: I0309 13:19:39.743472 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 09 13:19:39 crc kubenswrapper[4723]: I0309 13:19:39.758829 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-v4zmc"] Mar 09 13:19:39 crc kubenswrapper[4723]: I0309 13:19:39.776490 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/10b17a94-81eb-4e72-bd49-97f590e26aec-etc-swift\") pod \"swift-ring-rebalance-v4zmc\" (UID: \"10b17a94-81eb-4e72-bd49-97f590e26aec\") " pod="openstack/swift-ring-rebalance-v4zmc" Mar 09 13:19:39 crc kubenswrapper[4723]: I0309 13:19:39.776690 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzwcr\" (UniqueName: \"kubernetes.io/projected/10b17a94-81eb-4e72-bd49-97f590e26aec-kube-api-access-kzwcr\") pod \"swift-ring-rebalance-v4zmc\" (UID: \"10b17a94-81eb-4e72-bd49-97f590e26aec\") " pod="openstack/swift-ring-rebalance-v4zmc" Mar 09 13:19:39 crc kubenswrapper[4723]: I0309 13:19:39.776747 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/10b17a94-81eb-4e72-bd49-97f590e26aec-ring-data-devices\") pod \"swift-ring-rebalance-v4zmc\" (UID: \"10b17a94-81eb-4e72-bd49-97f590e26aec\") " pod="openstack/swift-ring-rebalance-v4zmc" Mar 09 13:19:39 crc kubenswrapper[4723]: I0309 13:19:39.776821 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/10b17a94-81eb-4e72-bd49-97f590e26aec-swiftconf\") pod \"swift-ring-rebalance-v4zmc\" (UID: \"10b17a94-81eb-4e72-bd49-97f590e26aec\") " pod="openstack/swift-ring-rebalance-v4zmc" Mar 09 13:19:39 crc kubenswrapper[4723]: I0309 13:19:39.776877 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/10b17a94-81eb-4e72-bd49-97f590e26aec-scripts\") pod \"swift-ring-rebalance-v4zmc\" (UID: \"10b17a94-81eb-4e72-bd49-97f590e26aec\") " pod="openstack/swift-ring-rebalance-v4zmc" Mar 09 13:19:39 crc kubenswrapper[4723]: I0309 13:19:39.776907 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b17a94-81eb-4e72-bd49-97f590e26aec-combined-ca-bundle\") pod \"swift-ring-rebalance-v4zmc\" (UID: \"10b17a94-81eb-4e72-bd49-97f590e26aec\") " pod="openstack/swift-ring-rebalance-v4zmc" Mar 09 13:19:39 crc kubenswrapper[4723]: I0309 13:19:39.776946 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/10b17a94-81eb-4e72-bd49-97f590e26aec-dispersionconf\") pod \"swift-ring-rebalance-v4zmc\" (UID: \"10b17a94-81eb-4e72-bd49-97f590e26aec\") " pod="openstack/swift-ring-rebalance-v4zmc" Mar 09 13:19:39 crc kubenswrapper[4723]: I0309 13:19:39.878270 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzwcr\" (UniqueName: \"kubernetes.io/projected/10b17a94-81eb-4e72-bd49-97f590e26aec-kube-api-access-kzwcr\") pod \"swift-ring-rebalance-v4zmc\" (UID: \"10b17a94-81eb-4e72-bd49-97f590e26aec\") " pod="openstack/swift-ring-rebalance-v4zmc" Mar 09 13:19:39 crc kubenswrapper[4723]: I0309 13:19:39.878580 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/10b17a94-81eb-4e72-bd49-97f590e26aec-ring-data-devices\") pod \"swift-ring-rebalance-v4zmc\" (UID: \"10b17a94-81eb-4e72-bd49-97f590e26aec\") " pod="openstack/swift-ring-rebalance-v4zmc" Mar 09 13:19:39 crc kubenswrapper[4723]: I0309 13:19:39.878714 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/10b17a94-81eb-4e72-bd49-97f590e26aec-swiftconf\") pod \"swift-ring-rebalance-v4zmc\" (UID: \"10b17a94-81eb-4e72-bd49-97f590e26aec\") " pod="openstack/swift-ring-rebalance-v4zmc" Mar 09 13:19:39 crc kubenswrapper[4723]: I0309 13:19:39.879559 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/10b17a94-81eb-4e72-bd49-97f590e26aec-ring-data-devices\") pod \"swift-ring-rebalance-v4zmc\" (UID: \"10b17a94-81eb-4e72-bd49-97f590e26aec\") " pod="openstack/swift-ring-rebalance-v4zmc" Mar 09 13:19:39 crc kubenswrapper[4723]: I0309 13:19:39.879561 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/10b17a94-81eb-4e72-bd49-97f590e26aec-scripts\") pod \"swift-ring-rebalance-v4zmc\" (UID: \"10b17a94-81eb-4e72-bd49-97f590e26aec\") " pod="openstack/swift-ring-rebalance-v4zmc" Mar 09 13:19:39 crc kubenswrapper[4723]: I0309 13:19:39.879087 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/10b17a94-81eb-4e72-bd49-97f590e26aec-scripts\") pod \"swift-ring-rebalance-v4zmc\" (UID: \"10b17a94-81eb-4e72-bd49-97f590e26aec\") " pod="openstack/swift-ring-rebalance-v4zmc" Mar 09 13:19:39 crc kubenswrapper[4723]: I0309 13:19:39.879822 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b17a94-81eb-4e72-bd49-97f590e26aec-combined-ca-bundle\") pod \"swift-ring-rebalance-v4zmc\" (UID: \"10b17a94-81eb-4e72-bd49-97f590e26aec\") " pod="openstack/swift-ring-rebalance-v4zmc" Mar 09 13:19:39 crc kubenswrapper[4723]: I0309 13:19:39.879980 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/10b17a94-81eb-4e72-bd49-97f590e26aec-dispersionconf\") pod \"swift-ring-rebalance-v4zmc\" (UID: \"10b17a94-81eb-4e72-bd49-97f590e26aec\") " pod="openstack/swift-ring-rebalance-v4zmc" Mar 09 13:19:39 crc kubenswrapper[4723]: I0309 13:19:39.880112 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/10b17a94-81eb-4e72-bd49-97f590e26aec-etc-swift\") pod \"swift-ring-rebalance-v4zmc\" (UID: \"10b17a94-81eb-4e72-bd49-97f590e26aec\") " pod="openstack/swift-ring-rebalance-v4zmc" Mar 09 13:19:39 crc kubenswrapper[4723]: I0309 13:19:39.880583 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/10b17a94-81eb-4e72-bd49-97f590e26aec-etc-swift\") pod \"swift-ring-rebalance-v4zmc\" (UID: \"10b17a94-81eb-4e72-bd49-97f590e26aec\") " pod="openstack/swift-ring-rebalance-v4zmc" Mar 09 13:19:39 crc kubenswrapper[4723]: I0309 13:19:39.884830 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b17a94-81eb-4e72-bd49-97f590e26aec-combined-ca-bundle\") pod \"swift-ring-rebalance-v4zmc\" (UID: \"10b17a94-81eb-4e72-bd49-97f590e26aec\") " pod="openstack/swift-ring-rebalance-v4zmc" Mar 09 13:19:39 crc kubenswrapper[4723]: I0309 13:19:39.885169 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/10b17a94-81eb-4e72-bd49-97f590e26aec-dispersionconf\") pod \"swift-ring-rebalance-v4zmc\" (UID: \"10b17a94-81eb-4e72-bd49-97f590e26aec\") " pod="openstack/swift-ring-rebalance-v4zmc" Mar 09 13:19:39 crc kubenswrapper[4723]: I0309 13:19:39.892084 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/10b17a94-81eb-4e72-bd49-97f590e26aec-swiftconf\") pod \"swift-ring-rebalance-v4zmc\" (UID: \"10b17a94-81eb-4e72-bd49-97f590e26aec\") " pod="openstack/swift-ring-rebalance-v4zmc" Mar 09 13:19:39 crc kubenswrapper[4723]: I0309 13:19:39.901789 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzwcr\" (UniqueName: \"kubernetes.io/projected/10b17a94-81eb-4e72-bd49-97f590e26aec-kube-api-access-kzwcr\") pod \"swift-ring-rebalance-v4zmc\" (UID: \"10b17a94-81eb-4e72-bd49-97f590e26aec\") " pod="openstack/swift-ring-rebalance-v4zmc" Mar 09 13:19:40 crc kubenswrapper[4723]: I0309 13:19:40.065818 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-v4zmc" Mar 09 13:19:40 crc kubenswrapper[4723]: I0309 13:19:40.147874 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 09 13:19:40 crc kubenswrapper[4723]: I0309 13:19:40.147929 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 09 13:19:40 crc kubenswrapper[4723]: I0309 13:19:40.271216 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-ntm9d"] Mar 09 13:19:40 crc kubenswrapper[4723]: I0309 13:19:40.272767 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ntm9d" Mar 09 13:19:40 crc kubenswrapper[4723]: I0309 13:19:40.275381 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 09 13:19:40 crc kubenswrapper[4723]: I0309 13:19:40.286330 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-ntm9d"] Mar 09 13:19:40 crc kubenswrapper[4723]: I0309 13:19:40.287767 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4a548a9c-33c8-4a35-a559-7290357170c1","Type":"ContainerStarted","Data":"9ec89d749e487ccf881f60ee3e3329f37cb8dee40b34411d0ffb3b10fd496acf"} Mar 09 13:19:40 crc kubenswrapper[4723]: I0309 13:19:40.300964 4723 generic.go:334] "Generic (PLEG): container finished" podID="33ef3c2f-07b0-431c-96b0-b588061d9ce9" containerID="8356b72a18db57f66396db270ee1ac88e50cfbd0bd7e036a7db66fa6b79fd270" exitCode=0 Mar 09 13:19:40 crc kubenswrapper[4723]: I0309 13:19:40.301020 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-2kn5v" event={"ID":"33ef3c2f-07b0-431c-96b0-b588061d9ce9","Type":"ContainerDied","Data":"8356b72a18db57f66396db270ee1ac88e50cfbd0bd7e036a7db66fa6b79fd270"} Mar 09 13:19:40 crc kubenswrapper[4723]: I0309 13:19:40.338090 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-vmb2f" Mar 09 13:19:40 crc kubenswrapper[4723]: I0309 13:19:40.394925 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3af2747f-855e-4f65-aebb-40cfb961b775-operator-scripts\") pod \"root-account-create-update-ntm9d\" (UID: \"3af2747f-855e-4f65-aebb-40cfb961b775\") " pod="openstack/root-account-create-update-ntm9d" Mar 09 13:19:40 crc kubenswrapper[4723]: I0309 13:19:40.395658 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tkmt\" (UniqueName: \"kubernetes.io/projected/3af2747f-855e-4f65-aebb-40cfb961b775-kube-api-access-8tkmt\") pod \"root-account-create-update-ntm9d\" (UID: \"3af2747f-855e-4f65-aebb-40cfb961b775\") " pod="openstack/root-account-create-update-ntm9d" Mar 09 13:19:40 crc kubenswrapper[4723]: I0309 13:19:40.499076 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3af2747f-855e-4f65-aebb-40cfb961b775-operator-scripts\") pod \"root-account-create-update-ntm9d\" (UID: \"3af2747f-855e-4f65-aebb-40cfb961b775\") " pod="openstack/root-account-create-update-ntm9d" Mar 09 13:19:40 crc kubenswrapper[4723]: I0309 13:19:40.500308 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tkmt\" (UniqueName: \"kubernetes.io/projected/3af2747f-855e-4f65-aebb-40cfb961b775-kube-api-access-8tkmt\") pod \"root-account-create-update-ntm9d\" (UID: \"3af2747f-855e-4f65-aebb-40cfb961b775\") " pod="openstack/root-account-create-update-ntm9d" Mar 09 13:19:40 crc kubenswrapper[4723]: I0309 13:19:40.501029 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3af2747f-855e-4f65-aebb-40cfb961b775-operator-scripts\") pod \"root-account-create-update-ntm9d\" (UID: \"3af2747f-855e-4f65-aebb-40cfb961b775\") " pod="openstack/root-account-create-update-ntm9d" Mar 09 13:19:40 crc kubenswrapper[4723]: I0309 13:19:40.520808 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tkmt\" (UniqueName: \"kubernetes.io/projected/3af2747f-855e-4f65-aebb-40cfb961b775-kube-api-access-8tkmt\") pod \"root-account-create-update-ntm9d\" (UID: \"3af2747f-855e-4f65-aebb-40cfb961b775\") " pod="openstack/root-account-create-update-ntm9d" Mar 09 13:19:40 crc kubenswrapper[4723]: I0309 13:19:40.610794 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ntm9d" Mar 09 13:19:40 crc kubenswrapper[4723]: I0309 13:19:40.675538 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-v4zmc"] Mar 09 13:19:41 crc kubenswrapper[4723]: I0309 13:19:41.066773 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-ntm9d"] Mar 09 13:19:41 crc kubenswrapper[4723]: I0309 13:19:41.323496 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ntm9d" event={"ID":"3af2747f-855e-4f65-aebb-40cfb961b775","Type":"ContainerStarted","Data":"da6b2137bd9781c109f3e16b89ef5f82274b5c5c675a63a1ce457a03997ee6e0"} Mar 09 13:19:41 crc kubenswrapper[4723]: I0309 13:19:41.323575 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ntm9d" event={"ID":"3af2747f-855e-4f65-aebb-40cfb961b775","Type":"ContainerStarted","Data":"44f61da5d9f2147078a7e7c5d69cbe24e7c5e82e1051c422011e556a3150a206"} Mar 09 13:19:41 crc kubenswrapper[4723]: I0309 13:19:41.327693 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-2kn5v" event={"ID":"33ef3c2f-07b0-431c-96b0-b588061d9ce9","Type":"ContainerStarted","Data":"d92d0d56a25623f0f68ce1e22eaaea9b59c6682ffe250280b7f5ab6820b0d811"} Mar 09 13:19:41 crc kubenswrapper[4723]: I0309 13:19:41.327840 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-2kn5v" Mar 09 13:19:41 crc kubenswrapper[4723]: I0309 13:19:41.330635 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-v4zmc" event={"ID":"10b17a94-81eb-4e72-bd49-97f590e26aec","Type":"ContainerStarted","Data":"fad36a4a20181ba90ae025c17c1383d367688fb1c601d8220e18257773c28513"} Mar 09 13:19:41 crc kubenswrapper[4723]: I0309 13:19:41.348837 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-ntm9d" podStartSLOduration=1.348816002 podStartE2EDuration="1.348816002s" podCreationTimestamp="2026-03-09 13:19:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:19:41.34349739 +0000 UTC m=+1255.357964930" watchObservedRunningTime="2026-03-09 13:19:41.348816002 +0000 UTC m=+1255.363283552" Mar 09 13:19:41 crc kubenswrapper[4723]: I0309 13:19:41.370589 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-2kn5v" podStartSLOduration=7.370568721 podStartE2EDuration="7.370568721s" podCreationTimestamp="2026-03-09 13:19:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:19:41.361403667 +0000 UTC m=+1255.375871217" watchObservedRunningTime="2026-03-09 13:19:41.370568721 +0000 UTC m=+1255.385036261" Mar 09 13:19:42 crc kubenswrapper[4723]: I0309 13:19:42.353220 4723 generic.go:334] "Generic (PLEG): container finished" podID="3af2747f-855e-4f65-aebb-40cfb961b775" containerID="da6b2137bd9781c109f3e16b89ef5f82274b5c5c675a63a1ce457a03997ee6e0" exitCode=0 Mar 09 13:19:42 crc kubenswrapper[4723]: I0309 13:19:42.353286 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ntm9d" event={"ID":"3af2747f-855e-4f65-aebb-40cfb961b775","Type":"ContainerDied","Data":"da6b2137bd9781c109f3e16b89ef5f82274b5c5c675a63a1ce457a03997ee6e0"} Mar 09 13:19:43 crc kubenswrapper[4723]: I0309 13:19:43.227890 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 09 13:19:43 crc kubenswrapper[4723]: I0309 13:19:43.312225 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 09 13:19:43 crc kubenswrapper[4723]: I0309 13:19:43.493600 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-d2a6-account-create-update-gnb49"] Mar 09 13:19:43 crc kubenswrapper[4723]: I0309 13:19:43.531183 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d2a6-account-create-update-gnb49"] Mar 09 13:19:43 crc kubenswrapper[4723]: I0309 13:19:43.532054 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d2a6-account-create-update-gnb49" Mar 09 13:19:43 crc kubenswrapper[4723]: I0309 13:19:43.537072 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 09 13:19:43 crc kubenswrapper[4723]: I0309 13:19:43.596304 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d924133b-d3c9-4b71-bbf4-a894a618e6c4-etc-swift\") pod \"swift-storage-0\" (UID: \"d924133b-d3c9-4b71-bbf4-a894a618e6c4\") " pod="openstack/swift-storage-0" Mar 09 13:19:43 crc kubenswrapper[4723]: E0309 13:19:43.596463 4723 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 09 13:19:43 crc kubenswrapper[4723]: E0309 13:19:43.596483 4723 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 09 13:19:43 crc kubenswrapper[4723]: E0309 13:19:43.596530 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d924133b-d3c9-4b71-bbf4-a894a618e6c4-etc-swift podName:d924133b-d3c9-4b71-bbf4-a894a618e6c4 nodeName:}" failed. No retries permitted until 2026-03-09 13:19:51.596514824 +0000 UTC m=+1265.610982364 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d924133b-d3c9-4b71-bbf4-a894a618e6c4-etc-swift") pod "swift-storage-0" (UID: "d924133b-d3c9-4b71-bbf4-a894a618e6c4") : configmap "swift-ring-files" not found Mar 09 13:19:43 crc kubenswrapper[4723]: I0309 13:19:43.701101 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0958c374-1159-4924-bff6-d627956944c9-operator-scripts\") pod \"placement-d2a6-account-create-update-gnb49\" (UID: \"0958c374-1159-4924-bff6-d627956944c9\") " pod="openstack/placement-d2a6-account-create-update-gnb49" Mar 09 13:19:43 crc kubenswrapper[4723]: I0309 13:19:43.701679 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plvv9\" (UniqueName: \"kubernetes.io/projected/0958c374-1159-4924-bff6-d627956944c9-kube-api-access-plvv9\") pod \"placement-d2a6-account-create-update-gnb49\" (UID: \"0958c374-1159-4924-bff6-d627956944c9\") " pod="openstack/placement-d2a6-account-create-update-gnb49" Mar 09 13:19:43 crc kubenswrapper[4723]: I0309 13:19:43.804653 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0958c374-1159-4924-bff6-d627956944c9-operator-scripts\") pod \"placement-d2a6-account-create-update-gnb49\" (UID: \"0958c374-1159-4924-bff6-d627956944c9\") " pod="openstack/placement-d2a6-account-create-update-gnb49" Mar 09 13:19:43 crc kubenswrapper[4723]: I0309 13:19:43.804766 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plvv9\" (UniqueName: \"kubernetes.io/projected/0958c374-1159-4924-bff6-d627956944c9-kube-api-access-plvv9\") pod \"placement-d2a6-account-create-update-gnb49\" (UID: \"0958c374-1159-4924-bff6-d627956944c9\") " pod="openstack/placement-d2a6-account-create-update-gnb49" Mar 09 13:19:43 crc kubenswrapper[4723]: I0309 13:19:43.806900 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0958c374-1159-4924-bff6-d627956944c9-operator-scripts\") pod \"placement-d2a6-account-create-update-gnb49\" (UID: \"0958c374-1159-4924-bff6-d627956944c9\") " pod="openstack/placement-d2a6-account-create-update-gnb49" Mar 09 13:19:43 crc kubenswrapper[4723]: I0309 13:19:43.829219 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plvv9\" (UniqueName: \"kubernetes.io/projected/0958c374-1159-4924-bff6-d627956944c9-kube-api-access-plvv9\") pod \"placement-d2a6-account-create-update-gnb49\" (UID: \"0958c374-1159-4924-bff6-d627956944c9\") " pod="openstack/placement-d2a6-account-create-update-gnb49" Mar 09 13:19:43 crc kubenswrapper[4723]: I0309 13:19:43.870610 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d2a6-account-create-update-gnb49" Mar 09 13:19:44 crc kubenswrapper[4723]: I0309 13:19:44.635039 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-g7xtr"] Mar 09 13:19:44 crc kubenswrapper[4723]: I0309 13:19:44.636234 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ntm9d" Mar 09 13:19:44 crc kubenswrapper[4723]: I0309 13:19:44.636772 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-g7xtr" Mar 09 13:19:44 crc kubenswrapper[4723]: I0309 13:19:44.654342 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-g7xtr"] Mar 09 13:19:44 crc kubenswrapper[4723]: I0309 13:19:44.728703 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tkmt\" (UniqueName: \"kubernetes.io/projected/3af2747f-855e-4f65-aebb-40cfb961b775-kube-api-access-8tkmt\") pod \"3af2747f-855e-4f65-aebb-40cfb961b775\" (UID: \"3af2747f-855e-4f65-aebb-40cfb961b775\") " Mar 09 13:19:44 crc kubenswrapper[4723]: I0309 13:19:44.728773 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3af2747f-855e-4f65-aebb-40cfb961b775-operator-scripts\") pod \"3af2747f-855e-4f65-aebb-40cfb961b775\" (UID: \"3af2747f-855e-4f65-aebb-40cfb961b775\") " Mar 09 13:19:44 crc kubenswrapper[4723]: I0309 13:19:44.729172 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd854222-9c09-403c-95c5-37763f40aac3-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-g7xtr\" (UID: \"cd854222-9c09-403c-95c5-37763f40aac3\") " pod="openstack/mysqld-exporter-openstack-db-create-g7xtr" Mar 09 13:19:44 crc kubenswrapper[4723]: I0309 13:19:44.729218 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrdx2\" (UniqueName: \"kubernetes.io/projected/cd854222-9c09-403c-95c5-37763f40aac3-kube-api-access-mrdx2\") pod \"mysqld-exporter-openstack-db-create-g7xtr\" (UID: \"cd854222-9c09-403c-95c5-37763f40aac3\") " pod="openstack/mysqld-exporter-openstack-db-create-g7xtr" Mar 09 13:19:44 crc kubenswrapper[4723]: I0309 13:19:44.729720 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3af2747f-855e-4f65-aebb-40cfb961b775-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3af2747f-855e-4f65-aebb-40cfb961b775" (UID: "3af2747f-855e-4f65-aebb-40cfb961b775"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:19:44 crc kubenswrapper[4723]: I0309 13:19:44.729952 4723 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3af2747f-855e-4f65-aebb-40cfb961b775-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:44 crc kubenswrapper[4723]: I0309 13:19:44.741402 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3af2747f-855e-4f65-aebb-40cfb961b775-kube-api-access-8tkmt" (OuterVolumeSpecName: "kube-api-access-8tkmt") pod "3af2747f-855e-4f65-aebb-40cfb961b775" (UID: "3af2747f-855e-4f65-aebb-40cfb961b775"). InnerVolumeSpecName "kube-api-access-8tkmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:19:44 crc kubenswrapper[4723]: I0309 13:19:44.745348 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-fa4f-account-create-update-5mxnk"] Mar 09 13:19:44 crc kubenswrapper[4723]: E0309 13:19:44.746114 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3af2747f-855e-4f65-aebb-40cfb961b775" containerName="mariadb-account-create-update" Mar 09 13:19:44 crc kubenswrapper[4723]: I0309 13:19:44.746135 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="3af2747f-855e-4f65-aebb-40cfb961b775" containerName="mariadb-account-create-update" Mar 09 13:19:44 crc kubenswrapper[4723]: I0309 13:19:44.746372 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="3af2747f-855e-4f65-aebb-40cfb961b775" containerName="mariadb-account-create-update" Mar 09 13:19:44 crc kubenswrapper[4723]: I0309 13:19:44.747152 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-fa4f-account-create-update-5mxnk" Mar 09 13:19:44 crc kubenswrapper[4723]: I0309 13:19:44.749467 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-db-secret" Mar 09 13:19:44 crc kubenswrapper[4723]: I0309 13:19:44.771823 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-fa4f-account-create-update-5mxnk"] Mar 09 13:19:44 crc kubenswrapper[4723]: I0309 13:19:44.832402 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd854222-9c09-403c-95c5-37763f40aac3-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-g7xtr\" (UID: \"cd854222-9c09-403c-95c5-37763f40aac3\") " pod="openstack/mysqld-exporter-openstack-db-create-g7xtr" Mar 09 13:19:44 crc kubenswrapper[4723]: I0309 13:19:44.833098 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrdx2\" (UniqueName: \"kubernetes.io/projected/cd854222-9c09-403c-95c5-37763f40aac3-kube-api-access-mrdx2\") pod \"mysqld-exporter-openstack-db-create-g7xtr\" (UID: \"cd854222-9c09-403c-95c5-37763f40aac3\") " pod="openstack/mysqld-exporter-openstack-db-create-g7xtr" Mar 09 13:19:44 crc kubenswrapper[4723]: I0309 13:19:44.833907 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tkmt\" (UniqueName: \"kubernetes.io/projected/3af2747f-855e-4f65-aebb-40cfb961b775-kube-api-access-8tkmt\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:44 crc kubenswrapper[4723]: I0309 13:19:44.833109 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd854222-9c09-403c-95c5-37763f40aac3-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-g7xtr\" (UID: \"cd854222-9c09-403c-95c5-37763f40aac3\") " pod="openstack/mysqld-exporter-openstack-db-create-g7xtr" Mar 09 13:19:44 crc kubenswrapper[4723]: I0309 13:19:44.851284 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrdx2\" (UniqueName: \"kubernetes.io/projected/cd854222-9c09-403c-95c5-37763f40aac3-kube-api-access-mrdx2\") pod \"mysqld-exporter-openstack-db-create-g7xtr\" (UID: \"cd854222-9c09-403c-95c5-37763f40aac3\") " pod="openstack/mysqld-exporter-openstack-db-create-g7xtr" Mar 09 13:19:44 crc kubenswrapper[4723]: I0309 13:19:44.921464 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d2a6-account-create-update-gnb49"] Mar 09 13:19:44 crc kubenswrapper[4723]: I0309 13:19:44.935181 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08304d40-0d1d-40e2-8188-8e6fd44434c2-operator-scripts\") pod \"mysqld-exporter-fa4f-account-create-update-5mxnk\" (UID: \"08304d40-0d1d-40e2-8188-8e6fd44434c2\") " pod="openstack/mysqld-exporter-fa4f-account-create-update-5mxnk" Mar 09 13:19:44 crc kubenswrapper[4723]: I0309 13:19:44.935404 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mnr2\" (UniqueName: \"kubernetes.io/projected/08304d40-0d1d-40e2-8188-8e6fd44434c2-kube-api-access-7mnr2\") pod \"mysqld-exporter-fa4f-account-create-update-5mxnk\" (UID: \"08304d40-0d1d-40e2-8188-8e6fd44434c2\") " pod="openstack/mysqld-exporter-fa4f-account-create-update-5mxnk" Mar 09 13:19:44 crc kubenswrapper[4723]: I0309 13:19:44.978171 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-g7xtr" Mar 09 13:19:45 crc kubenswrapper[4723]: I0309 13:19:45.038745 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08304d40-0d1d-40e2-8188-8e6fd44434c2-operator-scripts\") pod \"mysqld-exporter-fa4f-account-create-update-5mxnk\" (UID: \"08304d40-0d1d-40e2-8188-8e6fd44434c2\") " pod="openstack/mysqld-exporter-fa4f-account-create-update-5mxnk" Mar 09 13:19:45 crc kubenswrapper[4723]: I0309 13:19:45.039282 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mnr2\" (UniqueName: \"kubernetes.io/projected/08304d40-0d1d-40e2-8188-8e6fd44434c2-kube-api-access-7mnr2\") pod \"mysqld-exporter-fa4f-account-create-update-5mxnk\" (UID: \"08304d40-0d1d-40e2-8188-8e6fd44434c2\") " pod="openstack/mysqld-exporter-fa4f-account-create-update-5mxnk" Mar 09 13:19:45 crc kubenswrapper[4723]: I0309 13:19:45.039593 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08304d40-0d1d-40e2-8188-8e6fd44434c2-operator-scripts\") pod \"mysqld-exporter-fa4f-account-create-update-5mxnk\" (UID: \"08304d40-0d1d-40e2-8188-8e6fd44434c2\") " pod="openstack/mysqld-exporter-fa4f-account-create-update-5mxnk" Mar 09 13:19:45 crc kubenswrapper[4723]: I0309 13:19:45.057156 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mnr2\" (UniqueName: \"kubernetes.io/projected/08304d40-0d1d-40e2-8188-8e6fd44434c2-kube-api-access-7mnr2\") pod \"mysqld-exporter-fa4f-account-create-update-5mxnk\" (UID: \"08304d40-0d1d-40e2-8188-8e6fd44434c2\") " pod="openstack/mysqld-exporter-fa4f-account-create-update-5mxnk" Mar 09 13:19:45 crc kubenswrapper[4723]: I0309 13:19:45.074729 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-fa4f-account-create-update-5mxnk" Mar 09 13:19:45 crc kubenswrapper[4723]: I0309 13:19:45.413274 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d2a6-account-create-update-gnb49" event={"ID":"0958c374-1159-4924-bff6-d627956944c9","Type":"ContainerStarted","Data":"954911e9e6881836f5fbfee400272cdadd39d73843c3fd7f68b4e696a43db9a5"} Mar 09 13:19:45 crc kubenswrapper[4723]: I0309 13:19:45.413519 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d2a6-account-create-update-gnb49" event={"ID":"0958c374-1159-4924-bff6-d627956944c9","Type":"ContainerStarted","Data":"40df6b7150773790ec23406278e14663eccaccbd8f84416cd7a103b8a8797cbb"} Mar 09 13:19:45 crc kubenswrapper[4723]: I0309 13:19:45.444466 4723 generic.go:334] "Generic (PLEG): container finished" podID="4a548a9c-33c8-4a35-a559-7290357170c1" containerID="9ec89d749e487ccf881f60ee3e3329f37cb8dee40b34411d0ffb3b10fd496acf" exitCode=0 Mar 09 13:19:45 crc kubenswrapper[4723]: I0309 13:19:45.444569 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4a548a9c-33c8-4a35-a559-7290357170c1","Type":"ContainerDied","Data":"9ec89d749e487ccf881f60ee3e3329f37cb8dee40b34411d0ffb3b10fd496acf"} Mar 09 13:19:45 crc kubenswrapper[4723]: I0309 13:19:45.471172 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-d2a6-account-create-update-gnb49" podStartSLOduration=2.471155452 podStartE2EDuration="2.471155452s" podCreationTimestamp="2026-03-09 13:19:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:19:45.462307957 +0000 UTC m=+1259.476775497" watchObservedRunningTime="2026-03-09 13:19:45.471155452 +0000 UTC m=+1259.485623012" Mar 09 13:19:45 crc kubenswrapper[4723]: I0309 13:19:45.473371 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-v4zmc" event={"ID":"10b17a94-81eb-4e72-bd49-97f590e26aec","Type":"ContainerStarted","Data":"4dff7cbb784365b6d7a33848c76f38626e7c710e86b81ba9ab3757b487ecd440"} Mar 09 13:19:45 crc kubenswrapper[4723]: I0309 13:19:45.484773 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ntm9d" event={"ID":"3af2747f-855e-4f65-aebb-40cfb961b775","Type":"ContainerDied","Data":"44f61da5d9f2147078a7e7c5d69cbe24e7c5e82e1051c422011e556a3150a206"} Mar 09 13:19:45 crc kubenswrapper[4723]: I0309 13:19:45.484814 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44f61da5d9f2147078a7e7c5d69cbe24e7c5e82e1051c422011e556a3150a206" Mar 09 13:19:45 crc kubenswrapper[4723]: I0309 13:19:45.484907 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ntm9d" Mar 09 13:19:45 crc kubenswrapper[4723]: I0309 13:19:45.583957 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-g7xtr"] Mar 09 13:19:45 crc kubenswrapper[4723]: I0309 13:19:45.596840 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-v4zmc" podStartSLOduration=2.797005106 podStartE2EDuration="6.596816938s" podCreationTimestamp="2026-03-09 13:19:39 +0000 UTC" firstStartedPulling="2026-03-09 13:19:40.687272417 +0000 UTC m=+1254.701739957" lastFinishedPulling="2026-03-09 13:19:44.487084249 +0000 UTC m=+1258.501551789" observedRunningTime="2026-03-09 13:19:45.558160239 +0000 UTC m=+1259.572627779" watchObservedRunningTime="2026-03-09 13:19:45.596816938 +0000 UTC m=+1259.611284488" Mar 09 13:19:45 crc kubenswrapper[4723]: I0309 13:19:45.694634 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-fa4f-account-create-update-5mxnk"] Mar 09 13:19:46 crc kubenswrapper[4723]: I0309 13:19:46.495638 4723 generic.go:334] "Generic (PLEG): container finished" podID="08304d40-0d1d-40e2-8188-8e6fd44434c2" containerID="ba88cce2dc24987461693475db35a4db20d16ace8abea540da2a702f0d29436a" exitCode=0 Mar 09 13:19:46 crc kubenswrapper[4723]: I0309 13:19:46.495907 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-fa4f-account-create-update-5mxnk" event={"ID":"08304d40-0d1d-40e2-8188-8e6fd44434c2","Type":"ContainerDied","Data":"ba88cce2dc24987461693475db35a4db20d16ace8abea540da2a702f0d29436a"} Mar 09 13:19:46 crc kubenswrapper[4723]: I0309 13:19:46.495962 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-fa4f-account-create-update-5mxnk" event={"ID":"08304d40-0d1d-40e2-8188-8e6fd44434c2","Type":"ContainerStarted","Data":"e99bbc78d60ee135992d0a7bb84a3442f33480ce5d8b7de305717f3bdc7e9081"} Mar 09 13:19:46 crc kubenswrapper[4723]: I0309 13:19:46.499990 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d2a6-account-create-update-gnb49" event={"ID":"0958c374-1159-4924-bff6-d627956944c9","Type":"ContainerDied","Data":"954911e9e6881836f5fbfee400272cdadd39d73843c3fd7f68b4e696a43db9a5"} Mar 09 13:19:46 crc kubenswrapper[4723]: I0309 13:19:46.500262 4723 generic.go:334] "Generic (PLEG): container finished" podID="0958c374-1159-4924-bff6-d627956944c9" containerID="954911e9e6881836f5fbfee400272cdadd39d73843c3fd7f68b4e696a43db9a5" exitCode=0 Mar 09 13:19:46 crc kubenswrapper[4723]: I0309 13:19:46.502850 4723 generic.go:334] "Generic (PLEG): container finished" podID="cd854222-9c09-403c-95c5-37763f40aac3" containerID="f7876a0eb57218f6edf385df427799637d84618943937fa84e8d085606c95f9f" exitCode=0 Mar 09 13:19:46 crc kubenswrapper[4723]: I0309 13:19:46.503984 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-g7xtr" event={"ID":"cd854222-9c09-403c-95c5-37763f40aac3","Type":"ContainerDied","Data":"f7876a0eb57218f6edf385df427799637d84618943937fa84e8d085606c95f9f"} Mar 09 13:19:46 crc kubenswrapper[4723]: I0309 13:19:46.504014 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-g7xtr" event={"ID":"cd854222-9c09-403c-95c5-37763f40aac3","Type":"ContainerStarted","Data":"6d8af820a6d8fa8e443ff6fb9410a8bd51b93138f014d1eed05b3466168db1ae"} Mar 09 13:19:48 crc kubenswrapper[4723]: I0309 13:19:48.033308 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d2a6-account-create-update-gnb49" Mar 09 13:19:48 crc kubenswrapper[4723]: I0309 13:19:48.111226 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0958c374-1159-4924-bff6-d627956944c9-operator-scripts\") pod \"0958c374-1159-4924-bff6-d627956944c9\" (UID: \"0958c374-1159-4924-bff6-d627956944c9\") " Mar 09 13:19:48 crc kubenswrapper[4723]: I0309 13:19:48.111482 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plvv9\" (UniqueName: \"kubernetes.io/projected/0958c374-1159-4924-bff6-d627956944c9-kube-api-access-plvv9\") pod \"0958c374-1159-4924-bff6-d627956944c9\" (UID: \"0958c374-1159-4924-bff6-d627956944c9\") " Mar 09 13:19:48 crc kubenswrapper[4723]: I0309 13:19:48.114410 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0958c374-1159-4924-bff6-d627956944c9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0958c374-1159-4924-bff6-d627956944c9" (UID: "0958c374-1159-4924-bff6-d627956944c9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:19:48 crc kubenswrapper[4723]: I0309 13:19:48.124258 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0958c374-1159-4924-bff6-d627956944c9-kube-api-access-plvv9" (OuterVolumeSpecName: "kube-api-access-plvv9") pod "0958c374-1159-4924-bff6-d627956944c9" (UID: "0958c374-1159-4924-bff6-d627956944c9"). InnerVolumeSpecName "kube-api-access-plvv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:19:48 crc kubenswrapper[4723]: I0309 13:19:48.199532 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-fa4f-account-create-update-5mxnk" Mar 09 13:19:48 crc kubenswrapper[4723]: I0309 13:19:48.208753 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-g7xtr" Mar 09 13:19:48 crc kubenswrapper[4723]: I0309 13:19:48.225739 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plvv9\" (UniqueName: \"kubernetes.io/projected/0958c374-1159-4924-bff6-d627956944c9-kube-api-access-plvv9\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:48 crc kubenswrapper[4723]: I0309 13:19:48.225771 4723 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0958c374-1159-4924-bff6-d627956944c9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:48 crc kubenswrapper[4723]: I0309 13:19:48.326803 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd854222-9c09-403c-95c5-37763f40aac3-operator-scripts\") pod \"cd854222-9c09-403c-95c5-37763f40aac3\" (UID: \"cd854222-9c09-403c-95c5-37763f40aac3\") " Mar 09 13:19:48 crc kubenswrapper[4723]: I0309 13:19:48.327034 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrdx2\" (UniqueName: \"kubernetes.io/projected/cd854222-9c09-403c-95c5-37763f40aac3-kube-api-access-mrdx2\") pod \"cd854222-9c09-403c-95c5-37763f40aac3\" (UID: \"cd854222-9c09-403c-95c5-37763f40aac3\") " Mar 09 13:19:48 crc kubenswrapper[4723]: I0309 13:19:48.327378 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd854222-9c09-403c-95c5-37763f40aac3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cd854222-9c09-403c-95c5-37763f40aac3" (UID: "cd854222-9c09-403c-95c5-37763f40aac3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:19:48 crc kubenswrapper[4723]: I0309 13:19:48.327427 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mnr2\" (UniqueName: \"kubernetes.io/projected/08304d40-0d1d-40e2-8188-8e6fd44434c2-kube-api-access-7mnr2\") pod \"08304d40-0d1d-40e2-8188-8e6fd44434c2\" (UID: \"08304d40-0d1d-40e2-8188-8e6fd44434c2\") " Mar 09 13:19:48 crc kubenswrapper[4723]: I0309 13:19:48.327470 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08304d40-0d1d-40e2-8188-8e6fd44434c2-operator-scripts\") pod \"08304d40-0d1d-40e2-8188-8e6fd44434c2\" (UID: \"08304d40-0d1d-40e2-8188-8e6fd44434c2\") " Mar 09 13:19:48 crc kubenswrapper[4723]: I0309 13:19:48.327987 4723 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd854222-9c09-403c-95c5-37763f40aac3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:48 crc kubenswrapper[4723]: I0309 13:19:48.328003 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08304d40-0d1d-40e2-8188-8e6fd44434c2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "08304d40-0d1d-40e2-8188-8e6fd44434c2" (UID: "08304d40-0d1d-40e2-8188-8e6fd44434c2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:19:48 crc kubenswrapper[4723]: I0309 13:19:48.330725 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08304d40-0d1d-40e2-8188-8e6fd44434c2-kube-api-access-7mnr2" (OuterVolumeSpecName: "kube-api-access-7mnr2") pod "08304d40-0d1d-40e2-8188-8e6fd44434c2" (UID: "08304d40-0d1d-40e2-8188-8e6fd44434c2"). InnerVolumeSpecName "kube-api-access-7mnr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:19:48 crc kubenswrapper[4723]: I0309 13:19:48.331612 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd854222-9c09-403c-95c5-37763f40aac3-kube-api-access-mrdx2" (OuterVolumeSpecName: "kube-api-access-mrdx2") pod "cd854222-9c09-403c-95c5-37763f40aac3" (UID: "cd854222-9c09-403c-95c5-37763f40aac3"). InnerVolumeSpecName "kube-api-access-mrdx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:19:48 crc kubenswrapper[4723]: I0309 13:19:48.430151 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrdx2\" (UniqueName: \"kubernetes.io/projected/cd854222-9c09-403c-95c5-37763f40aac3-kube-api-access-mrdx2\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:48 crc kubenswrapper[4723]: I0309 13:19:48.430184 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mnr2\" (UniqueName: \"kubernetes.io/projected/08304d40-0d1d-40e2-8188-8e6fd44434c2-kube-api-access-7mnr2\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:48 crc kubenswrapper[4723]: I0309 13:19:48.430194 4723 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08304d40-0d1d-40e2-8188-8e6fd44434c2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:48 crc kubenswrapper[4723]: I0309 13:19:48.526842 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-fa4f-account-create-update-5mxnk" event={"ID":"08304d40-0d1d-40e2-8188-8e6fd44434c2","Type":"ContainerDied","Data":"e99bbc78d60ee135992d0a7bb84a3442f33480ce5d8b7de305717f3bdc7e9081"} Mar 09 13:19:48 crc kubenswrapper[4723]: I0309 13:19:48.526927 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e99bbc78d60ee135992d0a7bb84a3442f33480ce5d8b7de305717f3bdc7e9081" Mar 09 13:19:48 crc kubenswrapper[4723]: I0309 13:19:48.526881 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-fa4f-account-create-update-5mxnk" Mar 09 13:19:48 crc kubenswrapper[4723]: I0309 13:19:48.529660 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d2a6-account-create-update-gnb49" event={"ID":"0958c374-1159-4924-bff6-d627956944c9","Type":"ContainerDied","Data":"40df6b7150773790ec23406278e14663eccaccbd8f84416cd7a103b8a8797cbb"} Mar 09 13:19:48 crc kubenswrapper[4723]: I0309 13:19:48.529706 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40df6b7150773790ec23406278e14663eccaccbd8f84416cd7a103b8a8797cbb" Mar 09 13:19:48 crc kubenswrapper[4723]: I0309 13:19:48.529733 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d2a6-account-create-update-gnb49" Mar 09 13:19:48 crc kubenswrapper[4723]: I0309 13:19:48.532142 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-g7xtr" Mar 09 13:19:48 crc kubenswrapper[4723]: I0309 13:19:48.532102 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-g7xtr" event={"ID":"cd854222-9c09-403c-95c5-37763f40aac3","Type":"ContainerDied","Data":"6d8af820a6d8fa8e443ff6fb9410a8bd51b93138f014d1eed05b3466168db1ae"} Mar 09 13:19:48 crc kubenswrapper[4723]: I0309 13:19:48.532277 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d8af820a6d8fa8e443ff6fb9410a8bd51b93138f014d1eed05b3466168db1ae" Mar 09 13:19:48 crc kubenswrapper[4723]: I0309 13:19:48.727006 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-ntm9d"] Mar 09 13:19:48 crc kubenswrapper[4723]: I0309 13:19:48.738135 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-ntm9d"] Mar 09 13:19:48 crc kubenswrapper[4723]: I0309 13:19:48.866815 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-lmg7p"] Mar 09 13:19:48 crc kubenswrapper[4723]: E0309 13:19:48.867215 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0958c374-1159-4924-bff6-d627956944c9" containerName="mariadb-account-create-update" Mar 09 13:19:48 crc kubenswrapper[4723]: I0309 13:19:48.867232 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="0958c374-1159-4924-bff6-d627956944c9" containerName="mariadb-account-create-update" Mar 09 13:19:48 crc kubenswrapper[4723]: E0309 13:19:48.867245 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08304d40-0d1d-40e2-8188-8e6fd44434c2" containerName="mariadb-account-create-update" Mar 09 13:19:48 crc kubenswrapper[4723]: I0309 13:19:48.867253 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="08304d40-0d1d-40e2-8188-8e6fd44434c2" containerName="mariadb-account-create-update" Mar 09 13:19:48 crc kubenswrapper[4723]: E0309 13:19:48.867288 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd854222-9c09-403c-95c5-37763f40aac3" containerName="mariadb-database-create" Mar 09 13:19:48 crc kubenswrapper[4723]: I0309 13:19:48.867294 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd854222-9c09-403c-95c5-37763f40aac3" containerName="mariadb-database-create" Mar 09 13:19:48 crc kubenswrapper[4723]: I0309 13:19:48.867462 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd854222-9c09-403c-95c5-37763f40aac3" containerName="mariadb-database-create" Mar 09 13:19:48 crc kubenswrapper[4723]: I0309 13:19:48.867495 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="08304d40-0d1d-40e2-8188-8e6fd44434c2" containerName="mariadb-account-create-update" Mar 09 13:19:48 crc kubenswrapper[4723]: I0309 13:19:48.867502 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="0958c374-1159-4924-bff6-d627956944c9" containerName="mariadb-account-create-update" Mar 09 13:19:48 crc kubenswrapper[4723]: I0309 13:19:48.884284 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-lmg7p" Mar 09 13:19:48 crc kubenswrapper[4723]: I0309 13:19:48.887056 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 09 13:19:48 crc kubenswrapper[4723]: I0309 13:19:48.907525 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3af2747f-855e-4f65-aebb-40cfb961b775" path="/var/lib/kubelet/pods/3af2747f-855e-4f65-aebb-40cfb961b775/volumes" Mar 09 13:19:48 crc kubenswrapper[4723]: I0309 13:19:48.908128 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-lmg7p"] Mar 09 13:19:49 crc kubenswrapper[4723]: I0309 13:19:49.043012 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7954e0c8-9292-438a-b73f-c91df5746c02-operator-scripts\") pod \"root-account-create-update-lmg7p\" (UID: \"7954e0c8-9292-438a-b73f-c91df5746c02\") " pod="openstack/root-account-create-update-lmg7p" Mar 09 13:19:49 crc kubenswrapper[4723]: I0309 13:19:49.043092 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4bwc\" (UniqueName: \"kubernetes.io/projected/7954e0c8-9292-438a-b73f-c91df5746c02-kube-api-access-g4bwc\") pod \"root-account-create-update-lmg7p\" (UID: \"7954e0c8-9292-438a-b73f-c91df5746c02\") " pod="openstack/root-account-create-update-lmg7p" Mar 09 13:19:49 crc kubenswrapper[4723]: I0309 13:19:49.145503 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7954e0c8-9292-438a-b73f-c91df5746c02-operator-scripts\") pod \"root-account-create-update-lmg7p\" (UID: \"7954e0c8-9292-438a-b73f-c91df5746c02\") " pod="openstack/root-account-create-update-lmg7p" Mar 09 13:19:49 crc kubenswrapper[4723]: I0309 13:19:49.145560 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4bwc\" (UniqueName: \"kubernetes.io/projected/7954e0c8-9292-438a-b73f-c91df5746c02-kube-api-access-g4bwc\") pod \"root-account-create-update-lmg7p\" (UID: \"7954e0c8-9292-438a-b73f-c91df5746c02\") " pod="openstack/root-account-create-update-lmg7p" Mar 09 13:19:49 crc kubenswrapper[4723]: I0309 13:19:49.146416 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7954e0c8-9292-438a-b73f-c91df5746c02-operator-scripts\") pod \"root-account-create-update-lmg7p\" (UID: \"7954e0c8-9292-438a-b73f-c91df5746c02\") " pod="openstack/root-account-create-update-lmg7p" Mar 09 13:19:49 crc kubenswrapper[4723]: I0309 13:19:49.165817 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4bwc\" (UniqueName: \"kubernetes.io/projected/7954e0c8-9292-438a-b73f-c91df5746c02-kube-api-access-g4bwc\") pod \"root-account-create-update-lmg7p\" (UID: \"7954e0c8-9292-438a-b73f-c91df5746c02\") " pod="openstack/root-account-create-update-lmg7p" Mar 09 13:19:49 crc kubenswrapper[4723]: I0309 13:19:49.420806 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-lmg7p" Mar 09 13:19:49 crc kubenswrapper[4723]: I0309 13:19:49.545224 4723 generic.go:334] "Generic (PLEG): container finished" podID="54210e7b-b34d-411d-93e1-e8cc3448c4b0" containerID="42687d97a92f3eec8eb044239bd85c0cb19dc311299573a1084789dac3e84d1d" exitCode=0 Mar 09 13:19:49 crc kubenswrapper[4723]: I0309 13:19:49.545467 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"54210e7b-b34d-411d-93e1-e8cc3448c4b0","Type":"ContainerDied","Data":"42687d97a92f3eec8eb044239bd85c0cb19dc311299573a1084789dac3e84d1d"} Mar 09 13:19:49 crc kubenswrapper[4723]: I0309 13:19:49.550433 4723 generic.go:334] "Generic (PLEG): container finished" podID="4a39acc6-3d02-4b5a-957f-eb4e3d578aeb" containerID="f600bd95ed92947bbe218dbf141750e32f7e39e37c298df8afa64d12dc276f50" exitCode=0 Mar 09 13:19:49 crc kubenswrapper[4723]: I0309 13:19:49.550509 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb","Type":"ContainerDied","Data":"f600bd95ed92947bbe218dbf141750e32f7e39e37c298df8afa64d12dc276f50"} Mar 09 13:19:49 crc kubenswrapper[4723]: I0309 13:19:49.571459 4723 generic.go:334] "Generic (PLEG): container finished" podID="daa528e2-bcd7-43a8-bfea-a0911b3020c5" containerID="8ef261746c675013eb62f34eb677fb0207d09a06464e1471d9a77690a6583b55" exitCode=0 Mar 09 13:19:49 crc kubenswrapper[4723]: I0309 13:19:49.571504 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"daa528e2-bcd7-43a8-bfea-a0911b3020c5","Type":"ContainerDied","Data":"8ef261746c675013eb62f34eb677fb0207d09a06464e1471d9a77690a6583b55"} Mar 09 13:19:49 crc kubenswrapper[4723]: I0309 13:19:49.901661 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-2kn5v" Mar 09 13:19:49 crc kubenswrapper[4723]: I0309 13:19:49.990652 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-vzp24"] Mar 09 13:19:49 crc kubenswrapper[4723]: I0309 13:19:49.994700 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-vzp24" Mar 09 13:19:50 crc kubenswrapper[4723]: I0309 13:19:50.014919 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-vzp24"] Mar 09 13:19:50 crc kubenswrapper[4723]: I0309 13:19:50.046008 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-vmb2f"] Mar 09 13:19:50 crc kubenswrapper[4723]: I0309 13:19:50.046238 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-vmb2f" podUID="0d07d123-4437-46d2-b1f8-d4e0b495e0fa" containerName="dnsmasq-dns" containerID="cri-o://5f6698a68a7be5d219a30d632317697d1134c01481f7f107b78b760c1757948f" gracePeriod=10 Mar 09 13:19:50 crc kubenswrapper[4723]: I0309 13:19:50.071098 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dcf2e69-d684-4721-a6ad-bedf186a51a4-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-vzp24\" (UID: \"0dcf2e69-d684-4721-a6ad-bedf186a51a4\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-vzp24" Mar 09 13:19:50 crc kubenswrapper[4723]: I0309 13:19:50.071264 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvr2t\" (UniqueName: \"kubernetes.io/projected/0dcf2e69-d684-4721-a6ad-bedf186a51a4-kube-api-access-zvr2t\") pod \"mysqld-exporter-openstack-cell1-db-create-vzp24\" (UID: \"0dcf2e69-d684-4721-a6ad-bedf186a51a4\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-vzp24" Mar 09 13:19:50 crc kubenswrapper[4723]: I0309 13:19:50.088741 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-lmg7p"] Mar 09 13:19:50 crc kubenswrapper[4723]: I0309 13:19:50.173628 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dcf2e69-d684-4721-a6ad-bedf186a51a4-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-vzp24\" (UID: \"0dcf2e69-d684-4721-a6ad-bedf186a51a4\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-vzp24" Mar 09 13:19:50 crc kubenswrapper[4723]: I0309 13:19:50.174010 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvr2t\" (UniqueName: \"kubernetes.io/projected/0dcf2e69-d684-4721-a6ad-bedf186a51a4-kube-api-access-zvr2t\") pod \"mysqld-exporter-openstack-cell1-db-create-vzp24\" (UID: \"0dcf2e69-d684-4721-a6ad-bedf186a51a4\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-vzp24" Mar 09 13:19:50 crc kubenswrapper[4723]: I0309 13:19:50.174672 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-e639-account-create-update-n5ktm"] Mar 09 13:19:50 crc kubenswrapper[4723]: I0309 13:19:50.174799 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dcf2e69-d684-4721-a6ad-bedf186a51a4-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-vzp24\" (UID: \"0dcf2e69-d684-4721-a6ad-bedf186a51a4\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-vzp24" Mar 09 13:19:50 crc kubenswrapper[4723]: I0309 13:19:50.176680 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-e639-account-create-update-n5ktm" Mar 09 13:19:50 crc kubenswrapper[4723]: I0309 13:19:50.179169 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-cell1-db-secret" Mar 09 13:19:50 crc kubenswrapper[4723]: I0309 13:19:50.217415 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-e639-account-create-update-n5ktm"] Mar 09 13:19:50 crc kubenswrapper[4723]: I0309 13:19:50.224696 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvr2t\" (UniqueName: \"kubernetes.io/projected/0dcf2e69-d684-4721-a6ad-bedf186a51a4-kube-api-access-zvr2t\") pod \"mysqld-exporter-openstack-cell1-db-create-vzp24\" (UID: \"0dcf2e69-d684-4721-a6ad-bedf186a51a4\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-vzp24" Mar 09 13:19:50 crc kubenswrapper[4723]: I0309 13:19:50.278268 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvzvg\" (UniqueName: \"kubernetes.io/projected/842ad0c6-1dfa-4ea1-9d26-84c84b8d3f92-kube-api-access-zvzvg\") pod \"mysqld-exporter-e639-account-create-update-n5ktm\" (UID: \"842ad0c6-1dfa-4ea1-9d26-84c84b8d3f92\") " pod="openstack/mysqld-exporter-e639-account-create-update-n5ktm" Mar 09 13:19:50 crc kubenswrapper[4723]: I0309 13:19:50.278449 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/842ad0c6-1dfa-4ea1-9d26-84c84b8d3f92-operator-scripts\") pod \"mysqld-exporter-e639-account-create-update-n5ktm\" (UID: \"842ad0c6-1dfa-4ea1-9d26-84c84b8d3f92\") " pod="openstack/mysqld-exporter-e639-account-create-update-n5ktm" Mar 09 13:19:50 crc kubenswrapper[4723]: I0309 13:19:50.335478 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-vmb2f" podUID="0d07d123-4437-46d2-b1f8-d4e0b495e0fa" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.149:5353: connect: connection refused" Mar 09 13:19:50 crc kubenswrapper[4723]: I0309 13:19:50.366016 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-vzp24" Mar 09 13:19:50 crc kubenswrapper[4723]: I0309 13:19:50.381342 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/842ad0c6-1dfa-4ea1-9d26-84c84b8d3f92-operator-scripts\") pod \"mysqld-exporter-e639-account-create-update-n5ktm\" (UID: \"842ad0c6-1dfa-4ea1-9d26-84c84b8d3f92\") " pod="openstack/mysqld-exporter-e639-account-create-update-n5ktm" Mar 09 13:19:50 crc kubenswrapper[4723]: I0309 13:19:50.381519 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvzvg\" (UniqueName: \"kubernetes.io/projected/842ad0c6-1dfa-4ea1-9d26-84c84b8d3f92-kube-api-access-zvzvg\") pod \"mysqld-exporter-e639-account-create-update-n5ktm\" (UID: \"842ad0c6-1dfa-4ea1-9d26-84c84b8d3f92\") " pod="openstack/mysqld-exporter-e639-account-create-update-n5ktm" Mar 09 13:19:50 crc kubenswrapper[4723]: I0309 13:19:50.382330 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/842ad0c6-1dfa-4ea1-9d26-84c84b8d3f92-operator-scripts\") pod \"mysqld-exporter-e639-account-create-update-n5ktm\" (UID: \"842ad0c6-1dfa-4ea1-9d26-84c84b8d3f92\") " pod="openstack/mysqld-exporter-e639-account-create-update-n5ktm" Mar 09 13:19:50 crc kubenswrapper[4723]: I0309 13:19:50.400115 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvzvg\" (UniqueName: \"kubernetes.io/projected/842ad0c6-1dfa-4ea1-9d26-84c84b8d3f92-kube-api-access-zvzvg\") pod \"mysqld-exporter-e639-account-create-update-n5ktm\" (UID: \"842ad0c6-1dfa-4ea1-9d26-84c84b8d3f92\") " pod="openstack/mysqld-exporter-e639-account-create-update-n5ktm" Mar 09 13:19:50 crc kubenswrapper[4723]: I0309 13:19:50.624750 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-lmg7p" event={"ID":"7954e0c8-9292-438a-b73f-c91df5746c02","Type":"ContainerStarted","Data":"dc6c219316c1459058b740dc5cbe2a6864cc5d92da8d6dad97c573b1f932284f"} Mar 09 13:19:50 crc kubenswrapper[4723]: I0309 13:19:50.628442 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb","Type":"ContainerStarted","Data":"949cfd774a5ac85ffdc5516fd1299f2c3fc1e7abdb1a2335f187c11475bef008"} Mar 09 13:19:50 crc kubenswrapper[4723]: I0309 13:19:50.630104 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:19:50 crc kubenswrapper[4723]: I0309 13:19:50.640960 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"daa528e2-bcd7-43a8-bfea-a0911b3020c5","Type":"ContainerStarted","Data":"f12e964565b682de4c859455f62e9203db365e9420b552bf6ee54ba492d5bdee"} Mar 09 13:19:50 crc kubenswrapper[4723]: I0309 13:19:50.642009 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-1" Mar 09 13:19:50 crc kubenswrapper[4723]: I0309 13:19:50.646157 4723 generic.go:334] "Generic (PLEG): container finished" podID="0d07d123-4437-46d2-b1f8-d4e0b495e0fa" containerID="5f6698a68a7be5d219a30d632317697d1134c01481f7f107b78b760c1757948f" exitCode=0 Mar 09 13:19:50 crc kubenswrapper[4723]: I0309 13:19:50.646232 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-vmb2f" event={"ID":"0d07d123-4437-46d2-b1f8-d4e0b495e0fa","Type":"ContainerDied","Data":"5f6698a68a7be5d219a30d632317697d1134c01481f7f107b78b760c1757948f"} Mar 09 13:19:50 crc kubenswrapper[4723]: I0309 13:19:50.651620 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"54210e7b-b34d-411d-93e1-e8cc3448c4b0","Type":"ContainerStarted","Data":"025b50af0a9f9816db745dfddd2c4f6971a2e8d89c088c14482796d31046a5b3"} Mar 09 13:19:50 crc kubenswrapper[4723]: I0309 13:19:50.656203 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-2" Mar 09 13:19:50 crc kubenswrapper[4723]: I0309 13:19:50.660082 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-e639-account-create-update-n5ktm" Mar 09 13:19:50 crc kubenswrapper[4723]: I0309 13:19:50.667441 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.68039746 podStartE2EDuration="1m3.66741538s" podCreationTimestamp="2026-03-09 13:18:47 +0000 UTC" firstStartedPulling="2026-03-09 13:18:49.38935135 +0000 UTC m=+1203.403818890" lastFinishedPulling="2026-03-09 13:19:15.37636927 +0000 UTC m=+1229.390836810" observedRunningTime="2026-03-09 13:19:50.660041093 +0000 UTC m=+1264.674508663" watchObservedRunningTime="2026-03-09 13:19:50.66741538 +0000 UTC m=+1264.681882920" Mar 09 13:19:50 crc kubenswrapper[4723]: I0309 13:19:50.711220 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-2" podStartSLOduration=38.527601361 podStartE2EDuration="1m3.711197426s" podCreationTimestamp="2026-03-09 13:18:47 +0000 UTC" firstStartedPulling="2026-03-09 13:18:50.096811539 +0000 UTC m=+1204.111279079" lastFinishedPulling="2026-03-09 13:19:15.280407604 +0000 UTC m=+1229.294875144" observedRunningTime="2026-03-09 13:19:50.701647361 +0000 UTC m=+1264.716114901" watchObservedRunningTime="2026-03-09 13:19:50.711197426 +0000 UTC m=+1264.725664966" Mar 09 13:19:50 crc kubenswrapper[4723]: I0309 13:19:50.747228 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-1" podStartSLOduration=39.387767305 podStartE2EDuration="1m3.747203544s" podCreationTimestamp="2026-03-09 13:18:47 +0000 UTC" firstStartedPulling="2026-03-09 13:18:51.075411248 +0000 UTC m=+1205.089878788" lastFinishedPulling="2026-03-09 13:19:15.434847487 +0000 UTC m=+1229.449315027" observedRunningTime="2026-03-09 13:19:50.742442118 +0000 UTC m=+1264.756909678" watchObservedRunningTime="2026-03-09 13:19:50.747203544 +0000 UTC m=+1264.761671084" Mar 09 13:19:51 crc kubenswrapper[4723]: I0309 13:19:51.055220 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-vzp24"] Mar 09 13:19:51 crc kubenswrapper[4723]: I0309 13:19:51.455833 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-e639-account-create-update-n5ktm"] Mar 09 13:19:51 crc kubenswrapper[4723]: I0309 13:19:51.467361 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-vmb2f" Mar 09 13:19:51 crc kubenswrapper[4723]: W0309 13:19:51.472993 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod842ad0c6_1dfa_4ea1_9d26_84c84b8d3f92.slice/crio-82dc65b151d56cdc4a26dae1de185a047afa5832a3f9abd70e1d7b12c899a55c WatchSource:0}: Error finding container 82dc65b151d56cdc4a26dae1de185a047afa5832a3f9abd70e1d7b12c899a55c: Status 404 returned error can't find the container with id 82dc65b151d56cdc4a26dae1de185a047afa5832a3f9abd70e1d7b12c899a55c Mar 09 13:19:51 crc kubenswrapper[4723]: I0309 13:19:51.525842 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d07d123-4437-46d2-b1f8-d4e0b495e0fa-dns-svc\") pod \"0d07d123-4437-46d2-b1f8-d4e0b495e0fa\" (UID: \"0d07d123-4437-46d2-b1f8-d4e0b495e0fa\") " Mar 09 13:19:51 crc kubenswrapper[4723]: I0309 13:19:51.526056 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d07d123-4437-46d2-b1f8-d4e0b495e0fa-ovsdbserver-nb\") pod \"0d07d123-4437-46d2-b1f8-d4e0b495e0fa\" (UID: \"0d07d123-4437-46d2-b1f8-d4e0b495e0fa\") " Mar 09 13:19:51 crc kubenswrapper[4723]: I0309 13:19:51.526147 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d07d123-4437-46d2-b1f8-d4e0b495e0fa-config\") pod \"0d07d123-4437-46d2-b1f8-d4e0b495e0fa\" (UID: \"0d07d123-4437-46d2-b1f8-d4e0b495e0fa\") " Mar 09 13:19:51 crc kubenswrapper[4723]: I0309 13:19:51.526182 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d07d123-4437-46d2-b1f8-d4e0b495e0fa-ovsdbserver-sb\") pod \"0d07d123-4437-46d2-b1f8-d4e0b495e0fa\" (UID: \"0d07d123-4437-46d2-b1f8-d4e0b495e0fa\") " Mar 09 13:19:51 crc kubenswrapper[4723]: I0309 13:19:51.526229 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7zpg\" (UniqueName: \"kubernetes.io/projected/0d07d123-4437-46d2-b1f8-d4e0b495e0fa-kube-api-access-r7zpg\") pod \"0d07d123-4437-46d2-b1f8-d4e0b495e0fa\" (UID: \"0d07d123-4437-46d2-b1f8-d4e0b495e0fa\") " Mar 09 13:19:51 crc kubenswrapper[4723]: I0309 13:19:51.539467 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d07d123-4437-46d2-b1f8-d4e0b495e0fa-kube-api-access-r7zpg" (OuterVolumeSpecName: "kube-api-access-r7zpg") pod "0d07d123-4437-46d2-b1f8-d4e0b495e0fa" (UID: "0d07d123-4437-46d2-b1f8-d4e0b495e0fa"). InnerVolumeSpecName "kube-api-access-r7zpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:19:51 crc kubenswrapper[4723]: I0309 13:19:51.630364 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d924133b-d3c9-4b71-bbf4-a894a618e6c4-etc-swift\") pod \"swift-storage-0\" (UID: \"d924133b-d3c9-4b71-bbf4-a894a618e6c4\") " pod="openstack/swift-storage-0" Mar 09 13:19:51 crc kubenswrapper[4723]: I0309 13:19:51.630775 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7zpg\" (UniqueName: \"kubernetes.io/projected/0d07d123-4437-46d2-b1f8-d4e0b495e0fa-kube-api-access-r7zpg\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:51 crc kubenswrapper[4723]: E0309 13:19:51.630959 4723 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 09 13:19:51 crc kubenswrapper[4723]: E0309 13:19:51.631045 4723 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 09 13:19:51 crc kubenswrapper[4723]: E0309 13:19:51.631147 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d924133b-d3c9-4b71-bbf4-a894a618e6c4-etc-swift podName:d924133b-d3c9-4b71-bbf4-a894a618e6c4 nodeName:}" failed. No retries permitted until 2026-03-09 13:20:07.631132132 +0000 UTC m=+1281.645599672 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d924133b-d3c9-4b71-bbf4-a894a618e6c4-etc-swift") pod "swift-storage-0" (UID: "d924133b-d3c9-4b71-bbf4-a894a618e6c4") : configmap "swift-ring-files" not found Mar 09 13:19:51 crc kubenswrapper[4723]: I0309 13:19:51.689215 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-vmb2f" event={"ID":"0d07d123-4437-46d2-b1f8-d4e0b495e0fa","Type":"ContainerDied","Data":"821423caf45cfb68ea6aa64738ff5f33f209cac63e167418f4f29100207dd1e7"} Mar 09 13:19:51 crc kubenswrapper[4723]: I0309 13:19:51.689263 4723 scope.go:117] "RemoveContainer" containerID="5f6698a68a7be5d219a30d632317697d1134c01481f7f107b78b760c1757948f" Mar 09 13:19:51 crc kubenswrapper[4723]: I0309 13:19:51.689375 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-vmb2f" Mar 09 13:19:51 crc kubenswrapper[4723]: I0309 13:19:51.702434 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-vzp24" event={"ID":"0dcf2e69-d684-4721-a6ad-bedf186a51a4","Type":"ContainerStarted","Data":"4bb71d5e997748e86927ad52be58593d0c3870c517c8c6589a44e40bfc4f6359"} Mar 09 13:19:51 crc kubenswrapper[4723]: I0309 13:19:51.715143 4723 generic.go:334] "Generic (PLEG): container finished" podID="7954e0c8-9292-438a-b73f-c91df5746c02" containerID="8e11255da285ce3f57b0b4ccf342daccc1e3fb68582f73279f9605a2391b9e1d" exitCode=0 Mar 09 13:19:51 crc kubenswrapper[4723]: I0309 13:19:51.715307 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-lmg7p" event={"ID":"7954e0c8-9292-438a-b73f-c91df5746c02","Type":"ContainerDied","Data":"8e11255da285ce3f57b0b4ccf342daccc1e3fb68582f73279f9605a2391b9e1d"} Mar 09 13:19:51 crc kubenswrapper[4723]: I0309 13:19:51.745898 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d07d123-4437-46d2-b1f8-d4e0b495e0fa-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0d07d123-4437-46d2-b1f8-d4e0b495e0fa" (UID: "0d07d123-4437-46d2-b1f8-d4e0b495e0fa"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:19:51 crc kubenswrapper[4723]: I0309 13:19:51.746961 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-e639-account-create-update-n5ktm" event={"ID":"842ad0c6-1dfa-4ea1-9d26-84c84b8d3f92","Type":"ContainerStarted","Data":"82dc65b151d56cdc4a26dae1de185a047afa5832a3f9abd70e1d7b12c899a55c"} Mar 09 13:19:51 crc kubenswrapper[4723]: I0309 13:19:51.758333 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d07d123-4437-46d2-b1f8-d4e0b495e0fa-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0d07d123-4437-46d2-b1f8-d4e0b495e0fa" (UID: "0d07d123-4437-46d2-b1f8-d4e0b495e0fa"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:19:51 crc kubenswrapper[4723]: I0309 13:19:51.779717 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d07d123-4437-46d2-b1f8-d4e0b495e0fa-ovsdbserver-nb\") pod \"0d07d123-4437-46d2-b1f8-d4e0b495e0fa\" (UID: \"0d07d123-4437-46d2-b1f8-d4e0b495e0fa\") " Mar 09 13:19:51 crc kubenswrapper[4723]: W0309 13:19:51.780272 4723 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/0d07d123-4437-46d2-b1f8-d4e0b495e0fa/volumes/kubernetes.io~configmap/ovsdbserver-nb Mar 09 13:19:51 crc kubenswrapper[4723]: I0309 13:19:51.780304 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d07d123-4437-46d2-b1f8-d4e0b495e0fa-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0d07d123-4437-46d2-b1f8-d4e0b495e0fa" (UID: "0d07d123-4437-46d2-b1f8-d4e0b495e0fa"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:19:51 crc kubenswrapper[4723]: I0309 13:19:51.788577 4723 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d07d123-4437-46d2-b1f8-d4e0b495e0fa-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:51 crc kubenswrapper[4723]: I0309 13:19:51.788651 4723 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d07d123-4437-46d2-b1f8-d4e0b495e0fa-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:51 crc kubenswrapper[4723]: I0309 13:19:51.804305 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d07d123-4437-46d2-b1f8-d4e0b495e0fa-config" (OuterVolumeSpecName: "config") pod "0d07d123-4437-46d2-b1f8-d4e0b495e0fa" (UID: "0d07d123-4437-46d2-b1f8-d4e0b495e0fa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:19:51 crc kubenswrapper[4723]: I0309 13:19:51.809266 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d07d123-4437-46d2-b1f8-d4e0b495e0fa-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0d07d123-4437-46d2-b1f8-d4e0b495e0fa" (UID: "0d07d123-4437-46d2-b1f8-d4e0b495e0fa"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:19:51 crc kubenswrapper[4723]: I0309 13:19:51.891607 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d07d123-4437-46d2-b1f8-d4e0b495e0fa-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:51 crc kubenswrapper[4723]: I0309 13:19:51.891639 4723 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d07d123-4437-46d2-b1f8-d4e0b495e0fa-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:51 crc kubenswrapper[4723]: I0309 13:19:51.908994 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 09 13:19:52 crc kubenswrapper[4723]: I0309 13:19:52.059955 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-vmb2f"] Mar 09 13:19:52 crc kubenswrapper[4723]: I0309 13:19:52.067631 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-vmb2f"] Mar 09 13:19:52 crc kubenswrapper[4723]: I0309 13:19:52.118306 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-gnr4x"] Mar 09 13:19:52 crc kubenswrapper[4723]: E0309 13:19:52.118791 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d07d123-4437-46d2-b1f8-d4e0b495e0fa" containerName="dnsmasq-dns" Mar 09 13:19:52 crc kubenswrapper[4723]: I0309 13:19:52.118816 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d07d123-4437-46d2-b1f8-d4e0b495e0fa" containerName="dnsmasq-dns" Mar 09 13:19:52 crc kubenswrapper[4723]: E0309 13:19:52.118844 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d07d123-4437-46d2-b1f8-d4e0b495e0fa" containerName="init" Mar 09 13:19:52 crc kubenswrapper[4723]: I0309 13:19:52.118851 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d07d123-4437-46d2-b1f8-d4e0b495e0fa" containerName="init" Mar 09 13:19:52 crc kubenswrapper[4723]: I0309 13:19:52.119064 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d07d123-4437-46d2-b1f8-d4e0b495e0fa" containerName="dnsmasq-dns" Mar 09 13:19:52 crc kubenswrapper[4723]: I0309 13:19:52.119980 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-gnr4x" Mar 09 13:19:52 crc kubenswrapper[4723]: I0309 13:19:52.148986 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-gnr4x"] Mar 09 13:19:52 crc kubenswrapper[4723]: I0309 13:19:52.151682 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-6d748498-cmr7m" podUID="2994750c-7f55-4708-856b-c9547e3f054c" containerName="console" containerID="cri-o://f56b0d79b1e915f7a130679fa2da30ddd4a330859fc19d4a51c5569a1d76ecec" gracePeriod=15 Mar 09 13:19:52 crc kubenswrapper[4723]: I0309 13:19:52.198175 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5n6w\" (UniqueName: \"kubernetes.io/projected/556c51ac-2052-4f8c-9c5b-830aacc68de0-kube-api-access-s5n6w\") pod \"glance-db-create-gnr4x\" (UID: \"556c51ac-2052-4f8c-9c5b-830aacc68de0\") " pod="openstack/glance-db-create-gnr4x" Mar 09 13:19:52 crc kubenswrapper[4723]: I0309 13:19:52.198509 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/556c51ac-2052-4f8c-9c5b-830aacc68de0-operator-scripts\") pod \"glance-db-create-gnr4x\" (UID: \"556c51ac-2052-4f8c-9c5b-830aacc68de0\") " pod="openstack/glance-db-create-gnr4x" Mar 09 13:19:52 crc kubenswrapper[4723]: I0309 13:19:52.269996 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-0679-account-create-update-xwhf9"] Mar 09 13:19:52 crc kubenswrapper[4723]: I0309 13:19:52.272014 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0679-account-create-update-xwhf9" Mar 09 13:19:52 crc kubenswrapper[4723]: I0309 13:19:52.282618 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 09 13:19:52 crc kubenswrapper[4723]: I0309 13:19:52.291501 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-0679-account-create-update-xwhf9"] Mar 09 13:19:52 crc kubenswrapper[4723]: I0309 13:19:52.305031 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/556c51ac-2052-4f8c-9c5b-830aacc68de0-operator-scripts\") pod \"glance-db-create-gnr4x\" (UID: \"556c51ac-2052-4f8c-9c5b-830aacc68de0\") " pod="openstack/glance-db-create-gnr4x" Mar 09 13:19:52 crc kubenswrapper[4723]: I0309 13:19:52.305199 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5n6w\" (UniqueName: \"kubernetes.io/projected/556c51ac-2052-4f8c-9c5b-830aacc68de0-kube-api-access-s5n6w\") pod \"glance-db-create-gnr4x\" (UID: \"556c51ac-2052-4f8c-9c5b-830aacc68de0\") " pod="openstack/glance-db-create-gnr4x" Mar 09 13:19:52 crc kubenswrapper[4723]: I0309 13:19:52.306248 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/556c51ac-2052-4f8c-9c5b-830aacc68de0-operator-scripts\") pod \"glance-db-create-gnr4x\" (UID: \"556c51ac-2052-4f8c-9c5b-830aacc68de0\") " pod="openstack/glance-db-create-gnr4x" Mar 09 13:19:52 crc kubenswrapper[4723]: I0309 13:19:52.336145 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5n6w\" (UniqueName: \"kubernetes.io/projected/556c51ac-2052-4f8c-9c5b-830aacc68de0-kube-api-access-s5n6w\") pod \"glance-db-create-gnr4x\" (UID: \"556c51ac-2052-4f8c-9c5b-830aacc68de0\") " pod="openstack/glance-db-create-gnr4x" Mar 09 13:19:52 crc kubenswrapper[4723]: I0309 13:19:52.407134 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nbql\" (UniqueName: \"kubernetes.io/projected/12ca1cc4-a262-42c5-ac51-eee86f7c9793-kube-api-access-8nbql\") pod \"glance-0679-account-create-update-xwhf9\" (UID: \"12ca1cc4-a262-42c5-ac51-eee86f7c9793\") " pod="openstack/glance-0679-account-create-update-xwhf9" Mar 09 13:19:52 crc kubenswrapper[4723]: I0309 13:19:52.407183 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12ca1cc4-a262-42c5-ac51-eee86f7c9793-operator-scripts\") pod \"glance-0679-account-create-update-xwhf9\" (UID: \"12ca1cc4-a262-42c5-ac51-eee86f7c9793\") " pod="openstack/glance-0679-account-create-update-xwhf9" Mar 09 13:19:52 crc kubenswrapper[4723]: I0309 13:19:52.435773 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-gnr4x" Mar 09 13:19:52 crc kubenswrapper[4723]: I0309 13:19:52.509118 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nbql\" (UniqueName: \"kubernetes.io/projected/12ca1cc4-a262-42c5-ac51-eee86f7c9793-kube-api-access-8nbql\") pod \"glance-0679-account-create-update-xwhf9\" (UID: \"12ca1cc4-a262-42c5-ac51-eee86f7c9793\") " pod="openstack/glance-0679-account-create-update-xwhf9" Mar 09 13:19:52 crc kubenswrapper[4723]: I0309 13:19:52.509182 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12ca1cc4-a262-42c5-ac51-eee86f7c9793-operator-scripts\") pod \"glance-0679-account-create-update-xwhf9\" (UID: \"12ca1cc4-a262-42c5-ac51-eee86f7c9793\") " pod="openstack/glance-0679-account-create-update-xwhf9" Mar 09 13:19:52 crc kubenswrapper[4723]: I0309 13:19:52.510200 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12ca1cc4-a262-42c5-ac51-eee86f7c9793-operator-scripts\") pod \"glance-0679-account-create-update-xwhf9\" (UID: \"12ca1cc4-a262-42c5-ac51-eee86f7c9793\") " pod="openstack/glance-0679-account-create-update-xwhf9" Mar 09 13:19:52 crc kubenswrapper[4723]: I0309 13:19:52.529452 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nbql\" (UniqueName: \"kubernetes.io/projected/12ca1cc4-a262-42c5-ac51-eee86f7c9793-kube-api-access-8nbql\") pod \"glance-0679-account-create-update-xwhf9\" (UID: \"12ca1cc4-a262-42c5-ac51-eee86f7c9793\") " pod="openstack/glance-0679-account-create-update-xwhf9" Mar 09 13:19:52 crc kubenswrapper[4723]: I0309 13:19:52.681735 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0679-account-create-update-xwhf9" Mar 09 13:19:52 crc kubenswrapper[4723]: I0309 13:19:52.731699 4723 generic.go:334] "Generic (PLEG): container finished" podID="842ad0c6-1dfa-4ea1-9d26-84c84b8d3f92" containerID="9eee0a564c22de11770ba58a1d205cc2a1923922a221b3600eb9363e0c573ab4" exitCode=0 Mar 09 13:19:52 crc kubenswrapper[4723]: I0309 13:19:52.731806 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-e639-account-create-update-n5ktm" event={"ID":"842ad0c6-1dfa-4ea1-9d26-84c84b8d3f92","Type":"ContainerDied","Data":"9eee0a564c22de11770ba58a1d205cc2a1923922a221b3600eb9363e0c573ab4"} Mar 09 13:19:52 crc kubenswrapper[4723]: I0309 13:19:52.735312 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6d748498-cmr7m_2994750c-7f55-4708-856b-c9547e3f054c/console/0.log" Mar 09 13:19:52 crc kubenswrapper[4723]: I0309 13:19:52.735383 4723 generic.go:334] "Generic (PLEG): container finished" podID="2994750c-7f55-4708-856b-c9547e3f054c" containerID="f56b0d79b1e915f7a130679fa2da30ddd4a330859fc19d4a51c5569a1d76ecec" exitCode=2 Mar 09 13:19:52 crc kubenswrapper[4723]: I0309 13:19:52.735483 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d748498-cmr7m" event={"ID":"2994750c-7f55-4708-856b-c9547e3f054c","Type":"ContainerDied","Data":"f56b0d79b1e915f7a130679fa2da30ddd4a330859fc19d4a51c5569a1d76ecec"} Mar 09 13:19:52 crc kubenswrapper[4723]: I0309 13:19:52.737045 4723 generic.go:334] "Generic (PLEG): container finished" podID="0dcf2e69-d684-4721-a6ad-bedf186a51a4" containerID="a1d7da4a08cf9b2a26a73630ce1508694ccbe3a4f003f21658b56c18137ab4ef" exitCode=0 Mar 09 13:19:52 crc kubenswrapper[4723]: I0309 13:19:52.737185 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-vzp24" event={"ID":"0dcf2e69-d684-4721-a6ad-bedf186a51a4","Type":"ContainerDied","Data":"a1d7da4a08cf9b2a26a73630ce1508694ccbe3a4f003f21658b56c18137ab4ef"} Mar 09 13:19:52 crc kubenswrapper[4723]: I0309 13:19:52.893617 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d07d123-4437-46d2-b1f8-d4e0b495e0fa" path="/var/lib/kubelet/pods/0d07d123-4437-46d2-b1f8-d4e0b495e0fa/volumes" Mar 09 13:19:53 crc kubenswrapper[4723]: I0309 13:19:53.100771 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-s8dhp"] Mar 09 13:19:53 crc kubenswrapper[4723]: I0309 13:19:53.102396 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-s8dhp" Mar 09 13:19:53 crc kubenswrapper[4723]: I0309 13:19:53.120709 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-s8dhp"] Mar 09 13:19:53 crc kubenswrapper[4723]: I0309 13:19:53.210391 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-1379-account-create-update-tw2zb"] Mar 09 13:19:53 crc kubenswrapper[4723]: I0309 13:19:53.211631 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1379-account-create-update-tw2zb" Mar 09 13:19:53 crc kubenswrapper[4723]: I0309 13:19:53.217207 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 09 13:19:53 crc kubenswrapper[4723]: I0309 13:19:53.224411 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cxtg\" (UniqueName: \"kubernetes.io/projected/ed454b22-e190-4bf0-8581-e71f2ce51324-kube-api-access-7cxtg\") pod \"keystone-db-create-s8dhp\" (UID: \"ed454b22-e190-4bf0-8581-e71f2ce51324\") " pod="openstack/keystone-db-create-s8dhp" Mar 09 13:19:53 crc kubenswrapper[4723]: I0309 13:19:53.224456 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed454b22-e190-4bf0-8581-e71f2ce51324-operator-scripts\") pod \"keystone-db-create-s8dhp\" (UID: \"ed454b22-e190-4bf0-8581-e71f2ce51324\") " pod="openstack/keystone-db-create-s8dhp" Mar 09 13:19:53 crc kubenswrapper[4723]: I0309 13:19:53.242282 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-1379-account-create-update-tw2zb"] Mar 09 13:19:53 crc kubenswrapper[4723]: I0309 13:19:53.321969 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-ghfww"] Mar 09 13:19:53 crc kubenswrapper[4723]: I0309 13:19:53.323539 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ghfww" Mar 09 13:19:53 crc kubenswrapper[4723]: I0309 13:19:53.327283 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szpm7\" (UniqueName: \"kubernetes.io/projected/36dc8fce-575c-4fe1-b4df-2ae47014bce7-kube-api-access-szpm7\") pod \"keystone-1379-account-create-update-tw2zb\" (UID: \"36dc8fce-575c-4fe1-b4df-2ae47014bce7\") " pod="openstack/keystone-1379-account-create-update-tw2zb" Mar 09 13:19:53 crc kubenswrapper[4723]: I0309 13:19:53.327338 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cxtg\" (UniqueName: \"kubernetes.io/projected/ed454b22-e190-4bf0-8581-e71f2ce51324-kube-api-access-7cxtg\") pod \"keystone-db-create-s8dhp\" (UID: \"ed454b22-e190-4bf0-8581-e71f2ce51324\") " pod="openstack/keystone-db-create-s8dhp" Mar 09 13:19:53 crc kubenswrapper[4723]: I0309 13:19:53.327374 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed454b22-e190-4bf0-8581-e71f2ce51324-operator-scripts\") pod \"keystone-db-create-s8dhp\" (UID: \"ed454b22-e190-4bf0-8581-e71f2ce51324\") " pod="openstack/keystone-db-create-s8dhp" Mar 09 13:19:53 crc kubenswrapper[4723]: I0309 13:19:53.327424 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36dc8fce-575c-4fe1-b4df-2ae47014bce7-operator-scripts\") pod \"keystone-1379-account-create-update-tw2zb\" (UID: \"36dc8fce-575c-4fe1-b4df-2ae47014bce7\") " pod="openstack/keystone-1379-account-create-update-tw2zb" Mar 09 13:19:53 crc kubenswrapper[4723]: I0309 13:19:53.328420 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed454b22-e190-4bf0-8581-e71f2ce51324-operator-scripts\") pod \"keystone-db-create-s8dhp\" (UID: \"ed454b22-e190-4bf0-8581-e71f2ce51324\") " pod="openstack/keystone-db-create-s8dhp" Mar 09 13:19:53 crc kubenswrapper[4723]: I0309 13:19:53.332138 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-ghfww"] Mar 09 13:19:53 crc kubenswrapper[4723]: I0309 13:19:53.374625 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cxtg\" (UniqueName: \"kubernetes.io/projected/ed454b22-e190-4bf0-8581-e71f2ce51324-kube-api-access-7cxtg\") pod \"keystone-db-create-s8dhp\" (UID: \"ed454b22-e190-4bf0-8581-e71f2ce51324\") " pod="openstack/keystone-db-create-s8dhp" Mar 09 13:19:53 crc kubenswrapper[4723]: I0309 13:19:53.429391 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b743aece-4b06-44e2-9afa-fd075c0730d3-operator-scripts\") pod \"placement-db-create-ghfww\" (UID: \"b743aece-4b06-44e2-9afa-fd075c0730d3\") " pod="openstack/placement-db-create-ghfww" Mar 09 13:19:53 crc kubenswrapper[4723]: I0309 13:19:53.429648 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szpm7\" (UniqueName: \"kubernetes.io/projected/36dc8fce-575c-4fe1-b4df-2ae47014bce7-kube-api-access-szpm7\") pod \"keystone-1379-account-create-update-tw2zb\" (UID: \"36dc8fce-575c-4fe1-b4df-2ae47014bce7\") " pod="openstack/keystone-1379-account-create-update-tw2zb" Mar 09 13:19:53 crc kubenswrapper[4723]: I0309 13:19:53.429725 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bftr\" (UniqueName: \"kubernetes.io/projected/b743aece-4b06-44e2-9afa-fd075c0730d3-kube-api-access-5bftr\") pod \"placement-db-create-ghfww\" (UID: \"b743aece-4b06-44e2-9afa-fd075c0730d3\") " pod="openstack/placement-db-create-ghfww" Mar 09 13:19:53 crc kubenswrapper[4723]: I0309 13:19:53.429749 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36dc8fce-575c-4fe1-b4df-2ae47014bce7-operator-scripts\") pod \"keystone-1379-account-create-update-tw2zb\" (UID: \"36dc8fce-575c-4fe1-b4df-2ae47014bce7\") " pod="openstack/keystone-1379-account-create-update-tw2zb" Mar 09 13:19:53 crc kubenswrapper[4723]: I0309 13:19:53.430765 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36dc8fce-575c-4fe1-b4df-2ae47014bce7-operator-scripts\") pod \"keystone-1379-account-create-update-tw2zb\" (UID: \"36dc8fce-575c-4fe1-b4df-2ae47014bce7\") " pod="openstack/keystone-1379-account-create-update-tw2zb" Mar 09 13:19:53 crc kubenswrapper[4723]: I0309 13:19:53.448702 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szpm7\" (UniqueName: \"kubernetes.io/projected/36dc8fce-575c-4fe1-b4df-2ae47014bce7-kube-api-access-szpm7\") pod \"keystone-1379-account-create-update-tw2zb\" (UID: \"36dc8fce-575c-4fe1-b4df-2ae47014bce7\") " pod="openstack/keystone-1379-account-create-update-tw2zb" Mar 09 13:19:53 crc kubenswrapper[4723]: I0309 13:19:53.463981 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-s8dhp" Mar 09 13:19:53 crc kubenswrapper[4723]: I0309 13:19:53.531252 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b743aece-4b06-44e2-9afa-fd075c0730d3-operator-scripts\") pod \"placement-db-create-ghfww\" (UID: \"b743aece-4b06-44e2-9afa-fd075c0730d3\") " pod="openstack/placement-db-create-ghfww" Mar 09 13:19:53 crc kubenswrapper[4723]: I0309 13:19:53.531350 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bftr\" (UniqueName: \"kubernetes.io/projected/b743aece-4b06-44e2-9afa-fd075c0730d3-kube-api-access-5bftr\") pod \"placement-db-create-ghfww\" (UID: \"b743aece-4b06-44e2-9afa-fd075c0730d3\") " pod="openstack/placement-db-create-ghfww" Mar 09 13:19:53 crc kubenswrapper[4723]: I0309 13:19:53.531958 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b743aece-4b06-44e2-9afa-fd075c0730d3-operator-scripts\") pod \"placement-db-create-ghfww\" (UID: \"b743aece-4b06-44e2-9afa-fd075c0730d3\") " pod="openstack/placement-db-create-ghfww" Mar 09 13:19:53 crc kubenswrapper[4723]: I0309 13:19:53.534704 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1379-account-create-update-tw2zb" Mar 09 13:19:53 crc kubenswrapper[4723]: I0309 13:19:53.548816 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bftr\" (UniqueName: \"kubernetes.io/projected/b743aece-4b06-44e2-9afa-fd075c0730d3-kube-api-access-5bftr\") pod \"placement-db-create-ghfww\" (UID: \"b743aece-4b06-44e2-9afa-fd075c0730d3\") " pod="openstack/placement-db-create-ghfww" Mar 09 13:19:53 crc kubenswrapper[4723]: I0309 13:19:53.640800 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ghfww" Mar 09 13:19:54 crc kubenswrapper[4723]: I0309 13:19:54.755173 4723 generic.go:334] "Generic (PLEG): container finished" podID="10b17a94-81eb-4e72-bd49-97f590e26aec" containerID="4dff7cbb784365b6d7a33848c76f38626e7c710e86b81ba9ab3757b487ecd440" exitCode=0 Mar 09 13:19:54 crc kubenswrapper[4723]: I0309 13:19:54.755271 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-v4zmc" event={"ID":"10b17a94-81eb-4e72-bd49-97f590e26aec","Type":"ContainerDied","Data":"4dff7cbb784365b6d7a33848c76f38626e7c710e86b81ba9ab3757b487ecd440"} Mar 09 13:19:55 crc kubenswrapper[4723]: I0309 13:19:55.169648 4723 patch_prober.go:28] interesting pod/console-6d748498-cmr7m container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.71:8443/health\": dial tcp 10.217.0.71:8443: connect: connection refused" start-of-body= Mar 09 13:19:55 crc kubenswrapper[4723]: I0309 13:19:55.169706 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-6d748498-cmr7m" podUID="2994750c-7f55-4708-856b-c9547e3f054c" containerName="console" probeResult="failure" output="Get \"https://10.217.0.71:8443/health\": dial tcp 10.217.0.71:8443: connect: connection refused" Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.511074 4723 scope.go:117] "RemoveContainer" containerID="55a1bea2226982c9fc007254ecd0bafcac3b14d4b114155b0ecf7dc7062af31b" Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.708520 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-lmg7p" Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.729986 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-vzp24" Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.730335 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-e639-account-create-update-n5ktm" Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.796730 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-lmg7p" event={"ID":"7954e0c8-9292-438a-b73f-c91df5746c02","Type":"ContainerDied","Data":"dc6c219316c1459058b740dc5cbe2a6864cc5d92da8d6dad97c573b1f932284f"} Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.796765 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc6c219316c1459058b740dc5cbe2a6864cc5d92da8d6dad97c573b1f932284f" Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.796829 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-lmg7p" Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.801445 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/842ad0c6-1dfa-4ea1-9d26-84c84b8d3f92-operator-scripts\") pod \"842ad0c6-1dfa-4ea1-9d26-84c84b8d3f92\" (UID: \"842ad0c6-1dfa-4ea1-9d26-84c84b8d3f92\") " Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.801482 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvr2t\" (UniqueName: \"kubernetes.io/projected/0dcf2e69-d684-4721-a6ad-bedf186a51a4-kube-api-access-zvr2t\") pod \"0dcf2e69-d684-4721-a6ad-bedf186a51a4\" (UID: \"0dcf2e69-d684-4721-a6ad-bedf186a51a4\") " Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.801602 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4bwc\" (UniqueName: \"kubernetes.io/projected/7954e0c8-9292-438a-b73f-c91df5746c02-kube-api-access-g4bwc\") pod \"7954e0c8-9292-438a-b73f-c91df5746c02\" (UID: \"7954e0c8-9292-438a-b73f-c91df5746c02\") " Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.801649 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7954e0c8-9292-438a-b73f-c91df5746c02-operator-scripts\") pod \"7954e0c8-9292-438a-b73f-c91df5746c02\" (UID: \"7954e0c8-9292-438a-b73f-c91df5746c02\") " Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.801675 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvzvg\" (UniqueName: \"kubernetes.io/projected/842ad0c6-1dfa-4ea1-9d26-84c84b8d3f92-kube-api-access-zvzvg\") pod \"842ad0c6-1dfa-4ea1-9d26-84c84b8d3f92\" (UID: \"842ad0c6-1dfa-4ea1-9d26-84c84b8d3f92\") " Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.801758 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dcf2e69-d684-4721-a6ad-bedf186a51a4-operator-scripts\") pod \"0dcf2e69-d684-4721-a6ad-bedf186a51a4\" (UID: \"0dcf2e69-d684-4721-a6ad-bedf186a51a4\") " Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.803224 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7954e0c8-9292-438a-b73f-c91df5746c02-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7954e0c8-9292-438a-b73f-c91df5746c02" (UID: "7954e0c8-9292-438a-b73f-c91df5746c02"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.803313 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/842ad0c6-1dfa-4ea1-9d26-84c84b8d3f92-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "842ad0c6-1dfa-4ea1-9d26-84c84b8d3f92" (UID: "842ad0c6-1dfa-4ea1-9d26-84c84b8d3f92"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.804449 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0dcf2e69-d684-4721-a6ad-bedf186a51a4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0dcf2e69-d684-4721-a6ad-bedf186a51a4" (UID: "0dcf2e69-d684-4721-a6ad-bedf186a51a4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.806657 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/842ad0c6-1dfa-4ea1-9d26-84c84b8d3f92-kube-api-access-zvzvg" (OuterVolumeSpecName: "kube-api-access-zvzvg") pod "842ad0c6-1dfa-4ea1-9d26-84c84b8d3f92" (UID: "842ad0c6-1dfa-4ea1-9d26-84c84b8d3f92"). InnerVolumeSpecName "kube-api-access-zvzvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.809049 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-e639-account-create-update-n5ktm" Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.814957 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-e639-account-create-update-n5ktm" event={"ID":"842ad0c6-1dfa-4ea1-9d26-84c84b8d3f92","Type":"ContainerDied","Data":"82dc65b151d56cdc4a26dae1de185a047afa5832a3f9abd70e1d7b12c899a55c"} Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.815006 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82dc65b151d56cdc4a26dae1de185a047afa5832a3f9abd70e1d7b12c899a55c" Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.816435 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dcf2e69-d684-4721-a6ad-bedf186a51a4-kube-api-access-zvr2t" (OuterVolumeSpecName: "kube-api-access-zvr2t") pod "0dcf2e69-d684-4721-a6ad-bedf186a51a4" (UID: "0dcf2e69-d684-4721-a6ad-bedf186a51a4"). InnerVolumeSpecName "kube-api-access-zvr2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.832217 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7954e0c8-9292-438a-b73f-c91df5746c02-kube-api-access-g4bwc" (OuterVolumeSpecName: "kube-api-access-g4bwc") pod "7954e0c8-9292-438a-b73f-c91df5746c02" (UID: "7954e0c8-9292-438a-b73f-c91df5746c02"). InnerVolumeSpecName "kube-api-access-g4bwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.833339 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-v4zmc" event={"ID":"10b17a94-81eb-4e72-bd49-97f590e26aec","Type":"ContainerDied","Data":"fad36a4a20181ba90ae025c17c1383d367688fb1c601d8220e18257773c28513"} Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.833393 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fad36a4a20181ba90ae025c17c1383d367688fb1c601d8220e18257773c28513" Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.835620 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-v4zmc" Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.835700 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-vzp24" event={"ID":"0dcf2e69-d684-4721-a6ad-bedf186a51a4","Type":"ContainerDied","Data":"4bb71d5e997748e86927ad52be58593d0c3870c517c8c6589a44e40bfc4f6359"} Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.835733 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4bb71d5e997748e86927ad52be58593d0c3870c517c8c6589a44e40bfc4f6359" Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.835800 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-vzp24" Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.903062 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzwcr\" (UniqueName: \"kubernetes.io/projected/10b17a94-81eb-4e72-bd49-97f590e26aec-kube-api-access-kzwcr\") pod \"10b17a94-81eb-4e72-bd49-97f590e26aec\" (UID: \"10b17a94-81eb-4e72-bd49-97f590e26aec\") " Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.903413 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b17a94-81eb-4e72-bd49-97f590e26aec-combined-ca-bundle\") pod \"10b17a94-81eb-4e72-bd49-97f590e26aec\" (UID: \"10b17a94-81eb-4e72-bd49-97f590e26aec\") " Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.903544 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/10b17a94-81eb-4e72-bd49-97f590e26aec-scripts\") pod \"10b17a94-81eb-4e72-bd49-97f590e26aec\" (UID: \"10b17a94-81eb-4e72-bd49-97f590e26aec\") " Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.903600 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/10b17a94-81eb-4e72-bd49-97f590e26aec-ring-data-devices\") pod \"10b17a94-81eb-4e72-bd49-97f590e26aec\" (UID: \"10b17a94-81eb-4e72-bd49-97f590e26aec\") " Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.907006 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10b17a94-81eb-4e72-bd49-97f590e26aec-kube-api-access-kzwcr" (OuterVolumeSpecName: "kube-api-access-kzwcr") pod "10b17a94-81eb-4e72-bd49-97f590e26aec" (UID: "10b17a94-81eb-4e72-bd49-97f590e26aec"). InnerVolumeSpecName "kube-api-access-kzwcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.908055 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/10b17a94-81eb-4e72-bd49-97f590e26aec-etc-swift\") pod \"10b17a94-81eb-4e72-bd49-97f590e26aec\" (UID: \"10b17a94-81eb-4e72-bd49-97f590e26aec\") " Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.908170 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/10b17a94-81eb-4e72-bd49-97f590e26aec-swiftconf\") pod \"10b17a94-81eb-4e72-bd49-97f590e26aec\" (UID: \"10b17a94-81eb-4e72-bd49-97f590e26aec\") " Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.908211 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/10b17a94-81eb-4e72-bd49-97f590e26aec-dispersionconf\") pod \"10b17a94-81eb-4e72-bd49-97f590e26aec\" (UID: \"10b17a94-81eb-4e72-bd49-97f590e26aec\") " Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.910539 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvr2t\" (UniqueName: \"kubernetes.io/projected/0dcf2e69-d684-4721-a6ad-bedf186a51a4-kube-api-access-zvr2t\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.910560 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4bwc\" (UniqueName: \"kubernetes.io/projected/7954e0c8-9292-438a-b73f-c91df5746c02-kube-api-access-g4bwc\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.910572 4723 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7954e0c8-9292-438a-b73f-c91df5746c02-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.910584 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvzvg\" (UniqueName: \"kubernetes.io/projected/842ad0c6-1dfa-4ea1-9d26-84c84b8d3f92-kube-api-access-zvzvg\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.910599 4723 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dcf2e69-d684-4721-a6ad-bedf186a51a4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.910610 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzwcr\" (UniqueName: \"kubernetes.io/projected/10b17a94-81eb-4e72-bd49-97f590e26aec-kube-api-access-kzwcr\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.910621 4723 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/842ad0c6-1dfa-4ea1-9d26-84c84b8d3f92-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.909111 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10b17a94-81eb-4e72-bd49-97f590e26aec-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "10b17a94-81eb-4e72-bd49-97f590e26aec" (UID: "10b17a94-81eb-4e72-bd49-97f590e26aec"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.910676 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10b17a94-81eb-4e72-bd49-97f590e26aec-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "10b17a94-81eb-4e72-bd49-97f590e26aec" (UID: "10b17a94-81eb-4e72-bd49-97f590e26aec"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.938520 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6d748498-cmr7m_2994750c-7f55-4708-856b-c9547e3f054c/console/0.log" Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.938625 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d748498-cmr7m" Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.938958 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10b17a94-81eb-4e72-bd49-97f590e26aec-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "10b17a94-81eb-4e72-bd49-97f590e26aec" (UID: "10b17a94-81eb-4e72-bd49-97f590e26aec"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.986111 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10b17a94-81eb-4e72-bd49-97f590e26aec-scripts" (OuterVolumeSpecName: "scripts") pod "10b17a94-81eb-4e72-bd49-97f590e26aec" (UID: "10b17a94-81eb-4e72-bd49-97f590e26aec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:19:56 crc kubenswrapper[4723]: I0309 13:19:56.986203 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10b17a94-81eb-4e72-bd49-97f590e26aec-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "10b17a94-81eb-4e72-bd49-97f590e26aec" (UID: "10b17a94-81eb-4e72-bd49-97f590e26aec"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:19:57 crc kubenswrapper[4723]: I0309 13:19:57.015483 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10b17a94-81eb-4e72-bd49-97f590e26aec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "10b17a94-81eb-4e72-bd49-97f590e26aec" (UID: "10b17a94-81eb-4e72-bd49-97f590e26aec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:19:57 crc kubenswrapper[4723]: I0309 13:19:57.016042 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdpnb\" (UniqueName: \"kubernetes.io/projected/2994750c-7f55-4708-856b-c9547e3f054c-kube-api-access-vdpnb\") pod \"2994750c-7f55-4708-856b-c9547e3f054c\" (UID: \"2994750c-7f55-4708-856b-c9547e3f054c\") " Mar 09 13:19:57 crc kubenswrapper[4723]: I0309 13:19:57.016388 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2994750c-7f55-4708-856b-c9547e3f054c-console-oauth-config\") pod \"2994750c-7f55-4708-856b-c9547e3f054c\" (UID: \"2994750c-7f55-4708-856b-c9547e3f054c\") " Mar 09 13:19:57 crc kubenswrapper[4723]: I0309 13:19:57.016455 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2994750c-7f55-4708-856b-c9547e3f054c-oauth-serving-cert\") pod \"2994750c-7f55-4708-856b-c9547e3f054c\" (UID: \"2994750c-7f55-4708-856b-c9547e3f054c\") " Mar 09 13:19:57 crc kubenswrapper[4723]: I0309 13:19:57.016480 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2994750c-7f55-4708-856b-c9547e3f054c-service-ca\") pod \"2994750c-7f55-4708-856b-c9547e3f054c\" (UID: \"2994750c-7f55-4708-856b-c9547e3f054c\") " Mar 09 13:19:57 crc kubenswrapper[4723]: I0309 13:19:57.016513 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2994750c-7f55-4708-856b-c9547e3f054c-console-serving-cert\") pod \"2994750c-7f55-4708-856b-c9547e3f054c\" (UID: \"2994750c-7f55-4708-856b-c9547e3f054c\") " Mar 09 13:19:57 crc kubenswrapper[4723]: I0309 13:19:57.016621 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2994750c-7f55-4708-856b-c9547e3f054c-console-config\") pod \"2994750c-7f55-4708-856b-c9547e3f054c\" (UID: \"2994750c-7f55-4708-856b-c9547e3f054c\") " Mar 09 13:19:57 crc kubenswrapper[4723]: I0309 13:19:57.016652 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2994750c-7f55-4708-856b-c9547e3f054c-trusted-ca-bundle\") pod \"2994750c-7f55-4708-856b-c9547e3f054c\" (UID: \"2994750c-7f55-4708-856b-c9547e3f054c\") " Mar 09 13:19:57 crc kubenswrapper[4723]: I0309 13:19:57.018092 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2994750c-7f55-4708-856b-c9547e3f054c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "2994750c-7f55-4708-856b-c9547e3f054c" (UID: "2994750c-7f55-4708-856b-c9547e3f054c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:19:57 crc kubenswrapper[4723]: I0309 13:19:57.018153 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2994750c-7f55-4708-856b-c9547e3f054c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "2994750c-7f55-4708-856b-c9547e3f054c" (UID: "2994750c-7f55-4708-856b-c9547e3f054c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:19:57 crc kubenswrapper[4723]: I0309 13:19:57.018638 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2994750c-7f55-4708-856b-c9547e3f054c-service-ca" (OuterVolumeSpecName: "service-ca") pod "2994750c-7f55-4708-856b-c9547e3f054c" (UID: "2994750c-7f55-4708-856b-c9547e3f054c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:19:57 crc kubenswrapper[4723]: I0309 13:19:57.019324 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2994750c-7f55-4708-856b-c9547e3f054c-console-config" (OuterVolumeSpecName: "console-config") pod "2994750c-7f55-4708-856b-c9547e3f054c" (UID: "2994750c-7f55-4708-856b-c9547e3f054c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:19:57 crc kubenswrapper[4723]: I0309 13:19:57.032261 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b17a94-81eb-4e72-bd49-97f590e26aec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:57 crc kubenswrapper[4723]: I0309 13:19:57.032433 4723 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2994750c-7f55-4708-856b-c9547e3f054c-console-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:57 crc kubenswrapper[4723]: I0309 13:19:57.032447 4723 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2994750c-7f55-4708-856b-c9547e3f054c-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:57 crc kubenswrapper[4723]: I0309 13:19:57.032457 4723 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/10b17a94-81eb-4e72-bd49-97f590e26aec-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:57 crc kubenswrapper[4723]: I0309 13:19:57.032468 4723 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/10b17a94-81eb-4e72-bd49-97f590e26aec-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:57 crc kubenswrapper[4723]: I0309 13:19:57.032478 4723 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/10b17a94-81eb-4e72-bd49-97f590e26aec-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:57 crc kubenswrapper[4723]: I0309 13:19:57.032487 4723 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2994750c-7f55-4708-856b-c9547e3f054c-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:57 crc kubenswrapper[4723]: I0309 13:19:57.032496 4723 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2994750c-7f55-4708-856b-c9547e3f054c-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:57 crc kubenswrapper[4723]: I0309 13:19:57.032505 4723 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/10b17a94-81eb-4e72-bd49-97f590e26aec-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:57 crc kubenswrapper[4723]: I0309 13:19:57.032514 4723 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/10b17a94-81eb-4e72-bd49-97f590e26aec-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:57 crc kubenswrapper[4723]: I0309 13:19:57.033519 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2994750c-7f55-4708-856b-c9547e3f054c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "2994750c-7f55-4708-856b-c9547e3f054c" (UID: "2994750c-7f55-4708-856b-c9547e3f054c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:19:57 crc kubenswrapper[4723]: I0309 13:19:57.036473 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2994750c-7f55-4708-856b-c9547e3f054c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "2994750c-7f55-4708-856b-c9547e3f054c" (UID: "2994750c-7f55-4708-856b-c9547e3f054c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:19:57 crc kubenswrapper[4723]: I0309 13:19:57.045113 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2994750c-7f55-4708-856b-c9547e3f054c-kube-api-access-vdpnb" (OuterVolumeSpecName: "kube-api-access-vdpnb") pod "2994750c-7f55-4708-856b-c9547e3f054c" (UID: "2994750c-7f55-4708-856b-c9547e3f054c"). InnerVolumeSpecName "kube-api-access-vdpnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:19:57 crc kubenswrapper[4723]: I0309 13:19:57.135829 4723 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2994750c-7f55-4708-856b-c9547e3f054c-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:57 crc kubenswrapper[4723]: I0309 13:19:57.135855 4723 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2994750c-7f55-4708-856b-c9547e3f054c-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:57 crc kubenswrapper[4723]: I0309 13:19:57.135870 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdpnb\" (UniqueName: \"kubernetes.io/projected/2994750c-7f55-4708-856b-c9547e3f054c-kube-api-access-vdpnb\") on node \"crc\" DevicePath \"\"" Mar 09 13:19:57 crc kubenswrapper[4723]: W0309 13:19:57.477932 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36dc8fce_575c_4fe1_b4df_2ae47014bce7.slice/crio-8a797b643a3c9574840c4921f7490aa6f8368af031d2fb453a8474e89fcca488 WatchSource:0}: Error finding container 8a797b643a3c9574840c4921f7490aa6f8368af031d2fb453a8474e89fcca488: Status 404 returned error can't find the container with id 8a797b643a3c9574840c4921f7490aa6f8368af031d2fb453a8474e89fcca488 Mar 09 13:19:57 crc kubenswrapper[4723]: I0309 13:19:57.487464 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-1379-account-create-update-tw2zb"] Mar 09 13:19:57 crc kubenswrapper[4723]: I0309 13:19:57.654759 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-ghfww"] Mar 09 13:19:57 crc kubenswrapper[4723]: I0309 13:19:57.873924 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4a548a9c-33c8-4a35-a559-7290357170c1","Type":"ContainerStarted","Data":"fec818ed303f7ff19de3d6732cdf76158698dff52fb322f95cc2d95ed5fc3dac"} Mar 09 13:19:57 crc kubenswrapper[4723]: I0309 13:19:57.922032 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6d748498-cmr7m_2994750c-7f55-4708-856b-c9547e3f054c/console/0.log" Mar 09 13:19:57 crc kubenswrapper[4723]: I0309 13:19:57.922120 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d748498-cmr7m" event={"ID":"2994750c-7f55-4708-856b-c9547e3f054c","Type":"ContainerDied","Data":"640e041ceb96605808c9acad14b9abfb33243803ee73577767597291fa592caa"} Mar 09 13:19:57 crc kubenswrapper[4723]: I0309 13:19:57.922162 4723 scope.go:117] "RemoveContainer" containerID="f56b0d79b1e915f7a130679fa2da30ddd4a330859fc19d4a51c5569a1d76ecec" Mar 09 13:19:57 crc kubenswrapper[4723]: I0309 13:19:57.922299 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d748498-cmr7m" Mar 09 13:19:57 crc kubenswrapper[4723]: I0309 13:19:57.941652 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1379-account-create-update-tw2zb" event={"ID":"36dc8fce-575c-4fe1-b4df-2ae47014bce7","Type":"ContainerStarted","Data":"8a797b643a3c9574840c4921f7490aa6f8368af031d2fb453a8474e89fcca488"} Mar 09 13:19:57 crc kubenswrapper[4723]: I0309 13:19:57.953343 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-v4zmc" Mar 09 13:19:57 crc kubenswrapper[4723]: I0309 13:19:57.953497 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-ghfww" event={"ID":"b743aece-4b06-44e2-9afa-fd075c0730d3","Type":"ContainerStarted","Data":"ed8d21eaa542959788c5c5d739736ca3ec5e2fde3e0c1089e8ab42f0acc3b427"} Mar 09 13:19:57 crc kubenswrapper[4723]: I0309 13:19:57.985821 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-0679-account-create-update-xwhf9"] Mar 09 13:19:58 crc kubenswrapper[4723]: W0309 13:19:58.006504 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12ca1cc4_a262_42c5_ac51_eee86f7c9793.slice/crio-9f196fea01a51bd5dd8d449eed39aa3871725a82b571cddb583f65083cce6815 WatchSource:0}: Error finding container 9f196fea01a51bd5dd8d449eed39aa3871725a82b571cddb583f65083cce6815: Status 404 returned error can't find the container with id 9f196fea01a51bd5dd8d449eed39aa3871725a82b571cddb583f65083cce6815 Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.021777 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-s8dhp"] Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.049258 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6d748498-cmr7m"] Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.065754 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6d748498-cmr7m"] Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.079232 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-5n52p" podUID="ea8d3865-305b-4ab6-833c-f8b227b6bae4" containerName="ovn-controller" probeResult="failure" output=< Mar 09 13:19:58 crc kubenswrapper[4723]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 09 13:19:58 crc kubenswrapper[4723]: > Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.079296 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-65x7z" Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.085268 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-gnr4x"] Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.102167 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-65x7z" Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.349060 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-5n52p-config-9bl8p"] Mar 09 13:19:58 crc kubenswrapper[4723]: E0309 13:19:58.349543 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10b17a94-81eb-4e72-bd49-97f590e26aec" containerName="swift-ring-rebalance" Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.349560 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="10b17a94-81eb-4e72-bd49-97f590e26aec" containerName="swift-ring-rebalance" Mar 09 13:19:58 crc kubenswrapper[4723]: E0309 13:19:58.349585 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="842ad0c6-1dfa-4ea1-9d26-84c84b8d3f92" containerName="mariadb-account-create-update" Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.349594 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="842ad0c6-1dfa-4ea1-9d26-84c84b8d3f92" containerName="mariadb-account-create-update" Mar 09 13:19:58 crc kubenswrapper[4723]: E0309 13:19:58.349608 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2994750c-7f55-4708-856b-c9547e3f054c" containerName="console" Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.349617 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="2994750c-7f55-4708-856b-c9547e3f054c" containerName="console" Mar 09 13:19:58 crc kubenswrapper[4723]: E0309 13:19:58.349632 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dcf2e69-d684-4721-a6ad-bedf186a51a4" containerName="mariadb-database-create" Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.349639 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dcf2e69-d684-4721-a6ad-bedf186a51a4" containerName="mariadb-database-create" Mar 09 13:19:58 crc kubenswrapper[4723]: E0309 13:19:58.349657 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7954e0c8-9292-438a-b73f-c91df5746c02" containerName="mariadb-account-create-update" Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.349664 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="7954e0c8-9292-438a-b73f-c91df5746c02" containerName="mariadb-account-create-update" Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.349924 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="2994750c-7f55-4708-856b-c9547e3f054c" containerName="console" Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.349942 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dcf2e69-d684-4721-a6ad-bedf186a51a4" containerName="mariadb-database-create" Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.349950 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="10b17a94-81eb-4e72-bd49-97f590e26aec" containerName="swift-ring-rebalance" Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.349958 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="842ad0c6-1dfa-4ea1-9d26-84c84b8d3f92" containerName="mariadb-account-create-update" Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.349973 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="7954e0c8-9292-438a-b73f-c91df5746c02" containerName="mariadb-account-create-update" Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.350837 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5n52p-config-9bl8p" Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.353883 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.367457 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-5n52p-config-9bl8p"] Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.418626 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/51d9b835-e695-4401-9833-ae97bd15bf48-var-run\") pod \"ovn-controller-5n52p-config-9bl8p\" (UID: \"51d9b835-e695-4401-9833-ae97bd15bf48\") " pod="openstack/ovn-controller-5n52p-config-9bl8p" Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.418703 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/51d9b835-e695-4401-9833-ae97bd15bf48-var-run-ovn\") pod \"ovn-controller-5n52p-config-9bl8p\" (UID: \"51d9b835-e695-4401-9833-ae97bd15bf48\") " pod="openstack/ovn-controller-5n52p-config-9bl8p" Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.418747 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/51d9b835-e695-4401-9833-ae97bd15bf48-var-log-ovn\") pod \"ovn-controller-5n52p-config-9bl8p\" (UID: \"51d9b835-e695-4401-9833-ae97bd15bf48\") " pod="openstack/ovn-controller-5n52p-config-9bl8p" Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.418809 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/51d9b835-e695-4401-9833-ae97bd15bf48-scripts\") pod \"ovn-controller-5n52p-config-9bl8p\" (UID: \"51d9b835-e695-4401-9833-ae97bd15bf48\") " pod="openstack/ovn-controller-5n52p-config-9bl8p" Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.418861 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hhtq\" (UniqueName: \"kubernetes.io/projected/51d9b835-e695-4401-9833-ae97bd15bf48-kube-api-access-8hhtq\") pod \"ovn-controller-5n52p-config-9bl8p\" (UID: \"51d9b835-e695-4401-9833-ae97bd15bf48\") " pod="openstack/ovn-controller-5n52p-config-9bl8p" Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.418904 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/51d9b835-e695-4401-9833-ae97bd15bf48-additional-scripts\") pod \"ovn-controller-5n52p-config-9bl8p\" (UID: \"51d9b835-e695-4401-9833-ae97bd15bf48\") " pod="openstack/ovn-controller-5n52p-config-9bl8p" Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.521066 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hhtq\" (UniqueName: \"kubernetes.io/projected/51d9b835-e695-4401-9833-ae97bd15bf48-kube-api-access-8hhtq\") pod \"ovn-controller-5n52p-config-9bl8p\" (UID: \"51d9b835-e695-4401-9833-ae97bd15bf48\") " pod="openstack/ovn-controller-5n52p-config-9bl8p" Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.521329 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/51d9b835-e695-4401-9833-ae97bd15bf48-additional-scripts\") pod \"ovn-controller-5n52p-config-9bl8p\" (UID: \"51d9b835-e695-4401-9833-ae97bd15bf48\") " pod="openstack/ovn-controller-5n52p-config-9bl8p" Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.521530 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/51d9b835-e695-4401-9833-ae97bd15bf48-var-run\") pod \"ovn-controller-5n52p-config-9bl8p\" (UID: \"51d9b835-e695-4401-9833-ae97bd15bf48\") " pod="openstack/ovn-controller-5n52p-config-9bl8p" Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.521615 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/51d9b835-e695-4401-9833-ae97bd15bf48-var-run-ovn\") pod \"ovn-controller-5n52p-config-9bl8p\" (UID: \"51d9b835-e695-4401-9833-ae97bd15bf48\") " pod="openstack/ovn-controller-5n52p-config-9bl8p" Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.521704 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/51d9b835-e695-4401-9833-ae97bd15bf48-var-log-ovn\") pod \"ovn-controller-5n52p-config-9bl8p\" (UID: \"51d9b835-e695-4401-9833-ae97bd15bf48\") " pod="openstack/ovn-controller-5n52p-config-9bl8p" Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.521786 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/51d9b835-e695-4401-9833-ae97bd15bf48-var-run\") pod \"ovn-controller-5n52p-config-9bl8p\" (UID: \"51d9b835-e695-4401-9833-ae97bd15bf48\") " pod="openstack/ovn-controller-5n52p-config-9bl8p" Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.521904 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/51d9b835-e695-4401-9833-ae97bd15bf48-scripts\") pod \"ovn-controller-5n52p-config-9bl8p\" (UID: \"51d9b835-e695-4401-9833-ae97bd15bf48\") " pod="openstack/ovn-controller-5n52p-config-9bl8p" Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.521962 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/51d9b835-e695-4401-9833-ae97bd15bf48-var-log-ovn\") pod \"ovn-controller-5n52p-config-9bl8p\" (UID: \"51d9b835-e695-4401-9833-ae97bd15bf48\") " pod="openstack/ovn-controller-5n52p-config-9bl8p" Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.521922 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/51d9b835-e695-4401-9833-ae97bd15bf48-var-run-ovn\") pod \"ovn-controller-5n52p-config-9bl8p\" (UID: \"51d9b835-e695-4401-9833-ae97bd15bf48\") " pod="openstack/ovn-controller-5n52p-config-9bl8p" Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.522146 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/51d9b835-e695-4401-9833-ae97bd15bf48-additional-scripts\") pod \"ovn-controller-5n52p-config-9bl8p\" (UID: \"51d9b835-e695-4401-9833-ae97bd15bf48\") " pod="openstack/ovn-controller-5n52p-config-9bl8p" Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.523729 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/51d9b835-e695-4401-9833-ae97bd15bf48-scripts\") pod \"ovn-controller-5n52p-config-9bl8p\" (UID: \"51d9b835-e695-4401-9833-ae97bd15bf48\") " pod="openstack/ovn-controller-5n52p-config-9bl8p" Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.597912 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hhtq\" (UniqueName: \"kubernetes.io/projected/51d9b835-e695-4401-9833-ae97bd15bf48-kube-api-access-8hhtq\") pod \"ovn-controller-5n52p-config-9bl8p\" (UID: \"51d9b835-e695-4401-9833-ae97bd15bf48\") " pod="openstack/ovn-controller-5n52p-config-9bl8p" Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.667968 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5n52p-config-9bl8p" Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.913583 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2994750c-7f55-4708-856b-c9547e3f054c" path="/var/lib/kubelet/pods/2994750c-7f55-4708-856b-c9547e3f054c/volumes" Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.969481 4723 generic.go:334] "Generic (PLEG): container finished" podID="b743aece-4b06-44e2-9afa-fd075c0730d3" containerID="110391c5dd41b539dcca3e4c33253b3a26882afeeb29854053cbd10c6ca6da5f" exitCode=0 Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.970349 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-ghfww" event={"ID":"b743aece-4b06-44e2-9afa-fd075c0730d3","Type":"ContainerDied","Data":"110391c5dd41b539dcca3e4c33253b3a26882afeeb29854053cbd10c6ca6da5f"} Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.979887 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-s8dhp" event={"ID":"ed454b22-e190-4bf0-8581-e71f2ce51324","Type":"ContainerStarted","Data":"ad0138a0acdda25ff6c624c810b9988f27e44004cadc70a91a20c7ca3d8ecdf2"} Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.979921 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-s8dhp" event={"ID":"ed454b22-e190-4bf0-8581-e71f2ce51324","Type":"ContainerStarted","Data":"bf0cbe3b0e2850fe2d39e2d1e618e87ebe994464025f6f5130f9f4b3f09fdb4a"} Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.983070 4723 generic.go:334] "Generic (PLEG): container finished" podID="36dc8fce-575c-4fe1-b4df-2ae47014bce7" containerID="5afa4fed2a1908b51a8f6e6c3656194ebad26dc4b01dc4b4321f66c1be88249f" exitCode=0 Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.983162 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1379-account-create-update-tw2zb" event={"ID":"36dc8fce-575c-4fe1-b4df-2ae47014bce7","Type":"ContainerDied","Data":"5afa4fed2a1908b51a8f6e6c3656194ebad26dc4b01dc4b4321f66c1be88249f"} Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.988593 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-0679-account-create-update-xwhf9" event={"ID":"12ca1cc4-a262-42c5-ac51-eee86f7c9793","Type":"ContainerStarted","Data":"7ef8c316d63f49cc25ce811f7a5e1c1a41fbd8c2bccf44124a352a41c6e50dd5"} Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.988765 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-0679-account-create-update-xwhf9" event={"ID":"12ca1cc4-a262-42c5-ac51-eee86f7c9793","Type":"ContainerStarted","Data":"9f196fea01a51bd5dd8d449eed39aa3871725a82b571cddb583f65083cce6815"} Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.991488 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-gnr4x" event={"ID":"556c51ac-2052-4f8c-9c5b-830aacc68de0","Type":"ContainerStarted","Data":"30bc40218aec01158c7e6a84024929bc585de0e02283d0f997f94e7b0c46879b"} Mar 09 13:19:58 crc kubenswrapper[4723]: I0309 13:19:58.991537 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-gnr4x" event={"ID":"556c51ac-2052-4f8c-9c5b-830aacc68de0","Type":"ContainerStarted","Data":"b81b74203ce2172f99cfff4e48b5956b05c8d528ef92f1482ed3f98791418153"} Mar 09 13:19:59 crc kubenswrapper[4723]: I0309 13:19:59.031743 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-s8dhp" podStartSLOduration=6.031725677 podStartE2EDuration="6.031725677s" podCreationTimestamp="2026-03-09 13:19:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:19:59.030087773 +0000 UTC m=+1273.044555323" watchObservedRunningTime="2026-03-09 13:19:59.031725677 +0000 UTC m=+1273.046193217" Mar 09 13:19:59 crc kubenswrapper[4723]: I0309 13:19:59.084342 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-0679-account-create-update-xwhf9" podStartSLOduration=7.084327208 podStartE2EDuration="7.084327208s" podCreationTimestamp="2026-03-09 13:19:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:19:59.074183038 +0000 UTC m=+1273.088650578" watchObservedRunningTime="2026-03-09 13:19:59.084327208 +0000 UTC m=+1273.098794738" Mar 09 13:19:59 crc kubenswrapper[4723]: I0309 13:19:59.108038 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-gnr4x" podStartSLOduration=7.108018149 podStartE2EDuration="7.108018149s" podCreationTimestamp="2026-03-09 13:19:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:19:59.102231854 +0000 UTC m=+1273.116699394" watchObservedRunningTime="2026-03-09 13:19:59.108018149 +0000 UTC m=+1273.122485699" Mar 09 13:19:59 crc kubenswrapper[4723]: I0309 13:19:59.247430 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-5n52p-config-9bl8p"] Mar 09 13:19:59 crc kubenswrapper[4723]: I0309 13:19:59.265797 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-2" podUID="54210e7b-b34d-411d-93e1-e8cc3448c4b0" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.134:5671: connect: connection refused" Mar 09 13:19:59 crc kubenswrapper[4723]: I0309 13:19:59.309433 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-1" podUID="daa528e2-bcd7-43a8-bfea-a0911b3020c5" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.135:5671: connect: connection refused" Mar 09 13:19:59 crc kubenswrapper[4723]: W0309 13:19:59.406958 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51d9b835_e695_4401_9833_ae97bd15bf48.slice/crio-e278c02313b0b58260c6c95a5fa97654dfcaa9cf9b7e528268177d47a6617df7 WatchSource:0}: Error finding container e278c02313b0b58260c6c95a5fa97654dfcaa9cf9b7e528268177d47a6617df7: Status 404 returned error can't find the container with id e278c02313b0b58260c6c95a5fa97654dfcaa9cf9b7e528268177d47a6617df7 Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.004274 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4a548a9c-33c8-4a35-a559-7290357170c1","Type":"ContainerStarted","Data":"93be7c2aa47e3f408bbcd685aae67aa4d15f16e5d60ce6cb28fa72a13e03a121"} Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.006784 4723 generic.go:334] "Generic (PLEG): container finished" podID="ed454b22-e190-4bf0-8581-e71f2ce51324" containerID="ad0138a0acdda25ff6c624c810b9988f27e44004cadc70a91a20c7ca3d8ecdf2" exitCode=0 Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.006847 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-s8dhp" event={"ID":"ed454b22-e190-4bf0-8581-e71f2ce51324","Type":"ContainerDied","Data":"ad0138a0acdda25ff6c624c810b9988f27e44004cadc70a91a20c7ca3d8ecdf2"} Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.012168 4723 generic.go:334] "Generic (PLEG): container finished" podID="12ca1cc4-a262-42c5-ac51-eee86f7c9793" containerID="7ef8c316d63f49cc25ce811f7a5e1c1a41fbd8c2bccf44124a352a41c6e50dd5" exitCode=0 Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.012219 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-0679-account-create-update-xwhf9" event={"ID":"12ca1cc4-a262-42c5-ac51-eee86f7c9793","Type":"ContainerDied","Data":"7ef8c316d63f49cc25ce811f7a5e1c1a41fbd8c2bccf44124a352a41c6e50dd5"} Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.015192 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5n52p-config-9bl8p" event={"ID":"51d9b835-e695-4401-9833-ae97bd15bf48","Type":"ContainerStarted","Data":"2a8b8e6a52c0f4d91f7a644a265b41484dbc581fd88186f79df9170db828f221"} Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.015295 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5n52p-config-9bl8p" event={"ID":"51d9b835-e695-4401-9833-ae97bd15bf48","Type":"ContainerStarted","Data":"e278c02313b0b58260c6c95a5fa97654dfcaa9cf9b7e528268177d47a6617df7"} Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.017260 4723 generic.go:334] "Generic (PLEG): container finished" podID="556c51ac-2052-4f8c-9c5b-830aacc68de0" containerID="30bc40218aec01158c7e6a84024929bc585de0e02283d0f997f94e7b0c46879b" exitCode=0 Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.017324 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-gnr4x" event={"ID":"556c51ac-2052-4f8c-9c5b-830aacc68de0","Type":"ContainerDied","Data":"30bc40218aec01158c7e6a84024929bc585de0e02283d0f997f94e7b0c46879b"} Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.086623 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-5n52p-config-9bl8p" podStartSLOduration=2.086600206 podStartE2EDuration="2.086600206s" podCreationTimestamp="2026-03-09 13:19:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:20:00.058155688 +0000 UTC m=+1274.072623228" watchObservedRunningTime="2026-03-09 13:20:00.086600206 +0000 UTC m=+1274.101067746" Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.162009 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551040-pqgfx"] Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.163550 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551040-pqgfx" Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.168940 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.169088 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jrp4x" Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.169088 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.177790 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551040-pqgfx"] Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.269606 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr4gf\" (UniqueName: \"kubernetes.io/projected/f8cfb1f4-2f08-4850-a398-679a25dacc26-kube-api-access-cr4gf\") pod \"auto-csr-approver-29551040-pqgfx\" (UID: \"f8cfb1f4-2f08-4850-a398-679a25dacc26\") " pod="openshift-infra/auto-csr-approver-29551040-pqgfx" Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.344072 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.347539 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.355849 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.371589 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr4gf\" (UniqueName: \"kubernetes.io/projected/f8cfb1f4-2f08-4850-a398-679a25dacc26-kube-api-access-cr4gf\") pod \"auto-csr-approver-29551040-pqgfx\" (UID: \"f8cfb1f4-2f08-4850-a398-679a25dacc26\") " pod="openshift-infra/auto-csr-approver-29551040-pqgfx" Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.384766 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.405836 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr4gf\" (UniqueName: \"kubernetes.io/projected/f8cfb1f4-2f08-4850-a398-679a25dacc26-kube-api-access-cr4gf\") pod \"auto-csr-approver-29551040-pqgfx\" (UID: \"f8cfb1f4-2f08-4850-a398-679a25dacc26\") " pod="openshift-infra/auto-csr-approver-29551040-pqgfx" Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.460772 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-lmg7p"] Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.473553 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnrkg\" (UniqueName: \"kubernetes.io/projected/a9fc4bb5-d4e9-46f7-b213-a64914a27ee9-kube-api-access-lnrkg\") pod \"mysqld-exporter-0\" (UID: \"a9fc4bb5-d4e9-46f7-b213-a64914a27ee9\") " pod="openstack/mysqld-exporter-0" Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.473682 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9fc4bb5-d4e9-46f7-b213-a64914a27ee9-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"a9fc4bb5-d4e9-46f7-b213-a64914a27ee9\") " pod="openstack/mysqld-exporter-0" Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.473756 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9fc4bb5-d4e9-46f7-b213-a64914a27ee9-config-data\") pod \"mysqld-exporter-0\" (UID: \"a9fc4bb5-d4e9-46f7-b213-a64914a27ee9\") " pod="openstack/mysqld-exporter-0" Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.477047 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-lmg7p"] Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.514384 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551040-pqgfx" Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.575494 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9fc4bb5-d4e9-46f7-b213-a64914a27ee9-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"a9fc4bb5-d4e9-46f7-b213-a64914a27ee9\") " pod="openstack/mysqld-exporter-0" Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.575566 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9fc4bb5-d4e9-46f7-b213-a64914a27ee9-config-data\") pod \"mysqld-exporter-0\" (UID: \"a9fc4bb5-d4e9-46f7-b213-a64914a27ee9\") " pod="openstack/mysqld-exporter-0" Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.575669 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnrkg\" (UniqueName: \"kubernetes.io/projected/a9fc4bb5-d4e9-46f7-b213-a64914a27ee9-kube-api-access-lnrkg\") pod \"mysqld-exporter-0\" (UID: \"a9fc4bb5-d4e9-46f7-b213-a64914a27ee9\") " pod="openstack/mysqld-exporter-0" Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.588957 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9fc4bb5-d4e9-46f7-b213-a64914a27ee9-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"a9fc4bb5-d4e9-46f7-b213-a64914a27ee9\") " pod="openstack/mysqld-exporter-0" Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.590527 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9fc4bb5-d4e9-46f7-b213-a64914a27ee9-config-data\") pod \"mysqld-exporter-0\" (UID: \"a9fc4bb5-d4e9-46f7-b213-a64914a27ee9\") " pod="openstack/mysqld-exporter-0" Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.612017 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnrkg\" (UniqueName: \"kubernetes.io/projected/a9fc4bb5-d4e9-46f7-b213-a64914a27ee9-kube-api-access-lnrkg\") pod \"mysqld-exporter-0\" (UID: \"a9fc4bb5-d4e9-46f7-b213-a64914a27ee9\") " pod="openstack/mysqld-exporter-0" Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.689403 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.712344 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1379-account-create-update-tw2zb" Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.780789 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szpm7\" (UniqueName: \"kubernetes.io/projected/36dc8fce-575c-4fe1-b4df-2ae47014bce7-kube-api-access-szpm7\") pod \"36dc8fce-575c-4fe1-b4df-2ae47014bce7\" (UID: \"36dc8fce-575c-4fe1-b4df-2ae47014bce7\") " Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.780863 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36dc8fce-575c-4fe1-b4df-2ae47014bce7-operator-scripts\") pod \"36dc8fce-575c-4fe1-b4df-2ae47014bce7\" (UID: \"36dc8fce-575c-4fe1-b4df-2ae47014bce7\") " Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.781999 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36dc8fce-575c-4fe1-b4df-2ae47014bce7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "36dc8fce-575c-4fe1-b4df-2ae47014bce7" (UID: "36dc8fce-575c-4fe1-b4df-2ae47014bce7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.789770 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36dc8fce-575c-4fe1-b4df-2ae47014bce7-kube-api-access-szpm7" (OuterVolumeSpecName: "kube-api-access-szpm7") pod "36dc8fce-575c-4fe1-b4df-2ae47014bce7" (UID: "36dc8fce-575c-4fe1-b4df-2ae47014bce7"). InnerVolumeSpecName "kube-api-access-szpm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.863940 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ghfww" Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.883704 4723 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36dc8fce-575c-4fe1-b4df-2ae47014bce7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.883727 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szpm7\" (UniqueName: \"kubernetes.io/projected/36dc8fce-575c-4fe1-b4df-2ae47014bce7-kube-api-access-szpm7\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.917039 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7954e0c8-9292-438a-b73f-c91df5746c02" path="/var/lib/kubelet/pods/7954e0c8-9292-438a-b73f-c91df5746c02/volumes" Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.985504 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bftr\" (UniqueName: \"kubernetes.io/projected/b743aece-4b06-44e2-9afa-fd075c0730d3-kube-api-access-5bftr\") pod \"b743aece-4b06-44e2-9afa-fd075c0730d3\" (UID: \"b743aece-4b06-44e2-9afa-fd075c0730d3\") " Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.985554 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b743aece-4b06-44e2-9afa-fd075c0730d3-operator-scripts\") pod \"b743aece-4b06-44e2-9afa-fd075c0730d3\" (UID: \"b743aece-4b06-44e2-9afa-fd075c0730d3\") " Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.990063 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b743aece-4b06-44e2-9afa-fd075c0730d3-kube-api-access-5bftr" (OuterVolumeSpecName: "kube-api-access-5bftr") pod "b743aece-4b06-44e2-9afa-fd075c0730d3" (UID: "b743aece-4b06-44e2-9afa-fd075c0730d3"). InnerVolumeSpecName "kube-api-access-5bftr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:20:00 crc kubenswrapper[4723]: I0309 13:20:00.993598 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b743aece-4b06-44e2-9afa-fd075c0730d3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b743aece-4b06-44e2-9afa-fd075c0730d3" (UID: "b743aece-4b06-44e2-9afa-fd075c0730d3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:20:01 crc kubenswrapper[4723]: I0309 13:20:01.029677 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1379-account-create-update-tw2zb" event={"ID":"36dc8fce-575c-4fe1-b4df-2ae47014bce7","Type":"ContainerDied","Data":"8a797b643a3c9574840c4921f7490aa6f8368af031d2fb453a8474e89fcca488"} Mar 09 13:20:01 crc kubenswrapper[4723]: I0309 13:20:01.029715 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a797b643a3c9574840c4921f7490aa6f8368af031d2fb453a8474e89fcca488" Mar 09 13:20:01 crc kubenswrapper[4723]: I0309 13:20:01.029761 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1379-account-create-update-tw2zb" Mar 09 13:20:01 crc kubenswrapper[4723]: I0309 13:20:01.036374 4723 generic.go:334] "Generic (PLEG): container finished" podID="51d9b835-e695-4401-9833-ae97bd15bf48" containerID="2a8b8e6a52c0f4d91f7a644a265b41484dbc581fd88186f79df9170db828f221" exitCode=0 Mar 09 13:20:01 crc kubenswrapper[4723]: I0309 13:20:01.036426 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5n52p-config-9bl8p" event={"ID":"51d9b835-e695-4401-9833-ae97bd15bf48","Type":"ContainerDied","Data":"2a8b8e6a52c0f4d91f7a644a265b41484dbc581fd88186f79df9170db828f221"} Mar 09 13:20:01 crc kubenswrapper[4723]: I0309 13:20:01.055370 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ghfww" Mar 09 13:20:01 crc kubenswrapper[4723]: I0309 13:20:01.056972 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-ghfww" event={"ID":"b743aece-4b06-44e2-9afa-fd075c0730d3","Type":"ContainerDied","Data":"ed8d21eaa542959788c5c5d739736ca3ec5e2fde3e0c1089e8ab42f0acc3b427"} Mar 09 13:20:01 crc kubenswrapper[4723]: I0309 13:20:01.057019 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed8d21eaa542959788c5c5d739736ca3ec5e2fde3e0c1089e8ab42f0acc3b427" Mar 09 13:20:01 crc kubenswrapper[4723]: I0309 13:20:01.088368 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bftr\" (UniqueName: \"kubernetes.io/projected/b743aece-4b06-44e2-9afa-fd075c0730d3-kube-api-access-5bftr\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:01 crc kubenswrapper[4723]: I0309 13:20:01.088398 4723 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b743aece-4b06-44e2-9afa-fd075c0730d3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:01 crc kubenswrapper[4723]: E0309 13:20:01.190498 4723 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36dc8fce_575c_4fe1_b4df_2ae47014bce7.slice\": RecentStats: unable to find data in memory cache]" Mar 09 13:20:01 crc kubenswrapper[4723]: I0309 13:20:01.391443 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551040-pqgfx"] Mar 09 13:20:01 crc kubenswrapper[4723]: I0309 13:20:01.400894 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 09 13:20:01 crc kubenswrapper[4723]: I0309 13:20:01.874282 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-s8dhp" Mar 09 13:20:01 crc kubenswrapper[4723]: I0309 13:20:01.878268 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-gnr4x" Mar 09 13:20:01 crc kubenswrapper[4723]: I0309 13:20:01.898155 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0679-account-create-update-xwhf9" Mar 09 13:20:02 crc kubenswrapper[4723]: I0309 13:20:02.020405 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12ca1cc4-a262-42c5-ac51-eee86f7c9793-operator-scripts\") pod \"12ca1cc4-a262-42c5-ac51-eee86f7c9793\" (UID: \"12ca1cc4-a262-42c5-ac51-eee86f7c9793\") " Mar 09 13:20:02 crc kubenswrapper[4723]: I0309 13:20:02.020612 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cxtg\" (UniqueName: \"kubernetes.io/projected/ed454b22-e190-4bf0-8581-e71f2ce51324-kube-api-access-7cxtg\") pod \"ed454b22-e190-4bf0-8581-e71f2ce51324\" (UID: \"ed454b22-e190-4bf0-8581-e71f2ce51324\") " Mar 09 13:20:02 crc kubenswrapper[4723]: I0309 13:20:02.020755 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nbql\" (UniqueName: \"kubernetes.io/projected/12ca1cc4-a262-42c5-ac51-eee86f7c9793-kube-api-access-8nbql\") pod \"12ca1cc4-a262-42c5-ac51-eee86f7c9793\" (UID: \"12ca1cc4-a262-42c5-ac51-eee86f7c9793\") " Mar 09 13:20:02 crc kubenswrapper[4723]: I0309 13:20:02.020779 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed454b22-e190-4bf0-8581-e71f2ce51324-operator-scripts\") pod \"ed454b22-e190-4bf0-8581-e71f2ce51324\" (UID: \"ed454b22-e190-4bf0-8581-e71f2ce51324\") " Mar 09 13:20:02 crc kubenswrapper[4723]: I0309 13:20:02.020803 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/556c51ac-2052-4f8c-9c5b-830aacc68de0-operator-scripts\") pod \"556c51ac-2052-4f8c-9c5b-830aacc68de0\" (UID: \"556c51ac-2052-4f8c-9c5b-830aacc68de0\") " Mar 09 13:20:02 crc kubenswrapper[4723]: I0309 13:20:02.020846 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5n6w\" (UniqueName: \"kubernetes.io/projected/556c51ac-2052-4f8c-9c5b-830aacc68de0-kube-api-access-s5n6w\") pod \"556c51ac-2052-4f8c-9c5b-830aacc68de0\" (UID: \"556c51ac-2052-4f8c-9c5b-830aacc68de0\") " Mar 09 13:20:02 crc kubenswrapper[4723]: I0309 13:20:02.021480 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/556c51ac-2052-4f8c-9c5b-830aacc68de0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "556c51ac-2052-4f8c-9c5b-830aacc68de0" (UID: "556c51ac-2052-4f8c-9c5b-830aacc68de0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:20:02 crc kubenswrapper[4723]: I0309 13:20:02.021552 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12ca1cc4-a262-42c5-ac51-eee86f7c9793-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "12ca1cc4-a262-42c5-ac51-eee86f7c9793" (UID: "12ca1cc4-a262-42c5-ac51-eee86f7c9793"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:20:02 crc kubenswrapper[4723]: I0309 13:20:02.022093 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed454b22-e190-4bf0-8581-e71f2ce51324-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ed454b22-e190-4bf0-8581-e71f2ce51324" (UID: "ed454b22-e190-4bf0-8581-e71f2ce51324"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:20:02 crc kubenswrapper[4723]: I0309 13:20:02.022758 4723 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12ca1cc4-a262-42c5-ac51-eee86f7c9793-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:02 crc kubenswrapper[4723]: I0309 13:20:02.022783 4723 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed454b22-e190-4bf0-8581-e71f2ce51324-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:02 crc kubenswrapper[4723]: I0309 13:20:02.022794 4723 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/556c51ac-2052-4f8c-9c5b-830aacc68de0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:02 crc kubenswrapper[4723]: I0309 13:20:02.029230 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/556c51ac-2052-4f8c-9c5b-830aacc68de0-kube-api-access-s5n6w" (OuterVolumeSpecName: "kube-api-access-s5n6w") pod "556c51ac-2052-4f8c-9c5b-830aacc68de0" (UID: "556c51ac-2052-4f8c-9c5b-830aacc68de0"). InnerVolumeSpecName "kube-api-access-s5n6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:20:02 crc kubenswrapper[4723]: I0309 13:20:02.032217 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12ca1cc4-a262-42c5-ac51-eee86f7c9793-kube-api-access-8nbql" (OuterVolumeSpecName: "kube-api-access-8nbql") pod "12ca1cc4-a262-42c5-ac51-eee86f7c9793" (UID: "12ca1cc4-a262-42c5-ac51-eee86f7c9793"). InnerVolumeSpecName "kube-api-access-8nbql". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:20:02 crc kubenswrapper[4723]: I0309 13:20:02.032257 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed454b22-e190-4bf0-8581-e71f2ce51324-kube-api-access-7cxtg" (OuterVolumeSpecName: "kube-api-access-7cxtg") pod "ed454b22-e190-4bf0-8581-e71f2ce51324" (UID: "ed454b22-e190-4bf0-8581-e71f2ce51324"). InnerVolumeSpecName "kube-api-access-7cxtg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:20:02 crc kubenswrapper[4723]: I0309 13:20:02.067701 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-s8dhp" event={"ID":"ed454b22-e190-4bf0-8581-e71f2ce51324","Type":"ContainerDied","Data":"bf0cbe3b0e2850fe2d39e2d1e618e87ebe994464025f6f5130f9f4b3f09fdb4a"} Mar 09 13:20:02 crc kubenswrapper[4723]: I0309 13:20:02.067737 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf0cbe3b0e2850fe2d39e2d1e618e87ebe994464025f6f5130f9f4b3f09fdb4a" Mar 09 13:20:02 crc kubenswrapper[4723]: I0309 13:20:02.067740 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-s8dhp" Mar 09 13:20:02 crc kubenswrapper[4723]: I0309 13:20:02.069818 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"a9fc4bb5-d4e9-46f7-b213-a64914a27ee9","Type":"ContainerStarted","Data":"251f1342a8675c3b964480b4e6450a82d9648bbc987af5ab0ff0fcf24f0e6189"} Mar 09 13:20:02 crc kubenswrapper[4723]: I0309 13:20:02.071503 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0679-account-create-update-xwhf9" Mar 09 13:20:02 crc kubenswrapper[4723]: I0309 13:20:02.071666 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-0679-account-create-update-xwhf9" event={"ID":"12ca1cc4-a262-42c5-ac51-eee86f7c9793","Type":"ContainerDied","Data":"9f196fea01a51bd5dd8d449eed39aa3871725a82b571cddb583f65083cce6815"} Mar 09 13:20:02 crc kubenswrapper[4723]: I0309 13:20:02.071788 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f196fea01a51bd5dd8d449eed39aa3871725a82b571cddb583f65083cce6815" Mar 09 13:20:02 crc kubenswrapper[4723]: I0309 13:20:02.092609 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-gnr4x" Mar 09 13:20:02 crc kubenswrapper[4723]: I0309 13:20:02.093716 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-gnr4x" event={"ID":"556c51ac-2052-4f8c-9c5b-830aacc68de0","Type":"ContainerDied","Data":"b81b74203ce2172f99cfff4e48b5956b05c8d528ef92f1482ed3f98791418153"} Mar 09 13:20:02 crc kubenswrapper[4723]: I0309 13:20:02.093751 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b81b74203ce2172f99cfff4e48b5956b05c8d528ef92f1482ed3f98791418153" Mar 09 13:20:02 crc kubenswrapper[4723]: I0309 13:20:02.114507 4723 generic.go:334] "Generic (PLEG): container finished" podID="19f91c12-b482-46ab-a6e1-20164abe2ee4" containerID="1619049fc971a5f761213226bb2e8e6badaa3b8e5bc14d02cccc57f4fa2faf21" exitCode=0 Mar 09 13:20:02 crc kubenswrapper[4723]: I0309 13:20:02.114601 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"19f91c12-b482-46ab-a6e1-20164abe2ee4","Type":"ContainerDied","Data":"1619049fc971a5f761213226bb2e8e6badaa3b8e5bc14d02cccc57f4fa2faf21"} Mar 09 13:20:02 crc kubenswrapper[4723]: I0309 13:20:02.121422 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551040-pqgfx" event={"ID":"f8cfb1f4-2f08-4850-a398-679a25dacc26","Type":"ContainerStarted","Data":"bc4d7c3389c92019b417f552be89e608721026f6655edb118280eeb732e64a21"} Mar 09 13:20:02 crc kubenswrapper[4723]: I0309 13:20:02.125637 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nbql\" (UniqueName: \"kubernetes.io/projected/12ca1cc4-a262-42c5-ac51-eee86f7c9793-kube-api-access-8nbql\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:02 crc kubenswrapper[4723]: I0309 13:20:02.125684 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5n6w\" (UniqueName: \"kubernetes.io/projected/556c51ac-2052-4f8c-9c5b-830aacc68de0-kube-api-access-s5n6w\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:02 crc kubenswrapper[4723]: I0309 13:20:02.125696 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cxtg\" (UniqueName: \"kubernetes.io/projected/ed454b22-e190-4bf0-8581-e71f2ce51324-kube-api-access-7cxtg\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:02 crc kubenswrapper[4723]: I0309 13:20:02.984168 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-5n52p" Mar 09 13:20:03 crc kubenswrapper[4723]: I0309 13:20:03.019757 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5n52p-config-9bl8p" Mar 09 13:20:03 crc kubenswrapper[4723]: I0309 13:20:03.054448 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/51d9b835-e695-4401-9833-ae97bd15bf48-var-log-ovn\") pod \"51d9b835-e695-4401-9833-ae97bd15bf48\" (UID: \"51d9b835-e695-4401-9833-ae97bd15bf48\") " Mar 09 13:20:03 crc kubenswrapper[4723]: I0309 13:20:03.054538 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/51d9b835-e695-4401-9833-ae97bd15bf48-var-run-ovn\") pod \"51d9b835-e695-4401-9833-ae97bd15bf48\" (UID: \"51d9b835-e695-4401-9833-ae97bd15bf48\") " Mar 09 13:20:03 crc kubenswrapper[4723]: I0309 13:20:03.054742 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/51d9b835-e695-4401-9833-ae97bd15bf48-var-run\") pod \"51d9b835-e695-4401-9833-ae97bd15bf48\" (UID: \"51d9b835-e695-4401-9833-ae97bd15bf48\") " Mar 09 13:20:03 crc kubenswrapper[4723]: I0309 13:20:03.054801 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hhtq\" (UniqueName: \"kubernetes.io/projected/51d9b835-e695-4401-9833-ae97bd15bf48-kube-api-access-8hhtq\") pod \"51d9b835-e695-4401-9833-ae97bd15bf48\" (UID: \"51d9b835-e695-4401-9833-ae97bd15bf48\") " Mar 09 13:20:03 crc kubenswrapper[4723]: I0309 13:20:03.054852 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/51d9b835-e695-4401-9833-ae97bd15bf48-additional-scripts\") pod \"51d9b835-e695-4401-9833-ae97bd15bf48\" (UID: \"51d9b835-e695-4401-9833-ae97bd15bf48\") " Mar 09 13:20:03 crc kubenswrapper[4723]: I0309 13:20:03.054981 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/51d9b835-e695-4401-9833-ae97bd15bf48-scripts\") pod \"51d9b835-e695-4401-9833-ae97bd15bf48\" (UID: \"51d9b835-e695-4401-9833-ae97bd15bf48\") " Mar 09 13:20:03 crc kubenswrapper[4723]: I0309 13:20:03.056299 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/51d9b835-e695-4401-9833-ae97bd15bf48-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "51d9b835-e695-4401-9833-ae97bd15bf48" (UID: "51d9b835-e695-4401-9833-ae97bd15bf48"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:20:03 crc kubenswrapper[4723]: I0309 13:20:03.056322 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/51d9b835-e695-4401-9833-ae97bd15bf48-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "51d9b835-e695-4401-9833-ae97bd15bf48" (UID: "51d9b835-e695-4401-9833-ae97bd15bf48"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:20:03 crc kubenswrapper[4723]: I0309 13:20:03.056336 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/51d9b835-e695-4401-9833-ae97bd15bf48-var-run" (OuterVolumeSpecName: "var-run") pod "51d9b835-e695-4401-9833-ae97bd15bf48" (UID: "51d9b835-e695-4401-9833-ae97bd15bf48"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:20:03 crc kubenswrapper[4723]: I0309 13:20:03.057926 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51d9b835-e695-4401-9833-ae97bd15bf48-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "51d9b835-e695-4401-9833-ae97bd15bf48" (UID: "51d9b835-e695-4401-9833-ae97bd15bf48"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:20:03 crc kubenswrapper[4723]: I0309 13:20:03.058681 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51d9b835-e695-4401-9833-ae97bd15bf48-scripts" (OuterVolumeSpecName: "scripts") pod "51d9b835-e695-4401-9833-ae97bd15bf48" (UID: "51d9b835-e695-4401-9833-ae97bd15bf48"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:20:03 crc kubenswrapper[4723]: I0309 13:20:03.092304 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51d9b835-e695-4401-9833-ae97bd15bf48-kube-api-access-8hhtq" (OuterVolumeSpecName: "kube-api-access-8hhtq") pod "51d9b835-e695-4401-9833-ae97bd15bf48" (UID: "51d9b835-e695-4401-9833-ae97bd15bf48"). InnerVolumeSpecName "kube-api-access-8hhtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:20:03 crc kubenswrapper[4723]: I0309 13:20:03.141309 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5n52p-config-9bl8p" event={"ID":"51d9b835-e695-4401-9833-ae97bd15bf48","Type":"ContainerDied","Data":"e278c02313b0b58260c6c95a5fa97654dfcaa9cf9b7e528268177d47a6617df7"} Mar 09 13:20:03 crc kubenswrapper[4723]: I0309 13:20:03.141361 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e278c02313b0b58260c6c95a5fa97654dfcaa9cf9b7e528268177d47a6617df7" Mar 09 13:20:03 crc kubenswrapper[4723]: I0309 13:20:03.141378 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5n52p-config-9bl8p" Mar 09 13:20:03 crc kubenswrapper[4723]: I0309 13:20:03.166970 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hhtq\" (UniqueName: \"kubernetes.io/projected/51d9b835-e695-4401-9833-ae97bd15bf48-kube-api-access-8hhtq\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:03 crc kubenswrapper[4723]: I0309 13:20:03.167035 4723 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/51d9b835-e695-4401-9833-ae97bd15bf48-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:03 crc kubenswrapper[4723]: I0309 13:20:03.167051 4723 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/51d9b835-e695-4401-9833-ae97bd15bf48-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:03 crc kubenswrapper[4723]: I0309 13:20:03.167062 4723 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/51d9b835-e695-4401-9833-ae97bd15bf48-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:03 crc kubenswrapper[4723]: I0309 13:20:03.167075 4723 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/51d9b835-e695-4401-9833-ae97bd15bf48-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:03 crc kubenswrapper[4723]: I0309 13:20:03.167089 4723 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/51d9b835-e695-4401-9833-ae97bd15bf48-var-run\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:03 crc kubenswrapper[4723]: I0309 13:20:03.225289 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-5n52p-config-9bl8p"] Mar 09 13:20:03 crc kubenswrapper[4723]: I0309 13:20:03.241473 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-5n52p-config-9bl8p"] Mar 09 13:20:04 crc kubenswrapper[4723]: I0309 13:20:04.153765 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"19f91c12-b482-46ab-a6e1-20164abe2ee4","Type":"ContainerStarted","Data":"74f1b39e23aa539fa8c875c3e083fe7c0fa27ba57b3d4335d1fcab549b59bfe5"} Mar 09 13:20:04 crc kubenswrapper[4723]: I0309 13:20:04.154982 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 09 13:20:04 crc kubenswrapper[4723]: I0309 13:20:04.186296 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=-9223371959.668507 podStartE2EDuration="1m17.186269213s" podCreationTimestamp="2026-03-09 13:18:47 +0000 UTC" firstStartedPulling="2026-03-09 13:18:51.099711645 +0000 UTC m=+1205.114179185" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:20:04.181809944 +0000 UTC m=+1278.196277474" watchObservedRunningTime="2026-03-09 13:20:04.186269213 +0000 UTC m=+1278.200736753" Mar 09 13:20:04 crc kubenswrapper[4723]: I0309 13:20:04.893425 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51d9b835-e695-4401-9833-ae97bd15bf48" path="/var/lib/kubelet/pods/51d9b835-e695-4401-9833-ae97bd15bf48/volumes" Mar 09 13:20:05 crc kubenswrapper[4723]: I0309 13:20:05.430069 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-fgmft"] Mar 09 13:20:05 crc kubenswrapper[4723]: E0309 13:20:05.430896 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36dc8fce-575c-4fe1-b4df-2ae47014bce7" containerName="mariadb-account-create-update" Mar 09 13:20:05 crc kubenswrapper[4723]: I0309 13:20:05.430912 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="36dc8fce-575c-4fe1-b4df-2ae47014bce7" containerName="mariadb-account-create-update" Mar 09 13:20:05 crc kubenswrapper[4723]: E0309 13:20:05.430928 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="556c51ac-2052-4f8c-9c5b-830aacc68de0" containerName="mariadb-database-create" Mar 09 13:20:05 crc kubenswrapper[4723]: I0309 13:20:05.430934 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="556c51ac-2052-4f8c-9c5b-830aacc68de0" containerName="mariadb-database-create" Mar 09 13:20:05 crc kubenswrapper[4723]: E0309 13:20:05.430967 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed454b22-e190-4bf0-8581-e71f2ce51324" containerName="mariadb-database-create" Mar 09 13:20:05 crc kubenswrapper[4723]: I0309 13:20:05.430973 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed454b22-e190-4bf0-8581-e71f2ce51324" containerName="mariadb-database-create" Mar 09 13:20:05 crc kubenswrapper[4723]: E0309 13:20:05.430987 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51d9b835-e695-4401-9833-ae97bd15bf48" containerName="ovn-config" Mar 09 13:20:05 crc kubenswrapper[4723]: I0309 13:20:05.430994 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="51d9b835-e695-4401-9833-ae97bd15bf48" containerName="ovn-config" Mar 09 13:20:05 crc kubenswrapper[4723]: E0309 13:20:05.431005 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b743aece-4b06-44e2-9afa-fd075c0730d3" containerName="mariadb-database-create" Mar 09 13:20:05 crc kubenswrapper[4723]: I0309 13:20:05.431013 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="b743aece-4b06-44e2-9afa-fd075c0730d3" containerName="mariadb-database-create" Mar 09 13:20:05 crc kubenswrapper[4723]: E0309 13:20:05.431040 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12ca1cc4-a262-42c5-ac51-eee86f7c9793" containerName="mariadb-account-create-update" Mar 09 13:20:05 crc kubenswrapper[4723]: I0309 13:20:05.431046 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="12ca1cc4-a262-42c5-ac51-eee86f7c9793" containerName="mariadb-account-create-update" Mar 09 13:20:05 crc kubenswrapper[4723]: I0309 13:20:05.431292 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="36dc8fce-575c-4fe1-b4df-2ae47014bce7" containerName="mariadb-account-create-update" Mar 09 13:20:05 crc kubenswrapper[4723]: I0309 13:20:05.431307 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed454b22-e190-4bf0-8581-e71f2ce51324" containerName="mariadb-database-create" Mar 09 13:20:05 crc kubenswrapper[4723]: I0309 13:20:05.431320 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="b743aece-4b06-44e2-9afa-fd075c0730d3" containerName="mariadb-database-create" Mar 09 13:20:05 crc kubenswrapper[4723]: I0309 13:20:05.431345 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="12ca1cc4-a262-42c5-ac51-eee86f7c9793" containerName="mariadb-account-create-update" Mar 09 13:20:05 crc kubenswrapper[4723]: I0309 13:20:05.431356 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="556c51ac-2052-4f8c-9c5b-830aacc68de0" containerName="mariadb-database-create" Mar 09 13:20:05 crc kubenswrapper[4723]: I0309 13:20:05.431366 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="51d9b835-e695-4401-9833-ae97bd15bf48" containerName="ovn-config" Mar 09 13:20:05 crc kubenswrapper[4723]: I0309 13:20:05.432320 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fgmft" Mar 09 13:20:05 crc kubenswrapper[4723]: I0309 13:20:05.435111 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 09 13:20:05 crc kubenswrapper[4723]: I0309 13:20:05.442590 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-fgmft"] Mar 09 13:20:05 crc kubenswrapper[4723]: I0309 13:20:05.518751 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v6wn\" (UniqueName: \"kubernetes.io/projected/7f6aa2e5-8424-44e1-a9e0-247a8ff42676-kube-api-access-6v6wn\") pod \"root-account-create-update-fgmft\" (UID: \"7f6aa2e5-8424-44e1-a9e0-247a8ff42676\") " pod="openstack/root-account-create-update-fgmft" Mar 09 13:20:05 crc kubenswrapper[4723]: I0309 13:20:05.519029 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f6aa2e5-8424-44e1-a9e0-247a8ff42676-operator-scripts\") pod \"root-account-create-update-fgmft\" (UID: \"7f6aa2e5-8424-44e1-a9e0-247a8ff42676\") " pod="openstack/root-account-create-update-fgmft" Mar 09 13:20:05 crc kubenswrapper[4723]: I0309 13:20:05.620661 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v6wn\" (UniqueName: \"kubernetes.io/projected/7f6aa2e5-8424-44e1-a9e0-247a8ff42676-kube-api-access-6v6wn\") pod \"root-account-create-update-fgmft\" (UID: \"7f6aa2e5-8424-44e1-a9e0-247a8ff42676\") " pod="openstack/root-account-create-update-fgmft" Mar 09 13:20:05 crc kubenswrapper[4723]: I0309 13:20:05.620720 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f6aa2e5-8424-44e1-a9e0-247a8ff42676-operator-scripts\") pod \"root-account-create-update-fgmft\" (UID: \"7f6aa2e5-8424-44e1-a9e0-247a8ff42676\") " pod="openstack/root-account-create-update-fgmft" Mar 09 13:20:05 crc kubenswrapper[4723]: I0309 13:20:05.621644 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f6aa2e5-8424-44e1-a9e0-247a8ff42676-operator-scripts\") pod \"root-account-create-update-fgmft\" (UID: \"7f6aa2e5-8424-44e1-a9e0-247a8ff42676\") " pod="openstack/root-account-create-update-fgmft" Mar 09 13:20:05 crc kubenswrapper[4723]: I0309 13:20:05.640727 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v6wn\" (UniqueName: \"kubernetes.io/projected/7f6aa2e5-8424-44e1-a9e0-247a8ff42676-kube-api-access-6v6wn\") pod \"root-account-create-update-fgmft\" (UID: \"7f6aa2e5-8424-44e1-a9e0-247a8ff42676\") " pod="openstack/root-account-create-update-fgmft" Mar 09 13:20:05 crc kubenswrapper[4723]: I0309 13:20:05.765169 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fgmft" Mar 09 13:20:06 crc kubenswrapper[4723]: I0309 13:20:06.177969 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4a548a9c-33c8-4a35-a559-7290357170c1","Type":"ContainerStarted","Data":"9d64a1f732a34f9be6cb568cf2f4993cfdb876b19a6ea084a842261ffb1bcfcc"} Mar 09 13:20:06 crc kubenswrapper[4723]: I0309 13:20:06.181008 4723 generic.go:334] "Generic (PLEG): container finished" podID="f8cfb1f4-2f08-4850-a398-679a25dacc26" containerID="bc25c8c5930c5c5f007ae41650a217c3741dcf849edb5962c108af4745085d35" exitCode=0 Mar 09 13:20:06 crc kubenswrapper[4723]: I0309 13:20:06.181058 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551040-pqgfx" event={"ID":"f8cfb1f4-2f08-4850-a398-679a25dacc26","Type":"ContainerDied","Data":"bc25c8c5930c5c5f007ae41650a217c3741dcf849edb5962c108af4745085d35"} Mar 09 13:20:06 crc kubenswrapper[4723]: I0309 13:20:06.183302 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"a9fc4bb5-d4e9-46f7-b213-a64914a27ee9","Type":"ContainerStarted","Data":"3564f49b4f404d76e803fe48113d380ef0fa7e1f2c45ba842d9ab8158d0e505e"} Mar 09 13:20:06 crc kubenswrapper[4723]: I0309 13:20:06.211984 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=14.68023369 podStartE2EDuration="34.211953235s" podCreationTimestamp="2026-03-09 13:19:32 +0000 UTC" firstStartedPulling="2026-03-09 13:19:45.462129792 +0000 UTC m=+1259.476597332" lastFinishedPulling="2026-03-09 13:20:04.993849337 +0000 UTC m=+1279.008316877" observedRunningTime="2026-03-09 13:20:06.208431191 +0000 UTC m=+1280.222898751" watchObservedRunningTime="2026-03-09 13:20:06.211953235 +0000 UTC m=+1280.226420855" Mar 09 13:20:06 crc kubenswrapper[4723]: I0309 13:20:06.255738 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=2.667018171 podStartE2EDuration="6.255713111s" podCreationTimestamp="2026-03-09 13:20:00 +0000 UTC" firstStartedPulling="2026-03-09 13:20:01.403523724 +0000 UTC m=+1275.417991264" lastFinishedPulling="2026-03-09 13:20:04.992218674 +0000 UTC m=+1279.006686204" observedRunningTime="2026-03-09 13:20:06.234387993 +0000 UTC m=+1280.248855533" watchObservedRunningTime="2026-03-09 13:20:06.255713111 +0000 UTC m=+1280.270180651" Mar 09 13:20:06 crc kubenswrapper[4723]: I0309 13:20:06.276888 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-fgmft"] Mar 09 13:20:07 crc kubenswrapper[4723]: I0309 13:20:07.198725 4723 generic.go:334] "Generic (PLEG): container finished" podID="7f6aa2e5-8424-44e1-a9e0-247a8ff42676" containerID="e0f5a6fb678696ddc6aee2ea2830a0afafb5c962f33dfbdf8b179e7e41d0277a" exitCode=0 Mar 09 13:20:07 crc kubenswrapper[4723]: I0309 13:20:07.198773 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-fgmft" event={"ID":"7f6aa2e5-8424-44e1-a9e0-247a8ff42676","Type":"ContainerDied","Data":"e0f5a6fb678696ddc6aee2ea2830a0afafb5c962f33dfbdf8b179e7e41d0277a"} Mar 09 13:20:07 crc kubenswrapper[4723]: I0309 13:20:07.199069 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-fgmft" event={"ID":"7f6aa2e5-8424-44e1-a9e0-247a8ff42676","Type":"ContainerStarted","Data":"d74af65ebb19eabb7c4f8ab093dea129dc2bbd7fd33ca0a894b5d535d1171c39"} Mar 09 13:20:07 crc kubenswrapper[4723]: I0309 13:20:07.579691 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-b5gc4"] Mar 09 13:20:07 crc kubenswrapper[4723]: I0309 13:20:07.581983 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-b5gc4" Mar 09 13:20:07 crc kubenswrapper[4723]: I0309 13:20:07.591220 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 09 13:20:07 crc kubenswrapper[4723]: I0309 13:20:07.591746 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-ncx5z" Mar 09 13:20:07 crc kubenswrapper[4723]: I0309 13:20:07.604364 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-b5gc4"] Mar 09 13:20:07 crc kubenswrapper[4723]: I0309 13:20:07.668548 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d924133b-d3c9-4b71-bbf4-a894a618e6c4-etc-swift\") pod \"swift-storage-0\" (UID: \"d924133b-d3c9-4b71-bbf4-a894a618e6c4\") " pod="openstack/swift-storage-0" Mar 09 13:20:07 crc kubenswrapper[4723]: I0309 13:20:07.677704 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d924133b-d3c9-4b71-bbf4-a894a618e6c4-etc-swift\") pod \"swift-storage-0\" (UID: \"d924133b-d3c9-4b71-bbf4-a894a618e6c4\") " pod="openstack/swift-storage-0" Mar 09 13:20:07 crc kubenswrapper[4723]: I0309 13:20:07.690429 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 09 13:20:07 crc kubenswrapper[4723]: I0309 13:20:07.770148 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72dc18cb-be01-4378-b62c-609a2c237731-combined-ca-bundle\") pod \"glance-db-sync-b5gc4\" (UID: \"72dc18cb-be01-4378-b62c-609a2c237731\") " pod="openstack/glance-db-sync-b5gc4" Mar 09 13:20:07 crc kubenswrapper[4723]: I0309 13:20:07.770228 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hm9c\" (UniqueName: \"kubernetes.io/projected/72dc18cb-be01-4378-b62c-609a2c237731-kube-api-access-5hm9c\") pod \"glance-db-sync-b5gc4\" (UID: \"72dc18cb-be01-4378-b62c-609a2c237731\") " pod="openstack/glance-db-sync-b5gc4" Mar 09 13:20:07 crc kubenswrapper[4723]: I0309 13:20:07.770254 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72dc18cb-be01-4378-b62c-609a2c237731-db-sync-config-data\") pod \"glance-db-sync-b5gc4\" (UID: \"72dc18cb-be01-4378-b62c-609a2c237731\") " pod="openstack/glance-db-sync-b5gc4" Mar 09 13:20:07 crc kubenswrapper[4723]: I0309 13:20:07.770349 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72dc18cb-be01-4378-b62c-609a2c237731-config-data\") pod \"glance-db-sync-b5gc4\" (UID: \"72dc18cb-be01-4378-b62c-609a2c237731\") " pod="openstack/glance-db-sync-b5gc4" Mar 09 13:20:07 crc kubenswrapper[4723]: I0309 13:20:07.861585 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:07 crc kubenswrapper[4723]: I0309 13:20:07.865899 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551040-pqgfx" Mar 09 13:20:07 crc kubenswrapper[4723]: I0309 13:20:07.873621 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hm9c\" (UniqueName: \"kubernetes.io/projected/72dc18cb-be01-4378-b62c-609a2c237731-kube-api-access-5hm9c\") pod \"glance-db-sync-b5gc4\" (UID: \"72dc18cb-be01-4378-b62c-609a2c237731\") " pod="openstack/glance-db-sync-b5gc4" Mar 09 13:20:07 crc kubenswrapper[4723]: I0309 13:20:07.873656 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72dc18cb-be01-4378-b62c-609a2c237731-db-sync-config-data\") pod \"glance-db-sync-b5gc4\" (UID: \"72dc18cb-be01-4378-b62c-609a2c237731\") " pod="openstack/glance-db-sync-b5gc4" Mar 09 13:20:07 crc kubenswrapper[4723]: I0309 13:20:07.873734 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72dc18cb-be01-4378-b62c-609a2c237731-config-data\") pod \"glance-db-sync-b5gc4\" (UID: \"72dc18cb-be01-4378-b62c-609a2c237731\") " pod="openstack/glance-db-sync-b5gc4" Mar 09 13:20:07 crc kubenswrapper[4723]: I0309 13:20:07.873836 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72dc18cb-be01-4378-b62c-609a2c237731-combined-ca-bundle\") pod \"glance-db-sync-b5gc4\" (UID: \"72dc18cb-be01-4378-b62c-609a2c237731\") " pod="openstack/glance-db-sync-b5gc4" Mar 09 13:20:07 crc kubenswrapper[4723]: I0309 13:20:07.878931 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72dc18cb-be01-4378-b62c-609a2c237731-combined-ca-bundle\") pod \"glance-db-sync-b5gc4\" (UID: \"72dc18cb-be01-4378-b62c-609a2c237731\") " pod="openstack/glance-db-sync-b5gc4" Mar 09 13:20:07 crc kubenswrapper[4723]: I0309 13:20:07.878942 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72dc18cb-be01-4378-b62c-609a2c237731-db-sync-config-data\") pod \"glance-db-sync-b5gc4\" (UID: \"72dc18cb-be01-4378-b62c-609a2c237731\") " pod="openstack/glance-db-sync-b5gc4" Mar 09 13:20:07 crc kubenswrapper[4723]: I0309 13:20:07.879189 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72dc18cb-be01-4378-b62c-609a2c237731-config-data\") pod \"glance-db-sync-b5gc4\" (UID: \"72dc18cb-be01-4378-b62c-609a2c237731\") " pod="openstack/glance-db-sync-b5gc4" Mar 09 13:20:07 crc kubenswrapper[4723]: I0309 13:20:07.912277 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hm9c\" (UniqueName: \"kubernetes.io/projected/72dc18cb-be01-4378-b62c-609a2c237731-kube-api-access-5hm9c\") pod \"glance-db-sync-b5gc4\" (UID: \"72dc18cb-be01-4378-b62c-609a2c237731\") " pod="openstack/glance-db-sync-b5gc4" Mar 09 13:20:07 crc kubenswrapper[4723]: I0309 13:20:07.918544 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-b5gc4" Mar 09 13:20:07 crc kubenswrapper[4723]: I0309 13:20:07.976435 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cr4gf\" (UniqueName: \"kubernetes.io/projected/f8cfb1f4-2f08-4850-a398-679a25dacc26-kube-api-access-cr4gf\") pod \"f8cfb1f4-2f08-4850-a398-679a25dacc26\" (UID: \"f8cfb1f4-2f08-4850-a398-679a25dacc26\") " Mar 09 13:20:07 crc kubenswrapper[4723]: I0309 13:20:07.989776 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8cfb1f4-2f08-4850-a398-679a25dacc26-kube-api-access-cr4gf" (OuterVolumeSpecName: "kube-api-access-cr4gf") pod "f8cfb1f4-2f08-4850-a398-679a25dacc26" (UID: "f8cfb1f4-2f08-4850-a398-679a25dacc26"). InnerVolumeSpecName "kube-api-access-cr4gf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:20:08 crc kubenswrapper[4723]: I0309 13:20:08.082803 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cr4gf\" (UniqueName: \"kubernetes.io/projected/f8cfb1f4-2f08-4850-a398-679a25dacc26-kube-api-access-cr4gf\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:08 crc kubenswrapper[4723]: I0309 13:20:08.230297 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551040-pqgfx" Mar 09 13:20:08 crc kubenswrapper[4723]: I0309 13:20:08.235094 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551040-pqgfx" event={"ID":"f8cfb1f4-2f08-4850-a398-679a25dacc26","Type":"ContainerDied","Data":"bc4d7c3389c92019b417f552be89e608721026f6655edb118280eeb732e64a21"} Mar 09 13:20:08 crc kubenswrapper[4723]: I0309 13:20:08.238007 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc4d7c3389c92019b417f552be89e608721026f6655edb118280eeb732e64a21" Mar 09 13:20:08 crc kubenswrapper[4723]: W0309 13:20:08.387702 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd924133b_d3c9_4b71_bbf4_a894a618e6c4.slice/crio-be97369d59b9b7c3c8380976366d70846a8d0b143cbcee7b7e58a4192cfa9ee1 WatchSource:0}: Error finding container be97369d59b9b7c3c8380976366d70846a8d0b143cbcee7b7e58a4192cfa9ee1: Status 404 returned error can't find the container with id be97369d59b9b7c3c8380976366d70846a8d0b143cbcee7b7e58a4192cfa9ee1 Mar 09 13:20:08 crc kubenswrapper[4723]: I0309 13:20:08.398095 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 09 13:20:08 crc kubenswrapper[4723]: W0309 13:20:08.562025 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72dc18cb_be01_4378_b62c_609a2c237731.slice/crio-0ce831fca7ae9078cd87d075655dc30a1957cba71962161555758a3a14bfc7a6 WatchSource:0}: Error finding container 0ce831fca7ae9078cd87d075655dc30a1957cba71962161555758a3a14bfc7a6: Status 404 returned error can't find the container with id 0ce831fca7ae9078cd87d075655dc30a1957cba71962161555758a3a14bfc7a6 Mar 09 13:20:08 crc kubenswrapper[4723]: I0309 13:20:08.562120 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-b5gc4"] Mar 09 13:20:08 crc kubenswrapper[4723]: I0309 13:20:08.628605 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fgmft" Mar 09 13:20:08 crc kubenswrapper[4723]: I0309 13:20:08.681714 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:20:08 crc kubenswrapper[4723]: I0309 13:20:08.715638 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6v6wn\" (UniqueName: \"kubernetes.io/projected/7f6aa2e5-8424-44e1-a9e0-247a8ff42676-kube-api-access-6v6wn\") pod \"7f6aa2e5-8424-44e1-a9e0-247a8ff42676\" (UID: \"7f6aa2e5-8424-44e1-a9e0-247a8ff42676\") " Mar 09 13:20:08 crc kubenswrapper[4723]: I0309 13:20:08.716610 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f6aa2e5-8424-44e1-a9e0-247a8ff42676-operator-scripts\") pod \"7f6aa2e5-8424-44e1-a9e0-247a8ff42676\" (UID: \"7f6aa2e5-8424-44e1-a9e0-247a8ff42676\") " Mar 09 13:20:08 crc kubenswrapper[4723]: I0309 13:20:08.717807 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f6aa2e5-8424-44e1-a9e0-247a8ff42676-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7f6aa2e5-8424-44e1-a9e0-247a8ff42676" (UID: "7f6aa2e5-8424-44e1-a9e0-247a8ff42676"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:20:08 crc kubenswrapper[4723]: I0309 13:20:08.735230 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f6aa2e5-8424-44e1-a9e0-247a8ff42676-kube-api-access-6v6wn" (OuterVolumeSpecName: "kube-api-access-6v6wn") pod "7f6aa2e5-8424-44e1-a9e0-247a8ff42676" (UID: "7f6aa2e5-8424-44e1-a9e0-247a8ff42676"). InnerVolumeSpecName "kube-api-access-6v6wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:20:08 crc kubenswrapper[4723]: I0309 13:20:08.821598 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6v6wn\" (UniqueName: \"kubernetes.io/projected/7f6aa2e5-8424-44e1-a9e0-247a8ff42676-kube-api-access-6v6wn\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:08 crc kubenswrapper[4723]: I0309 13:20:08.821637 4723 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f6aa2e5-8424-44e1-a9e0-247a8ff42676-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:08 crc kubenswrapper[4723]: I0309 13:20:08.951750 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551034-7c7r4"] Mar 09 13:20:08 crc kubenswrapper[4723]: I0309 13:20:08.961593 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551034-7c7r4"] Mar 09 13:20:09 crc kubenswrapper[4723]: I0309 13:20:09.239387 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-b5gc4" event={"ID":"72dc18cb-be01-4378-b62c-609a2c237731","Type":"ContainerStarted","Data":"0ce831fca7ae9078cd87d075655dc30a1957cba71962161555758a3a14bfc7a6"} Mar 09 13:20:09 crc kubenswrapper[4723]: I0309 13:20:09.241004 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d924133b-d3c9-4b71-bbf4-a894a618e6c4","Type":"ContainerStarted","Data":"be97369d59b9b7c3c8380976366d70846a8d0b143cbcee7b7e58a4192cfa9ee1"} Mar 09 13:20:09 crc kubenswrapper[4723]: I0309 13:20:09.243342 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-fgmft" event={"ID":"7f6aa2e5-8424-44e1-a9e0-247a8ff42676","Type":"ContainerDied","Data":"d74af65ebb19eabb7c4f8ab093dea129dc2bbd7fd33ca0a894b5d535d1171c39"} Mar 09 13:20:09 crc kubenswrapper[4723]: I0309 13:20:09.243381 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d74af65ebb19eabb7c4f8ab093dea129dc2bbd7fd33ca0a894b5d535d1171c39" Mar 09 13:20:09 crc kubenswrapper[4723]: I0309 13:20:09.243418 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fgmft" Mar 09 13:20:09 crc kubenswrapper[4723]: I0309 13:20:09.263088 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-2" Mar 09 13:20:09 crc kubenswrapper[4723]: I0309 13:20:09.307956 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-1" podUID="daa528e2-bcd7-43a8-bfea-a0911b3020c5" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.135:5671: connect: connection refused" Mar 09 13:20:10 crc kubenswrapper[4723]: I0309 13:20:10.264941 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d924133b-d3c9-4b71-bbf4-a894a618e6c4","Type":"ContainerStarted","Data":"d915271e28f380bdbda2dc07fdab93ea97ddcd5aa5eeaf122bc6c507f00553fc"} Mar 09 13:20:10 crc kubenswrapper[4723]: I0309 13:20:10.892160 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e328a06-9569-40c1-aef2-48e3659f74bf" path="/var/lib/kubelet/pods/6e328a06-9569-40c1-aef2-48e3659f74bf/volumes" Mar 09 13:20:11 crc kubenswrapper[4723]: I0309 13:20:11.279358 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d924133b-d3c9-4b71-bbf4-a894a618e6c4","Type":"ContainerStarted","Data":"1b60ab4632a95d45f94fa754e5bcd30eba702e3aa37e1209b304a9f6a2d0b440"} Mar 09 13:20:11 crc kubenswrapper[4723]: I0309 13:20:11.279400 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d924133b-d3c9-4b71-bbf4-a894a618e6c4","Type":"ContainerStarted","Data":"65137fc00de3a61e7713d4cb4b2bccd248662c7eeaa1ea166b38e36dafb29205"} Mar 09 13:20:11 crc kubenswrapper[4723]: I0309 13:20:11.279413 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d924133b-d3c9-4b71-bbf4-a894a618e6c4","Type":"ContainerStarted","Data":"d46296bba8182c94d321ef8e7ef661e61f4d525492f08c2d0244d467f8f468c3"} Mar 09 13:20:13 crc kubenswrapper[4723]: I0309 13:20:13.302015 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d924133b-d3c9-4b71-bbf4-a894a618e6c4","Type":"ContainerStarted","Data":"0ff1c3d3091e96e919fbedb886f605299101684402822c01d929d4806c7a54b7"} Mar 09 13:20:13 crc kubenswrapper[4723]: I0309 13:20:13.302601 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d924133b-d3c9-4b71-bbf4-a894a618e6c4","Type":"ContainerStarted","Data":"5c53f3d188973d2935e6df47d8f481c4942d2c6c495025e78582648ac56cf7ff"} Mar 09 13:20:13 crc kubenswrapper[4723]: I0309 13:20:13.302616 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d924133b-d3c9-4b71-bbf4-a894a618e6c4","Type":"ContainerStarted","Data":"d19ea6b87eb5f91085fd99c4c749a09fff5732e36a51e50b0136fb69402be831"} Mar 09 13:20:13 crc kubenswrapper[4723]: I0309 13:20:13.302626 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d924133b-d3c9-4b71-bbf4-a894a618e6c4","Type":"ContainerStarted","Data":"6c8d9c0f11faecd473a653e0d70b0d81b6082683390bd43aaa270b5ce549c0f4"} Mar 09 13:20:17 crc kubenswrapper[4723]: I0309 13:20:17.862572 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:17 crc kubenswrapper[4723]: I0309 13:20:17.871918 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:18 crc kubenswrapper[4723]: I0309 13:20:18.347151 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:19 crc kubenswrapper[4723]: I0309 13:20:19.308973 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-1" Mar 09 13:20:19 crc kubenswrapper[4723]: I0309 13:20:19.477305 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 09 13:20:19 crc kubenswrapper[4723]: I0309 13:20:19.922333 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-cwx46"] Mar 09 13:20:19 crc kubenswrapper[4723]: E0309 13:20:19.928635 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f6aa2e5-8424-44e1-a9e0-247a8ff42676" containerName="mariadb-account-create-update" Mar 09 13:20:19 crc kubenswrapper[4723]: I0309 13:20:19.928902 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f6aa2e5-8424-44e1-a9e0-247a8ff42676" containerName="mariadb-account-create-update" Mar 09 13:20:19 crc kubenswrapper[4723]: E0309 13:20:19.929042 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8cfb1f4-2f08-4850-a398-679a25dacc26" containerName="oc" Mar 09 13:20:19 crc kubenswrapper[4723]: I0309 13:20:19.929128 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8cfb1f4-2f08-4850-a398-679a25dacc26" containerName="oc" Mar 09 13:20:19 crc kubenswrapper[4723]: I0309 13:20:19.929412 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f6aa2e5-8424-44e1-a9e0-247a8ff42676" containerName="mariadb-account-create-update" Mar 09 13:20:19 crc kubenswrapper[4723]: I0309 13:20:19.929540 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8cfb1f4-2f08-4850-a398-679a25dacc26" containerName="oc" Mar 09 13:20:19 crc kubenswrapper[4723]: I0309 13:20:19.930547 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-cwx46" Mar 09 13:20:19 crc kubenswrapper[4723]: I0309 13:20:19.948668 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-cwx46"] Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.009309 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9feddb28-c165-4784-94a8-4d63209fda46-operator-scripts\") pod \"cinder-db-create-cwx46\" (UID: \"9feddb28-c165-4784-94a8-4d63209fda46\") " pod="openstack/cinder-db-create-cwx46" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.009370 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-457tb\" (UniqueName: \"kubernetes.io/projected/9feddb28-c165-4784-94a8-4d63209fda46-kube-api-access-457tb\") pod \"cinder-db-create-cwx46\" (UID: \"9feddb28-c165-4784-94a8-4d63209fda46\") " pod="openstack/cinder-db-create-cwx46" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.111522 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9feddb28-c165-4784-94a8-4d63209fda46-operator-scripts\") pod \"cinder-db-create-cwx46\" (UID: \"9feddb28-c165-4784-94a8-4d63209fda46\") " pod="openstack/cinder-db-create-cwx46" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.111568 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-457tb\" (UniqueName: \"kubernetes.io/projected/9feddb28-c165-4784-94a8-4d63209fda46-kube-api-access-457tb\") pod \"cinder-db-create-cwx46\" (UID: \"9feddb28-c165-4784-94a8-4d63209fda46\") " pod="openstack/cinder-db-create-cwx46" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.112263 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9feddb28-c165-4784-94a8-4d63209fda46-operator-scripts\") pod \"cinder-db-create-cwx46\" (UID: \"9feddb28-c165-4784-94a8-4d63209fda46\") " pod="openstack/cinder-db-create-cwx46" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.132896 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-d2b1-account-create-update-6966g"] Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.134560 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d2b1-account-create-update-6966g" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.142296 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.145086 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-d2b1-account-create-update-6966g"] Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.180632 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-457tb\" (UniqueName: \"kubernetes.io/projected/9feddb28-c165-4784-94a8-4d63209fda46-kube-api-access-457tb\") pod \"cinder-db-create-cwx46\" (UID: \"9feddb28-c165-4784-94a8-4d63209fda46\") " pod="openstack/cinder-db-create-cwx46" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.213579 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxb8l\" (UniqueName: \"kubernetes.io/projected/c72c36c7-750a-4bc7-ac34-c9d42896cdd6-kube-api-access-gxb8l\") pod \"cinder-d2b1-account-create-update-6966g\" (UID: \"c72c36c7-750a-4bc7-ac34-c9d42896cdd6\") " pod="openstack/cinder-d2b1-account-create-update-6966g" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.213852 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c72c36c7-750a-4bc7-ac34-c9d42896cdd6-operator-scripts\") pod \"cinder-d2b1-account-create-update-6966g\" (UID: \"c72c36c7-750a-4bc7-ac34-c9d42896cdd6\") " pod="openstack/cinder-d2b1-account-create-update-6966g" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.233584 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-b554g"] Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.235236 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-b554g" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.254673 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-cwx46" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.256985 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-6f11-account-create-update-gprhw"] Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.258764 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-6f11-account-create-update-gprhw" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.266814 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.300667 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-6f11-account-create-update-gprhw"] Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.337264 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-b554g"] Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.345972 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxb8l\" (UniqueName: \"kubernetes.io/projected/c72c36c7-750a-4bc7-ac34-c9d42896cdd6-kube-api-access-gxb8l\") pod \"cinder-d2b1-account-create-update-6966g\" (UID: \"c72c36c7-750a-4bc7-ac34-c9d42896cdd6\") " pod="openstack/cinder-d2b1-account-create-update-6966g" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.346030 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s2g7\" (UniqueName: \"kubernetes.io/projected/c1df2ad2-feac-487c-ab26-e885457d7979-kube-api-access-6s2g7\") pod \"heat-db-create-b554g\" (UID: \"c1df2ad2-feac-487c-ab26-e885457d7979\") " pod="openstack/heat-db-create-b554g" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.346240 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c72c36c7-750a-4bc7-ac34-c9d42896cdd6-operator-scripts\") pod \"cinder-d2b1-account-create-update-6966g\" (UID: \"c72c36c7-750a-4bc7-ac34-c9d42896cdd6\") " pod="openstack/cinder-d2b1-account-create-update-6966g" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.346434 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nncxb\" (UniqueName: \"kubernetes.io/projected/1b441fb5-f89b-4ec1-8399-b3f56fdf139c-kube-api-access-nncxb\") pod \"heat-6f11-account-create-update-gprhw\" (UID: \"1b441fb5-f89b-4ec1-8399-b3f56fdf139c\") " pod="openstack/heat-6f11-account-create-update-gprhw" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.346517 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1df2ad2-feac-487c-ab26-e885457d7979-operator-scripts\") pod \"heat-db-create-b554g\" (UID: \"c1df2ad2-feac-487c-ab26-e885457d7979\") " pod="openstack/heat-db-create-b554g" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.346596 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b441fb5-f89b-4ec1-8399-b3f56fdf139c-operator-scripts\") pod \"heat-6f11-account-create-update-gprhw\" (UID: \"1b441fb5-f89b-4ec1-8399-b3f56fdf139c\") " pod="openstack/heat-6f11-account-create-update-gprhw" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.347621 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c72c36c7-750a-4bc7-ac34-c9d42896cdd6-operator-scripts\") pod \"cinder-d2b1-account-create-update-6966g\" (UID: \"c72c36c7-750a-4bc7-ac34-c9d42896cdd6\") " pod="openstack/cinder-d2b1-account-create-update-6966g" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.372912 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-88pwc"] Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.374456 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-88pwc" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.410141 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxb8l\" (UniqueName: \"kubernetes.io/projected/c72c36c7-750a-4bc7-ac34-c9d42896cdd6-kube-api-access-gxb8l\") pod \"cinder-d2b1-account-create-update-6966g\" (UID: \"c72c36c7-750a-4bc7-ac34-c9d42896cdd6\") " pod="openstack/cinder-d2b1-account-create-update-6966g" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.433970 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-88pwc"] Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.450097 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s2g7\" (UniqueName: \"kubernetes.io/projected/c1df2ad2-feac-487c-ab26-e885457d7979-kube-api-access-6s2g7\") pod \"heat-db-create-b554g\" (UID: \"c1df2ad2-feac-487c-ab26-e885457d7979\") " pod="openstack/heat-db-create-b554g" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.450247 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nncxb\" (UniqueName: \"kubernetes.io/projected/1b441fb5-f89b-4ec1-8399-b3f56fdf139c-kube-api-access-nncxb\") pod \"heat-6f11-account-create-update-gprhw\" (UID: \"1b441fb5-f89b-4ec1-8399-b3f56fdf139c\") " pod="openstack/heat-6f11-account-create-update-gprhw" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.450291 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1df2ad2-feac-487c-ab26-e885457d7979-operator-scripts\") pod \"heat-db-create-b554g\" (UID: \"c1df2ad2-feac-487c-ab26-e885457d7979\") " pod="openstack/heat-db-create-b554g" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.450323 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d9kc\" (UniqueName: \"kubernetes.io/projected/0c9c338c-01c8-428b-89cb-4c4a59505595-kube-api-access-7d9kc\") pod \"barbican-db-create-88pwc\" (UID: \"0c9c338c-01c8-428b-89cb-4c4a59505595\") " pod="openstack/barbican-db-create-88pwc" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.450346 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c9c338c-01c8-428b-89cb-4c4a59505595-operator-scripts\") pod \"barbican-db-create-88pwc\" (UID: \"0c9c338c-01c8-428b-89cb-4c4a59505595\") " pod="openstack/barbican-db-create-88pwc" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.450377 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b441fb5-f89b-4ec1-8399-b3f56fdf139c-operator-scripts\") pod \"heat-6f11-account-create-update-gprhw\" (UID: \"1b441fb5-f89b-4ec1-8399-b3f56fdf139c\") " pod="openstack/heat-6f11-account-create-update-gprhw" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.450538 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d2b1-account-create-update-6966g" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.451579 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1df2ad2-feac-487c-ab26-e885457d7979-operator-scripts\") pod \"heat-db-create-b554g\" (UID: \"c1df2ad2-feac-487c-ab26-e885457d7979\") " pod="openstack/heat-db-create-b554g" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.454479 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b441fb5-f89b-4ec1-8399-b3f56fdf139c-operator-scripts\") pod \"heat-6f11-account-create-update-gprhw\" (UID: \"1b441fb5-f89b-4ec1-8399-b3f56fdf139c\") " pod="openstack/heat-6f11-account-create-update-gprhw" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.457622 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-x7vz5"] Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.459414 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-x7vz5" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.462302 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.462730 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.462852 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jxn5s" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.462983 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.479037 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s2g7\" (UniqueName: \"kubernetes.io/projected/c1df2ad2-feac-487c-ab26-e885457d7979-kube-api-access-6s2g7\") pod \"heat-db-create-b554g\" (UID: \"c1df2ad2-feac-487c-ab26-e885457d7979\") " pod="openstack/heat-db-create-b554g" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.495687 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nncxb\" (UniqueName: \"kubernetes.io/projected/1b441fb5-f89b-4ec1-8399-b3f56fdf139c-kube-api-access-nncxb\") pod \"heat-6f11-account-create-update-gprhw\" (UID: \"1b441fb5-f89b-4ec1-8399-b3f56fdf139c\") " pod="openstack/heat-6f11-account-create-update-gprhw" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.508913 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-x7vz5"] Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.526233 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-sbt7t"] Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.527889 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-sbt7t" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.536842 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-sbt7t"] Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.554230 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15541e12-c0a2-4c26-b912-d33be48eea77-combined-ca-bundle\") pod \"keystone-db-sync-x7vz5\" (UID: \"15541e12-c0a2-4c26-b912-d33be48eea77\") " pod="openstack/keystone-db-sync-x7vz5" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.554304 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15541e12-c0a2-4c26-b912-d33be48eea77-config-data\") pod \"keystone-db-sync-x7vz5\" (UID: \"15541e12-c0a2-4c26-b912-d33be48eea77\") " pod="openstack/keystone-db-sync-x7vz5" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.554464 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7d9kc\" (UniqueName: \"kubernetes.io/projected/0c9c338c-01c8-428b-89cb-4c4a59505595-kube-api-access-7d9kc\") pod \"barbican-db-create-88pwc\" (UID: \"0c9c338c-01c8-428b-89cb-4c4a59505595\") " pod="openstack/barbican-db-create-88pwc" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.554519 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c9c338c-01c8-428b-89cb-4c4a59505595-operator-scripts\") pod \"barbican-db-create-88pwc\" (UID: \"0c9c338c-01c8-428b-89cb-4c4a59505595\") " pod="openstack/barbican-db-create-88pwc" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.554579 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd6bp\" (UniqueName: \"kubernetes.io/projected/15541e12-c0a2-4c26-b912-d33be48eea77-kube-api-access-cd6bp\") pod \"keystone-db-sync-x7vz5\" (UID: \"15541e12-c0a2-4c26-b912-d33be48eea77\") " pod="openstack/keystone-db-sync-x7vz5" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.555671 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c9c338c-01c8-428b-89cb-4c4a59505595-operator-scripts\") pod \"barbican-db-create-88pwc\" (UID: \"0c9c338c-01c8-428b-89cb-4c4a59505595\") " pod="openstack/barbican-db-create-88pwc" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.561958 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-b554g" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.586700 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-af49-account-create-update-bwsv7"] Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.588445 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-af49-account-create-update-bwsv7" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.591712 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d9kc\" (UniqueName: \"kubernetes.io/projected/0c9c338c-01c8-428b-89cb-4c4a59505595-kube-api-access-7d9kc\") pod \"barbican-db-create-88pwc\" (UID: \"0c9c338c-01c8-428b-89cb-4c4a59505595\") " pod="openstack/barbican-db-create-88pwc" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.596376 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-af49-account-create-update-bwsv7"] Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.603260 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.613934 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-6f11-account-create-update-gprhw" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.658773 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-32ad-account-create-update-8p5t9"] Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.661592 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-32ad-account-create-update-8p5t9" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.671314 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83082054-f3da-4455-b4af-5232e439042c-operator-scripts\") pod \"barbican-af49-account-create-update-bwsv7\" (UID: \"83082054-f3da-4455-b4af-5232e439042c\") " pod="openstack/barbican-af49-account-create-update-bwsv7" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.671414 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cd6bp\" (UniqueName: \"kubernetes.io/projected/15541e12-c0a2-4c26-b912-d33be48eea77-kube-api-access-cd6bp\") pod \"keystone-db-sync-x7vz5\" (UID: \"15541e12-c0a2-4c26-b912-d33be48eea77\") " pod="openstack/keystone-db-sync-x7vz5" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.671523 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shfsf\" (UniqueName: \"kubernetes.io/projected/8f9ae762-5d7d-4d41-9477-b4cc72689803-kube-api-access-shfsf\") pod \"neutron-db-create-sbt7t\" (UID: \"8f9ae762-5d7d-4d41-9477-b4cc72689803\") " pod="openstack/neutron-db-create-sbt7t" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.671640 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15541e12-c0a2-4c26-b912-d33be48eea77-combined-ca-bundle\") pod \"keystone-db-sync-x7vz5\" (UID: \"15541e12-c0a2-4c26-b912-d33be48eea77\") " pod="openstack/keystone-db-sync-x7vz5" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.671663 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7pq4\" (UniqueName: \"kubernetes.io/projected/83082054-f3da-4455-b4af-5232e439042c-kube-api-access-h7pq4\") pod \"barbican-af49-account-create-update-bwsv7\" (UID: \"83082054-f3da-4455-b4af-5232e439042c\") " pod="openstack/barbican-af49-account-create-update-bwsv7" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.671715 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15541e12-c0a2-4c26-b912-d33be48eea77-config-data\") pod \"keystone-db-sync-x7vz5\" (UID: \"15541e12-c0a2-4c26-b912-d33be48eea77\") " pod="openstack/keystone-db-sync-x7vz5" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.671827 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f9ae762-5d7d-4d41-9477-b4cc72689803-operator-scripts\") pod \"neutron-db-create-sbt7t\" (UID: \"8f9ae762-5d7d-4d41-9477-b4cc72689803\") " pod="openstack/neutron-db-create-sbt7t" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.672218 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.678595 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15541e12-c0a2-4c26-b912-d33be48eea77-config-data\") pod \"keystone-db-sync-x7vz5\" (UID: \"15541e12-c0a2-4c26-b912-d33be48eea77\") " pod="openstack/keystone-db-sync-x7vz5" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.690022 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15541e12-c0a2-4c26-b912-d33be48eea77-combined-ca-bundle\") pod \"keystone-db-sync-x7vz5\" (UID: \"15541e12-c0a2-4c26-b912-d33be48eea77\") " pod="openstack/keystone-db-sync-x7vz5" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.713171 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd6bp\" (UniqueName: \"kubernetes.io/projected/15541e12-c0a2-4c26-b912-d33be48eea77-kube-api-access-cd6bp\") pod \"keystone-db-sync-x7vz5\" (UID: \"15541e12-c0a2-4c26-b912-d33be48eea77\") " pod="openstack/keystone-db-sync-x7vz5" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.739196 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-32ad-account-create-update-8p5t9"] Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.764311 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-88pwc" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.776249 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83082054-f3da-4455-b4af-5232e439042c-operator-scripts\") pod \"barbican-af49-account-create-update-bwsv7\" (UID: \"83082054-f3da-4455-b4af-5232e439042c\") " pod="openstack/barbican-af49-account-create-update-bwsv7" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.776354 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/edb56749-bb8b-4620-a1a9-8e1f2a70f1b2-operator-scripts\") pod \"neutron-32ad-account-create-update-8p5t9\" (UID: \"edb56749-bb8b-4620-a1a9-8e1f2a70f1b2\") " pod="openstack/neutron-32ad-account-create-update-8p5t9" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.776400 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrflm\" (UniqueName: \"kubernetes.io/projected/edb56749-bb8b-4620-a1a9-8e1f2a70f1b2-kube-api-access-jrflm\") pod \"neutron-32ad-account-create-update-8p5t9\" (UID: \"edb56749-bb8b-4620-a1a9-8e1f2a70f1b2\") " pod="openstack/neutron-32ad-account-create-update-8p5t9" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.776434 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shfsf\" (UniqueName: \"kubernetes.io/projected/8f9ae762-5d7d-4d41-9477-b4cc72689803-kube-api-access-shfsf\") pod \"neutron-db-create-sbt7t\" (UID: \"8f9ae762-5d7d-4d41-9477-b4cc72689803\") " pod="openstack/neutron-db-create-sbt7t" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.776677 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7pq4\" (UniqueName: \"kubernetes.io/projected/83082054-f3da-4455-b4af-5232e439042c-kube-api-access-h7pq4\") pod \"barbican-af49-account-create-update-bwsv7\" (UID: \"83082054-f3da-4455-b4af-5232e439042c\") " pod="openstack/barbican-af49-account-create-update-bwsv7" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.777101 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83082054-f3da-4455-b4af-5232e439042c-operator-scripts\") pod \"barbican-af49-account-create-update-bwsv7\" (UID: \"83082054-f3da-4455-b4af-5232e439042c\") " pod="openstack/barbican-af49-account-create-update-bwsv7" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.777414 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f9ae762-5d7d-4d41-9477-b4cc72689803-operator-scripts\") pod \"neutron-db-create-sbt7t\" (UID: \"8f9ae762-5d7d-4d41-9477-b4cc72689803\") " pod="openstack/neutron-db-create-sbt7t" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.778047 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f9ae762-5d7d-4d41-9477-b4cc72689803-operator-scripts\") pod \"neutron-db-create-sbt7t\" (UID: \"8f9ae762-5d7d-4d41-9477-b4cc72689803\") " pod="openstack/neutron-db-create-sbt7t" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.793544 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7pq4\" (UniqueName: \"kubernetes.io/projected/83082054-f3da-4455-b4af-5232e439042c-kube-api-access-h7pq4\") pod \"barbican-af49-account-create-update-bwsv7\" (UID: \"83082054-f3da-4455-b4af-5232e439042c\") " pod="openstack/barbican-af49-account-create-update-bwsv7" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.797893 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shfsf\" (UniqueName: \"kubernetes.io/projected/8f9ae762-5d7d-4d41-9477-b4cc72689803-kube-api-access-shfsf\") pod \"neutron-db-create-sbt7t\" (UID: \"8f9ae762-5d7d-4d41-9477-b4cc72689803\") " pod="openstack/neutron-db-create-sbt7t" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.880588 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/edb56749-bb8b-4620-a1a9-8e1f2a70f1b2-operator-scripts\") pod \"neutron-32ad-account-create-update-8p5t9\" (UID: \"edb56749-bb8b-4620-a1a9-8e1f2a70f1b2\") " pod="openstack/neutron-32ad-account-create-update-8p5t9" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.880675 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrflm\" (UniqueName: \"kubernetes.io/projected/edb56749-bb8b-4620-a1a9-8e1f2a70f1b2-kube-api-access-jrflm\") pod \"neutron-32ad-account-create-update-8p5t9\" (UID: \"edb56749-bb8b-4620-a1a9-8e1f2a70f1b2\") " pod="openstack/neutron-32ad-account-create-update-8p5t9" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.881953 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/edb56749-bb8b-4620-a1a9-8e1f2a70f1b2-operator-scripts\") pod \"neutron-32ad-account-create-update-8p5t9\" (UID: \"edb56749-bb8b-4620-a1a9-8e1f2a70f1b2\") " pod="openstack/neutron-32ad-account-create-update-8p5t9" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.899465 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrflm\" (UniqueName: \"kubernetes.io/projected/edb56749-bb8b-4620-a1a9-8e1f2a70f1b2-kube-api-access-jrflm\") pod \"neutron-32ad-account-create-update-8p5t9\" (UID: \"edb56749-bb8b-4620-a1a9-8e1f2a70f1b2\") " pod="openstack/neutron-32ad-account-create-update-8p5t9" Mar 09 13:20:20 crc kubenswrapper[4723]: I0309 13:20:20.983616 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-x7vz5" Mar 09 13:20:21 crc kubenswrapper[4723]: I0309 13:20:21.015889 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-sbt7t" Mar 09 13:20:21 crc kubenswrapper[4723]: I0309 13:20:21.025623 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-af49-account-create-update-bwsv7" Mar 09 13:20:21 crc kubenswrapper[4723]: I0309 13:20:21.055649 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-32ad-account-create-update-8p5t9" Mar 09 13:20:21 crc kubenswrapper[4723]: I0309 13:20:21.353116 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 09 13:20:21 crc kubenswrapper[4723]: I0309 13:20:21.353655 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="4a548a9c-33c8-4a35-a559-7290357170c1" containerName="prometheus" containerID="cri-o://fec818ed303f7ff19de3d6732cdf76158698dff52fb322f95cc2d95ed5fc3dac" gracePeriod=600 Mar 09 13:20:21 crc kubenswrapper[4723]: I0309 13:20:21.353786 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="4a548a9c-33c8-4a35-a559-7290357170c1" containerName="thanos-sidecar" containerID="cri-o://9d64a1f732a34f9be6cb568cf2f4993cfdb876b19a6ea084a842261ffb1bcfcc" gracePeriod=600 Mar 09 13:20:21 crc kubenswrapper[4723]: I0309 13:20:21.353786 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="4a548a9c-33c8-4a35-a559-7290357170c1" containerName="config-reloader" containerID="cri-o://93be7c2aa47e3f408bbcd685aae67aa4d15f16e5d60ce6cb28fa72a13e03a121" gracePeriod=600 Mar 09 13:20:22 crc kubenswrapper[4723]: I0309 13:20:22.430964 4723 generic.go:334] "Generic (PLEG): container finished" podID="4a548a9c-33c8-4a35-a559-7290357170c1" containerID="9d64a1f732a34f9be6cb568cf2f4993cfdb876b19a6ea084a842261ffb1bcfcc" exitCode=0 Mar 09 13:20:22 crc kubenswrapper[4723]: I0309 13:20:22.433047 4723 generic.go:334] "Generic (PLEG): container finished" podID="4a548a9c-33c8-4a35-a559-7290357170c1" containerID="93be7c2aa47e3f408bbcd685aae67aa4d15f16e5d60ce6cb28fa72a13e03a121" exitCode=0 Mar 09 13:20:22 crc kubenswrapper[4723]: I0309 13:20:22.433148 4723 generic.go:334] "Generic (PLEG): container finished" podID="4a548a9c-33c8-4a35-a559-7290357170c1" containerID="fec818ed303f7ff19de3d6732cdf76158698dff52fb322f95cc2d95ed5fc3dac" exitCode=0 Mar 09 13:20:22 crc kubenswrapper[4723]: I0309 13:20:22.431191 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4a548a9c-33c8-4a35-a559-7290357170c1","Type":"ContainerDied","Data":"9d64a1f732a34f9be6cb568cf2f4993cfdb876b19a6ea084a842261ffb1bcfcc"} Mar 09 13:20:22 crc kubenswrapper[4723]: I0309 13:20:22.433293 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4a548a9c-33c8-4a35-a559-7290357170c1","Type":"ContainerDied","Data":"93be7c2aa47e3f408bbcd685aae67aa4d15f16e5d60ce6cb28fa72a13e03a121"} Mar 09 13:20:22 crc kubenswrapper[4723]: I0309 13:20:22.433323 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4a548a9c-33c8-4a35-a559-7290357170c1","Type":"ContainerDied","Data":"fec818ed303f7ff19de3d6732cdf76158698dff52fb322f95cc2d95ed5fc3dac"} Mar 09 13:20:22 crc kubenswrapper[4723]: I0309 13:20:22.862389 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="4a548a9c-33c8-4a35-a559-7290357170c1" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.151:9090/-/ready\": dial tcp 10.217.0.151:9090: connect: connection refused" Mar 09 13:20:25 crc kubenswrapper[4723]: E0309 13:20:25.518738 4723 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Mar 09 13:20:25 crc kubenswrapper[4723]: E0309 13:20:25.519586 4723 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5hm9c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-b5gc4_openstack(72dc18cb-be01-4378-b62c-609a2c237731): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 13:20:25 crc kubenswrapper[4723]: E0309 13:20:25.520883 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-b5gc4" podUID="72dc18cb-be01-4378-b62c-609a2c237731" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.041726 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.125960 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/4a548a9c-33c8-4a35-a559-7290357170c1-prometheus-metric-storage-rulefiles-2\") pod \"4a548a9c-33c8-4a35-a559-7290357170c1\" (UID: \"4a548a9c-33c8-4a35-a559-7290357170c1\") " Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.126138 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwx7t\" (UniqueName: \"kubernetes.io/projected/4a548a9c-33c8-4a35-a559-7290357170c1-kube-api-access-lwx7t\") pod \"4a548a9c-33c8-4a35-a559-7290357170c1\" (UID: \"4a548a9c-33c8-4a35-a559-7290357170c1\") " Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.126259 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4a548a9c-33c8-4a35-a559-7290357170c1-prometheus-metric-storage-rulefiles-0\") pod \"4a548a9c-33c8-4a35-a559-7290357170c1\" (UID: \"4a548a9c-33c8-4a35-a559-7290357170c1\") " Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.126309 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4a548a9c-33c8-4a35-a559-7290357170c1-config\") pod \"4a548a9c-33c8-4a35-a559-7290357170c1\" (UID: \"4a548a9c-33c8-4a35-a559-7290357170c1\") " Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.126331 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4a548a9c-33c8-4a35-a559-7290357170c1-web-config\") pod \"4a548a9c-33c8-4a35-a559-7290357170c1\" (UID: \"4a548a9c-33c8-4a35-a559-7290357170c1\") " Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.126404 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4a548a9c-33c8-4a35-a559-7290357170c1-config-out\") pod \"4a548a9c-33c8-4a35-a559-7290357170c1\" (UID: \"4a548a9c-33c8-4a35-a559-7290357170c1\") " Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.126583 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-81bbfcdd-8860-466d-9fd8-a0c71da7caf4\") pod \"4a548a9c-33c8-4a35-a559-7290357170c1\" (UID: \"4a548a9c-33c8-4a35-a559-7290357170c1\") " Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.126624 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4a548a9c-33c8-4a35-a559-7290357170c1-thanos-prometheus-http-client-file\") pod \"4a548a9c-33c8-4a35-a559-7290357170c1\" (UID: \"4a548a9c-33c8-4a35-a559-7290357170c1\") " Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.126647 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4a548a9c-33c8-4a35-a559-7290357170c1-tls-assets\") pod \"4a548a9c-33c8-4a35-a559-7290357170c1\" (UID: \"4a548a9c-33c8-4a35-a559-7290357170c1\") " Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.126701 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/4a548a9c-33c8-4a35-a559-7290357170c1-prometheus-metric-storage-rulefiles-1\") pod \"4a548a9c-33c8-4a35-a559-7290357170c1\" (UID: \"4a548a9c-33c8-4a35-a559-7290357170c1\") " Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.128397 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a548a9c-33c8-4a35-a559-7290357170c1-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "4a548a9c-33c8-4a35-a559-7290357170c1" (UID: "4a548a9c-33c8-4a35-a559-7290357170c1"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.132165 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a548a9c-33c8-4a35-a559-7290357170c1-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "4a548a9c-33c8-4a35-a559-7290357170c1" (UID: "4a548a9c-33c8-4a35-a559-7290357170c1"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.132602 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a548a9c-33c8-4a35-a559-7290357170c1-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "4a548a9c-33c8-4a35-a559-7290357170c1" (UID: "4a548a9c-33c8-4a35-a559-7290357170c1"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.137957 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a548a9c-33c8-4a35-a559-7290357170c1-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "4a548a9c-33c8-4a35-a559-7290357170c1" (UID: "4a548a9c-33c8-4a35-a559-7290357170c1"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.138312 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a548a9c-33c8-4a35-a559-7290357170c1-kube-api-access-lwx7t" (OuterVolumeSpecName: "kube-api-access-lwx7t") pod "4a548a9c-33c8-4a35-a559-7290357170c1" (UID: "4a548a9c-33c8-4a35-a559-7290357170c1"). InnerVolumeSpecName "kube-api-access-lwx7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.139930 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a548a9c-33c8-4a35-a559-7290357170c1-config-out" (OuterVolumeSpecName: "config-out") pod "4a548a9c-33c8-4a35-a559-7290357170c1" (UID: "4a548a9c-33c8-4a35-a559-7290357170c1"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.141154 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a548a9c-33c8-4a35-a559-7290357170c1-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "4a548a9c-33c8-4a35-a559-7290357170c1" (UID: "4a548a9c-33c8-4a35-a559-7290357170c1"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.141781 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a548a9c-33c8-4a35-a559-7290357170c1-config" (OuterVolumeSpecName: "config") pod "4a548a9c-33c8-4a35-a559-7290357170c1" (UID: "4a548a9c-33c8-4a35-a559-7290357170c1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.167978 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a548a9c-33c8-4a35-a559-7290357170c1-web-config" (OuterVolumeSpecName: "web-config") pod "4a548a9c-33c8-4a35-a559-7290357170c1" (UID: "4a548a9c-33c8-4a35-a559-7290357170c1"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.169569 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-81bbfcdd-8860-466d-9fd8-a0c71da7caf4" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "4a548a9c-33c8-4a35-a559-7290357170c1" (UID: "4a548a9c-33c8-4a35-a559-7290357170c1"). InnerVolumeSpecName "pvc-81bbfcdd-8860-466d-9fd8-a0c71da7caf4". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.229167 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4a548a9c-33c8-4a35-a559-7290357170c1-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.229309 4723 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4a548a9c-33c8-4a35-a559-7290357170c1-web-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.229321 4723 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4a548a9c-33c8-4a35-a559-7290357170c1-config-out\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.229364 4723 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-81bbfcdd-8860-466d-9fd8-a0c71da7caf4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-81bbfcdd-8860-466d-9fd8-a0c71da7caf4\") on node \"crc\" " Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.229380 4723 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4a548a9c-33c8-4a35-a559-7290357170c1-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.229389 4723 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4a548a9c-33c8-4a35-a559-7290357170c1-tls-assets\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.229398 4723 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/4a548a9c-33c8-4a35-a559-7290357170c1-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.229407 4723 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/4a548a9c-33c8-4a35-a559-7290357170c1-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.229416 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwx7t\" (UniqueName: \"kubernetes.io/projected/4a548a9c-33c8-4a35-a559-7290357170c1-kube-api-access-lwx7t\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.229425 4723 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4a548a9c-33c8-4a35-a559-7290357170c1-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.244486 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-d2b1-account-create-update-6966g"] Mar 09 13:20:26 crc kubenswrapper[4723]: W0309 13:20:26.256706 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc72c36c7_750a_4bc7_ac34_c9d42896cdd6.slice/crio-f7206d813d65bc248c62ba4f9b29259a64bf6bdbac38b1d8c2a2e1ae90f47535 WatchSource:0}: Error finding container f7206d813d65bc248c62ba4f9b29259a64bf6bdbac38b1d8c2a2e1ae90f47535: Status 404 returned error can't find the container with id f7206d813d65bc248c62ba4f9b29259a64bf6bdbac38b1d8c2a2e1ae90f47535 Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.264087 4723 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.264265 4723 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-81bbfcdd-8860-466d-9fd8-a0c71da7caf4" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-81bbfcdd-8860-466d-9fd8-a0c71da7caf4") on node "crc" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.332399 4723 reconciler_common.go:293] "Volume detached for volume \"pvc-81bbfcdd-8860-466d-9fd8-a0c71da7caf4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-81bbfcdd-8860-466d-9fd8-a0c71da7caf4\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.480530 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d924133b-d3c9-4b71-bbf4-a894a618e6c4","Type":"ContainerStarted","Data":"fc1516e3cadad6e2b2de4aea38e728a87d9308f0e9285abae5fd1d7ad5eb140a"} Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.480577 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d924133b-d3c9-4b71-bbf4-a894a618e6c4","Type":"ContainerStarted","Data":"39bcee2588e83e5921e28c35608679ebe1332724b853610e6afddc6319883c2e"} Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.483200 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4a548a9c-33c8-4a35-a559-7290357170c1","Type":"ContainerDied","Data":"8b1742da4b1a6feb25bafe009951e7d4d7966a9d5403f08e561f67cea93ae41d"} Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.483242 4723 scope.go:117] "RemoveContainer" containerID="9d64a1f732a34f9be6cb568cf2f4993cfdb876b19a6ea084a842261ffb1bcfcc" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.483365 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.495833 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d2b1-account-create-update-6966g" event={"ID":"c72c36c7-750a-4bc7-ac34-c9d42896cdd6","Type":"ContainerStarted","Data":"a12f01e18860b06c7461525078132a6cb03eb0b25e0f6d6ecc433d5701fb5e7c"} Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.496100 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d2b1-account-create-update-6966g" event={"ID":"c72c36c7-750a-4bc7-ac34-c9d42896cdd6","Type":"ContainerStarted","Data":"f7206d813d65bc248c62ba4f9b29259a64bf6bdbac38b1d8c2a2e1ae90f47535"} Mar 09 13:20:26 crc kubenswrapper[4723]: E0309 13:20:26.499395 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-b5gc4" podUID="72dc18cb-be01-4378-b62c-609a2c237731" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.552748 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-d2b1-account-create-update-6966g" podStartSLOduration=6.552729326 podStartE2EDuration="6.552729326s" podCreationTimestamp="2026-03-09 13:20:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:20:26.543393669 +0000 UTC m=+1300.557861209" watchObservedRunningTime="2026-03-09 13:20:26.552729326 +0000 UTC m=+1300.567196866" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.633681 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-88pwc"] Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.642545 4723 scope.go:117] "RemoveContainer" containerID="93be7c2aa47e3f408bbcd685aae67aa4d15f16e5d60ce6cb28fa72a13e03a121" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.649509 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-sbt7t"] Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.673833 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-6f11-account-create-update-gprhw"] Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.693940 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-cwx46"] Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.705025 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-x7vz5"] Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.715880 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-af49-account-create-update-bwsv7"] Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.735670 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-b554g"] Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.760917 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-32ad-account-create-update-8p5t9"] Mar 09 13:20:26 crc kubenswrapper[4723]: W0309 13:20:26.772942 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c9c338c_01c8_428b_89cb_4c4a59505595.slice/crio-cefee832c14cde81af021b0d9f2af365ec637e01cb3f06b35ff6e8c830c6b899 WatchSource:0}: Error finding container cefee832c14cde81af021b0d9f2af365ec637e01cb3f06b35ff6e8c830c6b899: Status 404 returned error can't find the container with id cefee832c14cde81af021b0d9f2af365ec637e01cb3f06b35ff6e8c830c6b899 Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.909689 4723 scope.go:117] "RemoveContainer" containerID="fec818ed303f7ff19de3d6732cdf76158698dff52fb322f95cc2d95ed5fc3dac" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.915600 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.915636 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.925048 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 09 13:20:26 crc kubenswrapper[4723]: E0309 13:20:26.925747 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a548a9c-33c8-4a35-a559-7290357170c1" containerName="config-reloader" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.925771 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a548a9c-33c8-4a35-a559-7290357170c1" containerName="config-reloader" Mar 09 13:20:26 crc kubenswrapper[4723]: E0309 13:20:26.925795 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a548a9c-33c8-4a35-a559-7290357170c1" containerName="init-config-reloader" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.925802 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a548a9c-33c8-4a35-a559-7290357170c1" containerName="init-config-reloader" Mar 09 13:20:26 crc kubenswrapper[4723]: E0309 13:20:26.925812 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a548a9c-33c8-4a35-a559-7290357170c1" containerName="thanos-sidecar" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.925818 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a548a9c-33c8-4a35-a559-7290357170c1" containerName="thanos-sidecar" Mar 09 13:20:26 crc kubenswrapper[4723]: E0309 13:20:26.925877 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a548a9c-33c8-4a35-a559-7290357170c1" containerName="prometheus" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.925883 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a548a9c-33c8-4a35-a559-7290357170c1" containerName="prometheus" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.926096 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a548a9c-33c8-4a35-a559-7290357170c1" containerName="prometheus" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.926115 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a548a9c-33c8-4a35-a559-7290357170c1" containerName="config-reloader" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.926125 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a548a9c-33c8-4a35-a559-7290357170c1" containerName="thanos-sidecar" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.928025 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.932730 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.935460 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.935481 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.935470 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-bkfhs" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.935757 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.935906 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.935936 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.936025 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.936142 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.940293 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.953104 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2ghb\" (UniqueName: \"kubernetes.io/projected/3e47df78-6587-4f83-a1c9-dcaf0aa9b73c-kube-api-access-n2ghb\") pod \"prometheus-metric-storage-0\" (UID: \"3e47df78-6587-4f83-a1c9-dcaf0aa9b73c\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.953184 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3e47df78-6587-4f83-a1c9-dcaf0aa9b73c-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"3e47df78-6587-4f83-a1c9-dcaf0aa9b73c\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.953255 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/3e47df78-6587-4f83-a1c9-dcaf0aa9b73c-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"3e47df78-6587-4f83-a1c9-dcaf0aa9b73c\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.953304 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3e47df78-6587-4f83-a1c9-dcaf0aa9b73c-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"3e47df78-6587-4f83-a1c9-dcaf0aa9b73c\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.953675 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-81bbfcdd-8860-466d-9fd8-a0c71da7caf4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-81bbfcdd-8860-466d-9fd8-a0c71da7caf4\") pod \"prometheus-metric-storage-0\" (UID: \"3e47df78-6587-4f83-a1c9-dcaf0aa9b73c\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.953743 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/3e47df78-6587-4f83-a1c9-dcaf0aa9b73c-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"3e47df78-6587-4f83-a1c9-dcaf0aa9b73c\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.953776 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3e47df78-6587-4f83-a1c9-dcaf0aa9b73c-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"3e47df78-6587-4f83-a1c9-dcaf0aa9b73c\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.953881 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/3e47df78-6587-4f83-a1c9-dcaf0aa9b73c-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"3e47df78-6587-4f83-a1c9-dcaf0aa9b73c\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.953901 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3e47df78-6587-4f83-a1c9-dcaf0aa9b73c-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"3e47df78-6587-4f83-a1c9-dcaf0aa9b73c\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.953922 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e47df78-6587-4f83-a1c9-dcaf0aa9b73c-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"3e47df78-6587-4f83-a1c9-dcaf0aa9b73c\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.953987 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3e47df78-6587-4f83-a1c9-dcaf0aa9b73c-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"3e47df78-6587-4f83-a1c9-dcaf0aa9b73c\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.954015 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3e47df78-6587-4f83-a1c9-dcaf0aa9b73c-config\") pod \"prometheus-metric-storage-0\" (UID: \"3e47df78-6587-4f83-a1c9-dcaf0aa9b73c\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:26 crc kubenswrapper[4723]: I0309 13:20:26.954058 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/3e47df78-6587-4f83-a1c9-dcaf0aa9b73c-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"3e47df78-6587-4f83-a1c9-dcaf0aa9b73c\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.020174 4723 scope.go:117] "RemoveContainer" containerID="9ec89d749e487ccf881f60ee3e3329f37cb8dee40b34411d0ffb3b10fd496acf" Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.061033 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2ghb\" (UniqueName: \"kubernetes.io/projected/3e47df78-6587-4f83-a1c9-dcaf0aa9b73c-kube-api-access-n2ghb\") pod \"prometheus-metric-storage-0\" (UID: \"3e47df78-6587-4f83-a1c9-dcaf0aa9b73c\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.061080 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3e47df78-6587-4f83-a1c9-dcaf0aa9b73c-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"3e47df78-6587-4f83-a1c9-dcaf0aa9b73c\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.061106 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/3e47df78-6587-4f83-a1c9-dcaf0aa9b73c-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"3e47df78-6587-4f83-a1c9-dcaf0aa9b73c\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.061138 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3e47df78-6587-4f83-a1c9-dcaf0aa9b73c-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"3e47df78-6587-4f83-a1c9-dcaf0aa9b73c\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.061197 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-81bbfcdd-8860-466d-9fd8-a0c71da7caf4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-81bbfcdd-8860-466d-9fd8-a0c71da7caf4\") pod \"prometheus-metric-storage-0\" (UID: \"3e47df78-6587-4f83-a1c9-dcaf0aa9b73c\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.061228 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/3e47df78-6587-4f83-a1c9-dcaf0aa9b73c-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"3e47df78-6587-4f83-a1c9-dcaf0aa9b73c\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.061247 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3e47df78-6587-4f83-a1c9-dcaf0aa9b73c-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"3e47df78-6587-4f83-a1c9-dcaf0aa9b73c\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.061274 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3e47df78-6587-4f83-a1c9-dcaf0aa9b73c-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"3e47df78-6587-4f83-a1c9-dcaf0aa9b73c\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.061292 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/3e47df78-6587-4f83-a1c9-dcaf0aa9b73c-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"3e47df78-6587-4f83-a1c9-dcaf0aa9b73c\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.061310 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e47df78-6587-4f83-a1c9-dcaf0aa9b73c-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"3e47df78-6587-4f83-a1c9-dcaf0aa9b73c\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.061346 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3e47df78-6587-4f83-a1c9-dcaf0aa9b73c-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"3e47df78-6587-4f83-a1c9-dcaf0aa9b73c\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.061369 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3e47df78-6587-4f83-a1c9-dcaf0aa9b73c-config\") pod \"prometheus-metric-storage-0\" (UID: \"3e47df78-6587-4f83-a1c9-dcaf0aa9b73c\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.061390 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/3e47df78-6587-4f83-a1c9-dcaf0aa9b73c-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"3e47df78-6587-4f83-a1c9-dcaf0aa9b73c\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.062156 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/3e47df78-6587-4f83-a1c9-dcaf0aa9b73c-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"3e47df78-6587-4f83-a1c9-dcaf0aa9b73c\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.068570 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3e47df78-6587-4f83-a1c9-dcaf0aa9b73c-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"3e47df78-6587-4f83-a1c9-dcaf0aa9b73c\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.069175 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/3e47df78-6587-4f83-a1c9-dcaf0aa9b73c-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"3e47df78-6587-4f83-a1c9-dcaf0aa9b73c\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.074200 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e47df78-6587-4f83-a1c9-dcaf0aa9b73c-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"3e47df78-6587-4f83-a1c9-dcaf0aa9b73c\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.074573 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3e47df78-6587-4f83-a1c9-dcaf0aa9b73c-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"3e47df78-6587-4f83-a1c9-dcaf0aa9b73c\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.075949 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3e47df78-6587-4f83-a1c9-dcaf0aa9b73c-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"3e47df78-6587-4f83-a1c9-dcaf0aa9b73c\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.076152 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/3e47df78-6587-4f83-a1c9-dcaf0aa9b73c-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"3e47df78-6587-4f83-a1c9-dcaf0aa9b73c\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.091500 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3e47df78-6587-4f83-a1c9-dcaf0aa9b73c-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"3e47df78-6587-4f83-a1c9-dcaf0aa9b73c\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.094561 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/3e47df78-6587-4f83-a1c9-dcaf0aa9b73c-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"3e47df78-6587-4f83-a1c9-dcaf0aa9b73c\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.109593 4723 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.109643 4723 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-81bbfcdd-8860-466d-9fd8-a0c71da7caf4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-81bbfcdd-8860-466d-9fd8-a0c71da7caf4\") pod \"prometheus-metric-storage-0\" (UID: \"3e47df78-6587-4f83-a1c9-dcaf0aa9b73c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7a5bc2ca863004c00c102bc266d43328c7878cd25689db409330e8972fedad87/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.111607 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2ghb\" (UniqueName: \"kubernetes.io/projected/3e47df78-6587-4f83-a1c9-dcaf0aa9b73c-kube-api-access-n2ghb\") pod \"prometheus-metric-storage-0\" (UID: \"3e47df78-6587-4f83-a1c9-dcaf0aa9b73c\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.112210 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3e47df78-6587-4f83-a1c9-dcaf0aa9b73c-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"3e47df78-6587-4f83-a1c9-dcaf0aa9b73c\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.118144 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3e47df78-6587-4f83-a1c9-dcaf0aa9b73c-config\") pod \"prometheus-metric-storage-0\" (UID: \"3e47df78-6587-4f83-a1c9-dcaf0aa9b73c\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.257526 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-81bbfcdd-8860-466d-9fd8-a0c71da7caf4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-81bbfcdd-8860-466d-9fd8-a0c71da7caf4\") pod \"prometheus-metric-storage-0\" (UID: \"3e47df78-6587-4f83-a1c9-dcaf0aa9b73c\") " pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.361723 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.510042 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-sbt7t" event={"ID":"8f9ae762-5d7d-4d41-9477-b4cc72689803","Type":"ContainerStarted","Data":"ee4ad67d87342969a5456a44427d04f8893dceeb20b494e50679710f34a6e58d"} Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.510085 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-sbt7t" event={"ID":"8f9ae762-5d7d-4d41-9477-b4cc72689803","Type":"ContainerStarted","Data":"7dad028f8da8d94666f2c355ae3e0705c68f5070c2b7eb227d744f7504003a03"} Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.540452 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-sbt7t" podStartSLOduration=7.540433186 podStartE2EDuration="7.540433186s" podCreationTimestamp="2026-03-09 13:20:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:20:27.534252863 +0000 UTC m=+1301.548720403" watchObservedRunningTime="2026-03-09 13:20:27.540433186 +0000 UTC m=+1301.554900726" Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.559304 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d924133b-d3c9-4b71-bbf4-a894a618e6c4","Type":"ContainerStarted","Data":"f5cf04cd6e5fc856ae66d07de651a4049f130cb1a6d709fa99aa6ff1a7cdbf7d"} Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.573324 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-88pwc" event={"ID":"0c9c338c-01c8-428b-89cb-4c4a59505595","Type":"ContainerStarted","Data":"e4545be82e46e0052920e5b344bacbaf2bbe64b271359f6b8e82bb95b36fad61"} Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.573879 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-88pwc" event={"ID":"0c9c338c-01c8-428b-89cb-4c4a59505595","Type":"ContainerStarted","Data":"cefee832c14cde81af021b0d9f2af365ec637e01cb3f06b35ff6e8c830c6b899"} Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.592718 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-b554g" event={"ID":"c1df2ad2-feac-487c-ab26-e885457d7979","Type":"ContainerStarted","Data":"a0c8702b95ad8121cc345974734a1faa478de4334e93db13ffef34c0783aefa8"} Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.592764 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-b554g" event={"ID":"c1df2ad2-feac-487c-ab26-e885457d7979","Type":"ContainerStarted","Data":"2f6981c0fd769c0da239ba88e015468768067ca4ede72b7dcafd69330f0123ee"} Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.607934 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-88pwc" podStartSLOduration=7.607918142 podStartE2EDuration="7.607918142s" podCreationTimestamp="2026-03-09 13:20:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:20:27.605787756 +0000 UTC m=+1301.620255296" watchObservedRunningTime="2026-03-09 13:20:27.607918142 +0000 UTC m=+1301.622385682" Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.614253 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-32ad-account-create-update-8p5t9" event={"ID":"edb56749-bb8b-4620-a1a9-8e1f2a70f1b2","Type":"ContainerStarted","Data":"3139003d5f6dffdd22dc5e6128e2bb67c3122b2c35b79ec1cea78fac44c55ddf"} Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.614300 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-32ad-account-create-update-8p5t9" event={"ID":"edb56749-bb8b-4620-a1a9-8e1f2a70f1b2","Type":"ContainerStarted","Data":"2bb4d2d4d38437dc2e42bdbb07e8a6bb3acad51864bb11b229b688b979c15ed7"} Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.629550 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-x7vz5" event={"ID":"15541e12-c0a2-4c26-b912-d33be48eea77","Type":"ContainerStarted","Data":"f7bfa7b6b0cc3e213e8cf64be2833627e19264315a0ef180444ea44a49b352bd"} Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.649845 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-create-b554g" podStartSLOduration=7.649826041 podStartE2EDuration="7.649826041s" podCreationTimestamp="2026-03-09 13:20:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:20:27.628936129 +0000 UTC m=+1301.643403679" watchObservedRunningTime="2026-03-09 13:20:27.649826041 +0000 UTC m=+1301.664293581" Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.662827 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-6f11-account-create-update-gprhw" event={"ID":"1b441fb5-f89b-4ec1-8399-b3f56fdf139c","Type":"ContainerStarted","Data":"b03fbb274ecab3bd38f7ab5714bcc4aa8bff5b700df1fa192cc40bf66926d6df"} Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.662879 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-6f11-account-create-update-gprhw" event={"ID":"1b441fb5-f89b-4ec1-8399-b3f56fdf139c","Type":"ContainerStarted","Data":"eae8c9c0182f7283dbce37d9f76c625aed257f3a6dd6a11804f0cf8e0ae4fa64"} Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.666267 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-cwx46" event={"ID":"9feddb28-c165-4784-94a8-4d63209fda46","Type":"ContainerStarted","Data":"ef3be5a1a9d1774e23441c783b55cf716c31d9fd5560f1064109f14e419f2256"} Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.666301 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-cwx46" event={"ID":"9feddb28-c165-4784-94a8-4d63209fda46","Type":"ContainerStarted","Data":"5ef4de3b98e22d5e9a7fd70147731a405c15cade4e58c9e69be6695b2b8f3512"} Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.668729 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-32ad-account-create-update-8p5t9" podStartSLOduration=7.668710941 podStartE2EDuration="7.668710941s" podCreationTimestamp="2026-03-09 13:20:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:20:27.653649643 +0000 UTC m=+1301.668117183" watchObservedRunningTime="2026-03-09 13:20:27.668710941 +0000 UTC m=+1301.683178481" Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.696892 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-6f11-account-create-update-gprhw" podStartSLOduration=7.696870426 podStartE2EDuration="7.696870426s" podCreationTimestamp="2026-03-09 13:20:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:20:27.692123561 +0000 UTC m=+1301.706591101" watchObservedRunningTime="2026-03-09 13:20:27.696870426 +0000 UTC m=+1301.711337966" Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.701946 4723 generic.go:334] "Generic (PLEG): container finished" podID="c72c36c7-750a-4bc7-ac34-c9d42896cdd6" containerID="a12f01e18860b06c7461525078132a6cb03eb0b25e0f6d6ecc433d5701fb5e7c" exitCode=0 Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.702024 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d2b1-account-create-update-6966g" event={"ID":"c72c36c7-750a-4bc7-ac34-c9d42896cdd6","Type":"ContainerDied","Data":"a12f01e18860b06c7461525078132a6cb03eb0b25e0f6d6ecc433d5701fb5e7c"} Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.704129 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-af49-account-create-update-bwsv7" event={"ID":"83082054-f3da-4455-b4af-5232e439042c","Type":"ContainerStarted","Data":"adfc19cbc4e9011491c67a96c124d2f006433fe1b8078e82dbdcbbc1e0596906"} Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.819946 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-cwx46" podStartSLOduration=8.819929102 podStartE2EDuration="8.819929102s" podCreationTimestamp="2026-03-09 13:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:20:27.73023755 +0000 UTC m=+1301.744705090" watchObservedRunningTime="2026-03-09 13:20:27.819929102 +0000 UTC m=+1301.834396642" Mar 09 13:20:27 crc kubenswrapper[4723]: I0309 13:20:27.820556 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-af49-account-create-update-bwsv7" podStartSLOduration=7.820550719 podStartE2EDuration="7.820550719s" podCreationTimestamp="2026-03-09 13:20:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:20:27.754414669 +0000 UTC m=+1301.768882209" watchObservedRunningTime="2026-03-09 13:20:27.820550719 +0000 UTC m=+1301.835018259" Mar 09 13:20:28 crc kubenswrapper[4723]: I0309 13:20:28.215377 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 09 13:20:28 crc kubenswrapper[4723]: W0309 13:20:28.247635 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e47df78_6587_4f83_a1c9_dcaf0aa9b73c.slice/crio-6e06a2a6c728b6074cc3688f01241190fc94bc37b42090c324db6c7fa8b3ab7d WatchSource:0}: Error finding container 6e06a2a6c728b6074cc3688f01241190fc94bc37b42090c324db6c7fa8b3ab7d: Status 404 returned error can't find the container with id 6e06a2a6c728b6074cc3688f01241190fc94bc37b42090c324db6c7fa8b3ab7d Mar 09 13:20:28 crc kubenswrapper[4723]: I0309 13:20:28.718576 4723 generic.go:334] "Generic (PLEG): container finished" podID="0c9c338c-01c8-428b-89cb-4c4a59505595" containerID="e4545be82e46e0052920e5b344bacbaf2bbe64b271359f6b8e82bb95b36fad61" exitCode=0 Mar 09 13:20:28 crc kubenswrapper[4723]: I0309 13:20:28.718673 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-88pwc" event={"ID":"0c9c338c-01c8-428b-89cb-4c4a59505595","Type":"ContainerDied","Data":"e4545be82e46e0052920e5b344bacbaf2bbe64b271359f6b8e82bb95b36fad61"} Mar 09 13:20:28 crc kubenswrapper[4723]: I0309 13:20:28.721072 4723 generic.go:334] "Generic (PLEG): container finished" podID="c1df2ad2-feac-487c-ab26-e885457d7979" containerID="a0c8702b95ad8121cc345974734a1faa478de4334e93db13ffef34c0783aefa8" exitCode=0 Mar 09 13:20:28 crc kubenswrapper[4723]: I0309 13:20:28.721162 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-b554g" event={"ID":"c1df2ad2-feac-487c-ab26-e885457d7979","Type":"ContainerDied","Data":"a0c8702b95ad8121cc345974734a1faa478de4334e93db13ffef34c0783aefa8"} Mar 09 13:20:28 crc kubenswrapper[4723]: I0309 13:20:28.723853 4723 generic.go:334] "Generic (PLEG): container finished" podID="8f9ae762-5d7d-4d41-9477-b4cc72689803" containerID="ee4ad67d87342969a5456a44427d04f8893dceeb20b494e50679710f34a6e58d" exitCode=0 Mar 09 13:20:28 crc kubenswrapper[4723]: I0309 13:20:28.723915 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-sbt7t" event={"ID":"8f9ae762-5d7d-4d41-9477-b4cc72689803","Type":"ContainerDied","Data":"ee4ad67d87342969a5456a44427d04f8893dceeb20b494e50679710f34a6e58d"} Mar 09 13:20:28 crc kubenswrapper[4723]: I0309 13:20:28.741296 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d924133b-d3c9-4b71-bbf4-a894a618e6c4","Type":"ContainerStarted","Data":"8d9fec0f767d70ba1d6da0ec9547913768cb290c4fb3a9c8eb07cc3f897c6423"} Mar 09 13:20:28 crc kubenswrapper[4723]: I0309 13:20:28.741348 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d924133b-d3c9-4b71-bbf4-a894a618e6c4","Type":"ContainerStarted","Data":"205926013b2c6561972627a1dd8bb999207d2dadb5378c5a81acfce46f743107"} Mar 09 13:20:28 crc kubenswrapper[4723]: I0309 13:20:28.744202 4723 generic.go:334] "Generic (PLEG): container finished" podID="1b441fb5-f89b-4ec1-8399-b3f56fdf139c" containerID="b03fbb274ecab3bd38f7ab5714bcc4aa8bff5b700df1fa192cc40bf66926d6df" exitCode=0 Mar 09 13:20:28 crc kubenswrapper[4723]: I0309 13:20:28.744269 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-6f11-account-create-update-gprhw" event={"ID":"1b441fb5-f89b-4ec1-8399-b3f56fdf139c","Type":"ContainerDied","Data":"b03fbb274ecab3bd38f7ab5714bcc4aa8bff5b700df1fa192cc40bf66926d6df"} Mar 09 13:20:28 crc kubenswrapper[4723]: I0309 13:20:28.746087 4723 generic.go:334] "Generic (PLEG): container finished" podID="9feddb28-c165-4784-94a8-4d63209fda46" containerID="ef3be5a1a9d1774e23441c783b55cf716c31d9fd5560f1064109f14e419f2256" exitCode=0 Mar 09 13:20:28 crc kubenswrapper[4723]: I0309 13:20:28.746127 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-cwx46" event={"ID":"9feddb28-c165-4784-94a8-4d63209fda46","Type":"ContainerDied","Data":"ef3be5a1a9d1774e23441c783b55cf716c31d9fd5560f1064109f14e419f2256"} Mar 09 13:20:28 crc kubenswrapper[4723]: I0309 13:20:28.759148 4723 generic.go:334] "Generic (PLEG): container finished" podID="edb56749-bb8b-4620-a1a9-8e1f2a70f1b2" containerID="3139003d5f6dffdd22dc5e6128e2bb67c3122b2c35b79ec1cea78fac44c55ddf" exitCode=0 Mar 09 13:20:28 crc kubenswrapper[4723]: I0309 13:20:28.759399 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-32ad-account-create-update-8p5t9" event={"ID":"edb56749-bb8b-4620-a1a9-8e1f2a70f1b2","Type":"ContainerDied","Data":"3139003d5f6dffdd22dc5e6128e2bb67c3122b2c35b79ec1cea78fac44c55ddf"} Mar 09 13:20:28 crc kubenswrapper[4723]: I0309 13:20:28.766923 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3e47df78-6587-4f83-a1c9-dcaf0aa9b73c","Type":"ContainerStarted","Data":"6e06a2a6c728b6074cc3688f01241190fc94bc37b42090c324db6c7fa8b3ab7d"} Mar 09 13:20:28 crc kubenswrapper[4723]: I0309 13:20:28.771447 4723 generic.go:334] "Generic (PLEG): container finished" podID="83082054-f3da-4455-b4af-5232e439042c" containerID="05df4ab3aeef5227dd01ef0959d21e5fc3b094a1cce9906accd8d7502f74db94" exitCode=0 Mar 09 13:20:28 crc kubenswrapper[4723]: I0309 13:20:28.771493 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-af49-account-create-update-bwsv7" event={"ID":"83082054-f3da-4455-b4af-5232e439042c","Type":"ContainerDied","Data":"05df4ab3aeef5227dd01ef0959d21e5fc3b094a1cce9906accd8d7502f74db94"} Mar 09 13:20:28 crc kubenswrapper[4723]: I0309 13:20:28.919771 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a548a9c-33c8-4a35-a559-7290357170c1" path="/var/lib/kubelet/pods/4a548a9c-33c8-4a35-a559-7290357170c1/volumes" Mar 09 13:20:29 crc kubenswrapper[4723]: I0309 13:20:29.123571 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d2b1-account-create-update-6966g" Mar 09 13:20:29 crc kubenswrapper[4723]: I0309 13:20:29.184330 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxb8l\" (UniqueName: \"kubernetes.io/projected/c72c36c7-750a-4bc7-ac34-c9d42896cdd6-kube-api-access-gxb8l\") pod \"c72c36c7-750a-4bc7-ac34-c9d42896cdd6\" (UID: \"c72c36c7-750a-4bc7-ac34-c9d42896cdd6\") " Mar 09 13:20:29 crc kubenswrapper[4723]: I0309 13:20:29.184833 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c72c36c7-750a-4bc7-ac34-c9d42896cdd6-operator-scripts\") pod \"c72c36c7-750a-4bc7-ac34-c9d42896cdd6\" (UID: \"c72c36c7-750a-4bc7-ac34-c9d42896cdd6\") " Mar 09 13:20:29 crc kubenswrapper[4723]: I0309 13:20:29.185561 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c72c36c7-750a-4bc7-ac34-c9d42896cdd6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c72c36c7-750a-4bc7-ac34-c9d42896cdd6" (UID: "c72c36c7-750a-4bc7-ac34-c9d42896cdd6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:20:29 crc kubenswrapper[4723]: I0309 13:20:29.188490 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c72c36c7-750a-4bc7-ac34-c9d42896cdd6-kube-api-access-gxb8l" (OuterVolumeSpecName: "kube-api-access-gxb8l") pod "c72c36c7-750a-4bc7-ac34-c9d42896cdd6" (UID: "c72c36c7-750a-4bc7-ac34-c9d42896cdd6"). InnerVolumeSpecName "kube-api-access-gxb8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:20:29 crc kubenswrapper[4723]: I0309 13:20:29.286844 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxb8l\" (UniqueName: \"kubernetes.io/projected/c72c36c7-750a-4bc7-ac34-c9d42896cdd6-kube-api-access-gxb8l\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:29 crc kubenswrapper[4723]: I0309 13:20:29.286908 4723 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c72c36c7-750a-4bc7-ac34-c9d42896cdd6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:29 crc kubenswrapper[4723]: I0309 13:20:29.783289 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d2b1-account-create-update-6966g" event={"ID":"c72c36c7-750a-4bc7-ac34-c9d42896cdd6","Type":"ContainerDied","Data":"f7206d813d65bc248c62ba4f9b29259a64bf6bdbac38b1d8c2a2e1ae90f47535"} Mar 09 13:20:29 crc kubenswrapper[4723]: I0309 13:20:29.783337 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7206d813d65bc248c62ba4f9b29259a64bf6bdbac38b1d8c2a2e1ae90f47535" Mar 09 13:20:29 crc kubenswrapper[4723]: I0309 13:20:29.783395 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d2b1-account-create-update-6966g" Mar 09 13:20:29 crc kubenswrapper[4723]: I0309 13:20:29.797603 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d924133b-d3c9-4b71-bbf4-a894a618e6c4","Type":"ContainerStarted","Data":"1ece96c3e7176102361d77ab22a27584587765e0f734bb2703a583565fbe4bc5"} Mar 09 13:20:29 crc kubenswrapper[4723]: I0309 13:20:29.797636 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d924133b-d3c9-4b71-bbf4-a894a618e6c4","Type":"ContainerStarted","Data":"3dc77b8c416d542bb10957320b7e4cc82f5e522e2d679e360a422b7cecbf1cd6"} Mar 09 13:20:29 crc kubenswrapper[4723]: I0309 13:20:29.836312 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=38.650062418 podStartE2EDuration="55.836291707s" podCreationTimestamp="2026-03-09 13:19:34 +0000 UTC" firstStartedPulling="2026-03-09 13:20:08.390204998 +0000 UTC m=+1282.404672538" lastFinishedPulling="2026-03-09 13:20:25.576434287 +0000 UTC m=+1299.590901827" observedRunningTime="2026-03-09 13:20:29.833104273 +0000 UTC m=+1303.847571823" watchObservedRunningTime="2026-03-09 13:20:29.836291707 +0000 UTC m=+1303.850759257" Mar 09 13:20:30 crc kubenswrapper[4723]: I0309 13:20:30.158455 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-mbn5h"] Mar 09 13:20:30 crc kubenswrapper[4723]: E0309 13:20:30.159029 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c72c36c7-750a-4bc7-ac34-c9d42896cdd6" containerName="mariadb-account-create-update" Mar 09 13:20:30 crc kubenswrapper[4723]: I0309 13:20:30.159054 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="c72c36c7-750a-4bc7-ac34-c9d42896cdd6" containerName="mariadb-account-create-update" Mar 09 13:20:30 crc kubenswrapper[4723]: I0309 13:20:30.159305 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="c72c36c7-750a-4bc7-ac34-c9d42896cdd6" containerName="mariadb-account-create-update" Mar 09 13:20:30 crc kubenswrapper[4723]: I0309 13:20:30.168828 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-mbn5h" Mar 09 13:20:30 crc kubenswrapper[4723]: I0309 13:20:30.205480 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 09 13:20:30 crc kubenswrapper[4723]: I0309 13:20:30.216956 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-mbn5h"] Mar 09 13:20:30 crc kubenswrapper[4723]: I0309 13:20:30.313663 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/09188288-5f42-4c86-9f80-8d46d5544d93-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-mbn5h\" (UID: \"09188288-5f42-4c86-9f80-8d46d5544d93\") " pod="openstack/dnsmasq-dns-77585f5f8c-mbn5h" Mar 09 13:20:30 crc kubenswrapper[4723]: I0309 13:20:30.313757 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqm4x\" (UniqueName: \"kubernetes.io/projected/09188288-5f42-4c86-9f80-8d46d5544d93-kube-api-access-nqm4x\") pod \"dnsmasq-dns-77585f5f8c-mbn5h\" (UID: \"09188288-5f42-4c86-9f80-8d46d5544d93\") " pod="openstack/dnsmasq-dns-77585f5f8c-mbn5h" Mar 09 13:20:30 crc kubenswrapper[4723]: I0309 13:20:30.313780 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/09188288-5f42-4c86-9f80-8d46d5544d93-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-mbn5h\" (UID: \"09188288-5f42-4c86-9f80-8d46d5544d93\") " pod="openstack/dnsmasq-dns-77585f5f8c-mbn5h" Mar 09 13:20:30 crc kubenswrapper[4723]: I0309 13:20:30.313827 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/09188288-5f42-4c86-9f80-8d46d5544d93-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-mbn5h\" (UID: \"09188288-5f42-4c86-9f80-8d46d5544d93\") " pod="openstack/dnsmasq-dns-77585f5f8c-mbn5h" Mar 09 13:20:30 crc kubenswrapper[4723]: I0309 13:20:30.313940 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09188288-5f42-4c86-9f80-8d46d5544d93-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-mbn5h\" (UID: \"09188288-5f42-4c86-9f80-8d46d5544d93\") " pod="openstack/dnsmasq-dns-77585f5f8c-mbn5h" Mar 09 13:20:30 crc kubenswrapper[4723]: I0309 13:20:30.313979 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09188288-5f42-4c86-9f80-8d46d5544d93-config\") pod \"dnsmasq-dns-77585f5f8c-mbn5h\" (UID: \"09188288-5f42-4c86-9f80-8d46d5544d93\") " pod="openstack/dnsmasq-dns-77585f5f8c-mbn5h" Mar 09 13:20:30 crc kubenswrapper[4723]: I0309 13:20:30.415678 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09188288-5f42-4c86-9f80-8d46d5544d93-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-mbn5h\" (UID: \"09188288-5f42-4c86-9f80-8d46d5544d93\") " pod="openstack/dnsmasq-dns-77585f5f8c-mbn5h" Mar 09 13:20:30 crc kubenswrapper[4723]: I0309 13:20:30.416044 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09188288-5f42-4c86-9f80-8d46d5544d93-config\") pod \"dnsmasq-dns-77585f5f8c-mbn5h\" (UID: \"09188288-5f42-4c86-9f80-8d46d5544d93\") " pod="openstack/dnsmasq-dns-77585f5f8c-mbn5h" Mar 09 13:20:30 crc kubenswrapper[4723]: I0309 13:20:30.416071 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/09188288-5f42-4c86-9f80-8d46d5544d93-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-mbn5h\" (UID: \"09188288-5f42-4c86-9f80-8d46d5544d93\") " pod="openstack/dnsmasq-dns-77585f5f8c-mbn5h" Mar 09 13:20:30 crc kubenswrapper[4723]: I0309 13:20:30.416161 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqm4x\" (UniqueName: \"kubernetes.io/projected/09188288-5f42-4c86-9f80-8d46d5544d93-kube-api-access-nqm4x\") pod \"dnsmasq-dns-77585f5f8c-mbn5h\" (UID: \"09188288-5f42-4c86-9f80-8d46d5544d93\") " pod="openstack/dnsmasq-dns-77585f5f8c-mbn5h" Mar 09 13:20:30 crc kubenswrapper[4723]: I0309 13:20:30.416211 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/09188288-5f42-4c86-9f80-8d46d5544d93-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-mbn5h\" (UID: \"09188288-5f42-4c86-9f80-8d46d5544d93\") " pod="openstack/dnsmasq-dns-77585f5f8c-mbn5h" Mar 09 13:20:30 crc kubenswrapper[4723]: I0309 13:20:30.416255 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/09188288-5f42-4c86-9f80-8d46d5544d93-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-mbn5h\" (UID: \"09188288-5f42-4c86-9f80-8d46d5544d93\") " pod="openstack/dnsmasq-dns-77585f5f8c-mbn5h" Mar 09 13:20:30 crc kubenswrapper[4723]: I0309 13:20:30.417139 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/09188288-5f42-4c86-9f80-8d46d5544d93-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-mbn5h\" (UID: \"09188288-5f42-4c86-9f80-8d46d5544d93\") " pod="openstack/dnsmasq-dns-77585f5f8c-mbn5h" Mar 09 13:20:30 crc kubenswrapper[4723]: I0309 13:20:30.417636 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09188288-5f42-4c86-9f80-8d46d5544d93-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-mbn5h\" (UID: \"09188288-5f42-4c86-9f80-8d46d5544d93\") " pod="openstack/dnsmasq-dns-77585f5f8c-mbn5h" Mar 09 13:20:30 crc kubenswrapper[4723]: I0309 13:20:30.418135 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09188288-5f42-4c86-9f80-8d46d5544d93-config\") pod \"dnsmasq-dns-77585f5f8c-mbn5h\" (UID: \"09188288-5f42-4c86-9f80-8d46d5544d93\") " pod="openstack/dnsmasq-dns-77585f5f8c-mbn5h" Mar 09 13:20:30 crc kubenswrapper[4723]: I0309 13:20:30.418609 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/09188288-5f42-4c86-9f80-8d46d5544d93-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-mbn5h\" (UID: \"09188288-5f42-4c86-9f80-8d46d5544d93\") " pod="openstack/dnsmasq-dns-77585f5f8c-mbn5h" Mar 09 13:20:30 crc kubenswrapper[4723]: I0309 13:20:30.419384 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/09188288-5f42-4c86-9f80-8d46d5544d93-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-mbn5h\" (UID: \"09188288-5f42-4c86-9f80-8d46d5544d93\") " pod="openstack/dnsmasq-dns-77585f5f8c-mbn5h" Mar 09 13:20:30 crc kubenswrapper[4723]: I0309 13:20:30.443642 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqm4x\" (UniqueName: \"kubernetes.io/projected/09188288-5f42-4c86-9f80-8d46d5544d93-kube-api-access-nqm4x\") pod \"dnsmasq-dns-77585f5f8c-mbn5h\" (UID: \"09188288-5f42-4c86-9f80-8d46d5544d93\") " pod="openstack/dnsmasq-dns-77585f5f8c-mbn5h" Mar 09 13:20:30 crc kubenswrapper[4723]: I0309 13:20:30.531173 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-mbn5h" Mar 09 13:20:31 crc kubenswrapper[4723]: I0309 13:20:31.819308 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3e47df78-6587-4f83-a1c9-dcaf0aa9b73c","Type":"ContainerStarted","Data":"0f20df8ea1077c74474549ecfc6a71e72e4c6b5ed40ae1563af70106456a0dc9"} Mar 09 13:20:32 crc kubenswrapper[4723]: I0309 13:20:32.833783 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-cwx46" event={"ID":"9feddb28-c165-4784-94a8-4d63209fda46","Type":"ContainerDied","Data":"5ef4de3b98e22d5e9a7fd70147731a405c15cade4e58c9e69be6695b2b8f3512"} Mar 09 13:20:32 crc kubenswrapper[4723]: I0309 13:20:32.834112 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ef4de3b98e22d5e9a7fd70147731a405c15cade4e58c9e69be6695b2b8f3512" Mar 09 13:20:32 crc kubenswrapper[4723]: I0309 13:20:32.835819 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-b554g" event={"ID":"c1df2ad2-feac-487c-ab26-e885457d7979","Type":"ContainerDied","Data":"2f6981c0fd769c0da239ba88e015468768067ca4ede72b7dcafd69330f0123ee"} Mar 09 13:20:32 crc kubenswrapper[4723]: I0309 13:20:32.835882 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f6981c0fd769c0da239ba88e015468768067ca4ede72b7dcafd69330f0123ee" Mar 09 13:20:32 crc kubenswrapper[4723]: I0309 13:20:32.837789 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-32ad-account-create-update-8p5t9" event={"ID":"edb56749-bb8b-4620-a1a9-8e1f2a70f1b2","Type":"ContainerDied","Data":"2bb4d2d4d38437dc2e42bdbb07e8a6bb3acad51864bb11b229b688b979c15ed7"} Mar 09 13:20:32 crc kubenswrapper[4723]: I0309 13:20:32.837833 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2bb4d2d4d38437dc2e42bdbb07e8a6bb3acad51864bb11b229b688b979c15ed7" Mar 09 13:20:32 crc kubenswrapper[4723]: I0309 13:20:32.840326 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-sbt7t" event={"ID":"8f9ae762-5d7d-4d41-9477-b4cc72689803","Type":"ContainerDied","Data":"7dad028f8da8d94666f2c355ae3e0705c68f5070c2b7eb227d744f7504003a03"} Mar 09 13:20:32 crc kubenswrapper[4723]: I0309 13:20:32.840381 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7dad028f8da8d94666f2c355ae3e0705c68f5070c2b7eb227d744f7504003a03" Mar 09 13:20:32 crc kubenswrapper[4723]: I0309 13:20:32.842447 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-af49-account-create-update-bwsv7" event={"ID":"83082054-f3da-4455-b4af-5232e439042c","Type":"ContainerDied","Data":"adfc19cbc4e9011491c67a96c124d2f006433fe1b8078e82dbdcbbc1e0596906"} Mar 09 13:20:32 crc kubenswrapper[4723]: I0309 13:20:32.842478 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="adfc19cbc4e9011491c67a96c124d2f006433fe1b8078e82dbdcbbc1e0596906" Mar 09 13:20:32 crc kubenswrapper[4723]: I0309 13:20:32.845095 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-6f11-account-create-update-gprhw" event={"ID":"1b441fb5-f89b-4ec1-8399-b3f56fdf139c","Type":"ContainerDied","Data":"eae8c9c0182f7283dbce37d9f76c625aed257f3a6dd6a11804f0cf8e0ae4fa64"} Mar 09 13:20:32 crc kubenswrapper[4723]: I0309 13:20:32.845148 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eae8c9c0182f7283dbce37d9f76c625aed257f3a6dd6a11804f0cf8e0ae4fa64" Mar 09 13:20:32 crc kubenswrapper[4723]: I0309 13:20:32.848095 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-88pwc" event={"ID":"0c9c338c-01c8-428b-89cb-4c4a59505595","Type":"ContainerDied","Data":"cefee832c14cde81af021b0d9f2af365ec637e01cb3f06b35ff6e8c830c6b899"} Mar 09 13:20:32 crc kubenswrapper[4723]: I0309 13:20:32.848132 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cefee832c14cde81af021b0d9f2af365ec637e01cb3f06b35ff6e8c830c6b899" Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.010145 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-6f11-account-create-update-gprhw" Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.031577 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-88pwc" Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.040763 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-b554g" Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.048596 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-sbt7t" Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.057491 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-cwx46" Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.063104 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-32ad-account-create-update-8p5t9" Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.075300 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-af49-account-create-update-bwsv7" Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.186823 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrflm\" (UniqueName: \"kubernetes.io/projected/edb56749-bb8b-4620-a1a9-8e1f2a70f1b2-kube-api-access-jrflm\") pod \"edb56749-bb8b-4620-a1a9-8e1f2a70f1b2\" (UID: \"edb56749-bb8b-4620-a1a9-8e1f2a70f1b2\") " Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.186899 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6s2g7\" (UniqueName: \"kubernetes.io/projected/c1df2ad2-feac-487c-ab26-e885457d7979-kube-api-access-6s2g7\") pod \"c1df2ad2-feac-487c-ab26-e885457d7979\" (UID: \"c1df2ad2-feac-487c-ab26-e885457d7979\") " Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.186937 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83082054-f3da-4455-b4af-5232e439042c-operator-scripts\") pod \"83082054-f3da-4455-b4af-5232e439042c\" (UID: \"83082054-f3da-4455-b4af-5232e439042c\") " Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.187056 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c9c338c-01c8-428b-89cb-4c4a59505595-operator-scripts\") pod \"0c9c338c-01c8-428b-89cb-4c4a59505595\" (UID: \"0c9c338c-01c8-428b-89cb-4c4a59505595\") " Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.187148 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shfsf\" (UniqueName: \"kubernetes.io/projected/8f9ae762-5d7d-4d41-9477-b4cc72689803-kube-api-access-shfsf\") pod \"8f9ae762-5d7d-4d41-9477-b4cc72689803\" (UID: \"8f9ae762-5d7d-4d41-9477-b4cc72689803\") " Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.187186 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1df2ad2-feac-487c-ab26-e885457d7979-operator-scripts\") pod \"c1df2ad2-feac-487c-ab26-e885457d7979\" (UID: \"c1df2ad2-feac-487c-ab26-e885457d7979\") " Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.187222 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-457tb\" (UniqueName: \"kubernetes.io/projected/9feddb28-c165-4784-94a8-4d63209fda46-kube-api-access-457tb\") pod \"9feddb28-c165-4784-94a8-4d63209fda46\" (UID: \"9feddb28-c165-4784-94a8-4d63209fda46\") " Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.187260 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7d9kc\" (UniqueName: \"kubernetes.io/projected/0c9c338c-01c8-428b-89cb-4c4a59505595-kube-api-access-7d9kc\") pod \"0c9c338c-01c8-428b-89cb-4c4a59505595\" (UID: \"0c9c338c-01c8-428b-89cb-4c4a59505595\") " Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.187551 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9feddb28-c165-4784-94a8-4d63209fda46-operator-scripts\") pod \"9feddb28-c165-4784-94a8-4d63209fda46\" (UID: \"9feddb28-c165-4784-94a8-4d63209fda46\") " Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.187577 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f9ae762-5d7d-4d41-9477-b4cc72689803-operator-scripts\") pod \"8f9ae762-5d7d-4d41-9477-b4cc72689803\" (UID: \"8f9ae762-5d7d-4d41-9477-b4cc72689803\") " Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.187606 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nncxb\" (UniqueName: \"kubernetes.io/projected/1b441fb5-f89b-4ec1-8399-b3f56fdf139c-kube-api-access-nncxb\") pod \"1b441fb5-f89b-4ec1-8399-b3f56fdf139c\" (UID: \"1b441fb5-f89b-4ec1-8399-b3f56fdf139c\") " Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.188510 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1df2ad2-feac-487c-ab26-e885457d7979-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c1df2ad2-feac-487c-ab26-e885457d7979" (UID: "c1df2ad2-feac-487c-ab26-e885457d7979"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.188679 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83082054-f3da-4455-b4af-5232e439042c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "83082054-f3da-4455-b4af-5232e439042c" (UID: "83082054-f3da-4455-b4af-5232e439042c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.188679 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f9ae762-5d7d-4d41-9477-b4cc72689803-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8f9ae762-5d7d-4d41-9477-b4cc72689803" (UID: "8f9ae762-5d7d-4d41-9477-b4cc72689803"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.188736 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9feddb28-c165-4784-94a8-4d63209fda46-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9feddb28-c165-4784-94a8-4d63209fda46" (UID: "9feddb28-c165-4784-94a8-4d63209fda46"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.188782 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/edb56749-bb8b-4620-a1a9-8e1f2a70f1b2-operator-scripts\") pod \"edb56749-bb8b-4620-a1a9-8e1f2a70f1b2\" (UID: \"edb56749-bb8b-4620-a1a9-8e1f2a70f1b2\") " Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.188840 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b441fb5-f89b-4ec1-8399-b3f56fdf139c-operator-scripts\") pod \"1b441fb5-f89b-4ec1-8399-b3f56fdf139c\" (UID: \"1b441fb5-f89b-4ec1-8399-b3f56fdf139c\") " Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.188964 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7pq4\" (UniqueName: \"kubernetes.io/projected/83082054-f3da-4455-b4af-5232e439042c-kube-api-access-h7pq4\") pod \"83082054-f3da-4455-b4af-5232e439042c\" (UID: \"83082054-f3da-4455-b4af-5232e439042c\") " Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.189162 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c9c338c-01c8-428b-89cb-4c4a59505595-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0c9c338c-01c8-428b-89cb-4c4a59505595" (UID: "0c9c338c-01c8-428b-89cb-4c4a59505595"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.189514 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b441fb5-f89b-4ec1-8399-b3f56fdf139c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1b441fb5-f89b-4ec1-8399-b3f56fdf139c" (UID: "1b441fb5-f89b-4ec1-8399-b3f56fdf139c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.189699 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edb56749-bb8b-4620-a1a9-8e1f2a70f1b2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "edb56749-bb8b-4620-a1a9-8e1f2a70f1b2" (UID: "edb56749-bb8b-4620-a1a9-8e1f2a70f1b2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.191008 4723 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9feddb28-c165-4784-94a8-4d63209fda46-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.191029 4723 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f9ae762-5d7d-4d41-9477-b4cc72689803-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.191100 4723 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b441fb5-f89b-4ec1-8399-b3f56fdf139c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.191123 4723 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83082054-f3da-4455-b4af-5232e439042c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.191132 4723 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c9c338c-01c8-428b-89cb-4c4a59505595-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.191140 4723 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1df2ad2-feac-487c-ab26-e885457d7979-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.194492 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9feddb28-c165-4784-94a8-4d63209fda46-kube-api-access-457tb" (OuterVolumeSpecName: "kube-api-access-457tb") pod "9feddb28-c165-4784-94a8-4d63209fda46" (UID: "9feddb28-c165-4784-94a8-4d63209fda46"). InnerVolumeSpecName "kube-api-access-457tb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.199196 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1df2ad2-feac-487c-ab26-e885457d7979-kube-api-access-6s2g7" (OuterVolumeSpecName: "kube-api-access-6s2g7") pod "c1df2ad2-feac-487c-ab26-e885457d7979" (UID: "c1df2ad2-feac-487c-ab26-e885457d7979"). InnerVolumeSpecName "kube-api-access-6s2g7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.199061 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f9ae762-5d7d-4d41-9477-b4cc72689803-kube-api-access-shfsf" (OuterVolumeSpecName: "kube-api-access-shfsf") pod "8f9ae762-5d7d-4d41-9477-b4cc72689803" (UID: "8f9ae762-5d7d-4d41-9477-b4cc72689803"). InnerVolumeSpecName "kube-api-access-shfsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.201501 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83082054-f3da-4455-b4af-5232e439042c-kube-api-access-h7pq4" (OuterVolumeSpecName: "kube-api-access-h7pq4") pod "83082054-f3da-4455-b4af-5232e439042c" (UID: "83082054-f3da-4455-b4af-5232e439042c"). InnerVolumeSpecName "kube-api-access-h7pq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.206529 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b441fb5-f89b-4ec1-8399-b3f56fdf139c-kube-api-access-nncxb" (OuterVolumeSpecName: "kube-api-access-nncxb") pod "1b441fb5-f89b-4ec1-8399-b3f56fdf139c" (UID: "1b441fb5-f89b-4ec1-8399-b3f56fdf139c"). InnerVolumeSpecName "kube-api-access-nncxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.206640 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edb56749-bb8b-4620-a1a9-8e1f2a70f1b2-kube-api-access-jrflm" (OuterVolumeSpecName: "kube-api-access-jrflm") pod "edb56749-bb8b-4620-a1a9-8e1f2a70f1b2" (UID: "edb56749-bb8b-4620-a1a9-8e1f2a70f1b2"). InnerVolumeSpecName "kube-api-access-jrflm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.207439 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c9c338c-01c8-428b-89cb-4c4a59505595-kube-api-access-7d9kc" (OuterVolumeSpecName: "kube-api-access-7d9kc") pod "0c9c338c-01c8-428b-89cb-4c4a59505595" (UID: "0c9c338c-01c8-428b-89cb-4c4a59505595"). InnerVolumeSpecName "kube-api-access-7d9kc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.252950 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-mbn5h"] Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.292979 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nncxb\" (UniqueName: \"kubernetes.io/projected/1b441fb5-f89b-4ec1-8399-b3f56fdf139c-kube-api-access-nncxb\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.293021 4723 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/edb56749-bb8b-4620-a1a9-8e1f2a70f1b2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.293032 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7pq4\" (UniqueName: \"kubernetes.io/projected/83082054-f3da-4455-b4af-5232e439042c-kube-api-access-h7pq4\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.293040 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrflm\" (UniqueName: \"kubernetes.io/projected/edb56749-bb8b-4620-a1a9-8e1f2a70f1b2-kube-api-access-jrflm\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.293049 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6s2g7\" (UniqueName: \"kubernetes.io/projected/c1df2ad2-feac-487c-ab26-e885457d7979-kube-api-access-6s2g7\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.293057 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shfsf\" (UniqueName: \"kubernetes.io/projected/8f9ae762-5d7d-4d41-9477-b4cc72689803-kube-api-access-shfsf\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.293066 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-457tb\" (UniqueName: \"kubernetes.io/projected/9feddb28-c165-4784-94a8-4d63209fda46-kube-api-access-457tb\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.293075 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7d9kc\" (UniqueName: \"kubernetes.io/projected/0c9c338c-01c8-428b-89cb-4c4a59505595-kube-api-access-7d9kc\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.858212 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-x7vz5" event={"ID":"15541e12-c0a2-4c26-b912-d33be48eea77","Type":"ContainerStarted","Data":"be50d43caf62d2a431c11901c4e5ec9dccfaaaf18ef468a0f27c9c5a42afd31d"} Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.859953 4723 generic.go:334] "Generic (PLEG): container finished" podID="09188288-5f42-4c86-9f80-8d46d5544d93" containerID="c04bc17b8acd937090b2d9a8fb2492fc51726c0872eaff74b35078e81d4ddbee" exitCode=0 Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.860050 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-b554g" Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.861014 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-32ad-account-create-update-8p5t9" Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.861091 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-sbt7t" Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.861089 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-mbn5h" event={"ID":"09188288-5f42-4c86-9f80-8d46d5544d93","Type":"ContainerDied","Data":"c04bc17b8acd937090b2d9a8fb2492fc51726c0872eaff74b35078e81d4ddbee"} Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.861124 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-af49-account-create-update-bwsv7" Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.861134 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-mbn5h" event={"ID":"09188288-5f42-4c86-9f80-8d46d5544d93","Type":"ContainerStarted","Data":"71c66b5881fc76db2706c91310e2b8ebfbe3d90f1ded516b6ea656f57e30c73f"} Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.861140 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-6f11-account-create-update-gprhw" Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.861169 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-88pwc" Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.861254 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-cwx46" Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.894824 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-x7vz5" podStartSLOduration=7.897334822 podStartE2EDuration="13.894802529s" podCreationTimestamp="2026-03-09 13:20:20 +0000 UTC" firstStartedPulling="2026-03-09 13:20:26.815687035 +0000 UTC m=+1300.830154575" lastFinishedPulling="2026-03-09 13:20:32.813154742 +0000 UTC m=+1306.827622282" observedRunningTime="2026-03-09 13:20:33.882667138 +0000 UTC m=+1307.897134698" watchObservedRunningTime="2026-03-09 13:20:33.894802529 +0000 UTC m=+1307.909270069" Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.946476 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:20:33 crc kubenswrapper[4723]: I0309 13:20:33.946580 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:20:34 crc kubenswrapper[4723]: I0309 13:20:34.874552 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-mbn5h" event={"ID":"09188288-5f42-4c86-9f80-8d46d5544d93","Type":"ContainerStarted","Data":"471e7e966bf9470d41726c3299a2ac966405c5f7a8df9e40b514d853e2f001b3"} Mar 09 13:20:34 crc kubenswrapper[4723]: I0309 13:20:34.876143 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77585f5f8c-mbn5h" Mar 09 13:20:34 crc kubenswrapper[4723]: I0309 13:20:34.910493 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77585f5f8c-mbn5h" podStartSLOduration=4.910459369 podStartE2EDuration="4.910459369s" podCreationTimestamp="2026-03-09 13:20:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:20:34.901386699 +0000 UTC m=+1308.915854279" watchObservedRunningTime="2026-03-09 13:20:34.910459369 +0000 UTC m=+1308.924926949" Mar 09 13:20:36 crc kubenswrapper[4723]: I0309 13:20:36.897312 4723 generic.go:334] "Generic (PLEG): container finished" podID="15541e12-c0a2-4c26-b912-d33be48eea77" containerID="be50d43caf62d2a431c11901c4e5ec9dccfaaaf18ef468a0f27c9c5a42afd31d" exitCode=0 Mar 09 13:20:36 crc kubenswrapper[4723]: I0309 13:20:36.897408 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-x7vz5" event={"ID":"15541e12-c0a2-4c26-b912-d33be48eea77","Type":"ContainerDied","Data":"be50d43caf62d2a431c11901c4e5ec9dccfaaaf18ef468a0f27c9c5a42afd31d"} Mar 09 13:20:37 crc kubenswrapper[4723]: I0309 13:20:37.909924 4723 generic.go:334] "Generic (PLEG): container finished" podID="3e47df78-6587-4f83-a1c9-dcaf0aa9b73c" containerID="0f20df8ea1077c74474549ecfc6a71e72e4c6b5ed40ae1563af70106456a0dc9" exitCode=0 Mar 09 13:20:37 crc kubenswrapper[4723]: I0309 13:20:37.910014 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3e47df78-6587-4f83-a1c9-dcaf0aa9b73c","Type":"ContainerDied","Data":"0f20df8ea1077c74474549ecfc6a71e72e4c6b5ed40ae1563af70106456a0dc9"} Mar 09 13:20:38 crc kubenswrapper[4723]: I0309 13:20:38.369303 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-x7vz5" Mar 09 13:20:38 crc kubenswrapper[4723]: I0309 13:20:38.512948 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15541e12-c0a2-4c26-b912-d33be48eea77-config-data\") pod \"15541e12-c0a2-4c26-b912-d33be48eea77\" (UID: \"15541e12-c0a2-4c26-b912-d33be48eea77\") " Mar 09 13:20:38 crc kubenswrapper[4723]: I0309 13:20:38.513081 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cd6bp\" (UniqueName: \"kubernetes.io/projected/15541e12-c0a2-4c26-b912-d33be48eea77-kube-api-access-cd6bp\") pod \"15541e12-c0a2-4c26-b912-d33be48eea77\" (UID: \"15541e12-c0a2-4c26-b912-d33be48eea77\") " Mar 09 13:20:38 crc kubenswrapper[4723]: I0309 13:20:38.513261 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15541e12-c0a2-4c26-b912-d33be48eea77-combined-ca-bundle\") pod \"15541e12-c0a2-4c26-b912-d33be48eea77\" (UID: \"15541e12-c0a2-4c26-b912-d33be48eea77\") " Mar 09 13:20:38 crc kubenswrapper[4723]: I0309 13:20:38.521838 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15541e12-c0a2-4c26-b912-d33be48eea77-kube-api-access-cd6bp" (OuterVolumeSpecName: "kube-api-access-cd6bp") pod "15541e12-c0a2-4c26-b912-d33be48eea77" (UID: "15541e12-c0a2-4c26-b912-d33be48eea77"). InnerVolumeSpecName "kube-api-access-cd6bp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:20:38 crc kubenswrapper[4723]: I0309 13:20:38.611767 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15541e12-c0a2-4c26-b912-d33be48eea77-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "15541e12-c0a2-4c26-b912-d33be48eea77" (UID: "15541e12-c0a2-4c26-b912-d33be48eea77"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:20:38 crc kubenswrapper[4723]: I0309 13:20:38.615302 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cd6bp\" (UniqueName: \"kubernetes.io/projected/15541e12-c0a2-4c26-b912-d33be48eea77-kube-api-access-cd6bp\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:38 crc kubenswrapper[4723]: I0309 13:20:38.615337 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15541e12-c0a2-4c26-b912-d33be48eea77-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:38 crc kubenswrapper[4723]: I0309 13:20:38.639086 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15541e12-c0a2-4c26-b912-d33be48eea77-config-data" (OuterVolumeSpecName: "config-data") pod "15541e12-c0a2-4c26-b912-d33be48eea77" (UID: "15541e12-c0a2-4c26-b912-d33be48eea77"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:20:38 crc kubenswrapper[4723]: I0309 13:20:38.717038 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15541e12-c0a2-4c26-b912-d33be48eea77-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:38 crc kubenswrapper[4723]: I0309 13:20:38.922257 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3e47df78-6587-4f83-a1c9-dcaf0aa9b73c","Type":"ContainerStarted","Data":"23f519adafe2109d998e5940ac2f8529404610a594cb66828de81e1012d22033"} Mar 09 13:20:38 crc kubenswrapper[4723]: I0309 13:20:38.924814 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-x7vz5" event={"ID":"15541e12-c0a2-4c26-b912-d33be48eea77","Type":"ContainerDied","Data":"f7bfa7b6b0cc3e213e8cf64be2833627e19264315a0ef180444ea44a49b352bd"} Mar 09 13:20:38 crc kubenswrapper[4723]: I0309 13:20:38.924923 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7bfa7b6b0cc3e213e8cf64be2833627e19264315a0ef180444ea44a49b352bd" Mar 09 13:20:38 crc kubenswrapper[4723]: I0309 13:20:38.924879 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-x7vz5" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.178644 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-bnfpr"] Mar 09 13:20:39 crc kubenswrapper[4723]: E0309 13:20:39.181032 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c9c338c-01c8-428b-89cb-4c4a59505595" containerName="mariadb-database-create" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.181049 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c9c338c-01c8-428b-89cb-4c4a59505595" containerName="mariadb-database-create" Mar 09 13:20:39 crc kubenswrapper[4723]: E0309 13:20:39.181071 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b441fb5-f89b-4ec1-8399-b3f56fdf139c" containerName="mariadb-account-create-update" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.181077 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b441fb5-f89b-4ec1-8399-b3f56fdf139c" containerName="mariadb-account-create-update" Mar 09 13:20:39 crc kubenswrapper[4723]: E0309 13:20:39.181089 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f9ae762-5d7d-4d41-9477-b4cc72689803" containerName="mariadb-database-create" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.181096 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f9ae762-5d7d-4d41-9477-b4cc72689803" containerName="mariadb-database-create" Mar 09 13:20:39 crc kubenswrapper[4723]: E0309 13:20:39.181104 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15541e12-c0a2-4c26-b912-d33be48eea77" containerName="keystone-db-sync" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.181110 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="15541e12-c0a2-4c26-b912-d33be48eea77" containerName="keystone-db-sync" Mar 09 13:20:39 crc kubenswrapper[4723]: E0309 13:20:39.181131 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edb56749-bb8b-4620-a1a9-8e1f2a70f1b2" containerName="mariadb-account-create-update" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.181136 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="edb56749-bb8b-4620-a1a9-8e1f2a70f1b2" containerName="mariadb-account-create-update" Mar 09 13:20:39 crc kubenswrapper[4723]: E0309 13:20:39.181144 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9feddb28-c165-4784-94a8-4d63209fda46" containerName="mariadb-database-create" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.181150 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="9feddb28-c165-4784-94a8-4d63209fda46" containerName="mariadb-database-create" Mar 09 13:20:39 crc kubenswrapper[4723]: E0309 13:20:39.181164 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83082054-f3da-4455-b4af-5232e439042c" containerName="mariadb-account-create-update" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.181169 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="83082054-f3da-4455-b4af-5232e439042c" containerName="mariadb-account-create-update" Mar 09 13:20:39 crc kubenswrapper[4723]: E0309 13:20:39.181185 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1df2ad2-feac-487c-ab26-e885457d7979" containerName="mariadb-database-create" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.181192 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1df2ad2-feac-487c-ab26-e885457d7979" containerName="mariadb-database-create" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.181383 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b441fb5-f89b-4ec1-8399-b3f56fdf139c" containerName="mariadb-account-create-update" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.181396 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1df2ad2-feac-487c-ab26-e885457d7979" containerName="mariadb-database-create" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.181408 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="83082054-f3da-4455-b4af-5232e439042c" containerName="mariadb-account-create-update" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.181418 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c9c338c-01c8-428b-89cb-4c4a59505595" containerName="mariadb-database-create" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.181429 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="9feddb28-c165-4784-94a8-4d63209fda46" containerName="mariadb-database-create" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.181440 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f9ae762-5d7d-4d41-9477-b4cc72689803" containerName="mariadb-database-create" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.181451 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="15541e12-c0a2-4c26-b912-d33be48eea77" containerName="keystone-db-sync" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.181460 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="edb56749-bb8b-4620-a1a9-8e1f2a70f1b2" containerName="mariadb-account-create-update" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.182160 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bnfpr" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.188014 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.188221 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.188341 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jxn5s" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.189279 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.203213 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.205235 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-bnfpr"] Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.228197 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-mbn5h"] Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.228675 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77585f5f8c-mbn5h" podUID="09188288-5f42-4c86-9f80-8d46d5544d93" containerName="dnsmasq-dns" containerID="cri-o://471e7e966bf9470d41726c3299a2ac966405c5f7a8df9e40b514d853e2f001b3" gracePeriod=10 Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.237687 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77585f5f8c-mbn5h" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.278012 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-h9t6r"] Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.282129 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-h9t6r" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.307493 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-h9t6r"] Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.350224 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68a3edf0-8e62-4c92-8d5f-c831ec5625e9-combined-ca-bundle\") pod \"keystone-bootstrap-bnfpr\" (UID: \"68a3edf0-8e62-4c92-8d5f-c831ec5625e9\") " pod="openstack/keystone-bootstrap-bnfpr" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.350271 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68a3edf0-8e62-4c92-8d5f-c831ec5625e9-config-data\") pod \"keystone-bootstrap-bnfpr\" (UID: \"68a3edf0-8e62-4c92-8d5f-c831ec5625e9\") " pod="openstack/keystone-bootstrap-bnfpr" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.350293 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68a3edf0-8e62-4c92-8d5f-c831ec5625e9-scripts\") pod \"keystone-bootstrap-bnfpr\" (UID: \"68a3edf0-8e62-4c92-8d5f-c831ec5625e9\") " pod="openstack/keystone-bootstrap-bnfpr" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.350354 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/68a3edf0-8e62-4c92-8d5f-c831ec5625e9-fernet-keys\") pod \"keystone-bootstrap-bnfpr\" (UID: \"68a3edf0-8e62-4c92-8d5f-c831ec5625e9\") " pod="openstack/keystone-bootstrap-bnfpr" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.350433 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r9wk\" (UniqueName: \"kubernetes.io/projected/68a3edf0-8e62-4c92-8d5f-c831ec5625e9-kube-api-access-9r9wk\") pod \"keystone-bootstrap-bnfpr\" (UID: \"68a3edf0-8e62-4c92-8d5f-c831ec5625e9\") " pod="openstack/keystone-bootstrap-bnfpr" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.350478 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/68a3edf0-8e62-4c92-8d5f-c831ec5625e9-credential-keys\") pod \"keystone-bootstrap-bnfpr\" (UID: \"68a3edf0-8e62-4c92-8d5f-c831ec5625e9\") " pod="openstack/keystone-bootstrap-bnfpr" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.355519 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-lf8bq"] Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.357586 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-lf8bq" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.367438 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-f69fq" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.367832 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.433324 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-lf8bq"] Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.454120 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68a3edf0-8e62-4c92-8d5f-c831ec5625e9-combined-ca-bundle\") pod \"keystone-bootstrap-bnfpr\" (UID: \"68a3edf0-8e62-4c92-8d5f-c831ec5625e9\") " pod="openstack/keystone-bootstrap-bnfpr" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.454174 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef9c0fa1-111e-4aed-9838-1bbda77164dd-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-h9t6r\" (UID: \"ef9c0fa1-111e-4aed-9838-1bbda77164dd\") " pod="openstack/dnsmasq-dns-55fff446b9-h9t6r" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.454213 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hqhs\" (UniqueName: \"kubernetes.io/projected/ef9c0fa1-111e-4aed-9838-1bbda77164dd-kube-api-access-5hqhs\") pod \"dnsmasq-dns-55fff446b9-h9t6r\" (UID: \"ef9c0fa1-111e-4aed-9838-1bbda77164dd\") " pod="openstack/dnsmasq-dns-55fff446b9-h9t6r" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.454240 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68a3edf0-8e62-4c92-8d5f-c831ec5625e9-config-data\") pod \"keystone-bootstrap-bnfpr\" (UID: \"68a3edf0-8e62-4c92-8d5f-c831ec5625e9\") " pod="openstack/keystone-bootstrap-bnfpr" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.454264 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68a3edf0-8e62-4c92-8d5f-c831ec5625e9-scripts\") pod \"keystone-bootstrap-bnfpr\" (UID: \"68a3edf0-8e62-4c92-8d5f-c831ec5625e9\") " pod="openstack/keystone-bootstrap-bnfpr" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.454288 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc6n2\" (UniqueName: \"kubernetes.io/projected/90dea403-5a65-4824-ac5b-5c34c828d616-kube-api-access-pc6n2\") pod \"heat-db-sync-lf8bq\" (UID: \"90dea403-5a65-4824-ac5b-5c34c828d616\") " pod="openstack/heat-db-sync-lf8bq" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.454329 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ef9c0fa1-111e-4aed-9838-1bbda77164dd-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-h9t6r\" (UID: \"ef9c0fa1-111e-4aed-9838-1bbda77164dd\") " pod="openstack/dnsmasq-dns-55fff446b9-h9t6r" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.454366 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef9c0fa1-111e-4aed-9838-1bbda77164dd-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-h9t6r\" (UID: \"ef9c0fa1-111e-4aed-9838-1bbda77164dd\") " pod="openstack/dnsmasq-dns-55fff446b9-h9t6r" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.454402 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/68a3edf0-8e62-4c92-8d5f-c831ec5625e9-fernet-keys\") pod \"keystone-bootstrap-bnfpr\" (UID: \"68a3edf0-8e62-4c92-8d5f-c831ec5625e9\") " pod="openstack/keystone-bootstrap-bnfpr" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.454445 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90dea403-5a65-4824-ac5b-5c34c828d616-combined-ca-bundle\") pod \"heat-db-sync-lf8bq\" (UID: \"90dea403-5a65-4824-ac5b-5c34c828d616\") " pod="openstack/heat-db-sync-lf8bq" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.454484 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef9c0fa1-111e-4aed-9838-1bbda77164dd-config\") pod \"dnsmasq-dns-55fff446b9-h9t6r\" (UID: \"ef9c0fa1-111e-4aed-9838-1bbda77164dd\") " pod="openstack/dnsmasq-dns-55fff446b9-h9t6r" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.454520 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9r9wk\" (UniqueName: \"kubernetes.io/projected/68a3edf0-8e62-4c92-8d5f-c831ec5625e9-kube-api-access-9r9wk\") pod \"keystone-bootstrap-bnfpr\" (UID: \"68a3edf0-8e62-4c92-8d5f-c831ec5625e9\") " pod="openstack/keystone-bootstrap-bnfpr" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.454539 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef9c0fa1-111e-4aed-9838-1bbda77164dd-dns-svc\") pod \"dnsmasq-dns-55fff446b9-h9t6r\" (UID: \"ef9c0fa1-111e-4aed-9838-1bbda77164dd\") " pod="openstack/dnsmasq-dns-55fff446b9-h9t6r" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.454574 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90dea403-5a65-4824-ac5b-5c34c828d616-config-data\") pod \"heat-db-sync-lf8bq\" (UID: \"90dea403-5a65-4824-ac5b-5c34c828d616\") " pod="openstack/heat-db-sync-lf8bq" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.454601 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/68a3edf0-8e62-4c92-8d5f-c831ec5625e9-credential-keys\") pod \"keystone-bootstrap-bnfpr\" (UID: \"68a3edf0-8e62-4c92-8d5f-c831ec5625e9\") " pod="openstack/keystone-bootstrap-bnfpr" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.472357 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68a3edf0-8e62-4c92-8d5f-c831ec5625e9-scripts\") pod \"keystone-bootstrap-bnfpr\" (UID: \"68a3edf0-8e62-4c92-8d5f-c831ec5625e9\") " pod="openstack/keystone-bootstrap-bnfpr" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.485710 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68a3edf0-8e62-4c92-8d5f-c831ec5625e9-config-data\") pod \"keystone-bootstrap-bnfpr\" (UID: \"68a3edf0-8e62-4c92-8d5f-c831ec5625e9\") " pod="openstack/keystone-bootstrap-bnfpr" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.486222 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/68a3edf0-8e62-4c92-8d5f-c831ec5625e9-fernet-keys\") pod \"keystone-bootstrap-bnfpr\" (UID: \"68a3edf0-8e62-4c92-8d5f-c831ec5625e9\") " pod="openstack/keystone-bootstrap-bnfpr" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.487440 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/68a3edf0-8e62-4c92-8d5f-c831ec5625e9-credential-keys\") pod \"keystone-bootstrap-bnfpr\" (UID: \"68a3edf0-8e62-4c92-8d5f-c831ec5625e9\") " pod="openstack/keystone-bootstrap-bnfpr" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.493253 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68a3edf0-8e62-4c92-8d5f-c831ec5625e9-combined-ca-bundle\") pod \"keystone-bootstrap-bnfpr\" (UID: \"68a3edf0-8e62-4c92-8d5f-c831ec5625e9\") " pod="openstack/keystone-bootstrap-bnfpr" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.498898 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r9wk\" (UniqueName: \"kubernetes.io/projected/68a3edf0-8e62-4c92-8d5f-c831ec5625e9-kube-api-access-9r9wk\") pod \"keystone-bootstrap-bnfpr\" (UID: \"68a3edf0-8e62-4c92-8d5f-c831ec5625e9\") " pod="openstack/keystone-bootstrap-bnfpr" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.507976 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-bm79v"] Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.509307 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-bm79v" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.514147 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.514354 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-cd95p" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.514556 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.514643 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bnfpr" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.521318 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-bm79v"] Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.562824 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-wvbmm"] Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.563683 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef9c0fa1-111e-4aed-9838-1bbda77164dd-config\") pod \"dnsmasq-dns-55fff446b9-h9t6r\" (UID: \"ef9c0fa1-111e-4aed-9838-1bbda77164dd\") " pod="openstack/dnsmasq-dns-55fff446b9-h9t6r" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.564385 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wvbmm" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.562934 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef9c0fa1-111e-4aed-9838-1bbda77164dd-config\") pod \"dnsmasq-dns-55fff446b9-h9t6r\" (UID: \"ef9c0fa1-111e-4aed-9838-1bbda77164dd\") " pod="openstack/dnsmasq-dns-55fff446b9-h9t6r" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.565066 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef9c0fa1-111e-4aed-9838-1bbda77164dd-dns-svc\") pod \"dnsmasq-dns-55fff446b9-h9t6r\" (UID: \"ef9c0fa1-111e-4aed-9838-1bbda77164dd\") " pod="openstack/dnsmasq-dns-55fff446b9-h9t6r" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.565133 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90dea403-5a65-4824-ac5b-5c34c828d616-config-data\") pod \"heat-db-sync-lf8bq\" (UID: \"90dea403-5a65-4824-ac5b-5c34c828d616\") " pod="openstack/heat-db-sync-lf8bq" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.565235 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef9c0fa1-111e-4aed-9838-1bbda77164dd-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-h9t6r\" (UID: \"ef9c0fa1-111e-4aed-9838-1bbda77164dd\") " pod="openstack/dnsmasq-dns-55fff446b9-h9t6r" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.565261 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hqhs\" (UniqueName: \"kubernetes.io/projected/ef9c0fa1-111e-4aed-9838-1bbda77164dd-kube-api-access-5hqhs\") pod \"dnsmasq-dns-55fff446b9-h9t6r\" (UID: \"ef9c0fa1-111e-4aed-9838-1bbda77164dd\") " pod="openstack/dnsmasq-dns-55fff446b9-h9t6r" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.565291 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc6n2\" (UniqueName: \"kubernetes.io/projected/90dea403-5a65-4824-ac5b-5c34c828d616-kube-api-access-pc6n2\") pod \"heat-db-sync-lf8bq\" (UID: \"90dea403-5a65-4824-ac5b-5c34c828d616\") " pod="openstack/heat-db-sync-lf8bq" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.565335 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ef9c0fa1-111e-4aed-9838-1bbda77164dd-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-h9t6r\" (UID: \"ef9c0fa1-111e-4aed-9838-1bbda77164dd\") " pod="openstack/dnsmasq-dns-55fff446b9-h9t6r" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.565373 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef9c0fa1-111e-4aed-9838-1bbda77164dd-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-h9t6r\" (UID: \"ef9c0fa1-111e-4aed-9838-1bbda77164dd\") " pod="openstack/dnsmasq-dns-55fff446b9-h9t6r" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.565473 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90dea403-5a65-4824-ac5b-5c34c828d616-combined-ca-bundle\") pod \"heat-db-sync-lf8bq\" (UID: \"90dea403-5a65-4824-ac5b-5c34c828d616\") " pod="openstack/heat-db-sync-lf8bq" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.565744 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef9c0fa1-111e-4aed-9838-1bbda77164dd-dns-svc\") pod \"dnsmasq-dns-55fff446b9-h9t6r\" (UID: \"ef9c0fa1-111e-4aed-9838-1bbda77164dd\") " pod="openstack/dnsmasq-dns-55fff446b9-h9t6r" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.566525 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef9c0fa1-111e-4aed-9838-1bbda77164dd-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-h9t6r\" (UID: \"ef9c0fa1-111e-4aed-9838-1bbda77164dd\") " pod="openstack/dnsmasq-dns-55fff446b9-h9t6r" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.567188 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ef9c0fa1-111e-4aed-9838-1bbda77164dd-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-h9t6r\" (UID: \"ef9c0fa1-111e-4aed-9838-1bbda77164dd\") " pod="openstack/dnsmasq-dns-55fff446b9-h9t6r" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.567815 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef9c0fa1-111e-4aed-9838-1bbda77164dd-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-h9t6r\" (UID: \"ef9c0fa1-111e-4aed-9838-1bbda77164dd\") " pod="openstack/dnsmasq-dns-55fff446b9-h9t6r" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.569294 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.570705 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90dea403-5a65-4824-ac5b-5c34c828d616-config-data\") pod \"heat-db-sync-lf8bq\" (UID: \"90dea403-5a65-4824-ac5b-5c34c828d616\") " pod="openstack/heat-db-sync-lf8bq" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.571208 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-q6xhf" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.571425 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90dea403-5a65-4824-ac5b-5c34c828d616-combined-ca-bundle\") pod \"heat-db-sync-lf8bq\" (UID: \"90dea403-5a65-4824-ac5b-5c34c828d616\") " pod="openstack/heat-db-sync-lf8bq" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.583524 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-wvbmm"] Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.608818 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc6n2\" (UniqueName: \"kubernetes.io/projected/90dea403-5a65-4824-ac5b-5c34c828d616-kube-api-access-pc6n2\") pod \"heat-db-sync-lf8bq\" (UID: \"90dea403-5a65-4824-ac5b-5c34c828d616\") " pod="openstack/heat-db-sync-lf8bq" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.624204 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hqhs\" (UniqueName: \"kubernetes.io/projected/ef9c0fa1-111e-4aed-9838-1bbda77164dd-kube-api-access-5hqhs\") pod \"dnsmasq-dns-55fff446b9-h9t6r\" (UID: \"ef9c0fa1-111e-4aed-9838-1bbda77164dd\") " pod="openstack/dnsmasq-dns-55fff446b9-h9t6r" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.667293 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/191baa15-4ac5-4e55-9f87-751eddffb83e-db-sync-config-data\") pod \"cinder-db-sync-bm79v\" (UID: \"191baa15-4ac5-4e55-9f87-751eddffb83e\") " pod="openstack/cinder-db-sync-bm79v" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.667358 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/191baa15-4ac5-4e55-9f87-751eddffb83e-combined-ca-bundle\") pod \"cinder-db-sync-bm79v\" (UID: \"191baa15-4ac5-4e55-9f87-751eddffb83e\") " pod="openstack/cinder-db-sync-bm79v" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.667418 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/052977d5-adda-4cc2-a8bc-7b4ea4e32df7-combined-ca-bundle\") pod \"barbican-db-sync-wvbmm\" (UID: \"052977d5-adda-4cc2-a8bc-7b4ea4e32df7\") " pod="openstack/barbican-db-sync-wvbmm" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.667440 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/191baa15-4ac5-4e55-9f87-751eddffb83e-scripts\") pod \"cinder-db-sync-bm79v\" (UID: \"191baa15-4ac5-4e55-9f87-751eddffb83e\") " pod="openstack/cinder-db-sync-bm79v" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.667485 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57c8r\" (UniqueName: \"kubernetes.io/projected/052977d5-adda-4cc2-a8bc-7b4ea4e32df7-kube-api-access-57c8r\") pod \"barbican-db-sync-wvbmm\" (UID: \"052977d5-adda-4cc2-a8bc-7b4ea4e32df7\") " pod="openstack/barbican-db-sync-wvbmm" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.667511 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/191baa15-4ac5-4e55-9f87-751eddffb83e-config-data\") pod \"cinder-db-sync-bm79v\" (UID: \"191baa15-4ac5-4e55-9f87-751eddffb83e\") " pod="openstack/cinder-db-sync-bm79v" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.667540 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwtpj\" (UniqueName: \"kubernetes.io/projected/191baa15-4ac5-4e55-9f87-751eddffb83e-kube-api-access-vwtpj\") pod \"cinder-db-sync-bm79v\" (UID: \"191baa15-4ac5-4e55-9f87-751eddffb83e\") " pod="openstack/cinder-db-sync-bm79v" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.667562 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/191baa15-4ac5-4e55-9f87-751eddffb83e-etc-machine-id\") pod \"cinder-db-sync-bm79v\" (UID: \"191baa15-4ac5-4e55-9f87-751eddffb83e\") " pod="openstack/cinder-db-sync-bm79v" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.667603 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/052977d5-adda-4cc2-a8bc-7b4ea4e32df7-db-sync-config-data\") pod \"barbican-db-sync-wvbmm\" (UID: \"052977d5-adda-4cc2-a8bc-7b4ea4e32df7\") " pod="openstack/barbican-db-sync-wvbmm" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.669927 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-pf8wv"] Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.671265 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-pf8wv" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.673585 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.678431 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.678634 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-qq99h" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.681328 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-pf8wv"] Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.721137 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-h9t6r"] Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.722187 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-h9t6r" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.755329 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-cp5rk"] Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.756739 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-cp5rk" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.760813 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-s28v7" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.761100 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.761219 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.766506 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-lf8bq" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.769135 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8846c8f3-62f3-4053-8b48-177d011dd0c9-config\") pod \"neutron-db-sync-pf8wv\" (UID: \"8846c8f3-62f3-4053-8b48-177d011dd0c9\") " pod="openstack/neutron-db-sync-pf8wv" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.769202 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/052977d5-adda-4cc2-a8bc-7b4ea4e32df7-combined-ca-bundle\") pod \"barbican-db-sync-wvbmm\" (UID: \"052977d5-adda-4cc2-a8bc-7b4ea4e32df7\") " pod="openstack/barbican-db-sync-wvbmm" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.769223 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/191baa15-4ac5-4e55-9f87-751eddffb83e-scripts\") pod \"cinder-db-sync-bm79v\" (UID: \"191baa15-4ac5-4e55-9f87-751eddffb83e\") " pod="openstack/cinder-db-sync-bm79v" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.769263 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57c8r\" (UniqueName: \"kubernetes.io/projected/052977d5-adda-4cc2-a8bc-7b4ea4e32df7-kube-api-access-57c8r\") pod \"barbican-db-sync-wvbmm\" (UID: \"052977d5-adda-4cc2-a8bc-7b4ea4e32df7\") " pod="openstack/barbican-db-sync-wvbmm" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.769287 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/191baa15-4ac5-4e55-9f87-751eddffb83e-config-data\") pod \"cinder-db-sync-bm79v\" (UID: \"191baa15-4ac5-4e55-9f87-751eddffb83e\") " pod="openstack/cinder-db-sync-bm79v" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.769302 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4d2g\" (UniqueName: \"kubernetes.io/projected/8846c8f3-62f3-4053-8b48-177d011dd0c9-kube-api-access-k4d2g\") pod \"neutron-db-sync-pf8wv\" (UID: \"8846c8f3-62f3-4053-8b48-177d011dd0c9\") " pod="openstack/neutron-db-sync-pf8wv" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.769329 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwtpj\" (UniqueName: \"kubernetes.io/projected/191baa15-4ac5-4e55-9f87-751eddffb83e-kube-api-access-vwtpj\") pod \"cinder-db-sync-bm79v\" (UID: \"191baa15-4ac5-4e55-9f87-751eddffb83e\") " pod="openstack/cinder-db-sync-bm79v" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.769348 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/191baa15-4ac5-4e55-9f87-751eddffb83e-etc-machine-id\") pod \"cinder-db-sync-bm79v\" (UID: \"191baa15-4ac5-4e55-9f87-751eddffb83e\") " pod="openstack/cinder-db-sync-bm79v" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.769388 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/052977d5-adda-4cc2-a8bc-7b4ea4e32df7-db-sync-config-data\") pod \"barbican-db-sync-wvbmm\" (UID: \"052977d5-adda-4cc2-a8bc-7b4ea4e32df7\") " pod="openstack/barbican-db-sync-wvbmm" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.769415 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8846c8f3-62f3-4053-8b48-177d011dd0c9-combined-ca-bundle\") pod \"neutron-db-sync-pf8wv\" (UID: \"8846c8f3-62f3-4053-8b48-177d011dd0c9\") " pod="openstack/neutron-db-sync-pf8wv" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.769448 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/191baa15-4ac5-4e55-9f87-751eddffb83e-db-sync-config-data\") pod \"cinder-db-sync-bm79v\" (UID: \"191baa15-4ac5-4e55-9f87-751eddffb83e\") " pod="openstack/cinder-db-sync-bm79v" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.769475 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/191baa15-4ac5-4e55-9f87-751eddffb83e-combined-ca-bundle\") pod \"cinder-db-sync-bm79v\" (UID: \"191baa15-4ac5-4e55-9f87-751eddffb83e\") " pod="openstack/cinder-db-sync-bm79v" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.784158 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/191baa15-4ac5-4e55-9f87-751eddffb83e-etc-machine-id\") pod \"cinder-db-sync-bm79v\" (UID: \"191baa15-4ac5-4e55-9f87-751eddffb83e\") " pod="openstack/cinder-db-sync-bm79v" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.784550 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/052977d5-adda-4cc2-a8bc-7b4ea4e32df7-combined-ca-bundle\") pod \"barbican-db-sync-wvbmm\" (UID: \"052977d5-adda-4cc2-a8bc-7b4ea4e32df7\") " pod="openstack/barbican-db-sync-wvbmm" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.784596 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/191baa15-4ac5-4e55-9f87-751eddffb83e-combined-ca-bundle\") pod \"cinder-db-sync-bm79v\" (UID: \"191baa15-4ac5-4e55-9f87-751eddffb83e\") " pod="openstack/cinder-db-sync-bm79v" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.799077 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/052977d5-adda-4cc2-a8bc-7b4ea4e32df7-db-sync-config-data\") pod \"barbican-db-sync-wvbmm\" (UID: \"052977d5-adda-4cc2-a8bc-7b4ea4e32df7\") " pod="openstack/barbican-db-sync-wvbmm" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.800106 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/191baa15-4ac5-4e55-9f87-751eddffb83e-scripts\") pod \"cinder-db-sync-bm79v\" (UID: \"191baa15-4ac5-4e55-9f87-751eddffb83e\") " pod="openstack/cinder-db-sync-bm79v" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.800910 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwtpj\" (UniqueName: \"kubernetes.io/projected/191baa15-4ac5-4e55-9f87-751eddffb83e-kube-api-access-vwtpj\") pod \"cinder-db-sync-bm79v\" (UID: \"191baa15-4ac5-4e55-9f87-751eddffb83e\") " pod="openstack/cinder-db-sync-bm79v" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.805457 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57c8r\" (UniqueName: \"kubernetes.io/projected/052977d5-adda-4cc2-a8bc-7b4ea4e32df7-kube-api-access-57c8r\") pod \"barbican-db-sync-wvbmm\" (UID: \"052977d5-adda-4cc2-a8bc-7b4ea4e32df7\") " pod="openstack/barbican-db-sync-wvbmm" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.808728 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/191baa15-4ac5-4e55-9f87-751eddffb83e-config-data\") pod \"cinder-db-sync-bm79v\" (UID: \"191baa15-4ac5-4e55-9f87-751eddffb83e\") " pod="openstack/cinder-db-sync-bm79v" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.810630 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/191baa15-4ac5-4e55-9f87-751eddffb83e-db-sync-config-data\") pod \"cinder-db-sync-bm79v\" (UID: \"191baa15-4ac5-4e55-9f87-751eddffb83e\") " pod="openstack/cinder-db-sync-bm79v" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.825175 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-cp5rk"] Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.840542 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-8z98d"] Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.844565 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-8z98d" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.853690 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-8z98d"] Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.871390 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6832d621-3d7d-4e4a-824b-f219746aaa89-combined-ca-bundle\") pod \"placement-db-sync-cp5rk\" (UID: \"6832d621-3d7d-4e4a-824b-f219746aaa89\") " pod="openstack/placement-db-sync-cp5rk" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.871446 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4d2g\" (UniqueName: \"kubernetes.io/projected/8846c8f3-62f3-4053-8b48-177d011dd0c9-kube-api-access-k4d2g\") pod \"neutron-db-sync-pf8wv\" (UID: \"8846c8f3-62f3-4053-8b48-177d011dd0c9\") " pod="openstack/neutron-db-sync-pf8wv" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.871522 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6832d621-3d7d-4e4a-824b-f219746aaa89-logs\") pod \"placement-db-sync-cp5rk\" (UID: \"6832d621-3d7d-4e4a-824b-f219746aaa89\") " pod="openstack/placement-db-sync-cp5rk" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.871572 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8846c8f3-62f3-4053-8b48-177d011dd0c9-combined-ca-bundle\") pod \"neutron-db-sync-pf8wv\" (UID: \"8846c8f3-62f3-4053-8b48-177d011dd0c9\") " pod="openstack/neutron-db-sync-pf8wv" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.871605 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6832d621-3d7d-4e4a-824b-f219746aaa89-scripts\") pod \"placement-db-sync-cp5rk\" (UID: \"6832d621-3d7d-4e4a-824b-f219746aaa89\") " pod="openstack/placement-db-sync-cp5rk" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.871647 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lvdx\" (UniqueName: \"kubernetes.io/projected/6832d621-3d7d-4e4a-824b-f219746aaa89-kube-api-access-5lvdx\") pod \"placement-db-sync-cp5rk\" (UID: \"6832d621-3d7d-4e4a-824b-f219746aaa89\") " pod="openstack/placement-db-sync-cp5rk" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.871671 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8846c8f3-62f3-4053-8b48-177d011dd0c9-config\") pod \"neutron-db-sync-pf8wv\" (UID: \"8846c8f3-62f3-4053-8b48-177d011dd0c9\") " pod="openstack/neutron-db-sync-pf8wv" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.871761 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6832d621-3d7d-4e4a-824b-f219746aaa89-config-data\") pod \"placement-db-sync-cp5rk\" (UID: \"6832d621-3d7d-4e4a-824b-f219746aaa89\") " pod="openstack/placement-db-sync-cp5rk" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.875390 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8846c8f3-62f3-4053-8b48-177d011dd0c9-combined-ca-bundle\") pod \"neutron-db-sync-pf8wv\" (UID: \"8846c8f3-62f3-4053-8b48-177d011dd0c9\") " pod="openstack/neutron-db-sync-pf8wv" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.883467 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8846c8f3-62f3-4053-8b48-177d011dd0c9-config\") pod \"neutron-db-sync-pf8wv\" (UID: \"8846c8f3-62f3-4053-8b48-177d011dd0c9\") " pod="openstack/neutron-db-sync-pf8wv" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.890236 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4d2g\" (UniqueName: \"kubernetes.io/projected/8846c8f3-62f3-4053-8b48-177d011dd0c9-kube-api-access-k4d2g\") pod \"neutron-db-sync-pf8wv\" (UID: \"8846c8f3-62f3-4053-8b48-177d011dd0c9\") " pod="openstack/neutron-db-sync-pf8wv" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.961772 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-bm79v" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.974222 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xr48\" (UniqueName: \"kubernetes.io/projected/470074a0-571f-4f50-a275-bb91881d9d85-kube-api-access-9xr48\") pod \"dnsmasq-dns-76fcf4b695-8z98d\" (UID: \"470074a0-571f-4f50-a275-bb91881d9d85\") " pod="openstack/dnsmasq-dns-76fcf4b695-8z98d" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.974304 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6832d621-3d7d-4e4a-824b-f219746aaa89-config-data\") pod \"placement-db-sync-cp5rk\" (UID: \"6832d621-3d7d-4e4a-824b-f219746aaa89\") " pod="openstack/placement-db-sync-cp5rk" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.974370 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/470074a0-571f-4f50-a275-bb91881d9d85-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-8z98d\" (UID: \"470074a0-571f-4f50-a275-bb91881d9d85\") " pod="openstack/dnsmasq-dns-76fcf4b695-8z98d" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.974401 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6832d621-3d7d-4e4a-824b-f219746aaa89-combined-ca-bundle\") pod \"placement-db-sync-cp5rk\" (UID: \"6832d621-3d7d-4e4a-824b-f219746aaa89\") " pod="openstack/placement-db-sync-cp5rk" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.974490 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6832d621-3d7d-4e4a-824b-f219746aaa89-logs\") pod \"placement-db-sync-cp5rk\" (UID: \"6832d621-3d7d-4e4a-824b-f219746aaa89\") " pod="openstack/placement-db-sync-cp5rk" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.974512 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/470074a0-571f-4f50-a275-bb91881d9d85-config\") pod \"dnsmasq-dns-76fcf4b695-8z98d\" (UID: \"470074a0-571f-4f50-a275-bb91881d9d85\") " pod="openstack/dnsmasq-dns-76fcf4b695-8z98d" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.974540 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/470074a0-571f-4f50-a275-bb91881d9d85-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-8z98d\" (UID: \"470074a0-571f-4f50-a275-bb91881d9d85\") " pod="openstack/dnsmasq-dns-76fcf4b695-8z98d" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.974583 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/470074a0-571f-4f50-a275-bb91881d9d85-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-8z98d\" (UID: \"470074a0-571f-4f50-a275-bb91881d9d85\") " pod="openstack/dnsmasq-dns-76fcf4b695-8z98d" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.974616 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6832d621-3d7d-4e4a-824b-f219746aaa89-scripts\") pod \"placement-db-sync-cp5rk\" (UID: \"6832d621-3d7d-4e4a-824b-f219746aaa89\") " pod="openstack/placement-db-sync-cp5rk" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.974663 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/470074a0-571f-4f50-a275-bb91881d9d85-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-8z98d\" (UID: \"470074a0-571f-4f50-a275-bb91881d9d85\") " pod="openstack/dnsmasq-dns-76fcf4b695-8z98d" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.974685 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lvdx\" (UniqueName: \"kubernetes.io/projected/6832d621-3d7d-4e4a-824b-f219746aaa89-kube-api-access-5lvdx\") pod \"placement-db-sync-cp5rk\" (UID: \"6832d621-3d7d-4e4a-824b-f219746aaa89\") " pod="openstack/placement-db-sync-cp5rk" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.977574 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6832d621-3d7d-4e4a-824b-f219746aaa89-logs\") pod \"placement-db-sync-cp5rk\" (UID: \"6832d621-3d7d-4e4a-824b-f219746aaa89\") " pod="openstack/placement-db-sync-cp5rk" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.983490 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wvbmm" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.985232 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6832d621-3d7d-4e4a-824b-f219746aaa89-combined-ca-bundle\") pod \"placement-db-sync-cp5rk\" (UID: \"6832d621-3d7d-4e4a-824b-f219746aaa89\") " pod="openstack/placement-db-sync-cp5rk" Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.988970 4723 generic.go:334] "Generic (PLEG): container finished" podID="09188288-5f42-4c86-9f80-8d46d5544d93" containerID="471e7e966bf9470d41726c3299a2ac966405c5f7a8df9e40b514d853e2f001b3" exitCode=0 Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.989064 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-mbn5h" event={"ID":"09188288-5f42-4c86-9f80-8d46d5544d93","Type":"ContainerDied","Data":"471e7e966bf9470d41726c3299a2ac966405c5f7a8df9e40b514d853e2f001b3"} Mar 09 13:20:39 crc kubenswrapper[4723]: I0309 13:20:39.996680 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6832d621-3d7d-4e4a-824b-f219746aaa89-scripts\") pod \"placement-db-sync-cp5rk\" (UID: \"6832d621-3d7d-4e4a-824b-f219746aaa89\") " pod="openstack/placement-db-sync-cp5rk" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.012746 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6832d621-3d7d-4e4a-824b-f219746aaa89-config-data\") pod \"placement-db-sync-cp5rk\" (UID: \"6832d621-3d7d-4e4a-824b-f219746aaa89\") " pod="openstack/placement-db-sync-cp5rk" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.014652 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lvdx\" (UniqueName: \"kubernetes.io/projected/6832d621-3d7d-4e4a-824b-f219746aaa89-kube-api-access-5lvdx\") pod \"placement-db-sync-cp5rk\" (UID: \"6832d621-3d7d-4e4a-824b-f219746aaa89\") " pod="openstack/placement-db-sync-cp5rk" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.032016 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-pf8wv" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.034256 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-b5gc4" event={"ID":"72dc18cb-be01-4378-b62c-609a2c237731","Type":"ContainerStarted","Data":"99ed3f717a597eff2118d8c9217456fe28f4259766a3a003f976b0720a42d3e3"} Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.075667 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-b5gc4" podStartSLOduration=3.2844544190000002 podStartE2EDuration="33.075643779s" podCreationTimestamp="2026-03-09 13:20:07 +0000 UTC" firstStartedPulling="2026-03-09 13:20:08.564591222 +0000 UTC m=+1282.579058762" lastFinishedPulling="2026-03-09 13:20:38.355780582 +0000 UTC m=+1312.370248122" observedRunningTime="2026-03-09 13:20:40.062256934 +0000 UTC m=+1314.076724474" watchObservedRunningTime="2026-03-09 13:20:40.075643779 +0000 UTC m=+1314.090111319" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.076852 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/470074a0-571f-4f50-a275-bb91881d9d85-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-8z98d\" (UID: \"470074a0-571f-4f50-a275-bb91881d9d85\") " pod="openstack/dnsmasq-dns-76fcf4b695-8z98d" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.090004 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xr48\" (UniqueName: \"kubernetes.io/projected/470074a0-571f-4f50-a275-bb91881d9d85-kube-api-access-9xr48\") pod \"dnsmasq-dns-76fcf4b695-8z98d\" (UID: \"470074a0-571f-4f50-a275-bb91881d9d85\") " pod="openstack/dnsmasq-dns-76fcf4b695-8z98d" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.090264 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/470074a0-571f-4f50-a275-bb91881d9d85-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-8z98d\" (UID: \"470074a0-571f-4f50-a275-bb91881d9d85\") " pod="openstack/dnsmasq-dns-76fcf4b695-8z98d" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.090987 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/470074a0-571f-4f50-a275-bb91881d9d85-config\") pod \"dnsmasq-dns-76fcf4b695-8z98d\" (UID: \"470074a0-571f-4f50-a275-bb91881d9d85\") " pod="openstack/dnsmasq-dns-76fcf4b695-8z98d" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.091036 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/470074a0-571f-4f50-a275-bb91881d9d85-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-8z98d\" (UID: \"470074a0-571f-4f50-a275-bb91881d9d85\") " pod="openstack/dnsmasq-dns-76fcf4b695-8z98d" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.109561 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/470074a0-571f-4f50-a275-bb91881d9d85-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-8z98d\" (UID: \"470074a0-571f-4f50-a275-bb91881d9d85\") " pod="openstack/dnsmasq-dns-76fcf4b695-8z98d" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.078025 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/470074a0-571f-4f50-a275-bb91881d9d85-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-8z98d\" (UID: \"470074a0-571f-4f50-a275-bb91881d9d85\") " pod="openstack/dnsmasq-dns-76fcf4b695-8z98d" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.110592 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/470074a0-571f-4f50-a275-bb91881d9d85-config\") pod \"dnsmasq-dns-76fcf4b695-8z98d\" (UID: \"470074a0-571f-4f50-a275-bb91881d9d85\") " pod="openstack/dnsmasq-dns-76fcf4b695-8z98d" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.111128 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/470074a0-571f-4f50-a275-bb91881d9d85-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-8z98d\" (UID: \"470074a0-571f-4f50-a275-bb91881d9d85\") " pod="openstack/dnsmasq-dns-76fcf4b695-8z98d" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.113680 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/470074a0-571f-4f50-a275-bb91881d9d85-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-8z98d\" (UID: \"470074a0-571f-4f50-a275-bb91881d9d85\") " pod="openstack/dnsmasq-dns-76fcf4b695-8z98d" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.114419 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/470074a0-571f-4f50-a275-bb91881d9d85-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-8z98d\" (UID: \"470074a0-571f-4f50-a275-bb91881d9d85\") " pod="openstack/dnsmasq-dns-76fcf4b695-8z98d" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.115032 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-cp5rk" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.164021 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xr48\" (UniqueName: \"kubernetes.io/projected/470074a0-571f-4f50-a275-bb91881d9d85-kube-api-access-9xr48\") pod \"dnsmasq-dns-76fcf4b695-8z98d\" (UID: \"470074a0-571f-4f50-a275-bb91881d9d85\") " pod="openstack/dnsmasq-dns-76fcf4b695-8z98d" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.172835 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-8z98d" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.394413 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-mbn5h" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.429368 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:20:40 crc kubenswrapper[4723]: E0309 13:20:40.430275 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09188288-5f42-4c86-9f80-8d46d5544d93" containerName="dnsmasq-dns" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.430288 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="09188288-5f42-4c86-9f80-8d46d5544d93" containerName="dnsmasq-dns" Mar 09 13:20:40 crc kubenswrapper[4723]: E0309 13:20:40.430313 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09188288-5f42-4c86-9f80-8d46d5544d93" containerName="init" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.430319 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="09188288-5f42-4c86-9f80-8d46d5544d93" containerName="init" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.451141 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="09188288-5f42-4c86-9f80-8d46d5544d93" containerName="dnsmasq-dns" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.460265 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.462078 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.462505 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.463016 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.516251 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-bnfpr"] Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.570547 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09188288-5f42-4c86-9f80-8d46d5544d93-dns-svc\") pod \"09188288-5f42-4c86-9f80-8d46d5544d93\" (UID: \"09188288-5f42-4c86-9f80-8d46d5544d93\") " Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.570620 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09188288-5f42-4c86-9f80-8d46d5544d93-config\") pod \"09188288-5f42-4c86-9f80-8d46d5544d93\" (UID: \"09188288-5f42-4c86-9f80-8d46d5544d93\") " Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.570646 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqm4x\" (UniqueName: \"kubernetes.io/projected/09188288-5f42-4c86-9f80-8d46d5544d93-kube-api-access-nqm4x\") pod \"09188288-5f42-4c86-9f80-8d46d5544d93\" (UID: \"09188288-5f42-4c86-9f80-8d46d5544d93\") " Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.570769 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/09188288-5f42-4c86-9f80-8d46d5544d93-ovsdbserver-nb\") pod \"09188288-5f42-4c86-9f80-8d46d5544d93\" (UID: \"09188288-5f42-4c86-9f80-8d46d5544d93\") " Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.570934 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/09188288-5f42-4c86-9f80-8d46d5544d93-dns-swift-storage-0\") pod \"09188288-5f42-4c86-9f80-8d46d5544d93\" (UID: \"09188288-5f42-4c86-9f80-8d46d5544d93\") " Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.571029 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/09188288-5f42-4c86-9f80-8d46d5544d93-ovsdbserver-sb\") pod \"09188288-5f42-4c86-9f80-8d46d5544d93\" (UID: \"09188288-5f42-4c86-9f80-8d46d5544d93\") " Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.571321 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e0bf458-3488-4d3a-80ac-d9cf2f655791-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1e0bf458-3488-4d3a-80ac-d9cf2f655791\") " pod="openstack/ceilometer-0" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.571427 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e0bf458-3488-4d3a-80ac-d9cf2f655791-scripts\") pod \"ceilometer-0\" (UID: \"1e0bf458-3488-4d3a-80ac-d9cf2f655791\") " pod="openstack/ceilometer-0" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.571448 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1e0bf458-3488-4d3a-80ac-d9cf2f655791-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1e0bf458-3488-4d3a-80ac-d9cf2f655791\") " pod="openstack/ceilometer-0" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.571485 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e0bf458-3488-4d3a-80ac-d9cf2f655791-config-data\") pod \"ceilometer-0\" (UID: \"1e0bf458-3488-4d3a-80ac-d9cf2f655791\") " pod="openstack/ceilometer-0" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.571535 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e0bf458-3488-4d3a-80ac-d9cf2f655791-run-httpd\") pod \"ceilometer-0\" (UID: \"1e0bf458-3488-4d3a-80ac-d9cf2f655791\") " pod="openstack/ceilometer-0" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.571566 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e0bf458-3488-4d3a-80ac-d9cf2f655791-log-httpd\") pod \"ceilometer-0\" (UID: \"1e0bf458-3488-4d3a-80ac-d9cf2f655791\") " pod="openstack/ceilometer-0" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.571584 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fgl4\" (UniqueName: \"kubernetes.io/projected/1e0bf458-3488-4d3a-80ac-d9cf2f655791-kube-api-access-5fgl4\") pod \"ceilometer-0\" (UID: \"1e0bf458-3488-4d3a-80ac-d9cf2f655791\") " pod="openstack/ceilometer-0" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.679203 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e0bf458-3488-4d3a-80ac-d9cf2f655791-run-httpd\") pod \"ceilometer-0\" (UID: \"1e0bf458-3488-4d3a-80ac-d9cf2f655791\") " pod="openstack/ceilometer-0" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.679468 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e0bf458-3488-4d3a-80ac-d9cf2f655791-log-httpd\") pod \"ceilometer-0\" (UID: \"1e0bf458-3488-4d3a-80ac-d9cf2f655791\") " pod="openstack/ceilometer-0" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.679486 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fgl4\" (UniqueName: \"kubernetes.io/projected/1e0bf458-3488-4d3a-80ac-d9cf2f655791-kube-api-access-5fgl4\") pod \"ceilometer-0\" (UID: \"1e0bf458-3488-4d3a-80ac-d9cf2f655791\") " pod="openstack/ceilometer-0" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.679612 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e0bf458-3488-4d3a-80ac-d9cf2f655791-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1e0bf458-3488-4d3a-80ac-d9cf2f655791\") " pod="openstack/ceilometer-0" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.679662 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e0bf458-3488-4d3a-80ac-d9cf2f655791-scripts\") pod \"ceilometer-0\" (UID: \"1e0bf458-3488-4d3a-80ac-d9cf2f655791\") " pod="openstack/ceilometer-0" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.679678 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1e0bf458-3488-4d3a-80ac-d9cf2f655791-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1e0bf458-3488-4d3a-80ac-d9cf2f655791\") " pod="openstack/ceilometer-0" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.679711 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e0bf458-3488-4d3a-80ac-d9cf2f655791-config-data\") pod \"ceilometer-0\" (UID: \"1e0bf458-3488-4d3a-80ac-d9cf2f655791\") " pod="openstack/ceilometer-0" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.680999 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e0bf458-3488-4d3a-80ac-d9cf2f655791-run-httpd\") pod \"ceilometer-0\" (UID: \"1e0bf458-3488-4d3a-80ac-d9cf2f655791\") " pod="openstack/ceilometer-0" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.681215 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e0bf458-3488-4d3a-80ac-d9cf2f655791-log-httpd\") pod \"ceilometer-0\" (UID: \"1e0bf458-3488-4d3a-80ac-d9cf2f655791\") " pod="openstack/ceilometer-0" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.736919 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e0bf458-3488-4d3a-80ac-d9cf2f655791-config-data\") pod \"ceilometer-0\" (UID: \"1e0bf458-3488-4d3a-80ac-d9cf2f655791\") " pod="openstack/ceilometer-0" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.737460 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e0bf458-3488-4d3a-80ac-d9cf2f655791-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1e0bf458-3488-4d3a-80ac-d9cf2f655791\") " pod="openstack/ceilometer-0" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.745509 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1e0bf458-3488-4d3a-80ac-d9cf2f655791-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1e0bf458-3488-4d3a-80ac-d9cf2f655791\") " pod="openstack/ceilometer-0" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.746915 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e0bf458-3488-4d3a-80ac-d9cf2f655791-scripts\") pod \"ceilometer-0\" (UID: \"1e0bf458-3488-4d3a-80ac-d9cf2f655791\") " pod="openstack/ceilometer-0" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.797699 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fgl4\" (UniqueName: \"kubernetes.io/projected/1e0bf458-3488-4d3a-80ac-d9cf2f655791-kube-api-access-5fgl4\") pod \"ceilometer-0\" (UID: \"1e0bf458-3488-4d3a-80ac-d9cf2f655791\") " pod="openstack/ceilometer-0" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.800393 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09188288-5f42-4c86-9f80-8d46d5544d93-kube-api-access-nqm4x" (OuterVolumeSpecName: "kube-api-access-nqm4x") pod "09188288-5f42-4c86-9f80-8d46d5544d93" (UID: "09188288-5f42-4c86-9f80-8d46d5544d93"). InnerVolumeSpecName "kube-api-access-nqm4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.858620 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:20:40 crc kubenswrapper[4723]: I0309 13:20:40.887927 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqm4x\" (UniqueName: \"kubernetes.io/projected/09188288-5f42-4c86-9f80-8d46d5544d93-kube-api-access-nqm4x\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:41 crc kubenswrapper[4723]: I0309 13:20:41.003652 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09188288-5f42-4c86-9f80-8d46d5544d93-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "09188288-5f42-4c86-9f80-8d46d5544d93" (UID: "09188288-5f42-4c86-9f80-8d46d5544d93"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:20:41 crc kubenswrapper[4723]: I0309 13:20:41.128214 4723 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/09188288-5f42-4c86-9f80-8d46d5544d93-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:41 crc kubenswrapper[4723]: I0309 13:20:41.180343 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09188288-5f42-4c86-9f80-8d46d5544d93-config" (OuterVolumeSpecName: "config") pod "09188288-5f42-4c86-9f80-8d46d5544d93" (UID: "09188288-5f42-4c86-9f80-8d46d5544d93"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:20:41 crc kubenswrapper[4723]: I0309 13:20:41.191443 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-mbn5h" Mar 09 13:20:41 crc kubenswrapper[4723]: I0309 13:20:41.231385 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09188288-5f42-4c86-9f80-8d46d5544d93-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:41 crc kubenswrapper[4723]: I0309 13:20:41.240568 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09188288-5f42-4c86-9f80-8d46d5544d93-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "09188288-5f42-4c86-9f80-8d46d5544d93" (UID: "09188288-5f42-4c86-9f80-8d46d5544d93"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:20:41 crc kubenswrapper[4723]: I0309 13:20:41.303794 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09188288-5f42-4c86-9f80-8d46d5544d93-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "09188288-5f42-4c86-9f80-8d46d5544d93" (UID: "09188288-5f42-4c86-9f80-8d46d5544d93"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:20:41 crc kubenswrapper[4723]: I0309 13:20:41.338198 4723 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09188288-5f42-4c86-9f80-8d46d5544d93-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:41 crc kubenswrapper[4723]: I0309 13:20:41.338228 4723 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/09188288-5f42-4c86-9f80-8d46d5544d93-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:41 crc kubenswrapper[4723]: I0309 13:20:41.806226 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09188288-5f42-4c86-9f80-8d46d5544d93-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "09188288-5f42-4c86-9f80-8d46d5544d93" (UID: "09188288-5f42-4c86-9f80-8d46d5544d93"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:20:41 crc kubenswrapper[4723]: I0309 13:20:41.850345 4723 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/09188288-5f42-4c86-9f80-8d46d5544d93-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:41 crc kubenswrapper[4723]: E0309 13:20:41.946429 4723 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.066s" Mar 09 13:20:41 crc kubenswrapper[4723]: I0309 13:20:41.946508 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-mbn5h" event={"ID":"09188288-5f42-4c86-9f80-8d46d5544d93","Type":"ContainerDied","Data":"71c66b5881fc76db2706c91310e2b8ebfbe3d90f1ded516b6ea656f57e30c73f"} Mar 09 13:20:41 crc kubenswrapper[4723]: I0309 13:20:41.946539 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-lf8bq"] Mar 09 13:20:41 crc kubenswrapper[4723]: I0309 13:20:41.946557 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-h9t6r"] Mar 09 13:20:41 crc kubenswrapper[4723]: I0309 13:20:41.946570 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-lf8bq" event={"ID":"90dea403-5a65-4824-ac5b-5c34c828d616","Type":"ContainerStarted","Data":"daa7b630797809acdb3025cdda62219a718129183378179cea49defd1f62668b"} Mar 09 13:20:41 crc kubenswrapper[4723]: I0309 13:20:41.946585 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-bm79v"] Mar 09 13:20:41 crc kubenswrapper[4723]: I0309 13:20:41.946597 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bnfpr" event={"ID":"68a3edf0-8e62-4c92-8d5f-c831ec5625e9","Type":"ContainerStarted","Data":"6ce8b3b784700d502e7129f60bf89aa1b2ac081f0e70dba8329148e858294149"} Mar 09 13:20:41 crc kubenswrapper[4723]: I0309 13:20:41.946608 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-cp5rk"] Mar 09 13:20:41 crc kubenswrapper[4723]: I0309 13:20:41.946618 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-8z98d"] Mar 09 13:20:41 crc kubenswrapper[4723]: I0309 13:20:41.946628 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-pf8wv"] Mar 09 13:20:41 crc kubenswrapper[4723]: I0309 13:20:41.946636 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-wvbmm"] Mar 09 13:20:41 crc kubenswrapper[4723]: I0309 13:20:41.946645 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:20:41 crc kubenswrapper[4723]: I0309 13:20:41.946661 4723 scope.go:117] "RemoveContainer" containerID="471e7e966bf9470d41726c3299a2ac966405c5f7a8df9e40b514d853e2f001b3" Mar 09 13:20:42 crc kubenswrapper[4723]: I0309 13:20:42.041217 4723 scope.go:117] "RemoveContainer" containerID="c04bc17b8acd937090b2d9a8fb2492fc51726c0872eaff74b35078e81d4ddbee" Mar 09 13:20:42 crc kubenswrapper[4723]: I0309 13:20:42.225666 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-8z98d" event={"ID":"470074a0-571f-4f50-a275-bb91881d9d85","Type":"ContainerStarted","Data":"ceed71e93a0fa8a1083aafc88cae1c30292032a75ab2ba8cefc824b9fd8a852b"} Mar 09 13:20:42 crc kubenswrapper[4723]: I0309 13:20:42.225707 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-8z98d" event={"ID":"470074a0-571f-4f50-a275-bb91881d9d85","Type":"ContainerStarted","Data":"07d007b0ac62365cb9740cc4af23198926c9d5dc1a76112d031e9f8ae04ec1a0"} Mar 09 13:20:42 crc kubenswrapper[4723]: I0309 13:20:42.228694 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e0bf458-3488-4d3a-80ac-d9cf2f655791","Type":"ContainerStarted","Data":"a211a03dcf1458ff9b9cf538906f5fcdc0b232c428b25f2b3c043bac4351bb66"} Mar 09 13:20:42 crc kubenswrapper[4723]: I0309 13:20:42.231562 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-bm79v" event={"ID":"191baa15-4ac5-4e55-9f87-751eddffb83e","Type":"ContainerStarted","Data":"0bf1331a08a57c068cf7d37f0992feb47295e4c479f2b5345e8a9af85efd2918"} Mar 09 13:20:42 crc kubenswrapper[4723]: I0309 13:20:42.239582 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wvbmm" event={"ID":"052977d5-adda-4cc2-a8bc-7b4ea4e32df7","Type":"ContainerStarted","Data":"75f6aa0a0fb701015bde8e6503310a1dcd7c9366126a5368f221d3eaf277ae42"} Mar 09 13:20:42 crc kubenswrapper[4723]: I0309 13:20:42.266101 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-h9t6r" event={"ID":"ef9c0fa1-111e-4aed-9838-1bbda77164dd","Type":"ContainerStarted","Data":"206f9774d1aaf867947bc9606aa1ee16cab1a1b1d2de09939eeb9ded80cf0bc5"} Mar 09 13:20:42 crc kubenswrapper[4723]: I0309 13:20:42.270796 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-cp5rk" event={"ID":"6832d621-3d7d-4e4a-824b-f219746aaa89","Type":"ContainerStarted","Data":"e6498d9f715423d0271b48318b5f3a38a5904893ebc37d37e506bc8e5506222c"} Mar 09 13:20:42 crc kubenswrapper[4723]: I0309 13:20:42.295121 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-pf8wv" event={"ID":"8846c8f3-62f3-4053-8b48-177d011dd0c9","Type":"ContainerStarted","Data":"a163abfebaba397bab3c1ebf6d5e3a0fed057b2ea5d1ece53b4a0fd01dfb8d75"} Mar 09 13:20:42 crc kubenswrapper[4723]: I0309 13:20:42.398083 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-mbn5h"] Mar 09 13:20:42 crc kubenswrapper[4723]: I0309 13:20:42.434673 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-mbn5h"] Mar 09 13:20:42 crc kubenswrapper[4723]: E0309 13:20:42.538140 4723 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09188288_5f42_4c86_9f80_8d46d5544d93.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09188288_5f42_4c86_9f80_8d46d5544d93.slice/crio-71c66b5881fc76db2706c91310e2b8ebfbe3d90f1ded516b6ea656f57e30c73f\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod470074a0_571f_4f50_a275_bb91881d9d85.slice/crio-conmon-ceed71e93a0fa8a1083aafc88cae1c30292032a75ab2ba8cefc824b9fd8a852b.scope\": RecentStats: unable to find data in memory cache]" Mar 09 13:20:42 crc kubenswrapper[4723]: I0309 13:20:42.929514 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09188288-5f42-4c86-9f80-8d46d5544d93" path="/var/lib/kubelet/pods/09188288-5f42-4c86-9f80-8d46d5544d93/volumes" Mar 09 13:20:43 crc kubenswrapper[4723]: I0309 13:20:43.059880 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:20:43 crc kubenswrapper[4723]: I0309 13:20:43.366698 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3e47df78-6587-4f83-a1c9-dcaf0aa9b73c","Type":"ContainerStarted","Data":"68fa13d470697682cd7df582f66163aff84bdb289a0a3bcf6fe9c8dcdc64d266"} Mar 09 13:20:43 crc kubenswrapper[4723]: I0309 13:20:43.366754 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3e47df78-6587-4f83-a1c9-dcaf0aa9b73c","Type":"ContainerStarted","Data":"4691871a5822d7266d765e5cc1b8dea59c2f012d85ae5de37a0619afbd3b7a8c"} Mar 09 13:20:43 crc kubenswrapper[4723]: I0309 13:20:43.391801 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bnfpr" event={"ID":"68a3edf0-8e62-4c92-8d5f-c831ec5625e9","Type":"ContainerStarted","Data":"0bc52fc1eaa1ade792d3433bea77da9ac5efc6b56be9f74e7195589f92e642fa"} Mar 09 13:20:43 crc kubenswrapper[4723]: I0309 13:20:43.401982 4723 generic.go:334] "Generic (PLEG): container finished" podID="ef9c0fa1-111e-4aed-9838-1bbda77164dd" containerID="a56bf39e49b4109c4195c5bcb3a18c3627f7abcec3aacfa1275bee0acc3e64c5" exitCode=0 Mar 09 13:20:43 crc kubenswrapper[4723]: I0309 13:20:43.402109 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-h9t6r" event={"ID":"ef9c0fa1-111e-4aed-9838-1bbda77164dd","Type":"ContainerDied","Data":"a56bf39e49b4109c4195c5bcb3a18c3627f7abcec3aacfa1275bee0acc3e64c5"} Mar 09 13:20:43 crc kubenswrapper[4723]: I0309 13:20:43.418697 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-pf8wv" event={"ID":"8846c8f3-62f3-4053-8b48-177d011dd0c9","Type":"ContainerStarted","Data":"29b115c1f8743153a2d56a56172aa3cc7d2029bb69439e6f911ef022e708ed89"} Mar 09 13:20:43 crc kubenswrapper[4723]: I0309 13:20:43.453729 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=17.453707302 podStartE2EDuration="17.453707302s" podCreationTimestamp="2026-03-09 13:20:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:20:43.438260723 +0000 UTC m=+1317.452728263" watchObservedRunningTime="2026-03-09 13:20:43.453707302 +0000 UTC m=+1317.468174842" Mar 09 13:20:43 crc kubenswrapper[4723]: I0309 13:20:43.459315 4723 generic.go:334] "Generic (PLEG): container finished" podID="470074a0-571f-4f50-a275-bb91881d9d85" containerID="ceed71e93a0fa8a1083aafc88cae1c30292032a75ab2ba8cefc824b9fd8a852b" exitCode=0 Mar 09 13:20:43 crc kubenswrapper[4723]: I0309 13:20:43.459387 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-8z98d" event={"ID":"470074a0-571f-4f50-a275-bb91881d9d85","Type":"ContainerDied","Data":"ceed71e93a0fa8a1083aafc88cae1c30292032a75ab2ba8cefc824b9fd8a852b"} Mar 09 13:20:43 crc kubenswrapper[4723]: I0309 13:20:43.474277 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-bnfpr" podStartSLOduration=4.474251845 podStartE2EDuration="4.474251845s" podCreationTimestamp="2026-03-09 13:20:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:20:43.458395786 +0000 UTC m=+1317.472863336" watchObservedRunningTime="2026-03-09 13:20:43.474251845 +0000 UTC m=+1317.488719385" Mar 09 13:20:43 crc kubenswrapper[4723]: I0309 13:20:43.495344 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-pf8wv" podStartSLOduration=4.495326263 podStartE2EDuration="4.495326263s" podCreationTimestamp="2026-03-09 13:20:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:20:43.489834708 +0000 UTC m=+1317.504302248" watchObservedRunningTime="2026-03-09 13:20:43.495326263 +0000 UTC m=+1317.509793813" Mar 09 13:20:44 crc kubenswrapper[4723]: I0309 13:20:44.019597 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-h9t6r" Mar 09 13:20:44 crc kubenswrapper[4723]: I0309 13:20:44.077667 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef9c0fa1-111e-4aed-9838-1bbda77164dd-ovsdbserver-sb\") pod \"ef9c0fa1-111e-4aed-9838-1bbda77164dd\" (UID: \"ef9c0fa1-111e-4aed-9838-1bbda77164dd\") " Mar 09 13:20:44 crc kubenswrapper[4723]: I0309 13:20:44.078008 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef9c0fa1-111e-4aed-9838-1bbda77164dd-config\") pod \"ef9c0fa1-111e-4aed-9838-1bbda77164dd\" (UID: \"ef9c0fa1-111e-4aed-9838-1bbda77164dd\") " Mar 09 13:20:44 crc kubenswrapper[4723]: I0309 13:20:44.078086 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ef9c0fa1-111e-4aed-9838-1bbda77164dd-dns-swift-storage-0\") pod \"ef9c0fa1-111e-4aed-9838-1bbda77164dd\" (UID: \"ef9c0fa1-111e-4aed-9838-1bbda77164dd\") " Mar 09 13:20:44 crc kubenswrapper[4723]: I0309 13:20:44.078133 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hqhs\" (UniqueName: \"kubernetes.io/projected/ef9c0fa1-111e-4aed-9838-1bbda77164dd-kube-api-access-5hqhs\") pod \"ef9c0fa1-111e-4aed-9838-1bbda77164dd\" (UID: \"ef9c0fa1-111e-4aed-9838-1bbda77164dd\") " Mar 09 13:20:44 crc kubenswrapper[4723]: I0309 13:20:44.078195 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef9c0fa1-111e-4aed-9838-1bbda77164dd-dns-svc\") pod \"ef9c0fa1-111e-4aed-9838-1bbda77164dd\" (UID: \"ef9c0fa1-111e-4aed-9838-1bbda77164dd\") " Mar 09 13:20:44 crc kubenswrapper[4723]: I0309 13:20:44.078281 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef9c0fa1-111e-4aed-9838-1bbda77164dd-ovsdbserver-nb\") pod \"ef9c0fa1-111e-4aed-9838-1bbda77164dd\" (UID: \"ef9c0fa1-111e-4aed-9838-1bbda77164dd\") " Mar 09 13:20:44 crc kubenswrapper[4723]: I0309 13:20:44.096785 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef9c0fa1-111e-4aed-9838-1bbda77164dd-kube-api-access-5hqhs" (OuterVolumeSpecName: "kube-api-access-5hqhs") pod "ef9c0fa1-111e-4aed-9838-1bbda77164dd" (UID: "ef9c0fa1-111e-4aed-9838-1bbda77164dd"). InnerVolumeSpecName "kube-api-access-5hqhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:20:44 crc kubenswrapper[4723]: I0309 13:20:44.110326 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef9c0fa1-111e-4aed-9838-1bbda77164dd-config" (OuterVolumeSpecName: "config") pod "ef9c0fa1-111e-4aed-9838-1bbda77164dd" (UID: "ef9c0fa1-111e-4aed-9838-1bbda77164dd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:20:44 crc kubenswrapper[4723]: I0309 13:20:44.115196 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef9c0fa1-111e-4aed-9838-1bbda77164dd-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ef9c0fa1-111e-4aed-9838-1bbda77164dd" (UID: "ef9c0fa1-111e-4aed-9838-1bbda77164dd"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:20:44 crc kubenswrapper[4723]: I0309 13:20:44.119032 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef9c0fa1-111e-4aed-9838-1bbda77164dd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ef9c0fa1-111e-4aed-9838-1bbda77164dd" (UID: "ef9c0fa1-111e-4aed-9838-1bbda77164dd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:20:44 crc kubenswrapper[4723]: I0309 13:20:44.123294 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef9c0fa1-111e-4aed-9838-1bbda77164dd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ef9c0fa1-111e-4aed-9838-1bbda77164dd" (UID: "ef9c0fa1-111e-4aed-9838-1bbda77164dd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:20:44 crc kubenswrapper[4723]: I0309 13:20:44.132809 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef9c0fa1-111e-4aed-9838-1bbda77164dd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ef9c0fa1-111e-4aed-9838-1bbda77164dd" (UID: "ef9c0fa1-111e-4aed-9838-1bbda77164dd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:20:44 crc kubenswrapper[4723]: I0309 13:20:44.181168 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef9c0fa1-111e-4aed-9838-1bbda77164dd-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:44 crc kubenswrapper[4723]: I0309 13:20:44.181215 4723 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ef9c0fa1-111e-4aed-9838-1bbda77164dd-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:44 crc kubenswrapper[4723]: I0309 13:20:44.181230 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hqhs\" (UniqueName: \"kubernetes.io/projected/ef9c0fa1-111e-4aed-9838-1bbda77164dd-kube-api-access-5hqhs\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:44 crc kubenswrapper[4723]: I0309 13:20:44.181242 4723 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef9c0fa1-111e-4aed-9838-1bbda77164dd-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:44 crc kubenswrapper[4723]: I0309 13:20:44.181255 4723 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef9c0fa1-111e-4aed-9838-1bbda77164dd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:44 crc kubenswrapper[4723]: I0309 13:20:44.181269 4723 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef9c0fa1-111e-4aed-9838-1bbda77164dd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:44 crc kubenswrapper[4723]: I0309 13:20:44.472292 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-h9t6r" Mar 09 13:20:44 crc kubenswrapper[4723]: I0309 13:20:44.472325 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-h9t6r" event={"ID":"ef9c0fa1-111e-4aed-9838-1bbda77164dd","Type":"ContainerDied","Data":"206f9774d1aaf867947bc9606aa1ee16cab1a1b1d2de09939eeb9ded80cf0bc5"} Mar 09 13:20:44 crc kubenswrapper[4723]: I0309 13:20:44.472385 4723 scope.go:117] "RemoveContainer" containerID="a56bf39e49b4109c4195c5bcb3a18c3627f7abcec3aacfa1275bee0acc3e64c5" Mar 09 13:20:44 crc kubenswrapper[4723]: I0309 13:20:44.478375 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-8z98d" event={"ID":"470074a0-571f-4f50-a275-bb91881d9d85","Type":"ContainerStarted","Data":"5bca35e86567f4384537db83fcdb2e32879be70033e484b4df29112dbe2ef9e6"} Mar 09 13:20:44 crc kubenswrapper[4723]: I0309 13:20:44.478426 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76fcf4b695-8z98d" Mar 09 13:20:44 crc kubenswrapper[4723]: I0309 13:20:44.581295 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76fcf4b695-8z98d" podStartSLOduration=5.581273444 podStartE2EDuration="5.581273444s" podCreationTimestamp="2026-03-09 13:20:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:20:44.508448076 +0000 UTC m=+1318.522915626" watchObservedRunningTime="2026-03-09 13:20:44.581273444 +0000 UTC m=+1318.595740984" Mar 09 13:20:44 crc kubenswrapper[4723]: I0309 13:20:44.619085 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-h9t6r"] Mar 09 13:20:44 crc kubenswrapper[4723]: I0309 13:20:44.638768 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-h9t6r"] Mar 09 13:20:44 crc kubenswrapper[4723]: I0309 13:20:44.900877 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef9c0fa1-111e-4aed-9838-1bbda77164dd" path="/var/lib/kubelet/pods/ef9c0fa1-111e-4aed-9838-1bbda77164dd/volumes" Mar 09 13:20:47 crc kubenswrapper[4723]: I0309 13:20:47.363570 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:47 crc kubenswrapper[4723]: I0309 13:20:47.527818 4723 generic.go:334] "Generic (PLEG): container finished" podID="68a3edf0-8e62-4c92-8d5f-c831ec5625e9" containerID="0bc52fc1eaa1ade792d3433bea77da9ac5efc6b56be9f74e7195589f92e642fa" exitCode=0 Mar 09 13:20:47 crc kubenswrapper[4723]: I0309 13:20:47.528176 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bnfpr" event={"ID":"68a3edf0-8e62-4c92-8d5f-c831ec5625e9","Type":"ContainerDied","Data":"0bc52fc1eaa1ade792d3433bea77da9ac5efc6b56be9f74e7195589f92e642fa"} Mar 09 13:20:48 crc kubenswrapper[4723]: I0309 13:20:48.117844 4723 scope.go:117] "RemoveContainer" containerID="9f0f6210f994342f289cc606c7a07d1c1fceb3bad25c49f5b874238e0e576a0b" Mar 09 13:20:49 crc kubenswrapper[4723]: I0309 13:20:49.555372 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bnfpr" event={"ID":"68a3edf0-8e62-4c92-8d5f-c831ec5625e9","Type":"ContainerDied","Data":"6ce8b3b784700d502e7129f60bf89aa1b2ac081f0e70dba8329148e858294149"} Mar 09 13:20:49 crc kubenswrapper[4723]: I0309 13:20:49.555767 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ce8b3b784700d502e7129f60bf89aa1b2ac081f0e70dba8329148e858294149" Mar 09 13:20:49 crc kubenswrapper[4723]: I0309 13:20:49.618396 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bnfpr" Mar 09 13:20:49 crc kubenswrapper[4723]: I0309 13:20:49.660684 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9r9wk\" (UniqueName: \"kubernetes.io/projected/68a3edf0-8e62-4c92-8d5f-c831ec5625e9-kube-api-access-9r9wk\") pod \"68a3edf0-8e62-4c92-8d5f-c831ec5625e9\" (UID: \"68a3edf0-8e62-4c92-8d5f-c831ec5625e9\") " Mar 09 13:20:49 crc kubenswrapper[4723]: I0309 13:20:49.660829 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68a3edf0-8e62-4c92-8d5f-c831ec5625e9-combined-ca-bundle\") pod \"68a3edf0-8e62-4c92-8d5f-c831ec5625e9\" (UID: \"68a3edf0-8e62-4c92-8d5f-c831ec5625e9\") " Mar 09 13:20:49 crc kubenswrapper[4723]: I0309 13:20:49.660900 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68a3edf0-8e62-4c92-8d5f-c831ec5625e9-scripts\") pod \"68a3edf0-8e62-4c92-8d5f-c831ec5625e9\" (UID: \"68a3edf0-8e62-4c92-8d5f-c831ec5625e9\") " Mar 09 13:20:49 crc kubenswrapper[4723]: I0309 13:20:49.661017 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/68a3edf0-8e62-4c92-8d5f-c831ec5625e9-credential-keys\") pod \"68a3edf0-8e62-4c92-8d5f-c831ec5625e9\" (UID: \"68a3edf0-8e62-4c92-8d5f-c831ec5625e9\") " Mar 09 13:20:49 crc kubenswrapper[4723]: I0309 13:20:49.661103 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/68a3edf0-8e62-4c92-8d5f-c831ec5625e9-fernet-keys\") pod \"68a3edf0-8e62-4c92-8d5f-c831ec5625e9\" (UID: \"68a3edf0-8e62-4c92-8d5f-c831ec5625e9\") " Mar 09 13:20:49 crc kubenswrapper[4723]: I0309 13:20:49.661186 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68a3edf0-8e62-4c92-8d5f-c831ec5625e9-config-data\") pod \"68a3edf0-8e62-4c92-8d5f-c831ec5625e9\" (UID: \"68a3edf0-8e62-4c92-8d5f-c831ec5625e9\") " Mar 09 13:20:49 crc kubenswrapper[4723]: I0309 13:20:49.668791 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68a3edf0-8e62-4c92-8d5f-c831ec5625e9-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "68a3edf0-8e62-4c92-8d5f-c831ec5625e9" (UID: "68a3edf0-8e62-4c92-8d5f-c831ec5625e9"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:20:49 crc kubenswrapper[4723]: I0309 13:20:49.669835 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68a3edf0-8e62-4c92-8d5f-c831ec5625e9-scripts" (OuterVolumeSpecName: "scripts") pod "68a3edf0-8e62-4c92-8d5f-c831ec5625e9" (UID: "68a3edf0-8e62-4c92-8d5f-c831ec5625e9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:20:49 crc kubenswrapper[4723]: I0309 13:20:49.681952 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68a3edf0-8e62-4c92-8d5f-c831ec5625e9-kube-api-access-9r9wk" (OuterVolumeSpecName: "kube-api-access-9r9wk") pod "68a3edf0-8e62-4c92-8d5f-c831ec5625e9" (UID: "68a3edf0-8e62-4c92-8d5f-c831ec5625e9"). InnerVolumeSpecName "kube-api-access-9r9wk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:20:49 crc kubenswrapper[4723]: I0309 13:20:49.689188 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68a3edf0-8e62-4c92-8d5f-c831ec5625e9-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "68a3edf0-8e62-4c92-8d5f-c831ec5625e9" (UID: "68a3edf0-8e62-4c92-8d5f-c831ec5625e9"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:20:49 crc kubenswrapper[4723]: I0309 13:20:49.702574 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68a3edf0-8e62-4c92-8d5f-c831ec5625e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "68a3edf0-8e62-4c92-8d5f-c831ec5625e9" (UID: "68a3edf0-8e62-4c92-8d5f-c831ec5625e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:20:49 crc kubenswrapper[4723]: I0309 13:20:49.722195 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68a3edf0-8e62-4c92-8d5f-c831ec5625e9-config-data" (OuterVolumeSpecName: "config-data") pod "68a3edf0-8e62-4c92-8d5f-c831ec5625e9" (UID: "68a3edf0-8e62-4c92-8d5f-c831ec5625e9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:20:49 crc kubenswrapper[4723]: I0309 13:20:49.763826 4723 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/68a3edf0-8e62-4c92-8d5f-c831ec5625e9-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:49 crc kubenswrapper[4723]: I0309 13:20:49.763876 4723 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/68a3edf0-8e62-4c92-8d5f-c831ec5625e9-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:49 crc kubenswrapper[4723]: I0309 13:20:49.763886 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68a3edf0-8e62-4c92-8d5f-c831ec5625e9-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:49 crc kubenswrapper[4723]: I0309 13:20:49.763896 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9r9wk\" (UniqueName: \"kubernetes.io/projected/68a3edf0-8e62-4c92-8d5f-c831ec5625e9-kube-api-access-9r9wk\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:49 crc kubenswrapper[4723]: I0309 13:20:49.763907 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68a3edf0-8e62-4c92-8d5f-c831ec5625e9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:49 crc kubenswrapper[4723]: I0309 13:20:49.763915 4723 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68a3edf0-8e62-4c92-8d5f-c831ec5625e9-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:50 crc kubenswrapper[4723]: I0309 13:20:50.175045 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76fcf4b695-8z98d" Mar 09 13:20:50 crc kubenswrapper[4723]: I0309 13:20:50.241225 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-2kn5v"] Mar 09 13:20:50 crc kubenswrapper[4723]: I0309 13:20:50.253302 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-2kn5v" podUID="33ef3c2f-07b0-431c-96b0-b588061d9ce9" containerName="dnsmasq-dns" containerID="cri-o://d92d0d56a25623f0f68ce1e22eaaea9b59c6682ffe250280b7f5ab6820b0d811" gracePeriod=10 Mar 09 13:20:50 crc kubenswrapper[4723]: I0309 13:20:50.565420 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bnfpr" Mar 09 13:20:50 crc kubenswrapper[4723]: I0309 13:20:50.760492 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-bnfpr"] Mar 09 13:20:50 crc kubenswrapper[4723]: I0309 13:20:50.769979 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-bnfpr"] Mar 09 13:20:50 crc kubenswrapper[4723]: I0309 13:20:50.838150 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-ch4lf"] Mar 09 13:20:50 crc kubenswrapper[4723]: E0309 13:20:50.838920 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef9c0fa1-111e-4aed-9838-1bbda77164dd" containerName="init" Mar 09 13:20:50 crc kubenswrapper[4723]: I0309 13:20:50.838941 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef9c0fa1-111e-4aed-9838-1bbda77164dd" containerName="init" Mar 09 13:20:50 crc kubenswrapper[4723]: E0309 13:20:50.838960 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68a3edf0-8e62-4c92-8d5f-c831ec5625e9" containerName="keystone-bootstrap" Mar 09 13:20:50 crc kubenswrapper[4723]: I0309 13:20:50.838968 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="68a3edf0-8e62-4c92-8d5f-c831ec5625e9" containerName="keystone-bootstrap" Mar 09 13:20:50 crc kubenswrapper[4723]: I0309 13:20:50.839164 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="68a3edf0-8e62-4c92-8d5f-c831ec5625e9" containerName="keystone-bootstrap" Mar 09 13:20:50 crc kubenswrapper[4723]: I0309 13:20:50.839192 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef9c0fa1-111e-4aed-9838-1bbda77164dd" containerName="init" Mar 09 13:20:50 crc kubenswrapper[4723]: I0309 13:20:50.839911 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ch4lf" Mar 09 13:20:50 crc kubenswrapper[4723]: I0309 13:20:50.842225 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 09 13:20:50 crc kubenswrapper[4723]: I0309 13:20:50.842501 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jxn5s" Mar 09 13:20:50 crc kubenswrapper[4723]: I0309 13:20:50.842656 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 09 13:20:50 crc kubenswrapper[4723]: I0309 13:20:50.842822 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 09 13:20:50 crc kubenswrapper[4723]: I0309 13:20:50.843407 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 09 13:20:50 crc kubenswrapper[4723]: I0309 13:20:50.872347 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ch4lf"] Mar 09 13:20:50 crc kubenswrapper[4723]: I0309 13:20:50.907877 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d74c75e-9665-4723-8dca-9019bd324ccb-config-data\") pod \"keystone-bootstrap-ch4lf\" (UID: \"3d74c75e-9665-4723-8dca-9019bd324ccb\") " pod="openstack/keystone-bootstrap-ch4lf" Mar 09 13:20:50 crc kubenswrapper[4723]: I0309 13:20:50.908004 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djb8p\" (UniqueName: \"kubernetes.io/projected/3d74c75e-9665-4723-8dca-9019bd324ccb-kube-api-access-djb8p\") pod \"keystone-bootstrap-ch4lf\" (UID: \"3d74c75e-9665-4723-8dca-9019bd324ccb\") " pod="openstack/keystone-bootstrap-ch4lf" Mar 09 13:20:50 crc kubenswrapper[4723]: I0309 13:20:50.908143 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d74c75e-9665-4723-8dca-9019bd324ccb-scripts\") pod \"keystone-bootstrap-ch4lf\" (UID: \"3d74c75e-9665-4723-8dca-9019bd324ccb\") " pod="openstack/keystone-bootstrap-ch4lf" Mar 09 13:20:50 crc kubenswrapper[4723]: I0309 13:20:50.908188 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3d74c75e-9665-4723-8dca-9019bd324ccb-credential-keys\") pod \"keystone-bootstrap-ch4lf\" (UID: \"3d74c75e-9665-4723-8dca-9019bd324ccb\") " pod="openstack/keystone-bootstrap-ch4lf" Mar 09 13:20:50 crc kubenswrapper[4723]: I0309 13:20:50.908212 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d74c75e-9665-4723-8dca-9019bd324ccb-combined-ca-bundle\") pod \"keystone-bootstrap-ch4lf\" (UID: \"3d74c75e-9665-4723-8dca-9019bd324ccb\") " pod="openstack/keystone-bootstrap-ch4lf" Mar 09 13:20:50 crc kubenswrapper[4723]: I0309 13:20:50.913115 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3d74c75e-9665-4723-8dca-9019bd324ccb-fernet-keys\") pod \"keystone-bootstrap-ch4lf\" (UID: \"3d74c75e-9665-4723-8dca-9019bd324ccb\") " pod="openstack/keystone-bootstrap-ch4lf" Mar 09 13:20:50 crc kubenswrapper[4723]: I0309 13:20:50.929124 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68a3edf0-8e62-4c92-8d5f-c831ec5625e9" path="/var/lib/kubelet/pods/68a3edf0-8e62-4c92-8d5f-c831ec5625e9/volumes" Mar 09 13:20:51 crc kubenswrapper[4723]: I0309 13:20:51.015839 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d74c75e-9665-4723-8dca-9019bd324ccb-config-data\") pod \"keystone-bootstrap-ch4lf\" (UID: \"3d74c75e-9665-4723-8dca-9019bd324ccb\") " pod="openstack/keystone-bootstrap-ch4lf" Mar 09 13:20:51 crc kubenswrapper[4723]: I0309 13:20:51.016067 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djb8p\" (UniqueName: \"kubernetes.io/projected/3d74c75e-9665-4723-8dca-9019bd324ccb-kube-api-access-djb8p\") pod \"keystone-bootstrap-ch4lf\" (UID: \"3d74c75e-9665-4723-8dca-9019bd324ccb\") " pod="openstack/keystone-bootstrap-ch4lf" Mar 09 13:20:51 crc kubenswrapper[4723]: I0309 13:20:51.016161 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d74c75e-9665-4723-8dca-9019bd324ccb-scripts\") pod \"keystone-bootstrap-ch4lf\" (UID: \"3d74c75e-9665-4723-8dca-9019bd324ccb\") " pod="openstack/keystone-bootstrap-ch4lf" Mar 09 13:20:51 crc kubenswrapper[4723]: I0309 13:20:51.016238 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3d74c75e-9665-4723-8dca-9019bd324ccb-credential-keys\") pod \"keystone-bootstrap-ch4lf\" (UID: \"3d74c75e-9665-4723-8dca-9019bd324ccb\") " pod="openstack/keystone-bootstrap-ch4lf" Mar 09 13:20:51 crc kubenswrapper[4723]: I0309 13:20:51.016280 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d74c75e-9665-4723-8dca-9019bd324ccb-combined-ca-bundle\") pod \"keystone-bootstrap-ch4lf\" (UID: \"3d74c75e-9665-4723-8dca-9019bd324ccb\") " pod="openstack/keystone-bootstrap-ch4lf" Mar 09 13:20:51 crc kubenswrapper[4723]: I0309 13:20:51.016457 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3d74c75e-9665-4723-8dca-9019bd324ccb-fernet-keys\") pod \"keystone-bootstrap-ch4lf\" (UID: \"3d74c75e-9665-4723-8dca-9019bd324ccb\") " pod="openstack/keystone-bootstrap-ch4lf" Mar 09 13:20:51 crc kubenswrapper[4723]: I0309 13:20:51.023199 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d74c75e-9665-4723-8dca-9019bd324ccb-scripts\") pod \"keystone-bootstrap-ch4lf\" (UID: \"3d74c75e-9665-4723-8dca-9019bd324ccb\") " pod="openstack/keystone-bootstrap-ch4lf" Mar 09 13:20:51 crc kubenswrapper[4723]: I0309 13:20:51.023401 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3d74c75e-9665-4723-8dca-9019bd324ccb-fernet-keys\") pod \"keystone-bootstrap-ch4lf\" (UID: \"3d74c75e-9665-4723-8dca-9019bd324ccb\") " pod="openstack/keystone-bootstrap-ch4lf" Mar 09 13:20:51 crc kubenswrapper[4723]: I0309 13:20:51.023511 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d74c75e-9665-4723-8dca-9019bd324ccb-config-data\") pod \"keystone-bootstrap-ch4lf\" (UID: \"3d74c75e-9665-4723-8dca-9019bd324ccb\") " pod="openstack/keystone-bootstrap-ch4lf" Mar 09 13:20:51 crc kubenswrapper[4723]: I0309 13:20:51.023880 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d74c75e-9665-4723-8dca-9019bd324ccb-combined-ca-bundle\") pod \"keystone-bootstrap-ch4lf\" (UID: \"3d74c75e-9665-4723-8dca-9019bd324ccb\") " pod="openstack/keystone-bootstrap-ch4lf" Mar 09 13:20:51 crc kubenswrapper[4723]: I0309 13:20:51.024273 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3d74c75e-9665-4723-8dca-9019bd324ccb-credential-keys\") pod \"keystone-bootstrap-ch4lf\" (UID: \"3d74c75e-9665-4723-8dca-9019bd324ccb\") " pod="openstack/keystone-bootstrap-ch4lf" Mar 09 13:20:51 crc kubenswrapper[4723]: I0309 13:20:51.042742 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djb8p\" (UniqueName: \"kubernetes.io/projected/3d74c75e-9665-4723-8dca-9019bd324ccb-kube-api-access-djb8p\") pod \"keystone-bootstrap-ch4lf\" (UID: \"3d74c75e-9665-4723-8dca-9019bd324ccb\") " pod="openstack/keystone-bootstrap-ch4lf" Mar 09 13:20:51 crc kubenswrapper[4723]: I0309 13:20:51.174726 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ch4lf" Mar 09 13:20:51 crc kubenswrapper[4723]: I0309 13:20:51.584997 4723 generic.go:334] "Generic (PLEG): container finished" podID="33ef3c2f-07b0-431c-96b0-b588061d9ce9" containerID="d92d0d56a25623f0f68ce1e22eaaea9b59c6682ffe250280b7f5ab6820b0d811" exitCode=0 Mar 09 13:20:51 crc kubenswrapper[4723]: I0309 13:20:51.585049 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-2kn5v" event={"ID":"33ef3c2f-07b0-431c-96b0-b588061d9ce9","Type":"ContainerDied","Data":"d92d0d56a25623f0f68ce1e22eaaea9b59c6682ffe250280b7f5ab6820b0d811"} Mar 09 13:20:54 crc kubenswrapper[4723]: I0309 13:20:54.620659 4723 generic.go:334] "Generic (PLEG): container finished" podID="72dc18cb-be01-4378-b62c-609a2c237731" containerID="99ed3f717a597eff2118d8c9217456fe28f4259766a3a003f976b0720a42d3e3" exitCode=0 Mar 09 13:20:54 crc kubenswrapper[4723]: I0309 13:20:54.621020 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-b5gc4" event={"ID":"72dc18cb-be01-4378-b62c-609a2c237731","Type":"ContainerDied","Data":"99ed3f717a597eff2118d8c9217456fe28f4259766a3a003f976b0720a42d3e3"} Mar 09 13:20:54 crc kubenswrapper[4723]: I0309 13:20:54.901110 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-2kn5v" podUID="33ef3c2f-07b0-431c-96b0-b588061d9ce9" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.152:5353: connect: connection refused" Mar 09 13:20:57 crc kubenswrapper[4723]: I0309 13:20:57.363192 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:57 crc kubenswrapper[4723]: I0309 13:20:57.370628 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:57 crc kubenswrapper[4723]: I0309 13:20:57.658753 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 09 13:20:59 crc kubenswrapper[4723]: E0309 13:20:59.075415 4723 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Mar 09 13:20:59 crc kubenswrapper[4723]: E0309 13:20:59.075904 4723 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5lvdx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-cp5rk_openstack(6832d621-3d7d-4e4a-824b-f219746aaa89): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 13:20:59 crc kubenswrapper[4723]: E0309 13:20:59.077038 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-cp5rk" podUID="6832d621-3d7d-4e4a-824b-f219746aaa89" Mar 09 13:20:59 crc kubenswrapper[4723]: I0309 13:20:59.261232 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-b5gc4" Mar 09 13:20:59 crc kubenswrapper[4723]: I0309 13:20:59.434898 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72dc18cb-be01-4378-b62c-609a2c237731-config-data\") pod \"72dc18cb-be01-4378-b62c-609a2c237731\" (UID: \"72dc18cb-be01-4378-b62c-609a2c237731\") " Mar 09 13:20:59 crc kubenswrapper[4723]: I0309 13:20:59.434996 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hm9c\" (UniqueName: \"kubernetes.io/projected/72dc18cb-be01-4378-b62c-609a2c237731-kube-api-access-5hm9c\") pod \"72dc18cb-be01-4378-b62c-609a2c237731\" (UID: \"72dc18cb-be01-4378-b62c-609a2c237731\") " Mar 09 13:20:59 crc kubenswrapper[4723]: I0309 13:20:59.435045 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72dc18cb-be01-4378-b62c-609a2c237731-combined-ca-bundle\") pod \"72dc18cb-be01-4378-b62c-609a2c237731\" (UID: \"72dc18cb-be01-4378-b62c-609a2c237731\") " Mar 09 13:20:59 crc kubenswrapper[4723]: I0309 13:20:59.435068 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72dc18cb-be01-4378-b62c-609a2c237731-db-sync-config-data\") pod \"72dc18cb-be01-4378-b62c-609a2c237731\" (UID: \"72dc18cb-be01-4378-b62c-609a2c237731\") " Mar 09 13:20:59 crc kubenswrapper[4723]: I0309 13:20:59.442229 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72dc18cb-be01-4378-b62c-609a2c237731-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "72dc18cb-be01-4378-b62c-609a2c237731" (UID: "72dc18cb-be01-4378-b62c-609a2c237731"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:20:59 crc kubenswrapper[4723]: I0309 13:20:59.454883 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72dc18cb-be01-4378-b62c-609a2c237731-kube-api-access-5hm9c" (OuterVolumeSpecName: "kube-api-access-5hm9c") pod "72dc18cb-be01-4378-b62c-609a2c237731" (UID: "72dc18cb-be01-4378-b62c-609a2c237731"). InnerVolumeSpecName "kube-api-access-5hm9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:20:59 crc kubenswrapper[4723]: I0309 13:20:59.483905 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72dc18cb-be01-4378-b62c-609a2c237731-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72dc18cb-be01-4378-b62c-609a2c237731" (UID: "72dc18cb-be01-4378-b62c-609a2c237731"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:20:59 crc kubenswrapper[4723]: I0309 13:20:59.500439 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72dc18cb-be01-4378-b62c-609a2c237731-config-data" (OuterVolumeSpecName: "config-data") pod "72dc18cb-be01-4378-b62c-609a2c237731" (UID: "72dc18cb-be01-4378-b62c-609a2c237731"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:20:59 crc kubenswrapper[4723]: I0309 13:20:59.537103 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hm9c\" (UniqueName: \"kubernetes.io/projected/72dc18cb-be01-4378-b62c-609a2c237731-kube-api-access-5hm9c\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:59 crc kubenswrapper[4723]: I0309 13:20:59.537136 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72dc18cb-be01-4378-b62c-609a2c237731-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:59 crc kubenswrapper[4723]: I0309 13:20:59.537146 4723 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72dc18cb-be01-4378-b62c-609a2c237731-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:59 crc kubenswrapper[4723]: I0309 13:20:59.537156 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72dc18cb-be01-4378-b62c-609a2c237731-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:20:59 crc kubenswrapper[4723]: I0309 13:20:59.670387 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-b5gc4" event={"ID":"72dc18cb-be01-4378-b62c-609a2c237731","Type":"ContainerDied","Data":"0ce831fca7ae9078cd87d075655dc30a1957cba71962161555758a3a14bfc7a6"} Mar 09 13:20:59 crc kubenswrapper[4723]: I0309 13:20:59.671716 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ce831fca7ae9078cd87d075655dc30a1957cba71962161555758a3a14bfc7a6" Mar 09 13:20:59 crc kubenswrapper[4723]: I0309 13:20:59.671824 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-b5gc4" Mar 09 13:20:59 crc kubenswrapper[4723]: E0309 13:20:59.683043 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-cp5rk" podUID="6832d621-3d7d-4e4a-824b-f219746aaa89" Mar 09 13:21:00 crc kubenswrapper[4723]: I0309 13:21:00.766697 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-2lbbd"] Mar 09 13:21:00 crc kubenswrapper[4723]: E0309 13:21:00.767180 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72dc18cb-be01-4378-b62c-609a2c237731" containerName="glance-db-sync" Mar 09 13:21:00 crc kubenswrapper[4723]: I0309 13:21:00.767192 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="72dc18cb-be01-4378-b62c-609a2c237731" containerName="glance-db-sync" Mar 09 13:21:00 crc kubenswrapper[4723]: I0309 13:21:00.767385 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="72dc18cb-be01-4378-b62c-609a2c237731" containerName="glance-db-sync" Mar 09 13:21:00 crc kubenswrapper[4723]: I0309 13:21:00.768521 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-2lbbd" Mar 09 13:21:00 crc kubenswrapper[4723]: I0309 13:21:00.796563 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-2lbbd"] Mar 09 13:21:00 crc kubenswrapper[4723]: I0309 13:21:00.868139 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-2lbbd\" (UID: \"bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9\") " pod="openstack/dnsmasq-dns-8b5c85b87-2lbbd" Mar 09 13:21:00 crc kubenswrapper[4723]: I0309 13:21:00.868204 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9-config\") pod \"dnsmasq-dns-8b5c85b87-2lbbd\" (UID: \"bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9\") " pod="openstack/dnsmasq-dns-8b5c85b87-2lbbd" Mar 09 13:21:00 crc kubenswrapper[4723]: I0309 13:21:00.868234 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-2lbbd\" (UID: \"bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9\") " pod="openstack/dnsmasq-dns-8b5c85b87-2lbbd" Mar 09 13:21:00 crc kubenswrapper[4723]: I0309 13:21:00.868290 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flsr4\" (UniqueName: \"kubernetes.io/projected/bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9-kube-api-access-flsr4\") pod \"dnsmasq-dns-8b5c85b87-2lbbd\" (UID: \"bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9\") " pod="openstack/dnsmasq-dns-8b5c85b87-2lbbd" Mar 09 13:21:00 crc kubenswrapper[4723]: I0309 13:21:00.868343 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-2lbbd\" (UID: \"bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9\") " pod="openstack/dnsmasq-dns-8b5c85b87-2lbbd" Mar 09 13:21:00 crc kubenswrapper[4723]: I0309 13:21:00.868376 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-2lbbd\" (UID: \"bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9\") " pod="openstack/dnsmasq-dns-8b5c85b87-2lbbd" Mar 09 13:21:00 crc kubenswrapper[4723]: I0309 13:21:00.969030 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-2lbbd\" (UID: \"bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9\") " pod="openstack/dnsmasq-dns-8b5c85b87-2lbbd" Mar 09 13:21:00 crc kubenswrapper[4723]: I0309 13:21:00.969106 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9-config\") pod \"dnsmasq-dns-8b5c85b87-2lbbd\" (UID: \"bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9\") " pod="openstack/dnsmasq-dns-8b5c85b87-2lbbd" Mar 09 13:21:00 crc kubenswrapper[4723]: I0309 13:21:00.969142 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-2lbbd\" (UID: \"bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9\") " pod="openstack/dnsmasq-dns-8b5c85b87-2lbbd" Mar 09 13:21:00 crc kubenswrapper[4723]: I0309 13:21:00.969221 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flsr4\" (UniqueName: \"kubernetes.io/projected/bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9-kube-api-access-flsr4\") pod \"dnsmasq-dns-8b5c85b87-2lbbd\" (UID: \"bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9\") " pod="openstack/dnsmasq-dns-8b5c85b87-2lbbd" Mar 09 13:21:00 crc kubenswrapper[4723]: I0309 13:21:00.969278 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-2lbbd\" (UID: \"bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9\") " pod="openstack/dnsmasq-dns-8b5c85b87-2lbbd" Mar 09 13:21:00 crc kubenswrapper[4723]: I0309 13:21:00.969310 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-2lbbd\" (UID: \"bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9\") " pod="openstack/dnsmasq-dns-8b5c85b87-2lbbd" Mar 09 13:21:00 crc kubenswrapper[4723]: I0309 13:21:00.970317 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-2lbbd\" (UID: \"bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9\") " pod="openstack/dnsmasq-dns-8b5c85b87-2lbbd" Mar 09 13:21:00 crc kubenswrapper[4723]: I0309 13:21:00.970430 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-2lbbd\" (UID: \"bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9\") " pod="openstack/dnsmasq-dns-8b5c85b87-2lbbd" Mar 09 13:21:00 crc kubenswrapper[4723]: I0309 13:21:00.971257 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-2lbbd\" (UID: \"bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9\") " pod="openstack/dnsmasq-dns-8b5c85b87-2lbbd" Mar 09 13:21:00 crc kubenswrapper[4723]: I0309 13:21:00.971405 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9-config\") pod \"dnsmasq-dns-8b5c85b87-2lbbd\" (UID: \"bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9\") " pod="openstack/dnsmasq-dns-8b5c85b87-2lbbd" Mar 09 13:21:00 crc kubenswrapper[4723]: I0309 13:21:00.971515 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-2lbbd\" (UID: \"bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9\") " pod="openstack/dnsmasq-dns-8b5c85b87-2lbbd" Mar 09 13:21:01 crc kubenswrapper[4723]: I0309 13:21:01.081459 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flsr4\" (UniqueName: \"kubernetes.io/projected/bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9-kube-api-access-flsr4\") pod \"dnsmasq-dns-8b5c85b87-2lbbd\" (UID: \"bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9\") " pod="openstack/dnsmasq-dns-8b5c85b87-2lbbd" Mar 09 13:21:01 crc kubenswrapper[4723]: I0309 13:21:01.097532 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-2lbbd" Mar 09 13:21:01 crc kubenswrapper[4723]: I0309 13:21:01.695519 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 13:21:01 crc kubenswrapper[4723]: I0309 13:21:01.698178 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 13:21:01 crc kubenswrapper[4723]: I0309 13:21:01.703031 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 09 13:21:01 crc kubenswrapper[4723]: I0309 13:21:01.703473 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-ncx5z" Mar 09 13:21:01 crc kubenswrapper[4723]: I0309 13:21:01.703840 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 09 13:21:01 crc kubenswrapper[4723]: I0309 13:21:01.711011 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 13:21:01 crc kubenswrapper[4723]: I0309 13:21:01.893104 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-970052f0-8f5a-4d9c-9fec-85f44ded7440\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-970052f0-8f5a-4d9c-9fec-85f44ded7440\") pod \"glance-default-external-api-0\" (UID: \"63094be1-8d09-427e-9743-697c7bdcdb41\") " pod="openstack/glance-default-external-api-0" Mar 09 13:21:01 crc kubenswrapper[4723]: I0309 13:21:01.893874 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63094be1-8d09-427e-9743-697c7bdcdb41-config-data\") pod \"glance-default-external-api-0\" (UID: \"63094be1-8d09-427e-9743-697c7bdcdb41\") " pod="openstack/glance-default-external-api-0" Mar 09 13:21:01 crc kubenswrapper[4723]: I0309 13:21:01.894028 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63094be1-8d09-427e-9743-697c7bdcdb41-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"63094be1-8d09-427e-9743-697c7bdcdb41\") " pod="openstack/glance-default-external-api-0" Mar 09 13:21:01 crc kubenswrapper[4723]: I0309 13:21:01.894131 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63094be1-8d09-427e-9743-697c7bdcdb41-scripts\") pod \"glance-default-external-api-0\" (UID: \"63094be1-8d09-427e-9743-697c7bdcdb41\") " pod="openstack/glance-default-external-api-0" Mar 09 13:21:01 crc kubenswrapper[4723]: I0309 13:21:01.894233 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63094be1-8d09-427e-9743-697c7bdcdb41-logs\") pod \"glance-default-external-api-0\" (UID: \"63094be1-8d09-427e-9743-697c7bdcdb41\") " pod="openstack/glance-default-external-api-0" Mar 09 13:21:01 crc kubenswrapper[4723]: I0309 13:21:01.894333 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6vvv\" (UniqueName: \"kubernetes.io/projected/63094be1-8d09-427e-9743-697c7bdcdb41-kube-api-access-n6vvv\") pod \"glance-default-external-api-0\" (UID: \"63094be1-8d09-427e-9743-697c7bdcdb41\") " pod="openstack/glance-default-external-api-0" Mar 09 13:21:01 crc kubenswrapper[4723]: I0309 13:21:01.894424 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/63094be1-8d09-427e-9743-697c7bdcdb41-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"63094be1-8d09-427e-9743-697c7bdcdb41\") " pod="openstack/glance-default-external-api-0" Mar 09 13:21:01 crc kubenswrapper[4723]: I0309 13:21:01.919445 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 13:21:01 crc kubenswrapper[4723]: I0309 13:21:01.921503 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 13:21:01 crc kubenswrapper[4723]: I0309 13:21:01.924109 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 09 13:21:01 crc kubenswrapper[4723]: I0309 13:21:01.947634 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 13:21:01 crc kubenswrapper[4723]: I0309 13:21:01.997349 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-970052f0-8f5a-4d9c-9fec-85f44ded7440\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-970052f0-8f5a-4d9c-9fec-85f44ded7440\") pod \"glance-default-external-api-0\" (UID: \"63094be1-8d09-427e-9743-697c7bdcdb41\") " pod="openstack/glance-default-external-api-0" Mar 09 13:21:01 crc kubenswrapper[4723]: I0309 13:21:01.997445 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63094be1-8d09-427e-9743-697c7bdcdb41-config-data\") pod \"glance-default-external-api-0\" (UID: \"63094be1-8d09-427e-9743-697c7bdcdb41\") " pod="openstack/glance-default-external-api-0" Mar 09 13:21:01 crc kubenswrapper[4723]: I0309 13:21:01.997533 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63094be1-8d09-427e-9743-697c7bdcdb41-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"63094be1-8d09-427e-9743-697c7bdcdb41\") " pod="openstack/glance-default-external-api-0" Mar 09 13:21:01 crc kubenswrapper[4723]: I0309 13:21:01.997599 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63094be1-8d09-427e-9743-697c7bdcdb41-scripts\") pod \"glance-default-external-api-0\" (UID: \"63094be1-8d09-427e-9743-697c7bdcdb41\") " pod="openstack/glance-default-external-api-0" Mar 09 13:21:01 crc kubenswrapper[4723]: I0309 13:21:01.997681 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63094be1-8d09-427e-9743-697c7bdcdb41-logs\") pod \"glance-default-external-api-0\" (UID: \"63094be1-8d09-427e-9743-697c7bdcdb41\") " pod="openstack/glance-default-external-api-0" Mar 09 13:21:01 crc kubenswrapper[4723]: I0309 13:21:01.997766 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6vvv\" (UniqueName: \"kubernetes.io/projected/63094be1-8d09-427e-9743-697c7bdcdb41-kube-api-access-n6vvv\") pod \"glance-default-external-api-0\" (UID: \"63094be1-8d09-427e-9743-697c7bdcdb41\") " pod="openstack/glance-default-external-api-0" Mar 09 13:21:01 crc kubenswrapper[4723]: I0309 13:21:01.997836 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/63094be1-8d09-427e-9743-697c7bdcdb41-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"63094be1-8d09-427e-9743-697c7bdcdb41\") " pod="openstack/glance-default-external-api-0" Mar 09 13:21:01 crc kubenswrapper[4723]: I0309 13:21:01.999056 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63094be1-8d09-427e-9743-697c7bdcdb41-logs\") pod \"glance-default-external-api-0\" (UID: \"63094be1-8d09-427e-9743-697c7bdcdb41\") " pod="openstack/glance-default-external-api-0" Mar 09 13:21:01 crc kubenswrapper[4723]: I0309 13:21:01.999915 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/63094be1-8d09-427e-9743-697c7bdcdb41-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"63094be1-8d09-427e-9743-697c7bdcdb41\") " pod="openstack/glance-default-external-api-0" Mar 09 13:21:02 crc kubenswrapper[4723]: I0309 13:21:02.003212 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63094be1-8d09-427e-9743-697c7bdcdb41-scripts\") pod \"glance-default-external-api-0\" (UID: \"63094be1-8d09-427e-9743-697c7bdcdb41\") " pod="openstack/glance-default-external-api-0" Mar 09 13:21:02 crc kubenswrapper[4723]: I0309 13:21:02.004493 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63094be1-8d09-427e-9743-697c7bdcdb41-config-data\") pod \"glance-default-external-api-0\" (UID: \"63094be1-8d09-427e-9743-697c7bdcdb41\") " pod="openstack/glance-default-external-api-0" Mar 09 13:21:02 crc kubenswrapper[4723]: I0309 13:21:02.006519 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63094be1-8d09-427e-9743-697c7bdcdb41-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"63094be1-8d09-427e-9743-697c7bdcdb41\") " pod="openstack/glance-default-external-api-0" Mar 09 13:21:02 crc kubenswrapper[4723]: I0309 13:21:02.007450 4723 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 13:21:02 crc kubenswrapper[4723]: I0309 13:21:02.007599 4723 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-970052f0-8f5a-4d9c-9fec-85f44ded7440\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-970052f0-8f5a-4d9c-9fec-85f44ded7440\") pod \"glance-default-external-api-0\" (UID: \"63094be1-8d09-427e-9743-697c7bdcdb41\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c915ec56eda8abd87809fc9b9acb00675f31cc32ff5e20cf139093d267df65a0/globalmount\"" pod="openstack/glance-default-external-api-0" Mar 09 13:21:02 crc kubenswrapper[4723]: I0309 13:21:02.020263 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6vvv\" (UniqueName: \"kubernetes.io/projected/63094be1-8d09-427e-9743-697c7bdcdb41-kube-api-access-n6vvv\") pod \"glance-default-external-api-0\" (UID: \"63094be1-8d09-427e-9743-697c7bdcdb41\") " pod="openstack/glance-default-external-api-0" Mar 09 13:21:02 crc kubenswrapper[4723]: I0309 13:21:02.064885 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-970052f0-8f5a-4d9c-9fec-85f44ded7440\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-970052f0-8f5a-4d9c-9fec-85f44ded7440\") pod \"glance-default-external-api-0\" (UID: \"63094be1-8d09-427e-9743-697c7bdcdb41\") " pod="openstack/glance-default-external-api-0" Mar 09 13:21:02 crc kubenswrapper[4723]: I0309 13:21:02.102196 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/981e29c2-619d-4c24-a269-d3400901d853-config-data\") pod \"glance-default-internal-api-0\" (UID: \"981e29c2-619d-4c24-a269-d3400901d853\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:21:02 crc kubenswrapper[4723]: I0309 13:21:02.102263 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c93877a3-be1d-4f5b-9fa6-375a3db60f77\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c93877a3-be1d-4f5b-9fa6-375a3db60f77\") pod \"glance-default-internal-api-0\" (UID: \"981e29c2-619d-4c24-a269-d3400901d853\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:21:02 crc kubenswrapper[4723]: I0309 13:21:02.102285 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/981e29c2-619d-4c24-a269-d3400901d853-scripts\") pod \"glance-default-internal-api-0\" (UID: \"981e29c2-619d-4c24-a269-d3400901d853\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:21:02 crc kubenswrapper[4723]: I0309 13:21:02.102363 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/981e29c2-619d-4c24-a269-d3400901d853-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"981e29c2-619d-4c24-a269-d3400901d853\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:21:02 crc kubenswrapper[4723]: I0309 13:21:02.102436 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/981e29c2-619d-4c24-a269-d3400901d853-logs\") pod \"glance-default-internal-api-0\" (UID: \"981e29c2-619d-4c24-a269-d3400901d853\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:21:02 crc kubenswrapper[4723]: I0309 13:21:02.102460 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/981e29c2-619d-4c24-a269-d3400901d853-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"981e29c2-619d-4c24-a269-d3400901d853\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:21:02 crc kubenswrapper[4723]: I0309 13:21:02.102482 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb62l\" (UniqueName: \"kubernetes.io/projected/981e29c2-619d-4c24-a269-d3400901d853-kube-api-access-nb62l\") pod \"glance-default-internal-api-0\" (UID: \"981e29c2-619d-4c24-a269-d3400901d853\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:21:02 crc kubenswrapper[4723]: I0309 13:21:02.207655 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/981e29c2-619d-4c24-a269-d3400901d853-logs\") pod \"glance-default-internal-api-0\" (UID: \"981e29c2-619d-4c24-a269-d3400901d853\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:21:02 crc kubenswrapper[4723]: I0309 13:21:02.207703 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/981e29c2-619d-4c24-a269-d3400901d853-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"981e29c2-619d-4c24-a269-d3400901d853\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:21:02 crc kubenswrapper[4723]: I0309 13:21:02.207725 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nb62l\" (UniqueName: \"kubernetes.io/projected/981e29c2-619d-4c24-a269-d3400901d853-kube-api-access-nb62l\") pod \"glance-default-internal-api-0\" (UID: \"981e29c2-619d-4c24-a269-d3400901d853\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:21:02 crc kubenswrapper[4723]: I0309 13:21:02.207770 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/981e29c2-619d-4c24-a269-d3400901d853-config-data\") pod \"glance-default-internal-api-0\" (UID: \"981e29c2-619d-4c24-a269-d3400901d853\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:21:02 crc kubenswrapper[4723]: I0309 13:21:02.207800 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c93877a3-be1d-4f5b-9fa6-375a3db60f77\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c93877a3-be1d-4f5b-9fa6-375a3db60f77\") pod \"glance-default-internal-api-0\" (UID: \"981e29c2-619d-4c24-a269-d3400901d853\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:21:02 crc kubenswrapper[4723]: I0309 13:21:02.207821 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/981e29c2-619d-4c24-a269-d3400901d853-scripts\") pod \"glance-default-internal-api-0\" (UID: \"981e29c2-619d-4c24-a269-d3400901d853\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:21:02 crc kubenswrapper[4723]: I0309 13:21:02.210400 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/981e29c2-619d-4c24-a269-d3400901d853-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"981e29c2-619d-4c24-a269-d3400901d853\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:21:02 crc kubenswrapper[4723]: I0309 13:21:02.211242 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/981e29c2-619d-4c24-a269-d3400901d853-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"981e29c2-619d-4c24-a269-d3400901d853\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:21:02 crc kubenswrapper[4723]: I0309 13:21:02.211998 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/981e29c2-619d-4c24-a269-d3400901d853-logs\") pod \"glance-default-internal-api-0\" (UID: \"981e29c2-619d-4c24-a269-d3400901d853\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:21:02 crc kubenswrapper[4723]: I0309 13:21:02.218025 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/981e29c2-619d-4c24-a269-d3400901d853-config-data\") pod \"glance-default-internal-api-0\" (UID: \"981e29c2-619d-4c24-a269-d3400901d853\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:21:02 crc kubenswrapper[4723]: I0309 13:21:02.222561 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/981e29c2-619d-4c24-a269-d3400901d853-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"981e29c2-619d-4c24-a269-d3400901d853\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:21:02 crc kubenswrapper[4723]: I0309 13:21:02.266438 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb62l\" (UniqueName: \"kubernetes.io/projected/981e29c2-619d-4c24-a269-d3400901d853-kube-api-access-nb62l\") pod \"glance-default-internal-api-0\" (UID: \"981e29c2-619d-4c24-a269-d3400901d853\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:21:02 crc kubenswrapper[4723]: I0309 13:21:02.273567 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/981e29c2-619d-4c24-a269-d3400901d853-scripts\") pod \"glance-default-internal-api-0\" (UID: \"981e29c2-619d-4c24-a269-d3400901d853\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:21:02 crc kubenswrapper[4723]: I0309 13:21:02.340985 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 13:21:02 crc kubenswrapper[4723]: I0309 13:21:02.344397 4723 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 13:21:02 crc kubenswrapper[4723]: I0309 13:21:02.344453 4723 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c93877a3-be1d-4f5b-9fa6-375a3db60f77\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c93877a3-be1d-4f5b-9fa6-375a3db60f77\") pod \"glance-default-internal-api-0\" (UID: \"981e29c2-619d-4c24-a269-d3400901d853\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2988164fdbbdc0e6befebfe68058338f4fd9913a4b29356e34a132113ba27e6b/globalmount\"" pod="openstack/glance-default-internal-api-0" Mar 09 13:21:02 crc kubenswrapper[4723]: I0309 13:21:02.434024 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c93877a3-be1d-4f5b-9fa6-375a3db60f77\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c93877a3-be1d-4f5b-9fa6-375a3db60f77\") pod \"glance-default-internal-api-0\" (UID: \"981e29c2-619d-4c24-a269-d3400901d853\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:21:02 crc kubenswrapper[4723]: I0309 13:21:02.553292 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 13:21:03 crc kubenswrapper[4723]: I0309 13:21:03.606490 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 13:21:03 crc kubenswrapper[4723]: I0309 13:21:03.676331 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 13:21:03 crc kubenswrapper[4723]: I0309 13:21:03.946621 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:21:03 crc kubenswrapper[4723]: I0309 13:21:03.947059 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:21:04 crc kubenswrapper[4723]: I0309 13:21:04.761293 4723 generic.go:334] "Generic (PLEG): container finished" podID="8846c8f3-62f3-4053-8b48-177d011dd0c9" containerID="29b115c1f8743153a2d56a56172aa3cc7d2029bb69439e6f911ef022e708ed89" exitCode=0 Mar 09 13:21:04 crc kubenswrapper[4723]: I0309 13:21:04.761343 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-pf8wv" event={"ID":"8846c8f3-62f3-4053-8b48-177d011dd0c9","Type":"ContainerDied","Data":"29b115c1f8743153a2d56a56172aa3cc7d2029bb69439e6f911ef022e708ed89"} Mar 09 13:21:04 crc kubenswrapper[4723]: I0309 13:21:04.900714 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-2kn5v" podUID="33ef3c2f-07b0-431c-96b0-b588061d9ce9" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.152:5353: i/o timeout" Mar 09 13:21:09 crc kubenswrapper[4723]: E0309 13:21:09.471314 4723 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified" Mar 09 13:21:09 crc kubenswrapper[4723]: E0309 13:21:09.472013 4723 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pc6n2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-lf8bq_openstack(90dea403-5a65-4824-ac5b-5c34c828d616): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 13:21:09 crc kubenswrapper[4723]: E0309 13:21:09.473180 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-lf8bq" podUID="90dea403-5a65-4824-ac5b-5c34c828d616" Mar 09 13:21:09 crc kubenswrapper[4723]: I0309 13:21:09.572307 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-2kn5v" Mar 09 13:21:09 crc kubenswrapper[4723]: I0309 13:21:09.578555 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-pf8wv" Mar 09 13:21:09 crc kubenswrapper[4723]: I0309 13:21:09.600203 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33ef3c2f-07b0-431c-96b0-b588061d9ce9-dns-svc\") pod \"33ef3c2f-07b0-431c-96b0-b588061d9ce9\" (UID: \"33ef3c2f-07b0-431c-96b0-b588061d9ce9\") " Mar 09 13:21:09 crc kubenswrapper[4723]: I0309 13:21:09.600369 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33ef3c2f-07b0-431c-96b0-b588061d9ce9-config\") pod \"33ef3c2f-07b0-431c-96b0-b588061d9ce9\" (UID: \"33ef3c2f-07b0-431c-96b0-b588061d9ce9\") " Mar 09 13:21:09 crc kubenswrapper[4723]: I0309 13:21:09.600511 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8846c8f3-62f3-4053-8b48-177d011dd0c9-combined-ca-bundle\") pod \"8846c8f3-62f3-4053-8b48-177d011dd0c9\" (UID: \"8846c8f3-62f3-4053-8b48-177d011dd0c9\") " Mar 09 13:21:09 crc kubenswrapper[4723]: I0309 13:21:09.600555 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmcl7\" (UniqueName: \"kubernetes.io/projected/33ef3c2f-07b0-431c-96b0-b588061d9ce9-kube-api-access-qmcl7\") pod \"33ef3c2f-07b0-431c-96b0-b588061d9ce9\" (UID: \"33ef3c2f-07b0-431c-96b0-b588061d9ce9\") " Mar 09 13:21:09 crc kubenswrapper[4723]: I0309 13:21:09.611355 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33ef3c2f-07b0-431c-96b0-b588061d9ce9-kube-api-access-qmcl7" (OuterVolumeSpecName: "kube-api-access-qmcl7") pod "33ef3c2f-07b0-431c-96b0-b588061d9ce9" (UID: "33ef3c2f-07b0-431c-96b0-b588061d9ce9"). InnerVolumeSpecName "kube-api-access-qmcl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:09 crc kubenswrapper[4723]: I0309 13:21:09.655662 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8846c8f3-62f3-4053-8b48-177d011dd0c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8846c8f3-62f3-4053-8b48-177d011dd0c9" (UID: "8846c8f3-62f3-4053-8b48-177d011dd0c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:09 crc kubenswrapper[4723]: I0309 13:21:09.666492 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33ef3c2f-07b0-431c-96b0-b588061d9ce9-config" (OuterVolumeSpecName: "config") pod "33ef3c2f-07b0-431c-96b0-b588061d9ce9" (UID: "33ef3c2f-07b0-431c-96b0-b588061d9ce9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:09 crc kubenswrapper[4723]: I0309 13:21:09.684593 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33ef3c2f-07b0-431c-96b0-b588061d9ce9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "33ef3c2f-07b0-431c-96b0-b588061d9ce9" (UID: "33ef3c2f-07b0-431c-96b0-b588061d9ce9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:09 crc kubenswrapper[4723]: I0309 13:21:09.702904 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4d2g\" (UniqueName: \"kubernetes.io/projected/8846c8f3-62f3-4053-8b48-177d011dd0c9-kube-api-access-k4d2g\") pod \"8846c8f3-62f3-4053-8b48-177d011dd0c9\" (UID: \"8846c8f3-62f3-4053-8b48-177d011dd0c9\") " Mar 09 13:21:09 crc kubenswrapper[4723]: I0309 13:21:09.702987 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33ef3c2f-07b0-431c-96b0-b588061d9ce9-ovsdbserver-sb\") pod \"33ef3c2f-07b0-431c-96b0-b588061d9ce9\" (UID: \"33ef3c2f-07b0-431c-96b0-b588061d9ce9\") " Mar 09 13:21:09 crc kubenswrapper[4723]: I0309 13:21:09.703041 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33ef3c2f-07b0-431c-96b0-b588061d9ce9-ovsdbserver-nb\") pod \"33ef3c2f-07b0-431c-96b0-b588061d9ce9\" (UID: \"33ef3c2f-07b0-431c-96b0-b588061d9ce9\") " Mar 09 13:21:09 crc kubenswrapper[4723]: I0309 13:21:09.703082 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8846c8f3-62f3-4053-8b48-177d011dd0c9-config\") pod \"8846c8f3-62f3-4053-8b48-177d011dd0c9\" (UID: \"8846c8f3-62f3-4053-8b48-177d011dd0c9\") " Mar 09 13:21:09 crc kubenswrapper[4723]: I0309 13:21:09.703592 4723 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33ef3c2f-07b0-431c-96b0-b588061d9ce9-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:09 crc kubenswrapper[4723]: I0309 13:21:09.703620 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33ef3c2f-07b0-431c-96b0-b588061d9ce9-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:09 crc kubenswrapper[4723]: I0309 13:21:09.703634 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8846c8f3-62f3-4053-8b48-177d011dd0c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:09 crc kubenswrapper[4723]: I0309 13:21:09.703648 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmcl7\" (UniqueName: \"kubernetes.io/projected/33ef3c2f-07b0-431c-96b0-b588061d9ce9-kube-api-access-qmcl7\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:09 crc kubenswrapper[4723]: I0309 13:21:09.708957 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8846c8f3-62f3-4053-8b48-177d011dd0c9-kube-api-access-k4d2g" (OuterVolumeSpecName: "kube-api-access-k4d2g") pod "8846c8f3-62f3-4053-8b48-177d011dd0c9" (UID: "8846c8f3-62f3-4053-8b48-177d011dd0c9"). InnerVolumeSpecName "kube-api-access-k4d2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:09 crc kubenswrapper[4723]: I0309 13:21:09.736054 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8846c8f3-62f3-4053-8b48-177d011dd0c9-config" (OuterVolumeSpecName: "config") pod "8846c8f3-62f3-4053-8b48-177d011dd0c9" (UID: "8846c8f3-62f3-4053-8b48-177d011dd0c9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:09 crc kubenswrapper[4723]: I0309 13:21:09.757083 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33ef3c2f-07b0-431c-96b0-b588061d9ce9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "33ef3c2f-07b0-431c-96b0-b588061d9ce9" (UID: "33ef3c2f-07b0-431c-96b0-b588061d9ce9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:09 crc kubenswrapper[4723]: I0309 13:21:09.760944 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33ef3c2f-07b0-431c-96b0-b588061d9ce9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "33ef3c2f-07b0-431c-96b0-b588061d9ce9" (UID: "33ef3c2f-07b0-431c-96b0-b588061d9ce9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:09 crc kubenswrapper[4723]: I0309 13:21:09.805769 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4d2g\" (UniqueName: \"kubernetes.io/projected/8846c8f3-62f3-4053-8b48-177d011dd0c9-kube-api-access-k4d2g\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:09 crc kubenswrapper[4723]: I0309 13:21:09.805799 4723 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33ef3c2f-07b0-431c-96b0-b588061d9ce9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:09 crc kubenswrapper[4723]: I0309 13:21:09.805810 4723 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33ef3c2f-07b0-431c-96b0-b588061d9ce9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:09 crc kubenswrapper[4723]: I0309 13:21:09.805820 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/8846c8f3-62f3-4053-8b48-177d011dd0c9-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:09 crc kubenswrapper[4723]: I0309 13:21:09.827461 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-pf8wv" event={"ID":"8846c8f3-62f3-4053-8b48-177d011dd0c9","Type":"ContainerDied","Data":"a163abfebaba397bab3c1ebf6d5e3a0fed057b2ea5d1ece53b4a0fd01dfb8d75"} Mar 09 13:21:09 crc kubenswrapper[4723]: I0309 13:21:09.827547 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a163abfebaba397bab3c1ebf6d5e3a0fed057b2ea5d1ece53b4a0fd01dfb8d75" Mar 09 13:21:09 crc kubenswrapper[4723]: I0309 13:21:09.827609 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-pf8wv" Mar 09 13:21:09 crc kubenswrapper[4723]: I0309 13:21:09.833198 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-2kn5v" event={"ID":"33ef3c2f-07b0-431c-96b0-b588061d9ce9","Type":"ContainerDied","Data":"832c6ccd40dfa821f0bfd7b7e072c78b48a07a5ffe3103a4644b17ae7e111d12"} Mar 09 13:21:09 crc kubenswrapper[4723]: I0309 13:21:09.833499 4723 scope.go:117] "RemoveContainer" containerID="d92d0d56a25623f0f68ce1e22eaaea9b59c6682ffe250280b7f5ab6820b0d811" Mar 09 13:21:09 crc kubenswrapper[4723]: I0309 13:21:09.833319 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-2kn5v" Mar 09 13:21:09 crc kubenswrapper[4723]: E0309 13:21:09.833979 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified\\\"\"" pod="openstack/heat-db-sync-lf8bq" podUID="90dea403-5a65-4824-ac5b-5c34c828d616" Mar 09 13:21:09 crc kubenswrapper[4723]: I0309 13:21:09.878443 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-2kn5v"] Mar 09 13:21:09 crc kubenswrapper[4723]: I0309 13:21:09.887763 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-2kn5v"] Mar 09 13:21:09 crc kubenswrapper[4723]: I0309 13:21:09.901674 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-2kn5v" podUID="33ef3c2f-07b0-431c-96b0-b588061d9ce9" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.152:5353: i/o timeout" Mar 09 13:21:10 crc kubenswrapper[4723]: E0309 13:21:10.691941 4723 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 09 13:21:10 crc kubenswrapper[4723]: E0309 13:21:10.693480 4723 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vwtpj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-bm79v_openstack(191baa15-4ac5-4e55-9f87-751eddffb83e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 13:21:10 crc kubenswrapper[4723]: E0309 13:21:10.694948 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-bm79v" podUID="191baa15-4ac5-4e55-9f87-751eddffb83e" Mar 09 13:21:10 crc kubenswrapper[4723]: I0309 13:21:10.784396 4723 scope.go:117] "RemoveContainer" containerID="8356b72a18db57f66396db270ee1ac88e50cfbd0bd7e036a7db66fa6b79fd270" Mar 09 13:21:10 crc kubenswrapper[4723]: I0309 13:21:10.851513 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-2lbbd"] Mar 09 13:21:10 crc kubenswrapper[4723]: I0309 13:21:10.955060 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33ef3c2f-07b0-431c-96b0-b588061d9ce9" path="/var/lib/kubelet/pods/33ef3c2f-07b0-431c-96b0-b588061d9ce9/volumes" Mar 09 13:21:10 crc kubenswrapper[4723]: I0309 13:21:10.955763 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-9pt7z"] Mar 09 13:21:10 crc kubenswrapper[4723]: E0309 13:21:10.956100 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33ef3c2f-07b0-431c-96b0-b588061d9ce9" containerName="init" Mar 09 13:21:10 crc kubenswrapper[4723]: I0309 13:21:10.956116 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="33ef3c2f-07b0-431c-96b0-b588061d9ce9" containerName="init" Mar 09 13:21:10 crc kubenswrapper[4723]: E0309 13:21:10.956125 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33ef3c2f-07b0-431c-96b0-b588061d9ce9" containerName="dnsmasq-dns" Mar 09 13:21:10 crc kubenswrapper[4723]: I0309 13:21:10.956131 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="33ef3c2f-07b0-431c-96b0-b588061d9ce9" containerName="dnsmasq-dns" Mar 09 13:21:10 crc kubenswrapper[4723]: E0309 13:21:10.956151 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8846c8f3-62f3-4053-8b48-177d011dd0c9" containerName="neutron-db-sync" Mar 09 13:21:10 crc kubenswrapper[4723]: I0309 13:21:10.956157 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="8846c8f3-62f3-4053-8b48-177d011dd0c9" containerName="neutron-db-sync" Mar 09 13:21:10 crc kubenswrapper[4723]: I0309 13:21:10.956335 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="33ef3c2f-07b0-431c-96b0-b588061d9ce9" containerName="dnsmasq-dns" Mar 09 13:21:10 crc kubenswrapper[4723]: I0309 13:21:10.956353 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="8846c8f3-62f3-4053-8b48-177d011dd0c9" containerName="neutron-db-sync" Mar 09 13:21:10 crc kubenswrapper[4723]: I0309 13:21:10.958850 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-9pt7z" Mar 09 13:21:10 crc kubenswrapper[4723]: I0309 13:21:10.959490 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-9pt7z"] Mar 09 13:21:10 crc kubenswrapper[4723]: E0309 13:21:10.965760 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-bm79v" podUID="191baa15-4ac5-4e55-9f87-751eddffb83e" Mar 09 13:21:11 crc kubenswrapper[4723]: I0309 13:21:11.057667 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a2df24d1-d5eb-4680-aeb5-e8c2b6a93090-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-9pt7z\" (UID: \"a2df24d1-d5eb-4680-aeb5-e8c2b6a93090\") " pod="openstack/dnsmasq-dns-84b966f6c9-9pt7z" Mar 09 13:21:11 crc kubenswrapper[4723]: I0309 13:21:11.057731 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a2df24d1-d5eb-4680-aeb5-e8c2b6a93090-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-9pt7z\" (UID: \"a2df24d1-d5eb-4680-aeb5-e8c2b6a93090\") " pod="openstack/dnsmasq-dns-84b966f6c9-9pt7z" Mar 09 13:21:11 crc kubenswrapper[4723]: I0309 13:21:11.057766 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a2df24d1-d5eb-4680-aeb5-e8c2b6a93090-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-9pt7z\" (UID: \"a2df24d1-d5eb-4680-aeb5-e8c2b6a93090\") " pod="openstack/dnsmasq-dns-84b966f6c9-9pt7z" Mar 09 13:21:11 crc kubenswrapper[4723]: I0309 13:21:11.057792 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2df24d1-d5eb-4680-aeb5-e8c2b6a93090-config\") pod \"dnsmasq-dns-84b966f6c9-9pt7z\" (UID: \"a2df24d1-d5eb-4680-aeb5-e8c2b6a93090\") " pod="openstack/dnsmasq-dns-84b966f6c9-9pt7z" Mar 09 13:21:11 crc kubenswrapper[4723]: I0309 13:21:11.057962 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsm8l\" (UniqueName: \"kubernetes.io/projected/a2df24d1-d5eb-4680-aeb5-e8c2b6a93090-kube-api-access-bsm8l\") pod \"dnsmasq-dns-84b966f6c9-9pt7z\" (UID: \"a2df24d1-d5eb-4680-aeb5-e8c2b6a93090\") " pod="openstack/dnsmasq-dns-84b966f6c9-9pt7z" Mar 09 13:21:11 crc kubenswrapper[4723]: I0309 13:21:11.058106 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2df24d1-d5eb-4680-aeb5-e8c2b6a93090-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-9pt7z\" (UID: \"a2df24d1-d5eb-4680-aeb5-e8c2b6a93090\") " pod="openstack/dnsmasq-dns-84b966f6c9-9pt7z" Mar 09 13:21:11 crc kubenswrapper[4723]: I0309 13:21:11.113930 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-597569c5dd-vxwdd"] Mar 09 13:21:11 crc kubenswrapper[4723]: I0309 13:21:11.116130 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-597569c5dd-vxwdd" Mar 09 13:21:11 crc kubenswrapper[4723]: I0309 13:21:11.119010 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-qq99h" Mar 09 13:21:11 crc kubenswrapper[4723]: I0309 13:21:11.119648 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 09 13:21:11 crc kubenswrapper[4723]: I0309 13:21:11.119918 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 09 13:21:11 crc kubenswrapper[4723]: I0309 13:21:11.120011 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 09 13:21:11 crc kubenswrapper[4723]: I0309 13:21:11.156398 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-597569c5dd-vxwdd"] Mar 09 13:21:11 crc kubenswrapper[4723]: I0309 13:21:11.160952 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a2df24d1-d5eb-4680-aeb5-e8c2b6a93090-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-9pt7z\" (UID: \"a2df24d1-d5eb-4680-aeb5-e8c2b6a93090\") " pod="openstack/dnsmasq-dns-84b966f6c9-9pt7z" Mar 09 13:21:11 crc kubenswrapper[4723]: I0309 13:21:11.161007 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25ssx\" (UniqueName: \"kubernetes.io/projected/de4e8079-9f44-44ce-937d-0364b3ff7a9e-kube-api-access-25ssx\") pod \"neutron-597569c5dd-vxwdd\" (UID: \"de4e8079-9f44-44ce-937d-0364b3ff7a9e\") " pod="openstack/neutron-597569c5dd-vxwdd" Mar 09 13:21:11 crc kubenswrapper[4723]: I0309 13:21:11.161038 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a2df24d1-d5eb-4680-aeb5-e8c2b6a93090-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-9pt7z\" (UID: \"a2df24d1-d5eb-4680-aeb5-e8c2b6a93090\") " pod="openstack/dnsmasq-dns-84b966f6c9-9pt7z" Mar 09 13:21:11 crc kubenswrapper[4723]: I0309 13:21:11.161067 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2df24d1-d5eb-4680-aeb5-e8c2b6a93090-config\") pod \"dnsmasq-dns-84b966f6c9-9pt7z\" (UID: \"a2df24d1-d5eb-4680-aeb5-e8c2b6a93090\") " pod="openstack/dnsmasq-dns-84b966f6c9-9pt7z" Mar 09 13:21:11 crc kubenswrapper[4723]: I0309 13:21:11.161104 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/de4e8079-9f44-44ce-937d-0364b3ff7a9e-config\") pod \"neutron-597569c5dd-vxwdd\" (UID: \"de4e8079-9f44-44ce-937d-0364b3ff7a9e\") " pod="openstack/neutron-597569c5dd-vxwdd" Mar 09 13:21:11 crc kubenswrapper[4723]: I0309 13:21:11.161195 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de4e8079-9f44-44ce-937d-0364b3ff7a9e-combined-ca-bundle\") pod \"neutron-597569c5dd-vxwdd\" (UID: \"de4e8079-9f44-44ce-937d-0364b3ff7a9e\") " pod="openstack/neutron-597569c5dd-vxwdd" Mar 09 13:21:11 crc kubenswrapper[4723]: I0309 13:21:11.161225 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsm8l\" (UniqueName: \"kubernetes.io/projected/a2df24d1-d5eb-4680-aeb5-e8c2b6a93090-kube-api-access-bsm8l\") pod \"dnsmasq-dns-84b966f6c9-9pt7z\" (UID: \"a2df24d1-d5eb-4680-aeb5-e8c2b6a93090\") " pod="openstack/dnsmasq-dns-84b966f6c9-9pt7z" Mar 09 13:21:11 crc kubenswrapper[4723]: I0309 13:21:11.161307 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/de4e8079-9f44-44ce-937d-0364b3ff7a9e-httpd-config\") pod \"neutron-597569c5dd-vxwdd\" (UID: \"de4e8079-9f44-44ce-937d-0364b3ff7a9e\") " pod="openstack/neutron-597569c5dd-vxwdd" Mar 09 13:21:11 crc kubenswrapper[4723]: I0309 13:21:11.161355 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2df24d1-d5eb-4680-aeb5-e8c2b6a93090-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-9pt7z\" (UID: \"a2df24d1-d5eb-4680-aeb5-e8c2b6a93090\") " pod="openstack/dnsmasq-dns-84b966f6c9-9pt7z" Mar 09 13:21:11 crc kubenswrapper[4723]: I0309 13:21:11.161392 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de4e8079-9f44-44ce-937d-0364b3ff7a9e-ovndb-tls-certs\") pod \"neutron-597569c5dd-vxwdd\" (UID: \"de4e8079-9f44-44ce-937d-0364b3ff7a9e\") " pod="openstack/neutron-597569c5dd-vxwdd" Mar 09 13:21:11 crc kubenswrapper[4723]: I0309 13:21:11.161433 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a2df24d1-d5eb-4680-aeb5-e8c2b6a93090-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-9pt7z\" (UID: \"a2df24d1-d5eb-4680-aeb5-e8c2b6a93090\") " pod="openstack/dnsmasq-dns-84b966f6c9-9pt7z" Mar 09 13:21:11 crc kubenswrapper[4723]: I0309 13:21:11.172266 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a2df24d1-d5eb-4680-aeb5-e8c2b6a93090-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-9pt7z\" (UID: \"a2df24d1-d5eb-4680-aeb5-e8c2b6a93090\") " pod="openstack/dnsmasq-dns-84b966f6c9-9pt7z" Mar 09 13:21:11 crc kubenswrapper[4723]: I0309 13:21:11.179014 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2df24d1-d5eb-4680-aeb5-e8c2b6a93090-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-9pt7z\" (UID: \"a2df24d1-d5eb-4680-aeb5-e8c2b6a93090\") " pod="openstack/dnsmasq-dns-84b966f6c9-9pt7z" Mar 09 13:21:11 crc kubenswrapper[4723]: I0309 13:21:11.179138 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2df24d1-d5eb-4680-aeb5-e8c2b6a93090-config\") pod \"dnsmasq-dns-84b966f6c9-9pt7z\" (UID: \"a2df24d1-d5eb-4680-aeb5-e8c2b6a93090\") " pod="openstack/dnsmasq-dns-84b966f6c9-9pt7z" Mar 09 13:21:11 crc kubenswrapper[4723]: I0309 13:21:11.184768 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a2df24d1-d5eb-4680-aeb5-e8c2b6a93090-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-9pt7z\" (UID: \"a2df24d1-d5eb-4680-aeb5-e8c2b6a93090\") " pod="openstack/dnsmasq-dns-84b966f6c9-9pt7z" Mar 09 13:21:11 crc kubenswrapper[4723]: I0309 13:21:11.185690 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a2df24d1-d5eb-4680-aeb5-e8c2b6a93090-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-9pt7z\" (UID: \"a2df24d1-d5eb-4680-aeb5-e8c2b6a93090\") " pod="openstack/dnsmasq-dns-84b966f6c9-9pt7z" Mar 09 13:21:11 crc kubenswrapper[4723]: I0309 13:21:11.191466 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsm8l\" (UniqueName: \"kubernetes.io/projected/a2df24d1-d5eb-4680-aeb5-e8c2b6a93090-kube-api-access-bsm8l\") pod \"dnsmasq-dns-84b966f6c9-9pt7z\" (UID: \"a2df24d1-d5eb-4680-aeb5-e8c2b6a93090\") " pod="openstack/dnsmasq-dns-84b966f6c9-9pt7z" Mar 09 13:21:11 crc kubenswrapper[4723]: I0309 13:21:11.265259 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de4e8079-9f44-44ce-937d-0364b3ff7a9e-combined-ca-bundle\") pod \"neutron-597569c5dd-vxwdd\" (UID: \"de4e8079-9f44-44ce-937d-0364b3ff7a9e\") " pod="openstack/neutron-597569c5dd-vxwdd" Mar 09 13:21:11 crc kubenswrapper[4723]: I0309 13:21:11.265547 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/de4e8079-9f44-44ce-937d-0364b3ff7a9e-httpd-config\") pod \"neutron-597569c5dd-vxwdd\" (UID: \"de4e8079-9f44-44ce-937d-0364b3ff7a9e\") " pod="openstack/neutron-597569c5dd-vxwdd" Mar 09 13:21:11 crc kubenswrapper[4723]: I0309 13:21:11.265654 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de4e8079-9f44-44ce-937d-0364b3ff7a9e-ovndb-tls-certs\") pod \"neutron-597569c5dd-vxwdd\" (UID: \"de4e8079-9f44-44ce-937d-0364b3ff7a9e\") " pod="openstack/neutron-597569c5dd-vxwdd" Mar 09 13:21:11 crc kubenswrapper[4723]: I0309 13:21:11.265771 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25ssx\" (UniqueName: \"kubernetes.io/projected/de4e8079-9f44-44ce-937d-0364b3ff7a9e-kube-api-access-25ssx\") pod \"neutron-597569c5dd-vxwdd\" (UID: \"de4e8079-9f44-44ce-937d-0364b3ff7a9e\") " pod="openstack/neutron-597569c5dd-vxwdd" Mar 09 13:21:11 crc kubenswrapper[4723]: I0309 13:21:11.265884 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/de4e8079-9f44-44ce-937d-0364b3ff7a9e-config\") pod \"neutron-597569c5dd-vxwdd\" (UID: \"de4e8079-9f44-44ce-937d-0364b3ff7a9e\") " pod="openstack/neutron-597569c5dd-vxwdd" Mar 09 13:21:11 crc kubenswrapper[4723]: I0309 13:21:11.283601 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de4e8079-9f44-44ce-937d-0364b3ff7a9e-ovndb-tls-certs\") pod \"neutron-597569c5dd-vxwdd\" (UID: \"de4e8079-9f44-44ce-937d-0364b3ff7a9e\") " pod="openstack/neutron-597569c5dd-vxwdd" Mar 09 13:21:11 crc kubenswrapper[4723]: I0309 13:21:11.288431 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/de4e8079-9f44-44ce-937d-0364b3ff7a9e-httpd-config\") pod \"neutron-597569c5dd-vxwdd\" (UID: \"de4e8079-9f44-44ce-937d-0364b3ff7a9e\") " pod="openstack/neutron-597569c5dd-vxwdd" Mar 09 13:21:11 crc kubenswrapper[4723]: I0309 13:21:11.289481 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/de4e8079-9f44-44ce-937d-0364b3ff7a9e-config\") pod \"neutron-597569c5dd-vxwdd\" (UID: \"de4e8079-9f44-44ce-937d-0364b3ff7a9e\") " pod="openstack/neutron-597569c5dd-vxwdd" Mar 09 13:21:11 crc kubenswrapper[4723]: I0309 13:21:11.310019 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de4e8079-9f44-44ce-937d-0364b3ff7a9e-combined-ca-bundle\") pod \"neutron-597569c5dd-vxwdd\" (UID: \"de4e8079-9f44-44ce-937d-0364b3ff7a9e\") " pod="openstack/neutron-597569c5dd-vxwdd" Mar 09 13:21:11 crc kubenswrapper[4723]: I0309 13:21:11.314029 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-9pt7z" Mar 09 13:21:11 crc kubenswrapper[4723]: I0309 13:21:11.314191 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25ssx\" (UniqueName: \"kubernetes.io/projected/de4e8079-9f44-44ce-937d-0364b3ff7a9e-kube-api-access-25ssx\") pod \"neutron-597569c5dd-vxwdd\" (UID: \"de4e8079-9f44-44ce-937d-0364b3ff7a9e\") " pod="openstack/neutron-597569c5dd-vxwdd" Mar 09 13:21:11 crc kubenswrapper[4723]: I0309 13:21:11.450667 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-597569c5dd-vxwdd" Mar 09 13:21:11 crc kubenswrapper[4723]: I0309 13:21:11.966738 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-2lbbd"] Mar 09 13:21:11 crc kubenswrapper[4723]: W0309 13:21:11.971276 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd4eb3ac_c666_4d5d_93bb_37dfb1902fd9.slice/crio-42ceef1accfd9619f5d2adcb1c773860f74f82a067d9a9419e465e1c6dbcbe73 WatchSource:0}: Error finding container 42ceef1accfd9619f5d2adcb1c773860f74f82a067d9a9419e465e1c6dbcbe73: Status 404 returned error can't find the container with id 42ceef1accfd9619f5d2adcb1c773860f74f82a067d9a9419e465e1c6dbcbe73 Mar 09 13:21:11 crc kubenswrapper[4723]: I0309 13:21:11.974937 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e0bf458-3488-4d3a-80ac-d9cf2f655791","Type":"ContainerStarted","Data":"38abaf8dc7e932d42d539dc5445545a95ed746821765a01b2eb815a51321876c"} Mar 09 13:21:11 crc kubenswrapper[4723]: I0309 13:21:11.977483 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wvbmm" event={"ID":"052977d5-adda-4cc2-a8bc-7b4ea4e32df7","Type":"ContainerStarted","Data":"fa271cd98fb9d718deefeb8613c719476e73d21728be167d26cb92a97b551a67"} Mar 09 13:21:12 crc kubenswrapper[4723]: I0309 13:21:12.009637 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ch4lf"] Mar 09 13:21:12 crc kubenswrapper[4723]: I0309 13:21:12.017281 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-wvbmm" podStartSLOduration=3.804377787 podStartE2EDuration="33.017261686s" podCreationTimestamp="2026-03-09 13:20:39 +0000 UTC" firstStartedPulling="2026-03-09 13:20:41.423643105 +0000 UTC m=+1315.438110645" lastFinishedPulling="2026-03-09 13:21:10.636527014 +0000 UTC m=+1344.650994544" observedRunningTime="2026-03-09 13:21:12.000375309 +0000 UTC m=+1346.014842849" watchObservedRunningTime="2026-03-09 13:21:12.017261686 +0000 UTC m=+1346.031729226" Mar 09 13:21:12 crc kubenswrapper[4723]: I0309 13:21:12.092129 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 13:21:12 crc kubenswrapper[4723]: W0309 13:21:12.114929 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod981e29c2_619d_4c24_a269_d3400901d853.slice/crio-fc13c4cb9f8a3b3b85321639d5eb0d59c0c1325f9545856a72954fabd945b5bb WatchSource:0}: Error finding container fc13c4cb9f8a3b3b85321639d5eb0d59c0c1325f9545856a72954fabd945b5bb: Status 404 returned error can't find the container with id fc13c4cb9f8a3b3b85321639d5eb0d59c0c1325f9545856a72954fabd945b5bb Mar 09 13:21:12 crc kubenswrapper[4723]: I0309 13:21:12.149245 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-9pt7z"] Mar 09 13:21:12 crc kubenswrapper[4723]: I0309 13:21:12.166751 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 13:21:12 crc kubenswrapper[4723]: W0309 13:21:12.206191 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63094be1_8d09_427e_9743_697c7bdcdb41.slice/crio-810895a43a9c156e4da229f4efdb374e6320d138665090a1014880933069b7e6 WatchSource:0}: Error finding container 810895a43a9c156e4da229f4efdb374e6320d138665090a1014880933069b7e6: Status 404 returned error can't find the container with id 810895a43a9c156e4da229f4efdb374e6320d138665090a1014880933069b7e6 Mar 09 13:21:12 crc kubenswrapper[4723]: I0309 13:21:12.398173 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-597569c5dd-vxwdd"] Mar 09 13:21:12 crc kubenswrapper[4723]: I0309 13:21:12.994316 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-9pt7z" event={"ID":"a2df24d1-d5eb-4680-aeb5-e8c2b6a93090","Type":"ContainerStarted","Data":"72e5eca062d6a90c9c8bf015e63b9cd7aa73298d393661c97e2cfafeed457c8b"} Mar 09 13:21:13 crc kubenswrapper[4723]: I0309 13:21:13.001686 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-597569c5dd-vxwdd" event={"ID":"de4e8079-9f44-44ce-937d-0364b3ff7a9e","Type":"ContainerStarted","Data":"3e48af8a56fad11cc047b1d1d435d508320f0297008ac8e5d80628bf0ed3f546"} Mar 09 13:21:13 crc kubenswrapper[4723]: I0309 13:21:13.007644 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"981e29c2-619d-4c24-a269-d3400901d853","Type":"ContainerStarted","Data":"fc13c4cb9f8a3b3b85321639d5eb0d59c0c1325f9545856a72954fabd945b5bb"} Mar 09 13:21:13 crc kubenswrapper[4723]: I0309 13:21:13.009070 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ch4lf" event={"ID":"3d74c75e-9665-4723-8dca-9019bd324ccb","Type":"ContainerStarted","Data":"fe3139d90fc2debdc13fe9cfc9bf20e0412a7bff134036ac39658fac91cbee73"} Mar 09 13:21:13 crc kubenswrapper[4723]: I0309 13:21:13.009097 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ch4lf" event={"ID":"3d74c75e-9665-4723-8dca-9019bd324ccb","Type":"ContainerStarted","Data":"920731df8c6b4733c66acc97fd9537a062c06a347e6112cab23494bdc55f0aba"} Mar 09 13:21:13 crc kubenswrapper[4723]: I0309 13:21:13.016739 4723 generic.go:334] "Generic (PLEG): container finished" podID="bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9" containerID="85c1f4f649f83a640a90cf1b1ded147aa010a539481fa93ee6bbe2b20b1b9914" exitCode=0 Mar 09 13:21:13 crc kubenswrapper[4723]: I0309 13:21:13.016786 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-2lbbd" event={"ID":"bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9","Type":"ContainerDied","Data":"85c1f4f649f83a640a90cf1b1ded147aa010a539481fa93ee6bbe2b20b1b9914"} Mar 09 13:21:13 crc kubenswrapper[4723]: I0309 13:21:13.016809 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-2lbbd" event={"ID":"bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9","Type":"ContainerStarted","Data":"42ceef1accfd9619f5d2adcb1c773860f74f82a067d9a9419e465e1c6dbcbe73"} Mar 09 13:21:13 crc kubenswrapper[4723]: I0309 13:21:13.041773 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"63094be1-8d09-427e-9743-697c7bdcdb41","Type":"ContainerStarted","Data":"810895a43a9c156e4da229f4efdb374e6320d138665090a1014880933069b7e6"} Mar 09 13:21:13 crc kubenswrapper[4723]: I0309 13:21:13.050366 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-ch4lf" podStartSLOduration=23.050339637 podStartE2EDuration="23.050339637s" podCreationTimestamp="2026-03-09 13:20:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:21:13.02967734 +0000 UTC m=+1347.044144880" watchObservedRunningTime="2026-03-09 13:21:13.050339637 +0000 UTC m=+1347.064807177" Mar 09 13:21:13 crc kubenswrapper[4723]: I0309 13:21:13.589374 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-646c887bd9-qzqxk"] Mar 09 13:21:13 crc kubenswrapper[4723]: I0309 13:21:13.592265 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-646c887bd9-qzqxk" Mar 09 13:21:13 crc kubenswrapper[4723]: I0309 13:21:13.598679 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 09 13:21:13 crc kubenswrapper[4723]: I0309 13:21:13.600667 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 09 13:21:13 crc kubenswrapper[4723]: I0309 13:21:13.603530 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-646c887bd9-qzqxk"] Mar 09 13:21:13 crc kubenswrapper[4723]: I0309 13:21:13.713273 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/28297498-43e9-457c-a90d-0c3f49907491-public-tls-certs\") pod \"neutron-646c887bd9-qzqxk\" (UID: \"28297498-43e9-457c-a90d-0c3f49907491\") " pod="openstack/neutron-646c887bd9-qzqxk" Mar 09 13:21:13 crc kubenswrapper[4723]: I0309 13:21:13.713518 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/28297498-43e9-457c-a90d-0c3f49907491-config\") pod \"neutron-646c887bd9-qzqxk\" (UID: \"28297498-43e9-457c-a90d-0c3f49907491\") " pod="openstack/neutron-646c887bd9-qzqxk" Mar 09 13:21:13 crc kubenswrapper[4723]: I0309 13:21:13.713778 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8swz8\" (UniqueName: \"kubernetes.io/projected/28297498-43e9-457c-a90d-0c3f49907491-kube-api-access-8swz8\") pod \"neutron-646c887bd9-qzqxk\" (UID: \"28297498-43e9-457c-a90d-0c3f49907491\") " pod="openstack/neutron-646c887bd9-qzqxk" Mar 09 13:21:13 crc kubenswrapper[4723]: I0309 13:21:13.713817 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/28297498-43e9-457c-a90d-0c3f49907491-httpd-config\") pod \"neutron-646c887bd9-qzqxk\" (UID: \"28297498-43e9-457c-a90d-0c3f49907491\") " pod="openstack/neutron-646c887bd9-qzqxk" Mar 09 13:21:13 crc kubenswrapper[4723]: I0309 13:21:13.713846 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/28297498-43e9-457c-a90d-0c3f49907491-internal-tls-certs\") pod \"neutron-646c887bd9-qzqxk\" (UID: \"28297498-43e9-457c-a90d-0c3f49907491\") " pod="openstack/neutron-646c887bd9-qzqxk" Mar 09 13:21:13 crc kubenswrapper[4723]: I0309 13:21:13.713921 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28297498-43e9-457c-a90d-0c3f49907491-combined-ca-bundle\") pod \"neutron-646c887bd9-qzqxk\" (UID: \"28297498-43e9-457c-a90d-0c3f49907491\") " pod="openstack/neutron-646c887bd9-qzqxk" Mar 09 13:21:13 crc kubenswrapper[4723]: I0309 13:21:13.713939 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/28297498-43e9-457c-a90d-0c3f49907491-ovndb-tls-certs\") pod \"neutron-646c887bd9-qzqxk\" (UID: \"28297498-43e9-457c-a90d-0c3f49907491\") " pod="openstack/neutron-646c887bd9-qzqxk" Mar 09 13:21:13 crc kubenswrapper[4723]: I0309 13:21:13.815599 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/28297498-43e9-457c-a90d-0c3f49907491-public-tls-certs\") pod \"neutron-646c887bd9-qzqxk\" (UID: \"28297498-43e9-457c-a90d-0c3f49907491\") " pod="openstack/neutron-646c887bd9-qzqxk" Mar 09 13:21:13 crc kubenswrapper[4723]: I0309 13:21:13.817027 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/28297498-43e9-457c-a90d-0c3f49907491-config\") pod \"neutron-646c887bd9-qzqxk\" (UID: \"28297498-43e9-457c-a90d-0c3f49907491\") " pod="openstack/neutron-646c887bd9-qzqxk" Mar 09 13:21:13 crc kubenswrapper[4723]: I0309 13:21:13.817157 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8swz8\" (UniqueName: \"kubernetes.io/projected/28297498-43e9-457c-a90d-0c3f49907491-kube-api-access-8swz8\") pod \"neutron-646c887bd9-qzqxk\" (UID: \"28297498-43e9-457c-a90d-0c3f49907491\") " pod="openstack/neutron-646c887bd9-qzqxk" Mar 09 13:21:13 crc kubenswrapper[4723]: I0309 13:21:13.817206 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/28297498-43e9-457c-a90d-0c3f49907491-httpd-config\") pod \"neutron-646c887bd9-qzqxk\" (UID: \"28297498-43e9-457c-a90d-0c3f49907491\") " pod="openstack/neutron-646c887bd9-qzqxk" Mar 09 13:21:13 crc kubenswrapper[4723]: I0309 13:21:13.817235 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/28297498-43e9-457c-a90d-0c3f49907491-internal-tls-certs\") pod \"neutron-646c887bd9-qzqxk\" (UID: \"28297498-43e9-457c-a90d-0c3f49907491\") " pod="openstack/neutron-646c887bd9-qzqxk" Mar 09 13:21:13 crc kubenswrapper[4723]: I0309 13:21:13.817289 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28297498-43e9-457c-a90d-0c3f49907491-combined-ca-bundle\") pod \"neutron-646c887bd9-qzqxk\" (UID: \"28297498-43e9-457c-a90d-0c3f49907491\") " pod="openstack/neutron-646c887bd9-qzqxk" Mar 09 13:21:13 crc kubenswrapper[4723]: I0309 13:21:13.817316 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/28297498-43e9-457c-a90d-0c3f49907491-ovndb-tls-certs\") pod \"neutron-646c887bd9-qzqxk\" (UID: \"28297498-43e9-457c-a90d-0c3f49907491\") " pod="openstack/neutron-646c887bd9-qzqxk" Mar 09 13:21:13 crc kubenswrapper[4723]: I0309 13:21:13.820490 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/28297498-43e9-457c-a90d-0c3f49907491-public-tls-certs\") pod \"neutron-646c887bd9-qzqxk\" (UID: \"28297498-43e9-457c-a90d-0c3f49907491\") " pod="openstack/neutron-646c887bd9-qzqxk" Mar 09 13:21:13 crc kubenswrapper[4723]: I0309 13:21:13.822237 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/28297498-43e9-457c-a90d-0c3f49907491-config\") pod \"neutron-646c887bd9-qzqxk\" (UID: \"28297498-43e9-457c-a90d-0c3f49907491\") " pod="openstack/neutron-646c887bd9-qzqxk" Mar 09 13:21:13 crc kubenswrapper[4723]: I0309 13:21:13.822581 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28297498-43e9-457c-a90d-0c3f49907491-combined-ca-bundle\") pod \"neutron-646c887bd9-qzqxk\" (UID: \"28297498-43e9-457c-a90d-0c3f49907491\") " pod="openstack/neutron-646c887bd9-qzqxk" Mar 09 13:21:13 crc kubenswrapper[4723]: I0309 13:21:13.823838 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/28297498-43e9-457c-a90d-0c3f49907491-internal-tls-certs\") pod \"neutron-646c887bd9-qzqxk\" (UID: \"28297498-43e9-457c-a90d-0c3f49907491\") " pod="openstack/neutron-646c887bd9-qzqxk" Mar 09 13:21:13 crc kubenswrapper[4723]: I0309 13:21:13.823950 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/28297498-43e9-457c-a90d-0c3f49907491-ovndb-tls-certs\") pod \"neutron-646c887bd9-qzqxk\" (UID: \"28297498-43e9-457c-a90d-0c3f49907491\") " pod="openstack/neutron-646c887bd9-qzqxk" Mar 09 13:21:13 crc kubenswrapper[4723]: I0309 13:21:13.838504 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8swz8\" (UniqueName: \"kubernetes.io/projected/28297498-43e9-457c-a90d-0c3f49907491-kube-api-access-8swz8\") pod \"neutron-646c887bd9-qzqxk\" (UID: \"28297498-43e9-457c-a90d-0c3f49907491\") " pod="openstack/neutron-646c887bd9-qzqxk" Mar 09 13:21:13 crc kubenswrapper[4723]: I0309 13:21:13.839598 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/28297498-43e9-457c-a90d-0c3f49907491-httpd-config\") pod \"neutron-646c887bd9-qzqxk\" (UID: \"28297498-43e9-457c-a90d-0c3f49907491\") " pod="openstack/neutron-646c887bd9-qzqxk" Mar 09 13:21:13 crc kubenswrapper[4723]: I0309 13:21:13.914354 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-646c887bd9-qzqxk" Mar 09 13:21:14 crc kubenswrapper[4723]: I0309 13:21:14.059990 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-2lbbd" event={"ID":"bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9","Type":"ContainerDied","Data":"42ceef1accfd9619f5d2adcb1c773860f74f82a067d9a9419e465e1c6dbcbe73"} Mar 09 13:21:14 crc kubenswrapper[4723]: I0309 13:21:14.060036 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42ceef1accfd9619f5d2adcb1c773860f74f82a067d9a9419e465e1c6dbcbe73" Mar 09 13:21:14 crc kubenswrapper[4723]: I0309 13:21:14.061753 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"63094be1-8d09-427e-9743-697c7bdcdb41","Type":"ContainerStarted","Data":"a21863bb613a1974d34f8d63c84a7900ee3bb34ad88ea760618219c51e51b38e"} Mar 09 13:21:14 crc kubenswrapper[4723]: I0309 13:21:14.063275 4723 generic.go:334] "Generic (PLEG): container finished" podID="a2df24d1-d5eb-4680-aeb5-e8c2b6a93090" containerID="b8a98702c49a7f1d17a5268ee941bb7a7f9ec9bfd72579ab3b3cc36afc05b4db" exitCode=0 Mar 09 13:21:14 crc kubenswrapper[4723]: I0309 13:21:14.063341 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-9pt7z" event={"ID":"a2df24d1-d5eb-4680-aeb5-e8c2b6a93090","Type":"ContainerDied","Data":"b8a98702c49a7f1d17a5268ee941bb7a7f9ec9bfd72579ab3b3cc36afc05b4db"} Mar 09 13:21:14 crc kubenswrapper[4723]: I0309 13:21:14.094210 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-cp5rk" event={"ID":"6832d621-3d7d-4e4a-824b-f219746aaa89","Type":"ContainerStarted","Data":"01625adfdb5a75e4cff0e083107fb62501001c9a5214e951c45cff2815bc8cd4"} Mar 09 13:21:14 crc kubenswrapper[4723]: I0309 13:21:14.122832 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-cp5rk" podStartSLOduration=4.053633113 podStartE2EDuration="35.122814211s" podCreationTimestamp="2026-03-09 13:20:39 +0000 UTC" firstStartedPulling="2026-03-09 13:20:41.412400037 +0000 UTC m=+1315.426867577" lastFinishedPulling="2026-03-09 13:21:12.481581135 +0000 UTC m=+1346.496048675" observedRunningTime="2026-03-09 13:21:14.117296565 +0000 UTC m=+1348.131764105" watchObservedRunningTime="2026-03-09 13:21:14.122814211 +0000 UTC m=+1348.137281751" Mar 09 13:21:14 crc kubenswrapper[4723]: I0309 13:21:14.134665 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-597569c5dd-vxwdd" event={"ID":"de4e8079-9f44-44ce-937d-0364b3ff7a9e","Type":"ContainerStarted","Data":"21d2fe4aec92c09368a83bf195eb1d1edb63abd53ee959d548c5ef4582b18375"} Mar 09 13:21:14 crc kubenswrapper[4723]: I0309 13:21:14.174990 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"981e29c2-619d-4c24-a269-d3400901d853","Type":"ContainerStarted","Data":"1cc39b209ddd02a0caf06bbe58f3dbc548cf045d46ebf85295cc4da95a08a398"} Mar 09 13:21:14 crc kubenswrapper[4723]: I0309 13:21:14.216125 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-2lbbd" Mar 09 13:21:14 crc kubenswrapper[4723]: I0309 13:21:14.329790 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9-config\") pod \"bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9\" (UID: \"bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9\") " Mar 09 13:21:14 crc kubenswrapper[4723]: I0309 13:21:14.329882 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9-ovsdbserver-nb\") pod \"bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9\" (UID: \"bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9\") " Mar 09 13:21:14 crc kubenswrapper[4723]: I0309 13:21:14.329975 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9-ovsdbserver-sb\") pod \"bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9\" (UID: \"bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9\") " Mar 09 13:21:14 crc kubenswrapper[4723]: I0309 13:21:14.330002 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9-dns-svc\") pod \"bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9\" (UID: \"bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9\") " Mar 09 13:21:14 crc kubenswrapper[4723]: I0309 13:21:14.330069 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9-dns-swift-storage-0\") pod \"bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9\" (UID: \"bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9\") " Mar 09 13:21:14 crc kubenswrapper[4723]: I0309 13:21:14.330096 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flsr4\" (UniqueName: \"kubernetes.io/projected/bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9-kube-api-access-flsr4\") pod \"bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9\" (UID: \"bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9\") " Mar 09 13:21:14 crc kubenswrapper[4723]: I0309 13:21:14.335973 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9-kube-api-access-flsr4" (OuterVolumeSpecName: "kube-api-access-flsr4") pod "bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9" (UID: "bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9"). InnerVolumeSpecName "kube-api-access-flsr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:14 crc kubenswrapper[4723]: I0309 13:21:14.382011 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9" (UID: "bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:14 crc kubenswrapper[4723]: I0309 13:21:14.415236 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9" (UID: "bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:14 crc kubenswrapper[4723]: I0309 13:21:14.433243 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flsr4\" (UniqueName: \"kubernetes.io/projected/bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9-kube-api-access-flsr4\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:14 crc kubenswrapper[4723]: I0309 13:21:14.433288 4723 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:14 crc kubenswrapper[4723]: I0309 13:21:14.433302 4723 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:14 crc kubenswrapper[4723]: I0309 13:21:14.435437 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9" (UID: "bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:14 crc kubenswrapper[4723]: I0309 13:21:14.443293 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9" (UID: "bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:14 crc kubenswrapper[4723]: I0309 13:21:14.458142 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9-config" (OuterVolumeSpecName: "config") pod "bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9" (UID: "bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:14 crc kubenswrapper[4723]: I0309 13:21:14.534853 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:14 crc kubenswrapper[4723]: I0309 13:21:14.534899 4723 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:14 crc kubenswrapper[4723]: I0309 13:21:14.534909 4723 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:14 crc kubenswrapper[4723]: I0309 13:21:14.644710 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-646c887bd9-qzqxk"] Mar 09 13:21:15 crc kubenswrapper[4723]: I0309 13:21:15.189175 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e0bf458-3488-4d3a-80ac-d9cf2f655791","Type":"ContainerStarted","Data":"08856bed4f33e117dd10c871cba09579935b75e1730d76f9185afe1104c1a29f"} Mar 09 13:21:15 crc kubenswrapper[4723]: I0309 13:21:15.192551 4723 generic.go:334] "Generic (PLEG): container finished" podID="052977d5-adda-4cc2-a8bc-7b4ea4e32df7" containerID="fa271cd98fb9d718deefeb8613c719476e73d21728be167d26cb92a97b551a67" exitCode=0 Mar 09 13:21:15 crc kubenswrapper[4723]: I0309 13:21:15.192611 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wvbmm" event={"ID":"052977d5-adda-4cc2-a8bc-7b4ea4e32df7","Type":"ContainerDied","Data":"fa271cd98fb9d718deefeb8613c719476e73d21728be167d26cb92a97b551a67"} Mar 09 13:21:15 crc kubenswrapper[4723]: I0309 13:21:15.196853 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"63094be1-8d09-427e-9743-697c7bdcdb41","Type":"ContainerStarted","Data":"389eac8216d7b97c3fcf3fef592e7c340d0b9ea6b23d5ea8ba5ac23b49f2e963"} Mar 09 13:21:15 crc kubenswrapper[4723]: I0309 13:21:15.197017 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="63094be1-8d09-427e-9743-697c7bdcdb41" containerName="glance-httpd" containerID="cri-o://389eac8216d7b97c3fcf3fef592e7c340d0b9ea6b23d5ea8ba5ac23b49f2e963" gracePeriod=30 Mar 09 13:21:15 crc kubenswrapper[4723]: I0309 13:21:15.196959 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="63094be1-8d09-427e-9743-697c7bdcdb41" containerName="glance-log" containerID="cri-o://a21863bb613a1974d34f8d63c84a7900ee3bb34ad88ea760618219c51e51b38e" gracePeriod=30 Mar 09 13:21:15 crc kubenswrapper[4723]: I0309 13:21:15.245208 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-9pt7z" event={"ID":"a2df24d1-d5eb-4680-aeb5-e8c2b6a93090","Type":"ContainerStarted","Data":"cab4d22cb7da6a34a11896bec6e65184e23592a437083f0d5ff6d1fc60b117bd"} Mar 09 13:21:15 crc kubenswrapper[4723]: I0309 13:21:15.245326 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84b966f6c9-9pt7z" Mar 09 13:21:15 crc kubenswrapper[4723]: I0309 13:21:15.248513 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-597569c5dd-vxwdd" event={"ID":"de4e8079-9f44-44ce-937d-0364b3ff7a9e","Type":"ContainerStarted","Data":"396ebe17fa795e7aaeb9ee1d3e1c6982be72aa9650342f05d98b29e567725f39"} Mar 09 13:21:15 crc kubenswrapper[4723]: I0309 13:21:15.248768 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-597569c5dd-vxwdd" Mar 09 13:21:15 crc kubenswrapper[4723]: I0309 13:21:15.250303 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-646c887bd9-qzqxk" event={"ID":"28297498-43e9-457c-a90d-0c3f49907491","Type":"ContainerStarted","Data":"7a4d205903b0350689d246a7dd833d69ec9079f318670bb9c54d41aef2c07dd5"} Mar 09 13:21:15 crc kubenswrapper[4723]: I0309 13:21:15.250340 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-646c887bd9-qzqxk" event={"ID":"28297498-43e9-457c-a90d-0c3f49907491","Type":"ContainerStarted","Data":"427e7ad9e57e7112049b7ea9355f3ab9e3247fee84a0f1773512064007dc24b6"} Mar 09 13:21:15 crc kubenswrapper[4723]: I0309 13:21:15.258265 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"981e29c2-619d-4c24-a269-d3400901d853","Type":"ContainerStarted","Data":"f2015f017259459200c2ff89497f5616249b52a23b3dbf07804ff48cf07081c9"} Mar 09 13:21:15 crc kubenswrapper[4723]: I0309 13:21:15.258302 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-2lbbd" Mar 09 13:21:15 crc kubenswrapper[4723]: I0309 13:21:15.258417 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="981e29c2-619d-4c24-a269-d3400901d853" containerName="glance-log" containerID="cri-o://1cc39b209ddd02a0caf06bbe58f3dbc548cf045d46ebf85295cc4da95a08a398" gracePeriod=30 Mar 09 13:21:15 crc kubenswrapper[4723]: I0309 13:21:15.258451 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="981e29c2-619d-4c24-a269-d3400901d853" containerName="glance-httpd" containerID="cri-o://f2015f017259459200c2ff89497f5616249b52a23b3dbf07804ff48cf07081c9" gracePeriod=30 Mar 09 13:21:15 crc kubenswrapper[4723]: I0309 13:21:15.293066 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=15.293042852 podStartE2EDuration="15.293042852s" podCreationTimestamp="2026-03-09 13:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:21:15.269246782 +0000 UTC m=+1349.283714342" watchObservedRunningTime="2026-03-09 13:21:15.293042852 +0000 UTC m=+1349.307510392" Mar 09 13:21:15 crc kubenswrapper[4723]: I0309 13:21:15.318824 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-597569c5dd-vxwdd" podStartSLOduration=4.318801543 podStartE2EDuration="4.318801543s" podCreationTimestamp="2026-03-09 13:21:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:21:15.290899535 +0000 UTC m=+1349.305367075" watchObservedRunningTime="2026-03-09 13:21:15.318801543 +0000 UTC m=+1349.333269083" Mar 09 13:21:15 crc kubenswrapper[4723]: I0309 13:21:15.334885 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84b966f6c9-9pt7z" podStartSLOduration=5.334840268 podStartE2EDuration="5.334840268s" podCreationTimestamp="2026-03-09 13:21:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:21:15.320689893 +0000 UTC m=+1349.335157433" watchObservedRunningTime="2026-03-09 13:21:15.334840268 +0000 UTC m=+1349.349307808" Mar 09 13:21:15 crc kubenswrapper[4723]: I0309 13:21:15.345504 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=15.345481259 podStartE2EDuration="15.345481259s" podCreationTimestamp="2026-03-09 13:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:21:15.342123651 +0000 UTC m=+1349.356591211" watchObservedRunningTime="2026-03-09 13:21:15.345481259 +0000 UTC m=+1349.359948799" Mar 09 13:21:15 crc kubenswrapper[4723]: I0309 13:21:15.464749 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-2lbbd"] Mar 09 13:21:15 crc kubenswrapper[4723]: I0309 13:21:15.475285 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-2lbbd"] Mar 09 13:21:16 crc kubenswrapper[4723]: I0309 13:21:16.272053 4723 generic.go:334] "Generic (PLEG): container finished" podID="981e29c2-619d-4c24-a269-d3400901d853" containerID="f2015f017259459200c2ff89497f5616249b52a23b3dbf07804ff48cf07081c9" exitCode=0 Mar 09 13:21:16 crc kubenswrapper[4723]: I0309 13:21:16.272322 4723 generic.go:334] "Generic (PLEG): container finished" podID="981e29c2-619d-4c24-a269-d3400901d853" containerID="1cc39b209ddd02a0caf06bbe58f3dbc548cf045d46ebf85295cc4da95a08a398" exitCode=143 Mar 09 13:21:16 crc kubenswrapper[4723]: I0309 13:21:16.272357 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"981e29c2-619d-4c24-a269-d3400901d853","Type":"ContainerDied","Data":"f2015f017259459200c2ff89497f5616249b52a23b3dbf07804ff48cf07081c9"} Mar 09 13:21:16 crc kubenswrapper[4723]: I0309 13:21:16.272384 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"981e29c2-619d-4c24-a269-d3400901d853","Type":"ContainerDied","Data":"1cc39b209ddd02a0caf06bbe58f3dbc548cf045d46ebf85295cc4da95a08a398"} Mar 09 13:21:16 crc kubenswrapper[4723]: I0309 13:21:16.275323 4723 generic.go:334] "Generic (PLEG): container finished" podID="63094be1-8d09-427e-9743-697c7bdcdb41" containerID="389eac8216d7b97c3fcf3fef592e7c340d0b9ea6b23d5ea8ba5ac23b49f2e963" exitCode=0 Mar 09 13:21:16 crc kubenswrapper[4723]: I0309 13:21:16.275340 4723 generic.go:334] "Generic (PLEG): container finished" podID="63094be1-8d09-427e-9743-697c7bdcdb41" containerID="a21863bb613a1974d34f8d63c84a7900ee3bb34ad88ea760618219c51e51b38e" exitCode=143 Mar 09 13:21:16 crc kubenswrapper[4723]: I0309 13:21:16.275368 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"63094be1-8d09-427e-9743-697c7bdcdb41","Type":"ContainerDied","Data":"389eac8216d7b97c3fcf3fef592e7c340d0b9ea6b23d5ea8ba5ac23b49f2e963"} Mar 09 13:21:16 crc kubenswrapper[4723]: I0309 13:21:16.275385 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"63094be1-8d09-427e-9743-697c7bdcdb41","Type":"ContainerDied","Data":"a21863bb613a1974d34f8d63c84a7900ee3bb34ad88ea760618219c51e51b38e"} Mar 09 13:21:16 crc kubenswrapper[4723]: I0309 13:21:16.278179 4723 generic.go:334] "Generic (PLEG): container finished" podID="6832d621-3d7d-4e4a-824b-f219746aaa89" containerID="01625adfdb5a75e4cff0e083107fb62501001c9a5214e951c45cff2815bc8cd4" exitCode=0 Mar 09 13:21:16 crc kubenswrapper[4723]: I0309 13:21:16.278214 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-cp5rk" event={"ID":"6832d621-3d7d-4e4a-824b-f219746aaa89","Type":"ContainerDied","Data":"01625adfdb5a75e4cff0e083107fb62501001c9a5214e951c45cff2815bc8cd4"} Mar 09 13:21:16 crc kubenswrapper[4723]: I0309 13:21:16.325151 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-646c887bd9-qzqxk" event={"ID":"28297498-43e9-457c-a90d-0c3f49907491","Type":"ContainerStarted","Data":"fc77008e1f53a9fd9fd0fee21c759ad16773862210f0ceaae4c1cad04c78e7c6"} Mar 09 13:21:16 crc kubenswrapper[4723]: I0309 13:21:16.381748 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-646c887bd9-qzqxk" podStartSLOduration=3.381725055 podStartE2EDuration="3.381725055s" podCreationTimestamp="2026-03-09 13:21:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:21:16.347119189 +0000 UTC m=+1350.361586729" watchObservedRunningTime="2026-03-09 13:21:16.381725055 +0000 UTC m=+1350.396192595" Mar 09 13:21:16 crc kubenswrapper[4723]: I0309 13:21:16.860378 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wvbmm" Mar 09 13:21:16 crc kubenswrapper[4723]: I0309 13:21:16.889943 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57c8r\" (UniqueName: \"kubernetes.io/projected/052977d5-adda-4cc2-a8bc-7b4ea4e32df7-kube-api-access-57c8r\") pod \"052977d5-adda-4cc2-a8bc-7b4ea4e32df7\" (UID: \"052977d5-adda-4cc2-a8bc-7b4ea4e32df7\") " Mar 09 13:21:16 crc kubenswrapper[4723]: I0309 13:21:16.890316 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/052977d5-adda-4cc2-a8bc-7b4ea4e32df7-combined-ca-bundle\") pod \"052977d5-adda-4cc2-a8bc-7b4ea4e32df7\" (UID: \"052977d5-adda-4cc2-a8bc-7b4ea4e32df7\") " Mar 09 13:21:16 crc kubenswrapper[4723]: I0309 13:21:16.890348 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/052977d5-adda-4cc2-a8bc-7b4ea4e32df7-db-sync-config-data\") pod \"052977d5-adda-4cc2-a8bc-7b4ea4e32df7\" (UID: \"052977d5-adda-4cc2-a8bc-7b4ea4e32df7\") " Mar 09 13:21:16 crc kubenswrapper[4723]: I0309 13:21:16.895418 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/052977d5-adda-4cc2-a8bc-7b4ea4e32df7-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "052977d5-adda-4cc2-a8bc-7b4ea4e32df7" (UID: "052977d5-adda-4cc2-a8bc-7b4ea4e32df7"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:16 crc kubenswrapper[4723]: I0309 13:21:16.895554 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/052977d5-adda-4cc2-a8bc-7b4ea4e32df7-kube-api-access-57c8r" (OuterVolumeSpecName: "kube-api-access-57c8r") pod "052977d5-adda-4cc2-a8bc-7b4ea4e32df7" (UID: "052977d5-adda-4cc2-a8bc-7b4ea4e32df7"). InnerVolumeSpecName "kube-api-access-57c8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:16 crc kubenswrapper[4723]: I0309 13:21:16.918890 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9" path="/var/lib/kubelet/pods/bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9/volumes" Mar 09 13:21:16 crc kubenswrapper[4723]: I0309 13:21:16.942238 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/052977d5-adda-4cc2-a8bc-7b4ea4e32df7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "052977d5-adda-4cc2-a8bc-7b4ea4e32df7" (UID: "052977d5-adda-4cc2-a8bc-7b4ea4e32df7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:16 crc kubenswrapper[4723]: I0309 13:21:16.992668 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/052977d5-adda-4cc2-a8bc-7b4ea4e32df7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:16 crc kubenswrapper[4723]: I0309 13:21:16.992704 4723 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/052977d5-adda-4cc2-a8bc-7b4ea4e32df7-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:16 crc kubenswrapper[4723]: I0309 13:21:16.992714 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57c8r\" (UniqueName: \"kubernetes.io/projected/052977d5-adda-4cc2-a8bc-7b4ea4e32df7-kube-api-access-57c8r\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.066789 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.095052 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c93877a3-be1d-4f5b-9fa6-375a3db60f77\") pod \"981e29c2-619d-4c24-a269-d3400901d853\" (UID: \"981e29c2-619d-4c24-a269-d3400901d853\") " Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.095229 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/981e29c2-619d-4c24-a269-d3400901d853-logs\") pod \"981e29c2-619d-4c24-a269-d3400901d853\" (UID: \"981e29c2-619d-4c24-a269-d3400901d853\") " Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.095310 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nb62l\" (UniqueName: \"kubernetes.io/projected/981e29c2-619d-4c24-a269-d3400901d853-kube-api-access-nb62l\") pod \"981e29c2-619d-4c24-a269-d3400901d853\" (UID: \"981e29c2-619d-4c24-a269-d3400901d853\") " Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.095345 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/981e29c2-619d-4c24-a269-d3400901d853-config-data\") pod \"981e29c2-619d-4c24-a269-d3400901d853\" (UID: \"981e29c2-619d-4c24-a269-d3400901d853\") " Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.098956 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/981e29c2-619d-4c24-a269-d3400901d853-logs" (OuterVolumeSpecName: "logs") pod "981e29c2-619d-4c24-a269-d3400901d853" (UID: "981e29c2-619d-4c24-a269-d3400901d853"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.100058 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/981e29c2-619d-4c24-a269-d3400901d853-kube-api-access-nb62l" (OuterVolumeSpecName: "kube-api-access-nb62l") pod "981e29c2-619d-4c24-a269-d3400901d853" (UID: "981e29c2-619d-4c24-a269-d3400901d853"). InnerVolumeSpecName "kube-api-access-nb62l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.119765 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c93877a3-be1d-4f5b-9fa6-375a3db60f77" (OuterVolumeSpecName: "glance") pod "981e29c2-619d-4c24-a269-d3400901d853" (UID: "981e29c2-619d-4c24-a269-d3400901d853"). InnerVolumeSpecName "pvc-c93877a3-be1d-4f5b-9fa6-375a3db60f77". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.160273 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/981e29c2-619d-4c24-a269-d3400901d853-config-data" (OuterVolumeSpecName: "config-data") pod "981e29c2-619d-4c24-a269-d3400901d853" (UID: "981e29c2-619d-4c24-a269-d3400901d853"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.197907 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/981e29c2-619d-4c24-a269-d3400901d853-combined-ca-bundle\") pod \"981e29c2-619d-4c24-a269-d3400901d853\" (UID: \"981e29c2-619d-4c24-a269-d3400901d853\") " Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.198324 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/981e29c2-619d-4c24-a269-d3400901d853-scripts\") pod \"981e29c2-619d-4c24-a269-d3400901d853\" (UID: \"981e29c2-619d-4c24-a269-d3400901d853\") " Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.198374 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/981e29c2-619d-4c24-a269-d3400901d853-httpd-run\") pod \"981e29c2-619d-4c24-a269-d3400901d853\" (UID: \"981e29c2-619d-4c24-a269-d3400901d853\") " Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.199780 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/981e29c2-619d-4c24-a269-d3400901d853-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "981e29c2-619d-4c24-a269-d3400901d853" (UID: "981e29c2-619d-4c24-a269-d3400901d853"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.200351 4723 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-c93877a3-be1d-4f5b-9fa6-375a3db60f77\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c93877a3-be1d-4f5b-9fa6-375a3db60f77\") on node \"crc\" " Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.200401 4723 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/981e29c2-619d-4c24-a269-d3400901d853-logs\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.200416 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nb62l\" (UniqueName: \"kubernetes.io/projected/981e29c2-619d-4c24-a269-d3400901d853-kube-api-access-nb62l\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.200429 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/981e29c2-619d-4c24-a269-d3400901d853-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.200439 4723 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/981e29c2-619d-4c24-a269-d3400901d853-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.202475 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/981e29c2-619d-4c24-a269-d3400901d853-scripts" (OuterVolumeSpecName: "scripts") pod "981e29c2-619d-4c24-a269-d3400901d853" (UID: "981e29c2-619d-4c24-a269-d3400901d853"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.233702 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/981e29c2-619d-4c24-a269-d3400901d853-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "981e29c2-619d-4c24-a269-d3400901d853" (UID: "981e29c2-619d-4c24-a269-d3400901d853"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.246963 4723 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.247141 4723 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-c93877a3-be1d-4f5b-9fa6-375a3db60f77" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c93877a3-be1d-4f5b-9fa6-375a3db60f77") on node "crc" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.304760 4723 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/981e29c2-619d-4c24-a269-d3400901d853-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.304794 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/981e29c2-619d-4c24-a269-d3400901d853-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.304805 4723 reconciler_common.go:293] "Volume detached for volume \"pvc-c93877a3-be1d-4f5b-9fa6-375a3db60f77\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c93877a3-be1d-4f5b-9fa6-375a3db60f77\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.412139 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"981e29c2-619d-4c24-a269-d3400901d853","Type":"ContainerDied","Data":"fc13c4cb9f8a3b3b85321639d5eb0d59c0c1325f9545856a72954fabd945b5bb"} Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.412201 4723 scope.go:117] "RemoveContainer" containerID="f2015f017259459200c2ff89497f5616249b52a23b3dbf07804ff48cf07081c9" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.412356 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.432418 4723 generic.go:334] "Generic (PLEG): container finished" podID="3d74c75e-9665-4723-8dca-9019bd324ccb" containerID="fe3139d90fc2debdc13fe9cfc9bf20e0412a7bff134036ac39658fac91cbee73" exitCode=0 Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.432501 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ch4lf" event={"ID":"3d74c75e-9665-4723-8dca-9019bd324ccb","Type":"ContainerDied","Data":"fe3139d90fc2debdc13fe9cfc9bf20e0412a7bff134036ac39658fac91cbee73"} Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.478262 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wvbmm" event={"ID":"052977d5-adda-4cc2-a8bc-7b4ea4e32df7","Type":"ContainerDied","Data":"75f6aa0a0fb701015bde8e6503310a1dcd7c9366126a5368f221d3eaf277ae42"} Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.478305 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75f6aa0a0fb701015bde8e6503310a1dcd7c9366126a5368f221d3eaf277ae42" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.478498 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wvbmm" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.479275 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-646c887bd9-qzqxk" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.485926 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-69bb477875-ngpk2"] Mar 09 13:21:17 crc kubenswrapper[4723]: E0309 13:21:17.486433 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9" containerName="init" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.486448 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9" containerName="init" Mar 09 13:21:17 crc kubenswrapper[4723]: E0309 13:21:17.486460 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="981e29c2-619d-4c24-a269-d3400901d853" containerName="glance-log" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.486466 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="981e29c2-619d-4c24-a269-d3400901d853" containerName="glance-log" Mar 09 13:21:17 crc kubenswrapper[4723]: E0309 13:21:17.486493 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="981e29c2-619d-4c24-a269-d3400901d853" containerName="glance-httpd" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.486499 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="981e29c2-619d-4c24-a269-d3400901d853" containerName="glance-httpd" Mar 09 13:21:17 crc kubenswrapper[4723]: E0309 13:21:17.486517 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="052977d5-adda-4cc2-a8bc-7b4ea4e32df7" containerName="barbican-db-sync" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.486522 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="052977d5-adda-4cc2-a8bc-7b4ea4e32df7" containerName="barbican-db-sync" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.486771 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd4eb3ac-c666-4d5d-93bb-37dfb1902fd9" containerName="init" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.486785 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="981e29c2-619d-4c24-a269-d3400901d853" containerName="glance-log" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.486810 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="981e29c2-619d-4c24-a269-d3400901d853" containerName="glance-httpd" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.486826 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="052977d5-adda-4cc2-a8bc-7b4ea4e32df7" containerName="barbican-db-sync" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.488032 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-69bb477875-ngpk2" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.497389 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-q6xhf" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.497759 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.497960 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.526920 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-69bb477875-ngpk2"] Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.550340 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.560238 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.576294 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-767f59545d-j4rfp"] Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.589162 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-767f59545d-j4rfp" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.600004 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.600149 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-767f59545d-j4rfp"] Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.614473 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afb8e2d8-5db3-4447-9c6e-4b8b6249b44d-config-data\") pod \"barbican-worker-69bb477875-ngpk2\" (UID: \"afb8e2d8-5db3-4447-9c6e-4b8b6249b44d\") " pod="openstack/barbican-worker-69bb477875-ngpk2" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.614537 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afb8e2d8-5db3-4447-9c6e-4b8b6249b44d-combined-ca-bundle\") pod \"barbican-worker-69bb477875-ngpk2\" (UID: \"afb8e2d8-5db3-4447-9c6e-4b8b6249b44d\") " pod="openstack/barbican-worker-69bb477875-ngpk2" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.614657 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afb8e2d8-5db3-4447-9c6e-4b8b6249b44d-logs\") pod \"barbican-worker-69bb477875-ngpk2\" (UID: \"afb8e2d8-5db3-4447-9c6e-4b8b6249b44d\") " pod="openstack/barbican-worker-69bb477875-ngpk2" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.614706 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfkwx\" (UniqueName: \"kubernetes.io/projected/afb8e2d8-5db3-4447-9c6e-4b8b6249b44d-kube-api-access-vfkwx\") pod \"barbican-worker-69bb477875-ngpk2\" (UID: \"afb8e2d8-5db3-4447-9c6e-4b8b6249b44d\") " pod="openstack/barbican-worker-69bb477875-ngpk2" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.614743 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/afb8e2d8-5db3-4447-9c6e-4b8b6249b44d-config-data-custom\") pod \"barbican-worker-69bb477875-ngpk2\" (UID: \"afb8e2d8-5db3-4447-9c6e-4b8b6249b44d\") " pod="openstack/barbican-worker-69bb477875-ngpk2" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.631478 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.637115 4723 scope.go:117] "RemoveContainer" containerID="1cc39b209ddd02a0caf06bbe58f3dbc548cf045d46ebf85295cc4da95a08a398" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.642375 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-9pt7z"] Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.642755 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84b966f6c9-9pt7z" podUID="a2df24d1-d5eb-4680-aeb5-e8c2b6a93090" containerName="dnsmasq-dns" containerID="cri-o://cab4d22cb7da6a34a11896bec6e65184e23592a437083f0d5ff6d1fc60b117bd" gracePeriod=10 Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.643038 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.663355 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.664206 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.719068 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfkwx\" (UniqueName: \"kubernetes.io/projected/afb8e2d8-5db3-4447-9c6e-4b8b6249b44d-kube-api-access-vfkwx\") pod \"barbican-worker-69bb477875-ngpk2\" (UID: \"afb8e2d8-5db3-4447-9c6e-4b8b6249b44d\") " pod="openstack/barbican-worker-69bb477875-ngpk2" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.719402 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/afb8e2d8-5db3-4447-9c6e-4b8b6249b44d-config-data-custom\") pod \"barbican-worker-69bb477875-ngpk2\" (UID: \"afb8e2d8-5db3-4447-9c6e-4b8b6249b44d\") " pod="openstack/barbican-worker-69bb477875-ngpk2" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.719477 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9e10194-4646-4afe-8353-146441b874ad-config-data-custom\") pod \"barbican-keystone-listener-767f59545d-j4rfp\" (UID: \"d9e10194-4646-4afe-8353-146441b874ad\") " pod="openstack/barbican-keystone-listener-767f59545d-j4rfp" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.719533 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9e10194-4646-4afe-8353-146441b874ad-logs\") pod \"barbican-keystone-listener-767f59545d-j4rfp\" (UID: \"d9e10194-4646-4afe-8353-146441b874ad\") " pod="openstack/barbican-keystone-listener-767f59545d-j4rfp" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.719703 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afb8e2d8-5db3-4447-9c6e-4b8b6249b44d-config-data\") pod \"barbican-worker-69bb477875-ngpk2\" (UID: \"afb8e2d8-5db3-4447-9c6e-4b8b6249b44d\") " pod="openstack/barbican-worker-69bb477875-ngpk2" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.720133 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afb8e2d8-5db3-4447-9c6e-4b8b6249b44d-combined-ca-bundle\") pod \"barbican-worker-69bb477875-ngpk2\" (UID: \"afb8e2d8-5db3-4447-9c6e-4b8b6249b44d\") " pod="openstack/barbican-worker-69bb477875-ngpk2" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.720166 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9e10194-4646-4afe-8353-146441b874ad-config-data\") pod \"barbican-keystone-listener-767f59545d-j4rfp\" (UID: \"d9e10194-4646-4afe-8353-146441b874ad\") " pod="openstack/barbican-keystone-listener-767f59545d-j4rfp" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.720263 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9e10194-4646-4afe-8353-146441b874ad-combined-ca-bundle\") pod \"barbican-keystone-listener-767f59545d-j4rfp\" (UID: \"d9e10194-4646-4afe-8353-146441b874ad\") " pod="openstack/barbican-keystone-listener-767f59545d-j4rfp" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.720306 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afb8e2d8-5db3-4447-9c6e-4b8b6249b44d-logs\") pod \"barbican-worker-69bb477875-ngpk2\" (UID: \"afb8e2d8-5db3-4447-9c6e-4b8b6249b44d\") " pod="openstack/barbican-worker-69bb477875-ngpk2" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.720353 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmmlc\" (UniqueName: \"kubernetes.io/projected/d9e10194-4646-4afe-8353-146441b874ad-kube-api-access-bmmlc\") pod \"barbican-keystone-listener-767f59545d-j4rfp\" (UID: \"d9e10194-4646-4afe-8353-146441b874ad\") " pod="openstack/barbican-keystone-listener-767f59545d-j4rfp" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.727957 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afb8e2d8-5db3-4447-9c6e-4b8b6249b44d-logs\") pod \"barbican-worker-69bb477875-ngpk2\" (UID: \"afb8e2d8-5db3-4447-9c6e-4b8b6249b44d\") " pod="openstack/barbican-worker-69bb477875-ngpk2" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.774957 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/afb8e2d8-5db3-4447-9c6e-4b8b6249b44d-config-data-custom\") pod \"barbican-worker-69bb477875-ngpk2\" (UID: \"afb8e2d8-5db3-4447-9c6e-4b8b6249b44d\") " pod="openstack/barbican-worker-69bb477875-ngpk2" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.776203 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afb8e2d8-5db3-4447-9c6e-4b8b6249b44d-config-data\") pod \"barbican-worker-69bb477875-ngpk2\" (UID: \"afb8e2d8-5db3-4447-9c6e-4b8b6249b44d\") " pod="openstack/barbican-worker-69bb477875-ngpk2" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.776826 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afb8e2d8-5db3-4447-9c6e-4b8b6249b44d-combined-ca-bundle\") pod \"barbican-worker-69bb477875-ngpk2\" (UID: \"afb8e2d8-5db3-4447-9c6e-4b8b6249b44d\") " pod="openstack/barbican-worker-69bb477875-ngpk2" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.836950 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.838030 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfkwx\" (UniqueName: \"kubernetes.io/projected/afb8e2d8-5db3-4447-9c6e-4b8b6249b44d-kube-api-access-vfkwx\") pod \"barbican-worker-69bb477875-ngpk2\" (UID: \"afb8e2d8-5db3-4447-9c6e-4b8b6249b44d\") " pod="openstack/barbican-worker-69bb477875-ngpk2" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.841059 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e90760e-0ff0-4195-8bb9-d32fe674feb5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2e90760e-0ff0-4195-8bb9-d32fe674feb5\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.841167 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e90760e-0ff0-4195-8bb9-d32fe674feb5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2e90760e-0ff0-4195-8bb9-d32fe674feb5\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.841208 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c93877a3-be1d-4f5b-9fa6-375a3db60f77\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c93877a3-be1d-4f5b-9fa6-375a3db60f77\") pod \"glance-default-internal-api-0\" (UID: \"2e90760e-0ff0-4195-8bb9-d32fe674feb5\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.841242 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9e10194-4646-4afe-8353-146441b874ad-config-data\") pod \"barbican-keystone-listener-767f59545d-j4rfp\" (UID: \"d9e10194-4646-4afe-8353-146441b874ad\") " pod="openstack/barbican-keystone-listener-767f59545d-j4rfp" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.841321 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9e10194-4646-4afe-8353-146441b874ad-combined-ca-bundle\") pod \"barbican-keystone-listener-767f59545d-j4rfp\" (UID: \"d9e10194-4646-4afe-8353-146441b874ad\") " pod="openstack/barbican-keystone-listener-767f59545d-j4rfp" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.841387 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmmlc\" (UniqueName: \"kubernetes.io/projected/d9e10194-4646-4afe-8353-146441b874ad-kube-api-access-bmmlc\") pod \"barbican-keystone-listener-767f59545d-j4rfp\" (UID: \"d9e10194-4646-4afe-8353-146441b874ad\") " pod="openstack/barbican-keystone-listener-767f59545d-j4rfp" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.841411 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e90760e-0ff0-4195-8bb9-d32fe674feb5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2e90760e-0ff0-4195-8bb9-d32fe674feb5\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.841517 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9e10194-4646-4afe-8353-146441b874ad-config-data-custom\") pod \"barbican-keystone-listener-767f59545d-j4rfp\" (UID: \"d9e10194-4646-4afe-8353-146441b874ad\") " pod="openstack/barbican-keystone-listener-767f59545d-j4rfp" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.841578 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9e10194-4646-4afe-8353-146441b874ad-logs\") pod \"barbican-keystone-listener-767f59545d-j4rfp\" (UID: \"d9e10194-4646-4afe-8353-146441b874ad\") " pod="openstack/barbican-keystone-listener-767f59545d-j4rfp" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.841608 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf57h\" (UniqueName: \"kubernetes.io/projected/2e90760e-0ff0-4195-8bb9-d32fe674feb5-kube-api-access-jf57h\") pod \"glance-default-internal-api-0\" (UID: \"2e90760e-0ff0-4195-8bb9-d32fe674feb5\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.841633 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2e90760e-0ff0-4195-8bb9-d32fe674feb5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2e90760e-0ff0-4195-8bb9-d32fe674feb5\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.841681 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e90760e-0ff0-4195-8bb9-d32fe674feb5-logs\") pod \"glance-default-internal-api-0\" (UID: \"2e90760e-0ff0-4195-8bb9-d32fe674feb5\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.841724 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e90760e-0ff0-4195-8bb9-d32fe674feb5-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2e90760e-0ff0-4195-8bb9-d32fe674feb5\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.849984 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9e10194-4646-4afe-8353-146441b874ad-logs\") pod \"barbican-keystone-listener-767f59545d-j4rfp\" (UID: \"d9e10194-4646-4afe-8353-146441b874ad\") " pod="openstack/barbican-keystone-listener-767f59545d-j4rfp" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.867272 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-69bb477875-ngpk2" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.878642 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9e10194-4646-4afe-8353-146441b874ad-combined-ca-bundle\") pod \"barbican-keystone-listener-767f59545d-j4rfp\" (UID: \"d9e10194-4646-4afe-8353-146441b874ad\") " pod="openstack/barbican-keystone-listener-767f59545d-j4rfp" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.890632 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmmlc\" (UniqueName: \"kubernetes.io/projected/d9e10194-4646-4afe-8353-146441b874ad-kube-api-access-bmmlc\") pod \"barbican-keystone-listener-767f59545d-j4rfp\" (UID: \"d9e10194-4646-4afe-8353-146441b874ad\") " pod="openstack/barbican-keystone-listener-767f59545d-j4rfp" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.895093 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9e10194-4646-4afe-8353-146441b874ad-config-data\") pod \"barbican-keystone-listener-767f59545d-j4rfp\" (UID: \"d9e10194-4646-4afe-8353-146441b874ad\") " pod="openstack/barbican-keystone-listener-767f59545d-j4rfp" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.895612 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9e10194-4646-4afe-8353-146441b874ad-config-data-custom\") pod \"barbican-keystone-listener-767f59545d-j4rfp\" (UID: \"d9e10194-4646-4afe-8353-146441b874ad\") " pod="openstack/barbican-keystone-listener-767f59545d-j4rfp" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.976975 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-9jjw9"] Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.982157 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e90760e-0ff0-4195-8bb9-d32fe674feb5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2e90760e-0ff0-4195-8bb9-d32fe674feb5\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.982398 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf57h\" (UniqueName: \"kubernetes.io/projected/2e90760e-0ff0-4195-8bb9-d32fe674feb5-kube-api-access-jf57h\") pod \"glance-default-internal-api-0\" (UID: \"2e90760e-0ff0-4195-8bb9-d32fe674feb5\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.982429 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2e90760e-0ff0-4195-8bb9-d32fe674feb5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2e90760e-0ff0-4195-8bb9-d32fe674feb5\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.982479 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e90760e-0ff0-4195-8bb9-d32fe674feb5-logs\") pod \"glance-default-internal-api-0\" (UID: \"2e90760e-0ff0-4195-8bb9-d32fe674feb5\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.982540 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e90760e-0ff0-4195-8bb9-d32fe674feb5-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2e90760e-0ff0-4195-8bb9-d32fe674feb5\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.982619 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e90760e-0ff0-4195-8bb9-d32fe674feb5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2e90760e-0ff0-4195-8bb9-d32fe674feb5\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.982729 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e90760e-0ff0-4195-8bb9-d32fe674feb5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2e90760e-0ff0-4195-8bb9-d32fe674feb5\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.982785 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c93877a3-be1d-4f5b-9fa6-375a3db60f77\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c93877a3-be1d-4f5b-9fa6-375a3db60f77\") pod \"glance-default-internal-api-0\" (UID: \"2e90760e-0ff0-4195-8bb9-d32fe674feb5\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.983211 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-767f59545d-j4rfp" Mar 09 13:21:17 crc kubenswrapper[4723]: I0309 13:21:17.986135 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-9jjw9" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.001412 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e90760e-0ff0-4195-8bb9-d32fe674feb5-logs\") pod \"glance-default-internal-api-0\" (UID: \"2e90760e-0ff0-4195-8bb9-d32fe674feb5\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.004280 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2e90760e-0ff0-4195-8bb9-d32fe674feb5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2e90760e-0ff0-4195-8bb9-d32fe674feb5\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.011668 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e90760e-0ff0-4195-8bb9-d32fe674feb5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2e90760e-0ff0-4195-8bb9-d32fe674feb5\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.012685 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e90760e-0ff0-4195-8bb9-d32fe674feb5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2e90760e-0ff0-4195-8bb9-d32fe674feb5\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.025970 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e90760e-0ff0-4195-8bb9-d32fe674feb5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2e90760e-0ff0-4195-8bb9-d32fe674feb5\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.026339 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e90760e-0ff0-4195-8bb9-d32fe674feb5-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2e90760e-0ff0-4195-8bb9-d32fe674feb5\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.036730 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf57h\" (UniqueName: \"kubernetes.io/projected/2e90760e-0ff0-4195-8bb9-d32fe674feb5-kube-api-access-jf57h\") pod \"glance-default-internal-api-0\" (UID: \"2e90760e-0ff0-4195-8bb9-d32fe674feb5\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.049215 4723 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.049255 4723 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c93877a3-be1d-4f5b-9fa6-375a3db60f77\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c93877a3-be1d-4f5b-9fa6-375a3db60f77\") pod \"glance-default-internal-api-0\" (UID: \"2e90760e-0ff0-4195-8bb9-d32fe674feb5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2988164fdbbdc0e6befebfe68058338f4fd9913a4b29356e34a132113ba27e6b/globalmount\"" pod="openstack/glance-default-internal-api-0" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.085606 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4313196c-6a31-4615-9ffe-329aed2bfef4-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-9jjw9\" (UID: \"4313196c-6a31-4615-9ffe-329aed2bfef4\") " pod="openstack/dnsmasq-dns-75c8ddd69c-9jjw9" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.085724 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prqmc\" (UniqueName: \"kubernetes.io/projected/4313196c-6a31-4615-9ffe-329aed2bfef4-kube-api-access-prqmc\") pod \"dnsmasq-dns-75c8ddd69c-9jjw9\" (UID: \"4313196c-6a31-4615-9ffe-329aed2bfef4\") " pod="openstack/dnsmasq-dns-75c8ddd69c-9jjw9" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.085788 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4313196c-6a31-4615-9ffe-329aed2bfef4-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-9jjw9\" (UID: \"4313196c-6a31-4615-9ffe-329aed2bfef4\") " pod="openstack/dnsmasq-dns-75c8ddd69c-9jjw9" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.085827 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4313196c-6a31-4615-9ffe-329aed2bfef4-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-9jjw9\" (UID: \"4313196c-6a31-4615-9ffe-329aed2bfef4\") " pod="openstack/dnsmasq-dns-75c8ddd69c-9jjw9" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.090056 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4313196c-6a31-4615-9ffe-329aed2bfef4-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-9jjw9\" (UID: \"4313196c-6a31-4615-9ffe-329aed2bfef4\") " pod="openstack/dnsmasq-dns-75c8ddd69c-9jjw9" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.090715 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4313196c-6a31-4615-9ffe-329aed2bfef4-config\") pod \"dnsmasq-dns-75c8ddd69c-9jjw9\" (UID: \"4313196c-6a31-4615-9ffe-329aed2bfef4\") " pod="openstack/dnsmasq-dns-75c8ddd69c-9jjw9" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.156958 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-9jjw9"] Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.171714 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6f68f59b98-9mh25"] Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.173613 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f68f59b98-9mh25" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.175225 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.182653 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c93877a3-be1d-4f5b-9fa6-375a3db60f77\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c93877a3-be1d-4f5b-9fa6-375a3db60f77\") pod \"glance-default-internal-api-0\" (UID: \"2e90760e-0ff0-4195-8bb9-d32fe674feb5\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.195814 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4313196c-6a31-4615-9ffe-329aed2bfef4-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-9jjw9\" (UID: \"4313196c-6a31-4615-9ffe-329aed2bfef4\") " pod="openstack/dnsmasq-dns-75c8ddd69c-9jjw9" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.197968 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4313196c-6a31-4615-9ffe-329aed2bfef4-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-9jjw9\" (UID: \"4313196c-6a31-4615-9ffe-329aed2bfef4\") " pod="openstack/dnsmasq-dns-75c8ddd69c-9jjw9" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.200878 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4313196c-6a31-4615-9ffe-329aed2bfef4-config\") pod \"dnsmasq-dns-75c8ddd69c-9jjw9\" (UID: \"4313196c-6a31-4615-9ffe-329aed2bfef4\") " pod="openstack/dnsmasq-dns-75c8ddd69c-9jjw9" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.201054 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4575d39-0084-4f11-980c-6187b318d7fa-config-data\") pod \"barbican-api-6f68f59b98-9mh25\" (UID: \"a4575d39-0084-4f11-980c-6187b318d7fa\") " pod="openstack/barbican-api-6f68f59b98-9mh25" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.201242 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4313196c-6a31-4615-9ffe-329aed2bfef4-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-9jjw9\" (UID: \"4313196c-6a31-4615-9ffe-329aed2bfef4\") " pod="openstack/dnsmasq-dns-75c8ddd69c-9jjw9" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.201320 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prqmc\" (UniqueName: \"kubernetes.io/projected/4313196c-6a31-4615-9ffe-329aed2bfef4-kube-api-access-prqmc\") pod \"dnsmasq-dns-75c8ddd69c-9jjw9\" (UID: \"4313196c-6a31-4615-9ffe-329aed2bfef4\") " pod="openstack/dnsmasq-dns-75c8ddd69c-9jjw9" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.201476 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4575d39-0084-4f11-980c-6187b318d7fa-config-data-custom\") pod \"barbican-api-6f68f59b98-9mh25\" (UID: \"a4575d39-0084-4f11-980c-6187b318d7fa\") " pod="openstack/barbican-api-6f68f59b98-9mh25" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.201800 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4575d39-0084-4f11-980c-6187b318d7fa-combined-ca-bundle\") pod \"barbican-api-6f68f59b98-9mh25\" (UID: \"a4575d39-0084-4f11-980c-6187b318d7fa\") " pod="openstack/barbican-api-6f68f59b98-9mh25" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.201988 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4313196c-6a31-4615-9ffe-329aed2bfef4-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-9jjw9\" (UID: \"4313196c-6a31-4615-9ffe-329aed2bfef4\") " pod="openstack/dnsmasq-dns-75c8ddd69c-9jjw9" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.202195 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4313196c-6a31-4615-9ffe-329aed2bfef4-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-9jjw9\" (UID: \"4313196c-6a31-4615-9ffe-329aed2bfef4\") " pod="openstack/dnsmasq-dns-75c8ddd69c-9jjw9" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.202792 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4fh9\" (UniqueName: \"kubernetes.io/projected/a4575d39-0084-4f11-980c-6187b318d7fa-kube-api-access-x4fh9\") pod \"barbican-api-6f68f59b98-9mh25\" (UID: \"a4575d39-0084-4f11-980c-6187b318d7fa\") " pod="openstack/barbican-api-6f68f59b98-9mh25" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.202975 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4575d39-0084-4f11-980c-6187b318d7fa-logs\") pod \"barbican-api-6f68f59b98-9mh25\" (UID: \"a4575d39-0084-4f11-980c-6187b318d7fa\") " pod="openstack/barbican-api-6f68f59b98-9mh25" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.204060 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4313196c-6a31-4615-9ffe-329aed2bfef4-config\") pod \"dnsmasq-dns-75c8ddd69c-9jjw9\" (UID: \"4313196c-6a31-4615-9ffe-329aed2bfef4\") " pod="openstack/dnsmasq-dns-75c8ddd69c-9jjw9" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.204630 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4313196c-6a31-4615-9ffe-329aed2bfef4-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-9jjw9\" (UID: \"4313196c-6a31-4615-9ffe-329aed2bfef4\") " pod="openstack/dnsmasq-dns-75c8ddd69c-9jjw9" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.204937 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4313196c-6a31-4615-9ffe-329aed2bfef4-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-9jjw9\" (UID: \"4313196c-6a31-4615-9ffe-329aed2bfef4\") " pod="openstack/dnsmasq-dns-75c8ddd69c-9jjw9" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.205941 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4313196c-6a31-4615-9ffe-329aed2bfef4-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-9jjw9\" (UID: \"4313196c-6a31-4615-9ffe-329aed2bfef4\") " pod="openstack/dnsmasq-dns-75c8ddd69c-9jjw9" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.213524 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6f68f59b98-9mh25"] Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.247503 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prqmc\" (UniqueName: \"kubernetes.io/projected/4313196c-6a31-4615-9ffe-329aed2bfef4-kube-api-access-prqmc\") pod \"dnsmasq-dns-75c8ddd69c-9jjw9\" (UID: \"4313196c-6a31-4615-9ffe-329aed2bfef4\") " pod="openstack/dnsmasq-dns-75c8ddd69c-9jjw9" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.279853 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-9jjw9" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.293368 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.307246 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4fh9\" (UniqueName: \"kubernetes.io/projected/a4575d39-0084-4f11-980c-6187b318d7fa-kube-api-access-x4fh9\") pod \"barbican-api-6f68f59b98-9mh25\" (UID: \"a4575d39-0084-4f11-980c-6187b318d7fa\") " pod="openstack/barbican-api-6f68f59b98-9mh25" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.307297 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4575d39-0084-4f11-980c-6187b318d7fa-logs\") pod \"barbican-api-6f68f59b98-9mh25\" (UID: \"a4575d39-0084-4f11-980c-6187b318d7fa\") " pod="openstack/barbican-api-6f68f59b98-9mh25" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.307408 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4575d39-0084-4f11-980c-6187b318d7fa-config-data\") pod \"barbican-api-6f68f59b98-9mh25\" (UID: \"a4575d39-0084-4f11-980c-6187b318d7fa\") " pod="openstack/barbican-api-6f68f59b98-9mh25" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.307506 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4575d39-0084-4f11-980c-6187b318d7fa-config-data-custom\") pod \"barbican-api-6f68f59b98-9mh25\" (UID: \"a4575d39-0084-4f11-980c-6187b318d7fa\") " pod="openstack/barbican-api-6f68f59b98-9mh25" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.307523 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4575d39-0084-4f11-980c-6187b318d7fa-combined-ca-bundle\") pod \"barbican-api-6f68f59b98-9mh25\" (UID: \"a4575d39-0084-4f11-980c-6187b318d7fa\") " pod="openstack/barbican-api-6f68f59b98-9mh25" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.309181 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4575d39-0084-4f11-980c-6187b318d7fa-logs\") pod \"barbican-api-6f68f59b98-9mh25\" (UID: \"a4575d39-0084-4f11-980c-6187b318d7fa\") " pod="openstack/barbican-api-6f68f59b98-9mh25" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.312251 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4575d39-0084-4f11-980c-6187b318d7fa-config-data\") pod \"barbican-api-6f68f59b98-9mh25\" (UID: \"a4575d39-0084-4f11-980c-6187b318d7fa\") " pod="openstack/barbican-api-6f68f59b98-9mh25" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.312641 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4575d39-0084-4f11-980c-6187b318d7fa-combined-ca-bundle\") pod \"barbican-api-6f68f59b98-9mh25\" (UID: \"a4575d39-0084-4f11-980c-6187b318d7fa\") " pod="openstack/barbican-api-6f68f59b98-9mh25" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.312746 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4575d39-0084-4f11-980c-6187b318d7fa-config-data-custom\") pod \"barbican-api-6f68f59b98-9mh25\" (UID: \"a4575d39-0084-4f11-980c-6187b318d7fa\") " pod="openstack/barbican-api-6f68f59b98-9mh25" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.335434 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4fh9\" (UniqueName: \"kubernetes.io/projected/a4575d39-0084-4f11-980c-6187b318d7fa-kube-api-access-x4fh9\") pod \"barbican-api-6f68f59b98-9mh25\" (UID: \"a4575d39-0084-4f11-980c-6187b318d7fa\") " pod="openstack/barbican-api-6f68f59b98-9mh25" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.382371 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.409363 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63094be1-8d09-427e-9743-697c7bdcdb41-combined-ca-bundle\") pod \"63094be1-8d09-427e-9743-697c7bdcdb41\" (UID: \"63094be1-8d09-427e-9743-697c7bdcdb41\") " Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.409588 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/63094be1-8d09-427e-9743-697c7bdcdb41-httpd-run\") pod \"63094be1-8d09-427e-9743-697c7bdcdb41\" (UID: \"63094be1-8d09-427e-9743-697c7bdcdb41\") " Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.409699 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63094be1-8d09-427e-9743-697c7bdcdb41-config-data\") pod \"63094be1-8d09-427e-9743-697c7bdcdb41\" (UID: \"63094be1-8d09-427e-9743-697c7bdcdb41\") " Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.409972 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-970052f0-8f5a-4d9c-9fec-85f44ded7440\") pod \"63094be1-8d09-427e-9743-697c7bdcdb41\" (UID: \"63094be1-8d09-427e-9743-697c7bdcdb41\") " Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.410104 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63094be1-8d09-427e-9743-697c7bdcdb41-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "63094be1-8d09-427e-9743-697c7bdcdb41" (UID: "63094be1-8d09-427e-9743-697c7bdcdb41"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.412186 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63094be1-8d09-427e-9743-697c7bdcdb41-scripts\") pod \"63094be1-8d09-427e-9743-697c7bdcdb41\" (UID: \"63094be1-8d09-427e-9743-697c7bdcdb41\") " Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.412323 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63094be1-8d09-427e-9743-697c7bdcdb41-logs\") pod \"63094be1-8d09-427e-9743-697c7bdcdb41\" (UID: \"63094be1-8d09-427e-9743-697c7bdcdb41\") " Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.412495 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6vvv\" (UniqueName: \"kubernetes.io/projected/63094be1-8d09-427e-9743-697c7bdcdb41-kube-api-access-n6vvv\") pod \"63094be1-8d09-427e-9743-697c7bdcdb41\" (UID: \"63094be1-8d09-427e-9743-697c7bdcdb41\") " Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.416007 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63094be1-8d09-427e-9743-697c7bdcdb41-scripts" (OuterVolumeSpecName: "scripts") pod "63094be1-8d09-427e-9743-697c7bdcdb41" (UID: "63094be1-8d09-427e-9743-697c7bdcdb41"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.416326 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63094be1-8d09-427e-9743-697c7bdcdb41-logs" (OuterVolumeSpecName: "logs") pod "63094be1-8d09-427e-9743-697c7bdcdb41" (UID: "63094be1-8d09-427e-9743-697c7bdcdb41"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.421762 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63094be1-8d09-427e-9743-697c7bdcdb41-kube-api-access-n6vvv" (OuterVolumeSpecName: "kube-api-access-n6vvv") pod "63094be1-8d09-427e-9743-697c7bdcdb41" (UID: "63094be1-8d09-427e-9743-697c7bdcdb41"). InnerVolumeSpecName "kube-api-access-n6vvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.423431 4723 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/63094be1-8d09-427e-9743-697c7bdcdb41-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.423496 4723 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63094be1-8d09-427e-9743-697c7bdcdb41-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.423515 4723 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63094be1-8d09-427e-9743-697c7bdcdb41-logs\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.423527 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6vvv\" (UniqueName: \"kubernetes.io/projected/63094be1-8d09-427e-9743-697c7bdcdb41-kube-api-access-n6vvv\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.435664 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-970052f0-8f5a-4d9c-9fec-85f44ded7440" (OuterVolumeSpecName: "glance") pod "63094be1-8d09-427e-9743-697c7bdcdb41" (UID: "63094be1-8d09-427e-9743-697c7bdcdb41"). InnerVolumeSpecName "pvc-970052f0-8f5a-4d9c-9fec-85f44ded7440". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.451382 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63094be1-8d09-427e-9743-697c7bdcdb41-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63094be1-8d09-427e-9743-697c7bdcdb41" (UID: "63094be1-8d09-427e-9743-697c7bdcdb41"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.523113 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.523360 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"63094be1-8d09-427e-9743-697c7bdcdb41","Type":"ContainerDied","Data":"810895a43a9c156e4da229f4efdb374e6320d138665090a1014880933069b7e6"} Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.524063 4723 scope.go:117] "RemoveContainer" containerID="389eac8216d7b97c3fcf3fef592e7c340d0b9ea6b23d5ea8ba5ac23b49f2e963" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.538807 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63094be1-8d09-427e-9743-697c7bdcdb41-config-data" (OuterVolumeSpecName: "config-data") pod "63094be1-8d09-427e-9743-697c7bdcdb41" (UID: "63094be1-8d09-427e-9743-697c7bdcdb41"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.548489 4723 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-970052f0-8f5a-4d9c-9fec-85f44ded7440\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-970052f0-8f5a-4d9c-9fec-85f44ded7440\") on node \"crc\" " Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.548520 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63094be1-8d09-427e-9743-697c7bdcdb41-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.548531 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63094be1-8d09-427e-9743-697c7bdcdb41-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.551950 4723 generic.go:334] "Generic (PLEG): container finished" podID="a2df24d1-d5eb-4680-aeb5-e8c2b6a93090" containerID="cab4d22cb7da6a34a11896bec6e65184e23592a437083f0d5ff6d1fc60b117bd" exitCode=0 Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.552048 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-9pt7z" event={"ID":"a2df24d1-d5eb-4680-aeb5-e8c2b6a93090","Type":"ContainerDied","Data":"cab4d22cb7da6a34a11896bec6e65184e23592a437083f0d5ff6d1fc60b117bd"} Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.597805 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f68f59b98-9mh25" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.618199 4723 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.618367 4723 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-970052f0-8f5a-4d9c-9fec-85f44ded7440" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-970052f0-8f5a-4d9c-9fec-85f44ded7440") on node "crc" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.636646 4723 scope.go:117] "RemoveContainer" containerID="a21863bb613a1974d34f8d63c84a7900ee3bb34ad88ea760618219c51e51b38e" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.656690 4723 reconciler_common.go:293] "Volume detached for volume \"pvc-970052f0-8f5a-4d9c-9fec-85f44ded7440\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-970052f0-8f5a-4d9c-9fec-85f44ded7440\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.691888 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-cp5rk" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.757687 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6832d621-3d7d-4e4a-824b-f219746aaa89-logs\") pod \"6832d621-3d7d-4e4a-824b-f219746aaa89\" (UID: \"6832d621-3d7d-4e4a-824b-f219746aaa89\") " Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.758053 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6832d621-3d7d-4e4a-824b-f219746aaa89-scripts\") pod \"6832d621-3d7d-4e4a-824b-f219746aaa89\" (UID: \"6832d621-3d7d-4e4a-824b-f219746aaa89\") " Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.758183 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6832d621-3d7d-4e4a-824b-f219746aaa89-config-data\") pod \"6832d621-3d7d-4e4a-824b-f219746aaa89\" (UID: \"6832d621-3d7d-4e4a-824b-f219746aaa89\") " Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.758330 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lvdx\" (UniqueName: \"kubernetes.io/projected/6832d621-3d7d-4e4a-824b-f219746aaa89-kube-api-access-5lvdx\") pod \"6832d621-3d7d-4e4a-824b-f219746aaa89\" (UID: \"6832d621-3d7d-4e4a-824b-f219746aaa89\") " Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.758365 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6832d621-3d7d-4e4a-824b-f219746aaa89-combined-ca-bundle\") pod \"6832d621-3d7d-4e4a-824b-f219746aaa89\" (UID: \"6832d621-3d7d-4e4a-824b-f219746aaa89\") " Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.761090 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6832d621-3d7d-4e4a-824b-f219746aaa89-logs" (OuterVolumeSpecName: "logs") pod "6832d621-3d7d-4e4a-824b-f219746aaa89" (UID: "6832d621-3d7d-4e4a-824b-f219746aaa89"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.764735 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6832d621-3d7d-4e4a-824b-f219746aaa89-scripts" (OuterVolumeSpecName: "scripts") pod "6832d621-3d7d-4e4a-824b-f219746aaa89" (UID: "6832d621-3d7d-4e4a-824b-f219746aaa89"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.774991 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6832d621-3d7d-4e4a-824b-f219746aaa89-kube-api-access-5lvdx" (OuterVolumeSpecName: "kube-api-access-5lvdx") pod "6832d621-3d7d-4e4a-824b-f219746aaa89" (UID: "6832d621-3d7d-4e4a-824b-f219746aaa89"). InnerVolumeSpecName "kube-api-access-5lvdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.803296 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6832d621-3d7d-4e4a-824b-f219746aaa89-config-data" (OuterVolumeSpecName: "config-data") pod "6832d621-3d7d-4e4a-824b-f219746aaa89" (UID: "6832d621-3d7d-4e4a-824b-f219746aaa89"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.845120 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6832d621-3d7d-4e4a-824b-f219746aaa89-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6832d621-3d7d-4e4a-824b-f219746aaa89" (UID: "6832d621-3d7d-4e4a-824b-f219746aaa89"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.865532 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lvdx\" (UniqueName: \"kubernetes.io/projected/6832d621-3d7d-4e4a-824b-f219746aaa89-kube-api-access-5lvdx\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.865565 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6832d621-3d7d-4e4a-824b-f219746aaa89-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.865575 4723 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6832d621-3d7d-4e4a-824b-f219746aaa89-logs\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.865584 4723 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6832d621-3d7d-4e4a-824b-f219746aaa89-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.865593 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6832d621-3d7d-4e4a-824b-f219746aaa89-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.952320 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="981e29c2-619d-4c24-a269-d3400901d853" path="/var/lib/kubelet/pods/981e29c2-619d-4c24-a269-d3400901d853/volumes" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.953936 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-9pt7z" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.958044 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.958097 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.958116 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 13:21:18 crc kubenswrapper[4723]: E0309 13:21:18.958627 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6832d621-3d7d-4e4a-824b-f219746aaa89" containerName="placement-db-sync" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.958647 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="6832d621-3d7d-4e4a-824b-f219746aaa89" containerName="placement-db-sync" Mar 09 13:21:18 crc kubenswrapper[4723]: E0309 13:21:18.958667 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63094be1-8d09-427e-9743-697c7bdcdb41" containerName="glance-log" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.958676 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="63094be1-8d09-427e-9743-697c7bdcdb41" containerName="glance-log" Mar 09 13:21:18 crc kubenswrapper[4723]: E0309 13:21:18.958699 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63094be1-8d09-427e-9743-697c7bdcdb41" containerName="glance-httpd" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.958704 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="63094be1-8d09-427e-9743-697c7bdcdb41" containerName="glance-httpd" Mar 09 13:21:18 crc kubenswrapper[4723]: E0309 13:21:18.958712 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2df24d1-d5eb-4680-aeb5-e8c2b6a93090" containerName="init" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.958720 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2df24d1-d5eb-4680-aeb5-e8c2b6a93090" containerName="init" Mar 09 13:21:18 crc kubenswrapper[4723]: E0309 13:21:18.958743 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2df24d1-d5eb-4680-aeb5-e8c2b6a93090" containerName="dnsmasq-dns" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.958749 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2df24d1-d5eb-4680-aeb5-e8c2b6a93090" containerName="dnsmasq-dns" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.958951 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="63094be1-8d09-427e-9743-697c7bdcdb41" containerName="glance-log" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.958972 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="6832d621-3d7d-4e4a-824b-f219746aaa89" containerName="placement-db-sync" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.958986 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2df24d1-d5eb-4680-aeb5-e8c2b6a93090" containerName="dnsmasq-dns" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.958992 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="63094be1-8d09-427e-9743-697c7bdcdb41" containerName="glance-httpd" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.971662 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.980113 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.985006 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 09 13:21:18 crc kubenswrapper[4723]: I0309 13:21:18.987365 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.076349 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a2df24d1-d5eb-4680-aeb5-e8c2b6a93090-dns-swift-storage-0\") pod \"a2df24d1-d5eb-4680-aeb5-e8c2b6a93090\" (UID: \"a2df24d1-d5eb-4680-aeb5-e8c2b6a93090\") " Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.076399 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2df24d1-d5eb-4680-aeb5-e8c2b6a93090-config\") pod \"a2df24d1-d5eb-4680-aeb5-e8c2b6a93090\" (UID: \"a2df24d1-d5eb-4680-aeb5-e8c2b6a93090\") " Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.077164 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsm8l\" (UniqueName: \"kubernetes.io/projected/a2df24d1-d5eb-4680-aeb5-e8c2b6a93090-kube-api-access-bsm8l\") pod \"a2df24d1-d5eb-4680-aeb5-e8c2b6a93090\" (UID: \"a2df24d1-d5eb-4680-aeb5-e8c2b6a93090\") " Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.077298 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a2df24d1-d5eb-4680-aeb5-e8c2b6a93090-ovsdbserver-sb\") pod \"a2df24d1-d5eb-4680-aeb5-e8c2b6a93090\" (UID: \"a2df24d1-d5eb-4680-aeb5-e8c2b6a93090\") " Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.077326 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a2df24d1-d5eb-4680-aeb5-e8c2b6a93090-ovsdbserver-nb\") pod \"a2df24d1-d5eb-4680-aeb5-e8c2b6a93090\" (UID: \"a2df24d1-d5eb-4680-aeb5-e8c2b6a93090\") " Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.077436 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2df24d1-d5eb-4680-aeb5-e8c2b6a93090-dns-svc\") pod \"a2df24d1-d5eb-4680-aeb5-e8c2b6a93090\" (UID: \"a2df24d1-d5eb-4680-aeb5-e8c2b6a93090\") " Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.077721 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f94cb12-90f7-4a5a-9da4-6520946b46be-scripts\") pod \"glance-default-external-api-0\" (UID: \"2f94cb12-90f7-4a5a-9da4-6520946b46be\") " pod="openstack/glance-default-external-api-0" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.077762 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4wkw\" (UniqueName: \"kubernetes.io/projected/2f94cb12-90f7-4a5a-9da4-6520946b46be-kube-api-access-m4wkw\") pod \"glance-default-external-api-0\" (UID: \"2f94cb12-90f7-4a5a-9da4-6520946b46be\") " pod="openstack/glance-default-external-api-0" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.077780 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2f94cb12-90f7-4a5a-9da4-6520946b46be-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2f94cb12-90f7-4a5a-9da4-6520946b46be\") " pod="openstack/glance-default-external-api-0" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.077800 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f94cb12-90f7-4a5a-9da4-6520946b46be-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2f94cb12-90f7-4a5a-9da4-6520946b46be\") " pod="openstack/glance-default-external-api-0" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.077830 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f94cb12-90f7-4a5a-9da4-6520946b46be-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2f94cb12-90f7-4a5a-9da4-6520946b46be\") " pod="openstack/glance-default-external-api-0" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.077878 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f94cb12-90f7-4a5a-9da4-6520946b46be-config-data\") pod \"glance-default-external-api-0\" (UID: \"2f94cb12-90f7-4a5a-9da4-6520946b46be\") " pod="openstack/glance-default-external-api-0" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.077929 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f94cb12-90f7-4a5a-9da4-6520946b46be-logs\") pod \"glance-default-external-api-0\" (UID: \"2f94cb12-90f7-4a5a-9da4-6520946b46be\") " pod="openstack/glance-default-external-api-0" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.078007 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-970052f0-8f5a-4d9c-9fec-85f44ded7440\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-970052f0-8f5a-4d9c-9fec-85f44ded7440\") pod \"glance-default-external-api-0\" (UID: \"2f94cb12-90f7-4a5a-9da4-6520946b46be\") " pod="openstack/glance-default-external-api-0" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.104703 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2df24d1-d5eb-4680-aeb5-e8c2b6a93090-kube-api-access-bsm8l" (OuterVolumeSpecName: "kube-api-access-bsm8l") pod "a2df24d1-d5eb-4680-aeb5-e8c2b6a93090" (UID: "a2df24d1-d5eb-4680-aeb5-e8c2b6a93090"). InnerVolumeSpecName "kube-api-access-bsm8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.110697 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-69bb477875-ngpk2"] Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.125511 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-767f59545d-j4rfp"] Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.192154 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-970052f0-8f5a-4d9c-9fec-85f44ded7440\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-970052f0-8f5a-4d9c-9fec-85f44ded7440\") pod \"glance-default-external-api-0\" (UID: \"2f94cb12-90f7-4a5a-9da4-6520946b46be\") " pod="openstack/glance-default-external-api-0" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.192263 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f94cb12-90f7-4a5a-9da4-6520946b46be-scripts\") pod \"glance-default-external-api-0\" (UID: \"2f94cb12-90f7-4a5a-9da4-6520946b46be\") " pod="openstack/glance-default-external-api-0" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.192309 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4wkw\" (UniqueName: \"kubernetes.io/projected/2f94cb12-90f7-4a5a-9da4-6520946b46be-kube-api-access-m4wkw\") pod \"glance-default-external-api-0\" (UID: \"2f94cb12-90f7-4a5a-9da4-6520946b46be\") " pod="openstack/glance-default-external-api-0" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.192329 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2f94cb12-90f7-4a5a-9da4-6520946b46be-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2f94cb12-90f7-4a5a-9da4-6520946b46be\") " pod="openstack/glance-default-external-api-0" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.192358 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f94cb12-90f7-4a5a-9da4-6520946b46be-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2f94cb12-90f7-4a5a-9da4-6520946b46be\") " pod="openstack/glance-default-external-api-0" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.192402 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f94cb12-90f7-4a5a-9da4-6520946b46be-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2f94cb12-90f7-4a5a-9da4-6520946b46be\") " pod="openstack/glance-default-external-api-0" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.192463 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f94cb12-90f7-4a5a-9da4-6520946b46be-config-data\") pod \"glance-default-external-api-0\" (UID: \"2f94cb12-90f7-4a5a-9da4-6520946b46be\") " pod="openstack/glance-default-external-api-0" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.192572 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f94cb12-90f7-4a5a-9da4-6520946b46be-logs\") pod \"glance-default-external-api-0\" (UID: \"2f94cb12-90f7-4a5a-9da4-6520946b46be\") " pod="openstack/glance-default-external-api-0" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.192702 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsm8l\" (UniqueName: \"kubernetes.io/projected/a2df24d1-d5eb-4680-aeb5-e8c2b6a93090-kube-api-access-bsm8l\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.198602 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2df24d1-d5eb-4680-aeb5-e8c2b6a93090-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a2df24d1-d5eb-4680-aeb5-e8c2b6a93090" (UID: "a2df24d1-d5eb-4680-aeb5-e8c2b6a93090"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.199085 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2f94cb12-90f7-4a5a-9da4-6520946b46be-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2f94cb12-90f7-4a5a-9da4-6520946b46be\") " pod="openstack/glance-default-external-api-0" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.199460 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f94cb12-90f7-4a5a-9da4-6520946b46be-scripts\") pod \"glance-default-external-api-0\" (UID: \"2f94cb12-90f7-4a5a-9da4-6520946b46be\") " pod="openstack/glance-default-external-api-0" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.208808 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f94cb12-90f7-4a5a-9da4-6520946b46be-logs\") pod \"glance-default-external-api-0\" (UID: \"2f94cb12-90f7-4a5a-9da4-6520946b46be\") " pod="openstack/glance-default-external-api-0" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.209649 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f94cb12-90f7-4a5a-9da4-6520946b46be-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2f94cb12-90f7-4a5a-9da4-6520946b46be\") " pod="openstack/glance-default-external-api-0" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.214081 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f94cb12-90f7-4a5a-9da4-6520946b46be-config-data\") pod \"glance-default-external-api-0\" (UID: \"2f94cb12-90f7-4a5a-9da4-6520946b46be\") " pod="openstack/glance-default-external-api-0" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.229181 4723 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.229242 4723 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-970052f0-8f5a-4d9c-9fec-85f44ded7440\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-970052f0-8f5a-4d9c-9fec-85f44ded7440\") pod \"glance-default-external-api-0\" (UID: \"2f94cb12-90f7-4a5a-9da4-6520946b46be\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c915ec56eda8abd87809fc9b9acb00675f31cc32ff5e20cf139093d267df65a0/globalmount\"" pod="openstack/glance-default-external-api-0" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.234388 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f94cb12-90f7-4a5a-9da4-6520946b46be-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2f94cb12-90f7-4a5a-9da4-6520946b46be\") " pod="openstack/glance-default-external-api-0" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.238568 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4wkw\" (UniqueName: \"kubernetes.io/projected/2f94cb12-90f7-4a5a-9da4-6520946b46be-kube-api-access-m4wkw\") pod \"glance-default-external-api-0\" (UID: \"2f94cb12-90f7-4a5a-9da4-6520946b46be\") " pod="openstack/glance-default-external-api-0" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.239137 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2df24d1-d5eb-4680-aeb5-e8c2b6a93090-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a2df24d1-d5eb-4680-aeb5-e8c2b6a93090" (UID: "a2df24d1-d5eb-4680-aeb5-e8c2b6a93090"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.296457 4723 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a2df24d1-d5eb-4680-aeb5-e8c2b6a93090-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.296491 4723 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a2df24d1-d5eb-4680-aeb5-e8c2b6a93090-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.314386 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-970052f0-8f5a-4d9c-9fec-85f44ded7440\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-970052f0-8f5a-4d9c-9fec-85f44ded7440\") pod \"glance-default-external-api-0\" (UID: \"2f94cb12-90f7-4a5a-9da4-6520946b46be\") " pod="openstack/glance-default-external-api-0" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.341580 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2df24d1-d5eb-4680-aeb5-e8c2b6a93090-config" (OuterVolumeSpecName: "config") pod "a2df24d1-d5eb-4680-aeb5-e8c2b6a93090" (UID: "a2df24d1-d5eb-4680-aeb5-e8c2b6a93090"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.350823 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2df24d1-d5eb-4680-aeb5-e8c2b6a93090-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a2df24d1-d5eb-4680-aeb5-e8c2b6a93090" (UID: "a2df24d1-d5eb-4680-aeb5-e8c2b6a93090"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.361587 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.397972 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-9jjw9"] Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.409448 4723 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a2df24d1-d5eb-4680-aeb5-e8c2b6a93090-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.409489 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2df24d1-d5eb-4680-aeb5-e8c2b6a93090-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.425480 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2df24d1-d5eb-4680-aeb5-e8c2b6a93090-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a2df24d1-d5eb-4680-aeb5-e8c2b6a93090" (UID: "a2df24d1-d5eb-4680-aeb5-e8c2b6a93090"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.511309 4723 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2df24d1-d5eb-4680-aeb5-e8c2b6a93090-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.571550 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-cp5rk" event={"ID":"6832d621-3d7d-4e4a-824b-f219746aaa89","Type":"ContainerDied","Data":"e6498d9f715423d0271b48318b5f3a38a5904893ebc37d37e506bc8e5506222c"} Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.571598 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6498d9f715423d0271b48318b5f3a38a5904893ebc37d37e506bc8e5506222c" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.571599 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-cp5rk" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.578514 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-9pt7z" event={"ID":"a2df24d1-d5eb-4680-aeb5-e8c2b6a93090","Type":"ContainerDied","Data":"72e5eca062d6a90c9c8bf015e63b9cd7aa73298d393661c97e2cfafeed457c8b"} Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.578558 4723 scope.go:117] "RemoveContainer" containerID="cab4d22cb7da6a34a11896bec6e65184e23592a437083f0d5ff6d1fc60b117bd" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.578558 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-9pt7z" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.617363 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.667937 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-9pt7z"] Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.676398 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-9pt7z"] Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.751318 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6f68f59b98-9mh25"] Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.839427 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-677d745ffb-ng6tr"] Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.841353 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-677d745ffb-ng6tr" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.843359 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.843416 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-s28v7" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.843454 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.851085 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.851707 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.854956 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-677d745ffb-ng6tr"] Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.920073 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed96f382-04dd-41ec-b370-832266d07122-public-tls-certs\") pod \"placement-677d745ffb-ng6tr\" (UID: \"ed96f382-04dd-41ec-b370-832266d07122\") " pod="openstack/placement-677d745ffb-ng6tr" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.920131 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed96f382-04dd-41ec-b370-832266d07122-config-data\") pod \"placement-677d745ffb-ng6tr\" (UID: \"ed96f382-04dd-41ec-b370-832266d07122\") " pod="openstack/placement-677d745ffb-ng6tr" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.920209 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed96f382-04dd-41ec-b370-832266d07122-scripts\") pod \"placement-677d745ffb-ng6tr\" (UID: \"ed96f382-04dd-41ec-b370-832266d07122\") " pod="openstack/placement-677d745ffb-ng6tr" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.921460 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed96f382-04dd-41ec-b370-832266d07122-combined-ca-bundle\") pod \"placement-677d745ffb-ng6tr\" (UID: \"ed96f382-04dd-41ec-b370-832266d07122\") " pod="openstack/placement-677d745ffb-ng6tr" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.921807 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed96f382-04dd-41ec-b370-832266d07122-logs\") pod \"placement-677d745ffb-ng6tr\" (UID: \"ed96f382-04dd-41ec-b370-832266d07122\") " pod="openstack/placement-677d745ffb-ng6tr" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.921945 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed96f382-04dd-41ec-b370-832266d07122-internal-tls-certs\") pod \"placement-677d745ffb-ng6tr\" (UID: \"ed96f382-04dd-41ec-b370-832266d07122\") " pod="openstack/placement-677d745ffb-ng6tr" Mar 09 13:21:19 crc kubenswrapper[4723]: I0309 13:21:19.921994 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb5cd\" (UniqueName: \"kubernetes.io/projected/ed96f382-04dd-41ec-b370-832266d07122-kube-api-access-tb5cd\") pod \"placement-677d745ffb-ng6tr\" (UID: \"ed96f382-04dd-41ec-b370-832266d07122\") " pod="openstack/placement-677d745ffb-ng6tr" Mar 09 13:21:20 crc kubenswrapper[4723]: I0309 13:21:20.026043 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed96f382-04dd-41ec-b370-832266d07122-combined-ca-bundle\") pod \"placement-677d745ffb-ng6tr\" (UID: \"ed96f382-04dd-41ec-b370-832266d07122\") " pod="openstack/placement-677d745ffb-ng6tr" Mar 09 13:21:20 crc kubenswrapper[4723]: I0309 13:21:20.026215 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed96f382-04dd-41ec-b370-832266d07122-logs\") pod \"placement-677d745ffb-ng6tr\" (UID: \"ed96f382-04dd-41ec-b370-832266d07122\") " pod="openstack/placement-677d745ffb-ng6tr" Mar 09 13:21:20 crc kubenswrapper[4723]: I0309 13:21:20.026356 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed96f382-04dd-41ec-b370-832266d07122-internal-tls-certs\") pod \"placement-677d745ffb-ng6tr\" (UID: \"ed96f382-04dd-41ec-b370-832266d07122\") " pod="openstack/placement-677d745ffb-ng6tr" Mar 09 13:21:20 crc kubenswrapper[4723]: I0309 13:21:20.026410 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb5cd\" (UniqueName: \"kubernetes.io/projected/ed96f382-04dd-41ec-b370-832266d07122-kube-api-access-tb5cd\") pod \"placement-677d745ffb-ng6tr\" (UID: \"ed96f382-04dd-41ec-b370-832266d07122\") " pod="openstack/placement-677d745ffb-ng6tr" Mar 09 13:21:20 crc kubenswrapper[4723]: I0309 13:21:20.026475 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed96f382-04dd-41ec-b370-832266d07122-public-tls-certs\") pod \"placement-677d745ffb-ng6tr\" (UID: \"ed96f382-04dd-41ec-b370-832266d07122\") " pod="openstack/placement-677d745ffb-ng6tr" Mar 09 13:21:20 crc kubenswrapper[4723]: I0309 13:21:20.026507 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed96f382-04dd-41ec-b370-832266d07122-config-data\") pod \"placement-677d745ffb-ng6tr\" (UID: \"ed96f382-04dd-41ec-b370-832266d07122\") " pod="openstack/placement-677d745ffb-ng6tr" Mar 09 13:21:20 crc kubenswrapper[4723]: I0309 13:21:20.026531 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed96f382-04dd-41ec-b370-832266d07122-scripts\") pod \"placement-677d745ffb-ng6tr\" (UID: \"ed96f382-04dd-41ec-b370-832266d07122\") " pod="openstack/placement-677d745ffb-ng6tr" Mar 09 13:21:20 crc kubenswrapper[4723]: I0309 13:21:20.026808 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed96f382-04dd-41ec-b370-832266d07122-logs\") pod \"placement-677d745ffb-ng6tr\" (UID: \"ed96f382-04dd-41ec-b370-832266d07122\") " pod="openstack/placement-677d745ffb-ng6tr" Mar 09 13:21:20 crc kubenswrapper[4723]: I0309 13:21:20.031511 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed96f382-04dd-41ec-b370-832266d07122-scripts\") pod \"placement-677d745ffb-ng6tr\" (UID: \"ed96f382-04dd-41ec-b370-832266d07122\") " pod="openstack/placement-677d745ffb-ng6tr" Mar 09 13:21:20 crc kubenswrapper[4723]: I0309 13:21:20.031784 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed96f382-04dd-41ec-b370-832266d07122-config-data\") pod \"placement-677d745ffb-ng6tr\" (UID: \"ed96f382-04dd-41ec-b370-832266d07122\") " pod="openstack/placement-677d745ffb-ng6tr" Mar 09 13:21:20 crc kubenswrapper[4723]: I0309 13:21:20.032460 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed96f382-04dd-41ec-b370-832266d07122-public-tls-certs\") pod \"placement-677d745ffb-ng6tr\" (UID: \"ed96f382-04dd-41ec-b370-832266d07122\") " pod="openstack/placement-677d745ffb-ng6tr" Mar 09 13:21:20 crc kubenswrapper[4723]: I0309 13:21:20.032941 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed96f382-04dd-41ec-b370-832266d07122-internal-tls-certs\") pod \"placement-677d745ffb-ng6tr\" (UID: \"ed96f382-04dd-41ec-b370-832266d07122\") " pod="openstack/placement-677d745ffb-ng6tr" Mar 09 13:21:20 crc kubenswrapper[4723]: I0309 13:21:20.034688 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed96f382-04dd-41ec-b370-832266d07122-combined-ca-bundle\") pod \"placement-677d745ffb-ng6tr\" (UID: \"ed96f382-04dd-41ec-b370-832266d07122\") " pod="openstack/placement-677d745ffb-ng6tr" Mar 09 13:21:20 crc kubenswrapper[4723]: I0309 13:21:20.045357 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb5cd\" (UniqueName: \"kubernetes.io/projected/ed96f382-04dd-41ec-b370-832266d07122-kube-api-access-tb5cd\") pod \"placement-677d745ffb-ng6tr\" (UID: \"ed96f382-04dd-41ec-b370-832266d07122\") " pod="openstack/placement-677d745ffb-ng6tr" Mar 09 13:21:20 crc kubenswrapper[4723]: I0309 13:21:20.165724 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-677d745ffb-ng6tr" Mar 09 13:21:20 crc kubenswrapper[4723]: I0309 13:21:20.903918 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63094be1-8d09-427e-9743-697c7bdcdb41" path="/var/lib/kubelet/pods/63094be1-8d09-427e-9743-697c7bdcdb41/volumes" Mar 09 13:21:20 crc kubenswrapper[4723]: I0309 13:21:20.904840 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2df24d1-d5eb-4680-aeb5-e8c2b6a93090" path="/var/lib/kubelet/pods/a2df24d1-d5eb-4680-aeb5-e8c2b6a93090/volumes" Mar 09 13:21:21 crc kubenswrapper[4723]: I0309 13:21:21.253067 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-d7b4c9946-ljf67"] Mar 09 13:21:21 crc kubenswrapper[4723]: I0309 13:21:21.255139 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d7b4c9946-ljf67" Mar 09 13:21:21 crc kubenswrapper[4723]: I0309 13:21:21.257810 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 09 13:21:21 crc kubenswrapper[4723]: I0309 13:21:21.258529 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 09 13:21:21 crc kubenswrapper[4723]: I0309 13:21:21.268921 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-d7b4c9946-ljf67"] Mar 09 13:21:21 crc kubenswrapper[4723]: I0309 13:21:21.363061 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/591e7541-2095-490f-9787-d4551a2e4f9d-logs\") pod \"barbican-api-d7b4c9946-ljf67\" (UID: \"591e7541-2095-490f-9787-d4551a2e4f9d\") " pod="openstack/barbican-api-d7b4c9946-ljf67" Mar 09 13:21:21 crc kubenswrapper[4723]: I0309 13:21:21.363115 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/591e7541-2095-490f-9787-d4551a2e4f9d-config-data-custom\") pod \"barbican-api-d7b4c9946-ljf67\" (UID: \"591e7541-2095-490f-9787-d4551a2e4f9d\") " pod="openstack/barbican-api-d7b4c9946-ljf67" Mar 09 13:21:21 crc kubenswrapper[4723]: I0309 13:21:21.363144 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/591e7541-2095-490f-9787-d4551a2e4f9d-combined-ca-bundle\") pod \"barbican-api-d7b4c9946-ljf67\" (UID: \"591e7541-2095-490f-9787-d4551a2e4f9d\") " pod="openstack/barbican-api-d7b4c9946-ljf67" Mar 09 13:21:21 crc kubenswrapper[4723]: I0309 13:21:21.363339 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/591e7541-2095-490f-9787-d4551a2e4f9d-config-data\") pod \"barbican-api-d7b4c9946-ljf67\" (UID: \"591e7541-2095-490f-9787-d4551a2e4f9d\") " pod="openstack/barbican-api-d7b4c9946-ljf67" Mar 09 13:21:21 crc kubenswrapper[4723]: I0309 13:21:21.363677 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/591e7541-2095-490f-9787-d4551a2e4f9d-public-tls-certs\") pod \"barbican-api-d7b4c9946-ljf67\" (UID: \"591e7541-2095-490f-9787-d4551a2e4f9d\") " pod="openstack/barbican-api-d7b4c9946-ljf67" Mar 09 13:21:21 crc kubenswrapper[4723]: I0309 13:21:21.363721 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf7wn\" (UniqueName: \"kubernetes.io/projected/591e7541-2095-490f-9787-d4551a2e4f9d-kube-api-access-sf7wn\") pod \"barbican-api-d7b4c9946-ljf67\" (UID: \"591e7541-2095-490f-9787-d4551a2e4f9d\") " pod="openstack/barbican-api-d7b4c9946-ljf67" Mar 09 13:21:21 crc kubenswrapper[4723]: I0309 13:21:21.363774 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/591e7541-2095-490f-9787-d4551a2e4f9d-internal-tls-certs\") pod \"barbican-api-d7b4c9946-ljf67\" (UID: \"591e7541-2095-490f-9787-d4551a2e4f9d\") " pod="openstack/barbican-api-d7b4c9946-ljf67" Mar 09 13:21:21 crc kubenswrapper[4723]: I0309 13:21:21.466123 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/591e7541-2095-490f-9787-d4551a2e4f9d-public-tls-certs\") pod \"barbican-api-d7b4c9946-ljf67\" (UID: \"591e7541-2095-490f-9787-d4551a2e4f9d\") " pod="openstack/barbican-api-d7b4c9946-ljf67" Mar 09 13:21:21 crc kubenswrapper[4723]: I0309 13:21:21.466181 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sf7wn\" (UniqueName: \"kubernetes.io/projected/591e7541-2095-490f-9787-d4551a2e4f9d-kube-api-access-sf7wn\") pod \"barbican-api-d7b4c9946-ljf67\" (UID: \"591e7541-2095-490f-9787-d4551a2e4f9d\") " pod="openstack/barbican-api-d7b4c9946-ljf67" Mar 09 13:21:21 crc kubenswrapper[4723]: I0309 13:21:21.466215 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/591e7541-2095-490f-9787-d4551a2e4f9d-internal-tls-certs\") pod \"barbican-api-d7b4c9946-ljf67\" (UID: \"591e7541-2095-490f-9787-d4551a2e4f9d\") " pod="openstack/barbican-api-d7b4c9946-ljf67" Mar 09 13:21:21 crc kubenswrapper[4723]: I0309 13:21:21.466325 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/591e7541-2095-490f-9787-d4551a2e4f9d-logs\") pod \"barbican-api-d7b4c9946-ljf67\" (UID: \"591e7541-2095-490f-9787-d4551a2e4f9d\") " pod="openstack/barbican-api-d7b4c9946-ljf67" Mar 09 13:21:21 crc kubenswrapper[4723]: I0309 13:21:21.466364 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/591e7541-2095-490f-9787-d4551a2e4f9d-config-data-custom\") pod \"barbican-api-d7b4c9946-ljf67\" (UID: \"591e7541-2095-490f-9787-d4551a2e4f9d\") " pod="openstack/barbican-api-d7b4c9946-ljf67" Mar 09 13:21:21 crc kubenswrapper[4723]: I0309 13:21:21.466391 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/591e7541-2095-490f-9787-d4551a2e4f9d-combined-ca-bundle\") pod \"barbican-api-d7b4c9946-ljf67\" (UID: \"591e7541-2095-490f-9787-d4551a2e4f9d\") " pod="openstack/barbican-api-d7b4c9946-ljf67" Mar 09 13:21:21 crc kubenswrapper[4723]: I0309 13:21:21.466432 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/591e7541-2095-490f-9787-d4551a2e4f9d-config-data\") pod \"barbican-api-d7b4c9946-ljf67\" (UID: \"591e7541-2095-490f-9787-d4551a2e4f9d\") " pod="openstack/barbican-api-d7b4c9946-ljf67" Mar 09 13:21:21 crc kubenswrapper[4723]: I0309 13:21:21.466919 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/591e7541-2095-490f-9787-d4551a2e4f9d-logs\") pod \"barbican-api-d7b4c9946-ljf67\" (UID: \"591e7541-2095-490f-9787-d4551a2e4f9d\") " pod="openstack/barbican-api-d7b4c9946-ljf67" Mar 09 13:21:21 crc kubenswrapper[4723]: I0309 13:21:21.471205 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/591e7541-2095-490f-9787-d4551a2e4f9d-config-data-custom\") pod \"barbican-api-d7b4c9946-ljf67\" (UID: \"591e7541-2095-490f-9787-d4551a2e4f9d\") " pod="openstack/barbican-api-d7b4c9946-ljf67" Mar 09 13:21:21 crc kubenswrapper[4723]: I0309 13:21:21.472627 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/591e7541-2095-490f-9787-d4551a2e4f9d-combined-ca-bundle\") pod \"barbican-api-d7b4c9946-ljf67\" (UID: \"591e7541-2095-490f-9787-d4551a2e4f9d\") " pod="openstack/barbican-api-d7b4c9946-ljf67" Mar 09 13:21:21 crc kubenswrapper[4723]: I0309 13:21:21.473228 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/591e7541-2095-490f-9787-d4551a2e4f9d-config-data\") pod \"barbican-api-d7b4c9946-ljf67\" (UID: \"591e7541-2095-490f-9787-d4551a2e4f9d\") " pod="openstack/barbican-api-d7b4c9946-ljf67" Mar 09 13:21:21 crc kubenswrapper[4723]: I0309 13:21:21.473288 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/591e7541-2095-490f-9787-d4551a2e4f9d-internal-tls-certs\") pod \"barbican-api-d7b4c9946-ljf67\" (UID: \"591e7541-2095-490f-9787-d4551a2e4f9d\") " pod="openstack/barbican-api-d7b4c9946-ljf67" Mar 09 13:21:21 crc kubenswrapper[4723]: I0309 13:21:21.477304 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/591e7541-2095-490f-9787-d4551a2e4f9d-public-tls-certs\") pod \"barbican-api-d7b4c9946-ljf67\" (UID: \"591e7541-2095-490f-9787-d4551a2e4f9d\") " pod="openstack/barbican-api-d7b4c9946-ljf67" Mar 09 13:21:21 crc kubenswrapper[4723]: I0309 13:21:21.485763 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf7wn\" (UniqueName: \"kubernetes.io/projected/591e7541-2095-490f-9787-d4551a2e4f9d-kube-api-access-sf7wn\") pod \"barbican-api-d7b4c9946-ljf67\" (UID: \"591e7541-2095-490f-9787-d4551a2e4f9d\") " pod="openstack/barbican-api-d7b4c9946-ljf67" Mar 09 13:21:21 crc kubenswrapper[4723]: I0309 13:21:21.578181 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d7b4c9946-ljf67" Mar 09 13:21:23 crc kubenswrapper[4723]: W0309 13:21:23.310907 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e90760e_0ff0_4195_8bb9_d32fe674feb5.slice/crio-3d4fc48d171500cfc6491ae632987877a3921d836a072fead19695bd557ed8f3 WatchSource:0}: Error finding container 3d4fc48d171500cfc6491ae632987877a3921d836a072fead19695bd557ed8f3: Status 404 returned error can't find the container with id 3d4fc48d171500cfc6491ae632987877a3921d836a072fead19695bd557ed8f3 Mar 09 13:21:23 crc kubenswrapper[4723]: W0309 13:21:23.314787 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4575d39_0084_4f11_980c_6187b318d7fa.slice/crio-9055dfd425f15ff36dfa0eb3bf2c166fe534e7cf4a180b468bf90d94743397a0 WatchSource:0}: Error finding container 9055dfd425f15ff36dfa0eb3bf2c166fe534e7cf4a180b468bf90d94743397a0: Status 404 returned error can't find the container with id 9055dfd425f15ff36dfa0eb3bf2c166fe534e7cf4a180b468bf90d94743397a0 Mar 09 13:21:23 crc kubenswrapper[4723]: W0309 13:21:23.319491 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4313196c_6a31_4615_9ffe_329aed2bfef4.slice/crio-32101265ce9905129a95fb07ceebec339efbfb422691efbd8c51740b1f5c41d7 WatchSource:0}: Error finding container 32101265ce9905129a95fb07ceebec339efbfb422691efbd8c51740b1f5c41d7: Status 404 returned error can't find the container with id 32101265ce9905129a95fb07ceebec339efbfb422691efbd8c51740b1f5c41d7 Mar 09 13:21:23 crc kubenswrapper[4723]: W0309 13:21:23.323164 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9e10194_4646_4afe_8353_146441b874ad.slice/crio-630a167016d975db1a4b9088d5cdf3d70d3516bc4ed7af61a1ad6b2b0db51bd9 WatchSource:0}: Error finding container 630a167016d975db1a4b9088d5cdf3d70d3516bc4ed7af61a1ad6b2b0db51bd9: Status 404 returned error can't find the container with id 630a167016d975db1a4b9088d5cdf3d70d3516bc4ed7af61a1ad6b2b0db51bd9 Mar 09 13:21:23 crc kubenswrapper[4723]: I0309 13:21:23.333030 4723 scope.go:117] "RemoveContainer" containerID="b8a98702c49a7f1d17a5268ee941bb7a7f9ec9bfd72579ab3b3cc36afc05b4db" Mar 09 13:21:23 crc kubenswrapper[4723]: I0309 13:21:23.621045 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ch4lf" Mar 09 13:21:23 crc kubenswrapper[4723]: I0309 13:21:23.630134 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-69bb477875-ngpk2" event={"ID":"afb8e2d8-5db3-4447-9c6e-4b8b6249b44d","Type":"ContainerStarted","Data":"58a2c689bd2611389e9750bed2a7ac02068506859f7fe17473061e477d6bea41"} Mar 09 13:21:23 crc kubenswrapper[4723]: I0309 13:21:23.633774 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2e90760e-0ff0-4195-8bb9-d32fe674feb5","Type":"ContainerStarted","Data":"3d4fc48d171500cfc6491ae632987877a3921d836a072fead19695bd557ed8f3"} Mar 09 13:21:23 crc kubenswrapper[4723]: I0309 13:21:23.649712 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ch4lf" Mar 09 13:21:23 crc kubenswrapper[4723]: I0309 13:21:23.649736 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ch4lf" event={"ID":"3d74c75e-9665-4723-8dca-9019bd324ccb","Type":"ContainerDied","Data":"920731df8c6b4733c66acc97fd9537a062c06a347e6112cab23494bdc55f0aba"} Mar 09 13:21:23 crc kubenswrapper[4723]: I0309 13:21:23.650091 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="920731df8c6b4733c66acc97fd9537a062c06a347e6112cab23494bdc55f0aba" Mar 09 13:21:23 crc kubenswrapper[4723]: I0309 13:21:23.658715 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-767f59545d-j4rfp" event={"ID":"d9e10194-4646-4afe-8353-146441b874ad","Type":"ContainerStarted","Data":"630a167016d975db1a4b9088d5cdf3d70d3516bc4ed7af61a1ad6b2b0db51bd9"} Mar 09 13:21:23 crc kubenswrapper[4723]: I0309 13:21:23.668106 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-9jjw9" event={"ID":"4313196c-6a31-4615-9ffe-329aed2bfef4","Type":"ContainerStarted","Data":"32101265ce9905129a95fb07ceebec339efbfb422691efbd8c51740b1f5c41d7"} Mar 09 13:21:23 crc kubenswrapper[4723]: I0309 13:21:23.682896 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f68f59b98-9mh25" event={"ID":"a4575d39-0084-4f11-980c-6187b318d7fa","Type":"ContainerStarted","Data":"9055dfd425f15ff36dfa0eb3bf2c166fe534e7cf4a180b468bf90d94743397a0"} Mar 09 13:21:23 crc kubenswrapper[4723]: I0309 13:21:23.726899 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3d74c75e-9665-4723-8dca-9019bd324ccb-fernet-keys\") pod \"3d74c75e-9665-4723-8dca-9019bd324ccb\" (UID: \"3d74c75e-9665-4723-8dca-9019bd324ccb\") " Mar 09 13:21:23 crc kubenswrapper[4723]: I0309 13:21:23.727042 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3d74c75e-9665-4723-8dca-9019bd324ccb-credential-keys\") pod \"3d74c75e-9665-4723-8dca-9019bd324ccb\" (UID: \"3d74c75e-9665-4723-8dca-9019bd324ccb\") " Mar 09 13:21:23 crc kubenswrapper[4723]: I0309 13:21:23.727133 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d74c75e-9665-4723-8dca-9019bd324ccb-scripts\") pod \"3d74c75e-9665-4723-8dca-9019bd324ccb\" (UID: \"3d74c75e-9665-4723-8dca-9019bd324ccb\") " Mar 09 13:21:23 crc kubenswrapper[4723]: I0309 13:21:23.727281 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d74c75e-9665-4723-8dca-9019bd324ccb-combined-ca-bundle\") pod \"3d74c75e-9665-4723-8dca-9019bd324ccb\" (UID: \"3d74c75e-9665-4723-8dca-9019bd324ccb\") " Mar 09 13:21:23 crc kubenswrapper[4723]: I0309 13:21:23.727312 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djb8p\" (UniqueName: \"kubernetes.io/projected/3d74c75e-9665-4723-8dca-9019bd324ccb-kube-api-access-djb8p\") pod \"3d74c75e-9665-4723-8dca-9019bd324ccb\" (UID: \"3d74c75e-9665-4723-8dca-9019bd324ccb\") " Mar 09 13:21:23 crc kubenswrapper[4723]: I0309 13:21:23.727351 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d74c75e-9665-4723-8dca-9019bd324ccb-config-data\") pod \"3d74c75e-9665-4723-8dca-9019bd324ccb\" (UID: \"3d74c75e-9665-4723-8dca-9019bd324ccb\") " Mar 09 13:21:23 crc kubenswrapper[4723]: I0309 13:21:23.730751 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d74c75e-9665-4723-8dca-9019bd324ccb-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3d74c75e-9665-4723-8dca-9019bd324ccb" (UID: "3d74c75e-9665-4723-8dca-9019bd324ccb"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:23 crc kubenswrapper[4723]: I0309 13:21:23.731437 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d74c75e-9665-4723-8dca-9019bd324ccb-kube-api-access-djb8p" (OuterVolumeSpecName: "kube-api-access-djb8p") pod "3d74c75e-9665-4723-8dca-9019bd324ccb" (UID: "3d74c75e-9665-4723-8dca-9019bd324ccb"). InnerVolumeSpecName "kube-api-access-djb8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:23 crc kubenswrapper[4723]: I0309 13:21:23.738924 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d74c75e-9665-4723-8dca-9019bd324ccb-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "3d74c75e-9665-4723-8dca-9019bd324ccb" (UID: "3d74c75e-9665-4723-8dca-9019bd324ccb"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:23 crc kubenswrapper[4723]: I0309 13:21:23.740991 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d74c75e-9665-4723-8dca-9019bd324ccb-scripts" (OuterVolumeSpecName: "scripts") pod "3d74c75e-9665-4723-8dca-9019bd324ccb" (UID: "3d74c75e-9665-4723-8dca-9019bd324ccb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:23 crc kubenswrapper[4723]: I0309 13:21:23.831134 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djb8p\" (UniqueName: \"kubernetes.io/projected/3d74c75e-9665-4723-8dca-9019bd324ccb-kube-api-access-djb8p\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:23 crc kubenswrapper[4723]: I0309 13:21:23.831163 4723 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3d74c75e-9665-4723-8dca-9019bd324ccb-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:23 crc kubenswrapper[4723]: I0309 13:21:23.831172 4723 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3d74c75e-9665-4723-8dca-9019bd324ccb-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:23 crc kubenswrapper[4723]: I0309 13:21:23.831180 4723 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d74c75e-9665-4723-8dca-9019bd324ccb-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:23 crc kubenswrapper[4723]: I0309 13:21:23.841620 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d74c75e-9665-4723-8dca-9019bd324ccb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d74c75e-9665-4723-8dca-9019bd324ccb" (UID: "3d74c75e-9665-4723-8dca-9019bd324ccb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:23 crc kubenswrapper[4723]: I0309 13:21:23.873734 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d74c75e-9665-4723-8dca-9019bd324ccb-config-data" (OuterVolumeSpecName: "config-data") pod "3d74c75e-9665-4723-8dca-9019bd324ccb" (UID: "3d74c75e-9665-4723-8dca-9019bd324ccb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:23 crc kubenswrapper[4723]: I0309 13:21:23.933663 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d74c75e-9665-4723-8dca-9019bd324ccb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:23 crc kubenswrapper[4723]: I0309 13:21:23.933712 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d74c75e-9665-4723-8dca-9019bd324ccb-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:23 crc kubenswrapper[4723]: I0309 13:21:23.972913 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-d7b4c9946-ljf67"] Mar 09 13:21:24 crc kubenswrapper[4723]: I0309 13:21:24.090996 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-677d745ffb-ng6tr"] Mar 09 13:21:24 crc kubenswrapper[4723]: I0309 13:21:24.185148 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 13:21:24 crc kubenswrapper[4723]: W0309 13:21:24.208766 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f94cb12_90f7_4a5a_9da4_6520946b46be.slice/crio-77b9a9f3676621c7a67c476085a7467e59a8e369c486a3c383f88a71bc9b33e2 WatchSource:0}: Error finding container 77b9a9f3676621c7a67c476085a7467e59a8e369c486a3c383f88a71bc9b33e2: Status 404 returned error can't find the container with id 77b9a9f3676621c7a67c476085a7467e59a8e369c486a3c383f88a71bc9b33e2 Mar 09 13:21:24 crc kubenswrapper[4723]: I0309 13:21:24.798546 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-677d745ffb-ng6tr" event={"ID":"ed96f382-04dd-41ec-b370-832266d07122","Type":"ContainerStarted","Data":"108797fee4bc3e599d56d59fe81b93399c1c794db55cfea0cf31591e9e784560"} Mar 09 13:21:24 crc kubenswrapper[4723]: I0309 13:21:24.804543 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2e90760e-0ff0-4195-8bb9-d32fe674feb5","Type":"ContainerStarted","Data":"dbc687e3211b92eb828ad0207767237f7fe0ccc94f29cfde15316b72d8d5efbf"} Mar 09 13:21:24 crc kubenswrapper[4723]: I0309 13:21:24.815003 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-644bd545d4-m82n9"] Mar 09 13:21:24 crc kubenswrapper[4723]: E0309 13:21:24.815948 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d74c75e-9665-4723-8dca-9019bd324ccb" containerName="keystone-bootstrap" Mar 09 13:21:24 crc kubenswrapper[4723]: I0309 13:21:24.815963 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d74c75e-9665-4723-8dca-9019bd324ccb" containerName="keystone-bootstrap" Mar 09 13:21:24 crc kubenswrapper[4723]: I0309 13:21:24.816303 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d74c75e-9665-4723-8dca-9019bd324ccb" containerName="keystone-bootstrap" Mar 09 13:21:24 crc kubenswrapper[4723]: I0309 13:21:24.817333 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-644bd545d4-m82n9" Mar 09 13:21:24 crc kubenswrapper[4723]: I0309 13:21:24.851481 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 09 13:21:24 crc kubenswrapper[4723]: I0309 13:21:24.852120 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 09 13:21:24 crc kubenswrapper[4723]: I0309 13:21:24.852333 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jxn5s" Mar 09 13:21:24 crc kubenswrapper[4723]: I0309 13:21:24.852425 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 09 13:21:24 crc kubenswrapper[4723]: I0309 13:21:24.852377 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 09 13:21:24 crc kubenswrapper[4723]: I0309 13:21:24.852633 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 09 13:21:24 crc kubenswrapper[4723]: I0309 13:21:24.904952 4723 generic.go:334] "Generic (PLEG): container finished" podID="4313196c-6a31-4615-9ffe-329aed2bfef4" containerID="0af3fb47358d96397b65fcaf3629b2b6cf1be6d981d4ec437f2a04134c567377" exitCode=0 Mar 09 13:21:24 crc kubenswrapper[4723]: I0309 13:21:24.905049 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-9jjw9" event={"ID":"4313196c-6a31-4615-9ffe-329aed2bfef4","Type":"ContainerDied","Data":"0af3fb47358d96397b65fcaf3629b2b6cf1be6d981d4ec437f2a04134c567377"} Mar 09 13:21:25 crc kubenswrapper[4723]: I0309 13:21:25.015066 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d4bcf87-b989-4476-b780-23cfb3da4504-config-data\") pod \"keystone-644bd545d4-m82n9\" (UID: \"8d4bcf87-b989-4476-b780-23cfb3da4504\") " pod="openstack/keystone-644bd545d4-m82n9" Mar 09 13:21:25 crc kubenswrapper[4723]: I0309 13:21:25.015161 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d4bcf87-b989-4476-b780-23cfb3da4504-public-tls-certs\") pod \"keystone-644bd545d4-m82n9\" (UID: \"8d4bcf87-b989-4476-b780-23cfb3da4504\") " pod="openstack/keystone-644bd545d4-m82n9" Mar 09 13:21:25 crc kubenswrapper[4723]: I0309 13:21:25.015202 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8d4bcf87-b989-4476-b780-23cfb3da4504-credential-keys\") pod \"keystone-644bd545d4-m82n9\" (UID: \"8d4bcf87-b989-4476-b780-23cfb3da4504\") " pod="openstack/keystone-644bd545d4-m82n9" Mar 09 13:21:25 crc kubenswrapper[4723]: I0309 13:21:25.015243 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d4bcf87-b989-4476-b780-23cfb3da4504-combined-ca-bundle\") pod \"keystone-644bd545d4-m82n9\" (UID: \"8d4bcf87-b989-4476-b780-23cfb3da4504\") " pod="openstack/keystone-644bd545d4-m82n9" Mar 09 13:21:25 crc kubenswrapper[4723]: I0309 13:21:25.015272 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9ps7\" (UniqueName: \"kubernetes.io/projected/8d4bcf87-b989-4476-b780-23cfb3da4504-kube-api-access-p9ps7\") pod \"keystone-644bd545d4-m82n9\" (UID: \"8d4bcf87-b989-4476-b780-23cfb3da4504\") " pod="openstack/keystone-644bd545d4-m82n9" Mar 09 13:21:25 crc kubenswrapper[4723]: I0309 13:21:25.015310 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d4bcf87-b989-4476-b780-23cfb3da4504-scripts\") pod \"keystone-644bd545d4-m82n9\" (UID: \"8d4bcf87-b989-4476-b780-23cfb3da4504\") " pod="openstack/keystone-644bd545d4-m82n9" Mar 09 13:21:25 crc kubenswrapper[4723]: I0309 13:21:25.015336 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8d4bcf87-b989-4476-b780-23cfb3da4504-fernet-keys\") pod \"keystone-644bd545d4-m82n9\" (UID: \"8d4bcf87-b989-4476-b780-23cfb3da4504\") " pod="openstack/keystone-644bd545d4-m82n9" Mar 09 13:21:25 crc kubenswrapper[4723]: I0309 13:21:25.015355 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d4bcf87-b989-4476-b780-23cfb3da4504-internal-tls-certs\") pod \"keystone-644bd545d4-m82n9\" (UID: \"8d4bcf87-b989-4476-b780-23cfb3da4504\") " pod="openstack/keystone-644bd545d4-m82n9" Mar 09 13:21:25 crc kubenswrapper[4723]: I0309 13:21:25.020071 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6f68f59b98-9mh25" podStartSLOduration=8.020056634 podStartE2EDuration="8.020056634s" podCreationTimestamp="2026-03-09 13:21:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:21:24.984409291 +0000 UTC m=+1358.998876821" watchObservedRunningTime="2026-03-09 13:21:25.020056634 +0000 UTC m=+1359.034524174" Mar 09 13:21:25 crc kubenswrapper[4723]: I0309 13:21:25.118639 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d4bcf87-b989-4476-b780-23cfb3da4504-scripts\") pod \"keystone-644bd545d4-m82n9\" (UID: \"8d4bcf87-b989-4476-b780-23cfb3da4504\") " pod="openstack/keystone-644bd545d4-m82n9" Mar 09 13:21:25 crc kubenswrapper[4723]: I0309 13:21:25.118722 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8d4bcf87-b989-4476-b780-23cfb3da4504-fernet-keys\") pod \"keystone-644bd545d4-m82n9\" (UID: \"8d4bcf87-b989-4476-b780-23cfb3da4504\") " pod="openstack/keystone-644bd545d4-m82n9" Mar 09 13:21:25 crc kubenswrapper[4723]: I0309 13:21:25.118752 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d4bcf87-b989-4476-b780-23cfb3da4504-internal-tls-certs\") pod \"keystone-644bd545d4-m82n9\" (UID: \"8d4bcf87-b989-4476-b780-23cfb3da4504\") " pod="openstack/keystone-644bd545d4-m82n9" Mar 09 13:21:25 crc kubenswrapper[4723]: I0309 13:21:25.119065 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d4bcf87-b989-4476-b780-23cfb3da4504-config-data\") pod \"keystone-644bd545d4-m82n9\" (UID: \"8d4bcf87-b989-4476-b780-23cfb3da4504\") " pod="openstack/keystone-644bd545d4-m82n9" Mar 09 13:21:25 crc kubenswrapper[4723]: I0309 13:21:25.119173 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d4bcf87-b989-4476-b780-23cfb3da4504-public-tls-certs\") pod \"keystone-644bd545d4-m82n9\" (UID: \"8d4bcf87-b989-4476-b780-23cfb3da4504\") " pod="openstack/keystone-644bd545d4-m82n9" Mar 09 13:21:25 crc kubenswrapper[4723]: I0309 13:21:25.119232 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8d4bcf87-b989-4476-b780-23cfb3da4504-credential-keys\") pod \"keystone-644bd545d4-m82n9\" (UID: \"8d4bcf87-b989-4476-b780-23cfb3da4504\") " pod="openstack/keystone-644bd545d4-m82n9" Mar 09 13:21:25 crc kubenswrapper[4723]: I0309 13:21:25.119281 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d4bcf87-b989-4476-b780-23cfb3da4504-combined-ca-bundle\") pod \"keystone-644bd545d4-m82n9\" (UID: \"8d4bcf87-b989-4476-b780-23cfb3da4504\") " pod="openstack/keystone-644bd545d4-m82n9" Mar 09 13:21:25 crc kubenswrapper[4723]: I0309 13:21:25.119306 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9ps7\" (UniqueName: \"kubernetes.io/projected/8d4bcf87-b989-4476-b780-23cfb3da4504-kube-api-access-p9ps7\") pod \"keystone-644bd545d4-m82n9\" (UID: \"8d4bcf87-b989-4476-b780-23cfb3da4504\") " pod="openstack/keystone-644bd545d4-m82n9" Mar 09 13:21:25 crc kubenswrapper[4723]: I0309 13:21:25.125986 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d4bcf87-b989-4476-b780-23cfb3da4504-public-tls-certs\") pod \"keystone-644bd545d4-m82n9\" (UID: \"8d4bcf87-b989-4476-b780-23cfb3da4504\") " pod="openstack/keystone-644bd545d4-m82n9" Mar 09 13:21:25 crc kubenswrapper[4723]: I0309 13:21:25.131309 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d4bcf87-b989-4476-b780-23cfb3da4504-scripts\") pod \"keystone-644bd545d4-m82n9\" (UID: \"8d4bcf87-b989-4476-b780-23cfb3da4504\") " pod="openstack/keystone-644bd545d4-m82n9" Mar 09 13:21:25 crc kubenswrapper[4723]: I0309 13:21:25.132000 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d4bcf87-b989-4476-b780-23cfb3da4504-internal-tls-certs\") pod \"keystone-644bd545d4-m82n9\" (UID: \"8d4bcf87-b989-4476-b780-23cfb3da4504\") " pod="openstack/keystone-644bd545d4-m82n9" Mar 09 13:21:25 crc kubenswrapper[4723]: I0309 13:21:25.136583 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8d4bcf87-b989-4476-b780-23cfb3da4504-fernet-keys\") pod \"keystone-644bd545d4-m82n9\" (UID: \"8d4bcf87-b989-4476-b780-23cfb3da4504\") " pod="openstack/keystone-644bd545d4-m82n9" Mar 09 13:21:25 crc kubenswrapper[4723]: I0309 13:21:25.137174 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d4bcf87-b989-4476-b780-23cfb3da4504-config-data\") pod \"keystone-644bd545d4-m82n9\" (UID: \"8d4bcf87-b989-4476-b780-23cfb3da4504\") " pod="openstack/keystone-644bd545d4-m82n9" Mar 09 13:21:25 crc kubenswrapper[4723]: I0309 13:21:25.137447 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8d4bcf87-b989-4476-b780-23cfb3da4504-credential-keys\") pod \"keystone-644bd545d4-m82n9\" (UID: \"8d4bcf87-b989-4476-b780-23cfb3da4504\") " pod="openstack/keystone-644bd545d4-m82n9" Mar 09 13:21:25 crc kubenswrapper[4723]: I0309 13:21:25.137700 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d4bcf87-b989-4476-b780-23cfb3da4504-combined-ca-bundle\") pod \"keystone-644bd545d4-m82n9\" (UID: \"8d4bcf87-b989-4476-b780-23cfb3da4504\") " pod="openstack/keystone-644bd545d4-m82n9" Mar 09 13:21:25 crc kubenswrapper[4723]: I0309 13:21:25.142403 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9ps7\" (UniqueName: \"kubernetes.io/projected/8d4bcf87-b989-4476-b780-23cfb3da4504-kube-api-access-p9ps7\") pod \"keystone-644bd545d4-m82n9\" (UID: \"8d4bcf87-b989-4476-b780-23cfb3da4504\") " pod="openstack/keystone-644bd545d4-m82n9" Mar 09 13:21:25 crc kubenswrapper[4723]: I0309 13:21:25.377287 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6f68f59b98-9mh25" Mar 09 13:21:25 crc kubenswrapper[4723]: I0309 13:21:25.377617 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-644bd545d4-m82n9"] Mar 09 13:21:25 crc kubenswrapper[4723]: I0309 13:21:25.377644 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6f68f59b98-9mh25" Mar 09 13:21:25 crc kubenswrapper[4723]: I0309 13:21:25.377655 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f68f59b98-9mh25" event={"ID":"a4575d39-0084-4f11-980c-6187b318d7fa","Type":"ContainerStarted","Data":"7abce7725a88fd6062e279f214536937e4c6c3e1cffc36f4289af27afc6c2cc2"} Mar 09 13:21:25 crc kubenswrapper[4723]: I0309 13:21:25.377680 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f68f59b98-9mh25" event={"ID":"a4575d39-0084-4f11-980c-6187b318d7fa","Type":"ContainerStarted","Data":"b62efde1eb2c2d3fcd454c627418a6858b64d17b251d48f2fc5f97cc865d52a4"} Mar 09 13:21:25 crc kubenswrapper[4723]: I0309 13:21:25.377692 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d7b4c9946-ljf67" event={"ID":"591e7541-2095-490f-9787-d4551a2e4f9d","Type":"ContainerStarted","Data":"92dcefaf02293c319df2e4e7a6b68680e3d30054f568a1aaa658e5dedeec64d2"} Mar 09 13:21:25 crc kubenswrapper[4723]: I0309 13:21:25.377709 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d7b4c9946-ljf67" event={"ID":"591e7541-2095-490f-9787-d4551a2e4f9d","Type":"ContainerStarted","Data":"1925aa83e5f062d7fe6ccb9d26ece2e7cde2daef6afd506e9ee19477faa1657f"} Mar 09 13:21:25 crc kubenswrapper[4723]: I0309 13:21:25.377721 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2f94cb12-90f7-4a5a-9da4-6520946b46be","Type":"ContainerStarted","Data":"77b9a9f3676621c7a67c476085a7467e59a8e369c486a3c383f88a71bc9b33e2"} Mar 09 13:21:25 crc kubenswrapper[4723]: I0309 13:21:25.377735 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e0bf458-3488-4d3a-80ac-d9cf2f655791","Type":"ContainerStarted","Data":"2161603b5816667f7710ccc90429c6d2fb2b8f1d9495eb572edffaa2c1f175a3"} Mar 09 13:21:25 crc kubenswrapper[4723]: I0309 13:21:25.377747 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-lf8bq" event={"ID":"90dea403-5a65-4824-ac5b-5c34c828d616","Type":"ContainerStarted","Data":"6fa51296fe7f6684547eae4f35c35bbf74bb1c2b6a716b61798033760fc08532"} Mar 09 13:21:25 crc kubenswrapper[4723]: I0309 13:21:25.417244 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-lf8bq" podStartSLOduration=3.959143033 podStartE2EDuration="46.417222175s" podCreationTimestamp="2026-03-09 13:20:39 +0000 UTC" firstStartedPulling="2026-03-09 13:20:41.096212889 +0000 UTC m=+1315.110680419" lastFinishedPulling="2026-03-09 13:21:23.554292021 +0000 UTC m=+1357.568759561" observedRunningTime="2026-03-09 13:21:25.403301106 +0000 UTC m=+1359.417768646" watchObservedRunningTime="2026-03-09 13:21:25.417222175 +0000 UTC m=+1359.431689715" Mar 09 13:21:25 crc kubenswrapper[4723]: I0309 13:21:25.438719 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-644bd545d4-m82n9" Mar 09 13:21:26 crc kubenswrapper[4723]: I0309 13:21:26.078626 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2e90760e-0ff0-4195-8bb9-d32fe674feb5","Type":"ContainerStarted","Data":"f76c6b69b519dcbc0aca84270d607b9fdb4f7fc3927531998d5b7deedd7f3f99"} Mar 09 13:21:26 crc kubenswrapper[4723]: I0309 13:21:26.086892 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d7b4c9946-ljf67" event={"ID":"591e7541-2095-490f-9787-d4551a2e4f9d","Type":"ContainerStarted","Data":"0d5f2915d0560d32c496d76824067567dd9de60e05e9b2294a4bff612dd09489"} Mar 09 13:21:26 crc kubenswrapper[4723]: I0309 13:21:26.088080 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-d7b4c9946-ljf67" Mar 09 13:21:26 crc kubenswrapper[4723]: I0309 13:21:26.088108 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-d7b4c9946-ljf67" Mar 09 13:21:26 crc kubenswrapper[4723]: I0309 13:21:26.091828 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2f94cb12-90f7-4a5a-9da4-6520946b46be","Type":"ContainerStarted","Data":"0d93448417f193ea01c5258f51e9033e70438056797884f157722a8c20b6b8e2"} Mar 09 13:21:26 crc kubenswrapper[4723]: I0309 13:21:26.093602 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-677d745ffb-ng6tr" event={"ID":"ed96f382-04dd-41ec-b370-832266d07122","Type":"ContainerStarted","Data":"d04c43cd4468292b687fa6b635dd878c8d44b4defaf6b4c9adff867bb0b84ad2"} Mar 09 13:21:26 crc kubenswrapper[4723]: I0309 13:21:26.099530 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-bm79v" event={"ID":"191baa15-4ac5-4e55-9f87-751eddffb83e","Type":"ContainerStarted","Data":"3089133ad3ac67c8f2b108d7f5e469a3c6333fb68492b8c0cee405c625b8fe8b"} Mar 09 13:21:26 crc kubenswrapper[4723]: I0309 13:21:26.118311 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=9.118292379 podStartE2EDuration="9.118292379s" podCreationTimestamp="2026-03-09 13:21:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:21:26.107286078 +0000 UTC m=+1360.121753618" watchObservedRunningTime="2026-03-09 13:21:26.118292379 +0000 UTC m=+1360.132759919" Mar 09 13:21:26 crc kubenswrapper[4723]: I0309 13:21:26.144422 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-d7b4c9946-ljf67" podStartSLOduration=5.14439966 podStartE2EDuration="5.14439966s" podCreationTimestamp="2026-03-09 13:21:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:21:26.140691252 +0000 UTC m=+1360.155158802" watchObservedRunningTime="2026-03-09 13:21:26.14439966 +0000 UTC m=+1360.158867200" Mar 09 13:21:26 crc kubenswrapper[4723]: I0309 13:21:26.165500 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-bm79v" podStartSLOduration=4.816284577 podStartE2EDuration="47.165481938s" podCreationTimestamp="2026-03-09 13:20:39 +0000 UTC" firstStartedPulling="2026-03-09 13:20:41.230683528 +0000 UTC m=+1315.245151068" lastFinishedPulling="2026-03-09 13:21:23.579880879 +0000 UTC m=+1357.594348429" observedRunningTime="2026-03-09 13:21:26.159160821 +0000 UTC m=+1360.173628361" watchObservedRunningTime="2026-03-09 13:21:26.165481938 +0000 UTC m=+1360.179949478" Mar 09 13:21:27 crc kubenswrapper[4723]: I0309 13:21:27.116205 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-644bd545d4-m82n9"] Mar 09 13:21:27 crc kubenswrapper[4723]: I0309 13:21:27.142346 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-767f59545d-j4rfp" event={"ID":"d9e10194-4646-4afe-8353-146441b874ad","Type":"ContainerStarted","Data":"e332c9b63bffa997ef81f899bc149383525f526826ff9ec42cb036d9eaa39867"} Mar 09 13:21:27 crc kubenswrapper[4723]: I0309 13:21:27.163620 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-677d745ffb-ng6tr" event={"ID":"ed96f382-04dd-41ec-b370-832266d07122","Type":"ContainerStarted","Data":"8b43efdd93be446949623ace9492232e4959bc04ea236173db991fe5c509e83d"} Mar 09 13:21:27 crc kubenswrapper[4723]: I0309 13:21:27.164679 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-677d745ffb-ng6tr" Mar 09 13:21:27 crc kubenswrapper[4723]: I0309 13:21:27.164783 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-677d745ffb-ng6tr" Mar 09 13:21:27 crc kubenswrapper[4723]: I0309 13:21:27.173382 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-9jjw9" event={"ID":"4313196c-6a31-4615-9ffe-329aed2bfef4","Type":"ContainerStarted","Data":"0a327eb0475d9cfd15217ad702f0d307341bf391a37281c48d9bfbe427021cec"} Mar 09 13:21:27 crc kubenswrapper[4723]: I0309 13:21:27.174246 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75c8ddd69c-9jjw9" Mar 09 13:21:27 crc kubenswrapper[4723]: I0309 13:21:27.179694 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-69bb477875-ngpk2" event={"ID":"afb8e2d8-5db3-4447-9c6e-4b8b6249b44d","Type":"ContainerStarted","Data":"034d681500ce7f0991b196dedef58677579f082e0ae5cc7808a595fe6f1e8ce3"} Mar 09 13:21:27 crc kubenswrapper[4723]: W0309 13:21:27.207106 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d4bcf87_b989_4476_b780_23cfb3da4504.slice/crio-555297d0a77a5d536e30665f626f56ca407c65a19ca899250c4c3f38b279dc23 WatchSource:0}: Error finding container 555297d0a77a5d536e30665f626f56ca407c65a19ca899250c4c3f38b279dc23: Status 404 returned error can't find the container with id 555297d0a77a5d536e30665f626f56ca407c65a19ca899250c4c3f38b279dc23 Mar 09 13:21:27 crc kubenswrapper[4723]: I0309 13:21:27.222316 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-677d745ffb-ng6tr" podStartSLOduration=8.222295848 podStartE2EDuration="8.222295848s" podCreationTimestamp="2026-03-09 13:21:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:21:27.207937388 +0000 UTC m=+1361.222404948" watchObservedRunningTime="2026-03-09 13:21:27.222295848 +0000 UTC m=+1361.236763388" Mar 09 13:21:27 crc kubenswrapper[4723]: I0309 13:21:27.289027 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75c8ddd69c-9jjw9" podStartSLOduration=10.289009163 podStartE2EDuration="10.289009163s" podCreationTimestamp="2026-03-09 13:21:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:21:27.24015095 +0000 UTC m=+1361.254618490" watchObservedRunningTime="2026-03-09 13:21:27.289009163 +0000 UTC m=+1361.303476703" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.195765 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-644bd545d4-m82n9" event={"ID":"8d4bcf87-b989-4476-b780-23cfb3da4504","Type":"ContainerStarted","Data":"43f1e0cf1fa0e80e9df70190214ba81ca31fe1749adf3baee078eb713048d90d"} Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.196236 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-644bd545d4-m82n9" event={"ID":"8d4bcf87-b989-4476-b780-23cfb3da4504","Type":"ContainerStarted","Data":"555297d0a77a5d536e30665f626f56ca407c65a19ca899250c4c3f38b279dc23"} Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.196251 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-644bd545d4-m82n9" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.202522 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-69bb477875-ngpk2" event={"ID":"afb8e2d8-5db3-4447-9c6e-4b8b6249b44d","Type":"ContainerStarted","Data":"12250f20318d7511f4f8e27e4c414041091b0275179df1c716c0b1aae0826e16"} Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.221143 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2f94cb12-90f7-4a5a-9da4-6520946b46be","Type":"ContainerStarted","Data":"fadc14fdef084d5e31ef0114295b47042e605593134aed02c3c1b972b615ceee"} Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.232762 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-767f59545d-j4rfp" event={"ID":"d9e10194-4646-4afe-8353-146441b874ad","Type":"ContainerStarted","Data":"c6d3b9357a786b3b7c5160326d951ac8e90268cc334d013226114a206f3054be"} Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.245023 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-644bd545d4-m82n9" podStartSLOduration=4.245000665 podStartE2EDuration="4.245000665s" podCreationTimestamp="2026-03-09 13:21:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:21:28.221724009 +0000 UTC m=+1362.236191549" watchObservedRunningTime="2026-03-09 13:21:28.245000665 +0000 UTC m=+1362.259468205" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.256248 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=10.256229042 podStartE2EDuration="10.256229042s" podCreationTimestamp="2026-03-09 13:21:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:21:28.24446821 +0000 UTC m=+1362.258935740" watchObservedRunningTime="2026-03-09 13:21:28.256229042 +0000 UTC m=+1362.270696582" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.305211 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-69bb477875-ngpk2" podStartSLOduration=8.005652703 podStartE2EDuration="11.305193228s" podCreationTimestamp="2026-03-09 13:21:17 +0000 UTC" firstStartedPulling="2026-03-09 13:21:23.309058951 +0000 UTC m=+1357.323526491" lastFinishedPulling="2026-03-09 13:21:26.608599486 +0000 UTC m=+1360.623067016" observedRunningTime="2026-03-09 13:21:28.289716788 +0000 UTC m=+1362.304184338" watchObservedRunningTime="2026-03-09 13:21:28.305193228 +0000 UTC m=+1362.319660768" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.353273 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-767f59545d-j4rfp" podStartSLOduration=8.070311415 podStartE2EDuration="11.352043938s" podCreationTimestamp="2026-03-09 13:21:17 +0000 UTC" firstStartedPulling="2026-03-09 13:21:23.330979051 +0000 UTC m=+1357.345446591" lastFinishedPulling="2026-03-09 13:21:26.612711574 +0000 UTC m=+1360.627179114" observedRunningTime="2026-03-09 13:21:28.322089915 +0000 UTC m=+1362.336557475" watchObservedRunningTime="2026-03-09 13:21:28.352043938 +0000 UTC m=+1362.366511478" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.383732 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.384441 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.455527 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.457441 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.472708 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6b7867888c-cmz44"] Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.475170 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6b7867888c-cmz44" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.499010 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-64c6fb6dbb-56jkv"] Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.501272 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-64c6fb6dbb-56jkv" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.520158 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6048045-801d-4833-87ed-b47076a02338-config-data\") pod \"barbican-worker-6b7867888c-cmz44\" (UID: \"f6048045-801d-4833-87ed-b47076a02338\") " pod="openstack/barbican-worker-6b7867888c-cmz44" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.520222 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7jjq\" (UniqueName: \"kubernetes.io/projected/f6048045-801d-4833-87ed-b47076a02338-kube-api-access-r7jjq\") pod \"barbican-worker-6b7867888c-cmz44\" (UID: \"f6048045-801d-4833-87ed-b47076a02338\") " pod="openstack/barbican-worker-6b7867888c-cmz44" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.520290 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6048045-801d-4833-87ed-b47076a02338-combined-ca-bundle\") pod \"barbican-worker-6b7867888c-cmz44\" (UID: \"f6048045-801d-4833-87ed-b47076a02338\") " pod="openstack/barbican-worker-6b7867888c-cmz44" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.520328 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6048045-801d-4833-87ed-b47076a02338-logs\") pod \"barbican-worker-6b7867888c-cmz44\" (UID: \"f6048045-801d-4833-87ed-b47076a02338\") " pod="openstack/barbican-worker-6b7867888c-cmz44" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.520390 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f6048045-801d-4833-87ed-b47076a02338-config-data-custom\") pod \"barbican-worker-6b7867888c-cmz44\" (UID: \"f6048045-801d-4833-87ed-b47076a02338\") " pod="openstack/barbican-worker-6b7867888c-cmz44" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.623231 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbf24888-ed0f-4900-963d-c61c23af5bfe-config-data\") pod \"barbican-keystone-listener-64c6fb6dbb-56jkv\" (UID: \"cbf24888-ed0f-4900-963d-c61c23af5bfe\") " pod="openstack/barbican-keystone-listener-64c6fb6dbb-56jkv" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.623369 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6048045-801d-4833-87ed-b47076a02338-config-data\") pod \"barbican-worker-6b7867888c-cmz44\" (UID: \"f6048045-801d-4833-87ed-b47076a02338\") " pod="openstack/barbican-worker-6b7867888c-cmz44" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.623419 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cbf24888-ed0f-4900-963d-c61c23af5bfe-config-data-custom\") pod \"barbican-keystone-listener-64c6fb6dbb-56jkv\" (UID: \"cbf24888-ed0f-4900-963d-c61c23af5bfe\") " pod="openstack/barbican-keystone-listener-64c6fb6dbb-56jkv" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.623443 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbf24888-ed0f-4900-963d-c61c23af5bfe-logs\") pod \"barbican-keystone-listener-64c6fb6dbb-56jkv\" (UID: \"cbf24888-ed0f-4900-963d-c61c23af5bfe\") " pod="openstack/barbican-keystone-listener-64c6fb6dbb-56jkv" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.623478 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7jjq\" (UniqueName: \"kubernetes.io/projected/f6048045-801d-4833-87ed-b47076a02338-kube-api-access-r7jjq\") pod \"barbican-worker-6b7867888c-cmz44\" (UID: \"f6048045-801d-4833-87ed-b47076a02338\") " pod="openstack/barbican-worker-6b7867888c-cmz44" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.623522 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbf24888-ed0f-4900-963d-c61c23af5bfe-combined-ca-bundle\") pod \"barbican-keystone-listener-64c6fb6dbb-56jkv\" (UID: \"cbf24888-ed0f-4900-963d-c61c23af5bfe\") " pod="openstack/barbican-keystone-listener-64c6fb6dbb-56jkv" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.623597 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6048045-801d-4833-87ed-b47076a02338-combined-ca-bundle\") pod \"barbican-worker-6b7867888c-cmz44\" (UID: \"f6048045-801d-4833-87ed-b47076a02338\") " pod="openstack/barbican-worker-6b7867888c-cmz44" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.623645 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6048045-801d-4833-87ed-b47076a02338-logs\") pod \"barbican-worker-6b7867888c-cmz44\" (UID: \"f6048045-801d-4833-87ed-b47076a02338\") " pod="openstack/barbican-worker-6b7867888c-cmz44" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.623683 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srlfp\" (UniqueName: \"kubernetes.io/projected/cbf24888-ed0f-4900-963d-c61c23af5bfe-kube-api-access-srlfp\") pod \"barbican-keystone-listener-64c6fb6dbb-56jkv\" (UID: \"cbf24888-ed0f-4900-963d-c61c23af5bfe\") " pod="openstack/barbican-keystone-listener-64c6fb6dbb-56jkv" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.623759 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f6048045-801d-4833-87ed-b47076a02338-config-data-custom\") pod \"barbican-worker-6b7867888c-cmz44\" (UID: \"f6048045-801d-4833-87ed-b47076a02338\") " pod="openstack/barbican-worker-6b7867888c-cmz44" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.637703 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6048045-801d-4833-87ed-b47076a02338-combined-ca-bundle\") pod \"barbican-worker-6b7867888c-cmz44\" (UID: \"f6048045-801d-4833-87ed-b47076a02338\") " pod="openstack/barbican-worker-6b7867888c-cmz44" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.639051 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6048045-801d-4833-87ed-b47076a02338-logs\") pod \"barbican-worker-6b7867888c-cmz44\" (UID: \"f6048045-801d-4833-87ed-b47076a02338\") " pod="openstack/barbican-worker-6b7867888c-cmz44" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.651729 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f6048045-801d-4833-87ed-b47076a02338-config-data-custom\") pod \"barbican-worker-6b7867888c-cmz44\" (UID: \"f6048045-801d-4833-87ed-b47076a02338\") " pod="openstack/barbican-worker-6b7867888c-cmz44" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.658618 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6b7867888c-cmz44"] Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.665516 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6048045-801d-4833-87ed-b47076a02338-config-data\") pod \"barbican-worker-6b7867888c-cmz44\" (UID: \"f6048045-801d-4833-87ed-b47076a02338\") " pod="openstack/barbican-worker-6b7867888c-cmz44" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.670650 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7jjq\" (UniqueName: \"kubernetes.io/projected/f6048045-801d-4833-87ed-b47076a02338-kube-api-access-r7jjq\") pod \"barbican-worker-6b7867888c-cmz44\" (UID: \"f6048045-801d-4833-87ed-b47076a02338\") " pod="openstack/barbican-worker-6b7867888c-cmz44" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.678463 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-64c6fb6dbb-56jkv"] Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.729309 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbf24888-ed0f-4900-963d-c61c23af5bfe-logs\") pod \"barbican-keystone-listener-64c6fb6dbb-56jkv\" (UID: \"cbf24888-ed0f-4900-963d-c61c23af5bfe\") " pod="openstack/barbican-keystone-listener-64c6fb6dbb-56jkv" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.729354 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cbf24888-ed0f-4900-963d-c61c23af5bfe-config-data-custom\") pod \"barbican-keystone-listener-64c6fb6dbb-56jkv\" (UID: \"cbf24888-ed0f-4900-963d-c61c23af5bfe\") " pod="openstack/barbican-keystone-listener-64c6fb6dbb-56jkv" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.729399 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbf24888-ed0f-4900-963d-c61c23af5bfe-combined-ca-bundle\") pod \"barbican-keystone-listener-64c6fb6dbb-56jkv\" (UID: \"cbf24888-ed0f-4900-963d-c61c23af5bfe\") " pod="openstack/barbican-keystone-listener-64c6fb6dbb-56jkv" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.729516 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srlfp\" (UniqueName: \"kubernetes.io/projected/cbf24888-ed0f-4900-963d-c61c23af5bfe-kube-api-access-srlfp\") pod \"barbican-keystone-listener-64c6fb6dbb-56jkv\" (UID: \"cbf24888-ed0f-4900-963d-c61c23af5bfe\") " pod="openstack/barbican-keystone-listener-64c6fb6dbb-56jkv" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.729614 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbf24888-ed0f-4900-963d-c61c23af5bfe-config-data\") pod \"barbican-keystone-listener-64c6fb6dbb-56jkv\" (UID: \"cbf24888-ed0f-4900-963d-c61c23af5bfe\") " pod="openstack/barbican-keystone-listener-64c6fb6dbb-56jkv" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.729870 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbf24888-ed0f-4900-963d-c61c23af5bfe-logs\") pod \"barbican-keystone-listener-64c6fb6dbb-56jkv\" (UID: \"cbf24888-ed0f-4900-963d-c61c23af5bfe\") " pod="openstack/barbican-keystone-listener-64c6fb6dbb-56jkv" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.733484 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbf24888-ed0f-4900-963d-c61c23af5bfe-combined-ca-bundle\") pod \"barbican-keystone-listener-64c6fb6dbb-56jkv\" (UID: \"cbf24888-ed0f-4900-963d-c61c23af5bfe\") " pod="openstack/barbican-keystone-listener-64c6fb6dbb-56jkv" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.747433 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cbf24888-ed0f-4900-963d-c61c23af5bfe-config-data-custom\") pod \"barbican-keystone-listener-64c6fb6dbb-56jkv\" (UID: \"cbf24888-ed0f-4900-963d-c61c23af5bfe\") " pod="openstack/barbican-keystone-listener-64c6fb6dbb-56jkv" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.756573 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbf24888-ed0f-4900-963d-c61c23af5bfe-config-data\") pod \"barbican-keystone-listener-64c6fb6dbb-56jkv\" (UID: \"cbf24888-ed0f-4900-963d-c61c23af5bfe\") " pod="openstack/barbican-keystone-listener-64c6fb6dbb-56jkv" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.756651 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6f68f59b98-9mh25"] Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.756929 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6f68f59b98-9mh25" podUID="a4575d39-0084-4f11-980c-6187b318d7fa" containerName="barbican-api-log" containerID="cri-o://b62efde1eb2c2d3fcd454c627418a6858b64d17b251d48f2fc5f97cc865d52a4" gracePeriod=30 Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.757494 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6f68f59b98-9mh25" podUID="a4575d39-0084-4f11-980c-6187b318d7fa" containerName="barbican-api" containerID="cri-o://7abce7725a88fd6062e279f214536937e4c6c3e1cffc36f4289af27afc6c2cc2" gracePeriod=30 Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.760435 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srlfp\" (UniqueName: \"kubernetes.io/projected/cbf24888-ed0f-4900-963d-c61c23af5bfe-kube-api-access-srlfp\") pod \"barbican-keystone-listener-64c6fb6dbb-56jkv\" (UID: \"cbf24888-ed0f-4900-963d-c61c23af5bfe\") " pod="openstack/barbican-keystone-listener-64c6fb6dbb-56jkv" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.766698 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6f68f59b98-9mh25" podUID="a4575d39-0084-4f11-980c-6187b318d7fa" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.203:9311/healthcheck\": EOF" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.774951 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6f68f59b98-9mh25" podUID="a4575d39-0084-4f11-980c-6187b318d7fa" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.203:9311/healthcheck\": EOF" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.785088 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-695767fcfd-4rv9j"] Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.787094 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-695767fcfd-4rv9j" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.799387 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6b7867888c-cmz44" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.816748 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-695767fcfd-4rv9j"] Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.831366 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a33c0e4-cc6f-49f1-bdb3-dab35255fd07-config-data\") pod \"barbican-api-695767fcfd-4rv9j\" (UID: \"6a33c0e4-cc6f-49f1-bdb3-dab35255fd07\") " pod="openstack/barbican-api-695767fcfd-4rv9j" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.831421 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a33c0e4-cc6f-49f1-bdb3-dab35255fd07-internal-tls-certs\") pod \"barbican-api-695767fcfd-4rv9j\" (UID: \"6a33c0e4-cc6f-49f1-bdb3-dab35255fd07\") " pod="openstack/barbican-api-695767fcfd-4rv9j" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.831457 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a33c0e4-cc6f-49f1-bdb3-dab35255fd07-public-tls-certs\") pod \"barbican-api-695767fcfd-4rv9j\" (UID: \"6a33c0e4-cc6f-49f1-bdb3-dab35255fd07\") " pod="openstack/barbican-api-695767fcfd-4rv9j" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.831527 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a33c0e4-cc6f-49f1-bdb3-dab35255fd07-config-data-custom\") pod \"barbican-api-695767fcfd-4rv9j\" (UID: \"6a33c0e4-cc6f-49f1-bdb3-dab35255fd07\") " pod="openstack/barbican-api-695767fcfd-4rv9j" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.831602 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a33c0e4-cc6f-49f1-bdb3-dab35255fd07-combined-ca-bundle\") pod \"barbican-api-695767fcfd-4rv9j\" (UID: \"6a33c0e4-cc6f-49f1-bdb3-dab35255fd07\") " pod="openstack/barbican-api-695767fcfd-4rv9j" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.831656 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-947v9\" (UniqueName: \"kubernetes.io/projected/6a33c0e4-cc6f-49f1-bdb3-dab35255fd07-kube-api-access-947v9\") pod \"barbican-api-695767fcfd-4rv9j\" (UID: \"6a33c0e4-cc6f-49f1-bdb3-dab35255fd07\") " pod="openstack/barbican-api-695767fcfd-4rv9j" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.831700 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a33c0e4-cc6f-49f1-bdb3-dab35255fd07-logs\") pod \"barbican-api-695767fcfd-4rv9j\" (UID: \"6a33c0e4-cc6f-49f1-bdb3-dab35255fd07\") " pod="openstack/barbican-api-695767fcfd-4rv9j" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.841673 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-64c6fb6dbb-56jkv" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.936720 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a33c0e4-cc6f-49f1-bdb3-dab35255fd07-config-data-custom\") pod \"barbican-api-695767fcfd-4rv9j\" (UID: \"6a33c0e4-cc6f-49f1-bdb3-dab35255fd07\") " pod="openstack/barbican-api-695767fcfd-4rv9j" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.937417 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a33c0e4-cc6f-49f1-bdb3-dab35255fd07-combined-ca-bundle\") pod \"barbican-api-695767fcfd-4rv9j\" (UID: \"6a33c0e4-cc6f-49f1-bdb3-dab35255fd07\") " pod="openstack/barbican-api-695767fcfd-4rv9j" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.937820 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-947v9\" (UniqueName: \"kubernetes.io/projected/6a33c0e4-cc6f-49f1-bdb3-dab35255fd07-kube-api-access-947v9\") pod \"barbican-api-695767fcfd-4rv9j\" (UID: \"6a33c0e4-cc6f-49f1-bdb3-dab35255fd07\") " pod="openstack/barbican-api-695767fcfd-4rv9j" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.938021 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a33c0e4-cc6f-49f1-bdb3-dab35255fd07-logs\") pod \"barbican-api-695767fcfd-4rv9j\" (UID: \"6a33c0e4-cc6f-49f1-bdb3-dab35255fd07\") " pod="openstack/barbican-api-695767fcfd-4rv9j" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.938342 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a33c0e4-cc6f-49f1-bdb3-dab35255fd07-config-data\") pod \"barbican-api-695767fcfd-4rv9j\" (UID: \"6a33c0e4-cc6f-49f1-bdb3-dab35255fd07\") " pod="openstack/barbican-api-695767fcfd-4rv9j" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.938720 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a33c0e4-cc6f-49f1-bdb3-dab35255fd07-internal-tls-certs\") pod \"barbican-api-695767fcfd-4rv9j\" (UID: \"6a33c0e4-cc6f-49f1-bdb3-dab35255fd07\") " pod="openstack/barbican-api-695767fcfd-4rv9j" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.939219 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a33c0e4-cc6f-49f1-bdb3-dab35255fd07-logs\") pod \"barbican-api-695767fcfd-4rv9j\" (UID: \"6a33c0e4-cc6f-49f1-bdb3-dab35255fd07\") " pod="openstack/barbican-api-695767fcfd-4rv9j" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.938856 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a33c0e4-cc6f-49f1-bdb3-dab35255fd07-public-tls-certs\") pod \"barbican-api-695767fcfd-4rv9j\" (UID: \"6a33c0e4-cc6f-49f1-bdb3-dab35255fd07\") " pod="openstack/barbican-api-695767fcfd-4rv9j" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.944767 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a33c0e4-cc6f-49f1-bdb3-dab35255fd07-config-data-custom\") pod \"barbican-api-695767fcfd-4rv9j\" (UID: \"6a33c0e4-cc6f-49f1-bdb3-dab35255fd07\") " pod="openstack/barbican-api-695767fcfd-4rv9j" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.945719 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a33c0e4-cc6f-49f1-bdb3-dab35255fd07-internal-tls-certs\") pod \"barbican-api-695767fcfd-4rv9j\" (UID: \"6a33c0e4-cc6f-49f1-bdb3-dab35255fd07\") " pod="openstack/barbican-api-695767fcfd-4rv9j" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.946290 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a33c0e4-cc6f-49f1-bdb3-dab35255fd07-config-data\") pod \"barbican-api-695767fcfd-4rv9j\" (UID: \"6a33c0e4-cc6f-49f1-bdb3-dab35255fd07\") " pod="openstack/barbican-api-695767fcfd-4rv9j" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.972250 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-947v9\" (UniqueName: \"kubernetes.io/projected/6a33c0e4-cc6f-49f1-bdb3-dab35255fd07-kube-api-access-947v9\") pod \"barbican-api-695767fcfd-4rv9j\" (UID: \"6a33c0e4-cc6f-49f1-bdb3-dab35255fd07\") " pod="openstack/barbican-api-695767fcfd-4rv9j" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.974188 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a33c0e4-cc6f-49f1-bdb3-dab35255fd07-combined-ca-bundle\") pod \"barbican-api-695767fcfd-4rv9j\" (UID: \"6a33c0e4-cc6f-49f1-bdb3-dab35255fd07\") " pod="openstack/barbican-api-695767fcfd-4rv9j" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.978775 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a33c0e4-cc6f-49f1-bdb3-dab35255fd07-public-tls-certs\") pod \"barbican-api-695767fcfd-4rv9j\" (UID: \"6a33c0e4-cc6f-49f1-bdb3-dab35255fd07\") " pod="openstack/barbican-api-695767fcfd-4rv9j" Mar 09 13:21:28 crc kubenswrapper[4723]: I0309 13:21:28.990552 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-695767fcfd-4rv9j" Mar 09 13:21:29 crc kubenswrapper[4723]: I0309 13:21:29.268207 4723 generic.go:334] "Generic (PLEG): container finished" podID="a4575d39-0084-4f11-980c-6187b318d7fa" containerID="b62efde1eb2c2d3fcd454c627418a6858b64d17b251d48f2fc5f97cc865d52a4" exitCode=143 Mar 09 13:21:29 crc kubenswrapper[4723]: I0309 13:21:29.268697 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f68f59b98-9mh25" event={"ID":"a4575d39-0084-4f11-980c-6187b318d7fa","Type":"ContainerDied","Data":"b62efde1eb2c2d3fcd454c627418a6858b64d17b251d48f2fc5f97cc865d52a4"} Mar 09 13:21:29 crc kubenswrapper[4723]: I0309 13:21:29.269848 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 09 13:21:29 crc kubenswrapper[4723]: I0309 13:21:29.269930 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 09 13:21:29 crc kubenswrapper[4723]: I0309 13:21:29.362873 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 09 13:21:29 crc kubenswrapper[4723]: I0309 13:21:29.362916 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 09 13:21:29 crc kubenswrapper[4723]: I0309 13:21:29.414491 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 09 13:21:29 crc kubenswrapper[4723]: I0309 13:21:29.423784 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 09 13:21:29 crc kubenswrapper[4723]: I0309 13:21:29.507886 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6b7867888c-cmz44"] Mar 09 13:21:29 crc kubenswrapper[4723]: W0309 13:21:29.508488 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6048045_801d_4833_87ed_b47076a02338.slice/crio-8d47c5b66615468c41be9e9a64ed8002726525806d9fce46b54b42117ad58e7a WatchSource:0}: Error finding container 8d47c5b66615468c41be9e9a64ed8002726525806d9fce46b54b42117ad58e7a: Status 404 returned error can't find the container with id 8d47c5b66615468c41be9e9a64ed8002726525806d9fce46b54b42117ad58e7a Mar 09 13:21:29 crc kubenswrapper[4723]: I0309 13:21:29.665541 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-64c6fb6dbb-56jkv"] Mar 09 13:21:29 crc kubenswrapper[4723]: W0309 13:21:29.687263 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcbf24888_ed0f_4900_963d_c61c23af5bfe.slice/crio-8565745b069e4ce37f0c718c1f41271cf865050d5ace7bd3fec1c13fd6b51a73 WatchSource:0}: Error finding container 8565745b069e4ce37f0c718c1f41271cf865050d5ace7bd3fec1c13fd6b51a73: Status 404 returned error can't find the container with id 8565745b069e4ce37f0c718c1f41271cf865050d5ace7bd3fec1c13fd6b51a73 Mar 09 13:21:29 crc kubenswrapper[4723]: I0309 13:21:29.875551 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-695767fcfd-4rv9j"] Mar 09 13:21:30 crc kubenswrapper[4723]: I0309 13:21:30.286199 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-695767fcfd-4rv9j" event={"ID":"6a33c0e4-cc6f-49f1-bdb3-dab35255fd07","Type":"ContainerStarted","Data":"26c7645e4bc0c660c3d9ac42caa34f0affdab0bf7025b47087e06864c9a61cad"} Mar 09 13:21:30 crc kubenswrapper[4723]: I0309 13:21:30.286525 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-695767fcfd-4rv9j" event={"ID":"6a33c0e4-cc6f-49f1-bdb3-dab35255fd07","Type":"ContainerStarted","Data":"2abc6b9578db021014db59101af0854ddbf6a4db747b76c8b1c04238b3f23a5b"} Mar 09 13:21:30 crc kubenswrapper[4723]: I0309 13:21:30.288917 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-64c6fb6dbb-56jkv" event={"ID":"cbf24888-ed0f-4900-963d-c61c23af5bfe","Type":"ContainerStarted","Data":"55c01af1c205cd116b49d20393955ddc3d2996d48743eaa4fd07dcf5253aced1"} Mar 09 13:21:30 crc kubenswrapper[4723]: I0309 13:21:30.288950 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-64c6fb6dbb-56jkv" event={"ID":"cbf24888-ed0f-4900-963d-c61c23af5bfe","Type":"ContainerStarted","Data":"b10da71339f7fdd63786ccac50034dd73b2701b065e2c76e80087e4ef20d24a0"} Mar 09 13:21:30 crc kubenswrapper[4723]: I0309 13:21:30.288965 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-64c6fb6dbb-56jkv" event={"ID":"cbf24888-ed0f-4900-963d-c61c23af5bfe","Type":"ContainerStarted","Data":"8565745b069e4ce37f0c718c1f41271cf865050d5ace7bd3fec1c13fd6b51a73"} Mar 09 13:21:30 crc kubenswrapper[4723]: I0309 13:21:30.292973 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b7867888c-cmz44" event={"ID":"f6048045-801d-4833-87ed-b47076a02338","Type":"ContainerStarted","Data":"96f3558ca0273ff54079da28ff69184ca596396ec6b78ef2ac77554bf8a2dc5d"} Mar 09 13:21:30 crc kubenswrapper[4723]: I0309 13:21:30.293077 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b7867888c-cmz44" event={"ID":"f6048045-801d-4833-87ed-b47076a02338","Type":"ContainerStarted","Data":"7d99df4a5dc30395076a378f005a6f05a96ba6f81d3cce2c7decb6f0aeeb3049"} Mar 09 13:21:30 crc kubenswrapper[4723]: I0309 13:21:30.293097 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b7867888c-cmz44" event={"ID":"f6048045-801d-4833-87ed-b47076a02338","Type":"ContainerStarted","Data":"8d47c5b66615468c41be9e9a64ed8002726525806d9fce46b54b42117ad58e7a"} Mar 09 13:21:30 crc kubenswrapper[4723]: I0309 13:21:30.293571 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 09 13:21:30 crc kubenswrapper[4723]: I0309 13:21:30.293600 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 09 13:21:30 crc kubenswrapper[4723]: I0309 13:21:30.321822 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-64c6fb6dbb-56jkv" podStartSLOduration=2.321801748 podStartE2EDuration="2.321801748s" podCreationTimestamp="2026-03-09 13:21:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:21:30.314101824 +0000 UTC m=+1364.328569364" watchObservedRunningTime="2026-03-09 13:21:30.321801748 +0000 UTC m=+1364.336269288" Mar 09 13:21:30 crc kubenswrapper[4723]: I0309 13:21:30.354025 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-767f59545d-j4rfp"] Mar 09 13:21:30 crc kubenswrapper[4723]: I0309 13:21:30.354328 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-767f59545d-j4rfp" podUID="d9e10194-4646-4afe-8353-146441b874ad" containerName="barbican-keystone-listener-log" containerID="cri-o://e332c9b63bffa997ef81f899bc149383525f526826ff9ec42cb036d9eaa39867" gracePeriod=30 Mar 09 13:21:30 crc kubenswrapper[4723]: I0309 13:21:30.354517 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-767f59545d-j4rfp" podUID="d9e10194-4646-4afe-8353-146441b874ad" containerName="barbican-keystone-listener" containerID="cri-o://c6d3b9357a786b3b7c5160326d951ac8e90268cc334d013226114a206f3054be" gracePeriod=30 Mar 09 13:21:30 crc kubenswrapper[4723]: I0309 13:21:30.374517 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6b7867888c-cmz44" podStartSLOduration=2.374490903 podStartE2EDuration="2.374490903s" podCreationTimestamp="2026-03-09 13:21:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:21:30.337598916 +0000 UTC m=+1364.352066486" watchObservedRunningTime="2026-03-09 13:21:30.374490903 +0000 UTC m=+1364.388958443" Mar 09 13:21:30 crc kubenswrapper[4723]: I0309 13:21:30.392608 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-69bb477875-ngpk2"] Mar 09 13:21:30 crc kubenswrapper[4723]: I0309 13:21:30.393196 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-69bb477875-ngpk2" podUID="afb8e2d8-5db3-4447-9c6e-4b8b6249b44d" containerName="barbican-worker-log" containerID="cri-o://034d681500ce7f0991b196dedef58677579f082e0ae5cc7808a595fe6f1e8ce3" gracePeriod=30 Mar 09 13:21:30 crc kubenswrapper[4723]: I0309 13:21:30.393494 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-69bb477875-ngpk2" podUID="afb8e2d8-5db3-4447-9c6e-4b8b6249b44d" containerName="barbican-worker" containerID="cri-o://12250f20318d7511f4f8e27e4c414041091b0275179df1c716c0b1aae0826e16" gracePeriod=30 Mar 09 13:21:31 crc kubenswrapper[4723]: I0309 13:21:31.384265 4723 generic.go:334] "Generic (PLEG): container finished" podID="afb8e2d8-5db3-4447-9c6e-4b8b6249b44d" containerID="034d681500ce7f0991b196dedef58677579f082e0ae5cc7808a595fe6f1e8ce3" exitCode=143 Mar 09 13:21:31 crc kubenswrapper[4723]: I0309 13:21:31.384547 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-69bb477875-ngpk2" event={"ID":"afb8e2d8-5db3-4447-9c6e-4b8b6249b44d","Type":"ContainerDied","Data":"034d681500ce7f0991b196dedef58677579f082e0ae5cc7808a595fe6f1e8ce3"} Mar 09 13:21:31 crc kubenswrapper[4723]: I0309 13:21:31.434168 4723 generic.go:334] "Generic (PLEG): container finished" podID="d9e10194-4646-4afe-8353-146441b874ad" containerID="c6d3b9357a786b3b7c5160326d951ac8e90268cc334d013226114a206f3054be" exitCode=0 Mar 09 13:21:31 crc kubenswrapper[4723]: I0309 13:21:31.434208 4723 generic.go:334] "Generic (PLEG): container finished" podID="d9e10194-4646-4afe-8353-146441b874ad" containerID="e332c9b63bffa997ef81f899bc149383525f526826ff9ec42cb036d9eaa39867" exitCode=143 Mar 09 13:21:31 crc kubenswrapper[4723]: I0309 13:21:31.434306 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-767f59545d-j4rfp" event={"ID":"d9e10194-4646-4afe-8353-146441b874ad","Type":"ContainerDied","Data":"c6d3b9357a786b3b7c5160326d951ac8e90268cc334d013226114a206f3054be"} Mar 09 13:21:31 crc kubenswrapper[4723]: I0309 13:21:31.434341 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-767f59545d-j4rfp" event={"ID":"d9e10194-4646-4afe-8353-146441b874ad","Type":"ContainerDied","Data":"e332c9b63bffa997ef81f899bc149383525f526826ff9ec42cb036d9eaa39867"} Mar 09 13:21:31 crc kubenswrapper[4723]: I0309 13:21:31.486610 4723 generic.go:334] "Generic (PLEG): container finished" podID="90dea403-5a65-4824-ac5b-5c34c828d616" containerID="6fa51296fe7f6684547eae4f35c35bbf74bb1c2b6a716b61798033760fc08532" exitCode=0 Mar 09 13:21:31 crc kubenswrapper[4723]: I0309 13:21:31.486695 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-lf8bq" event={"ID":"90dea403-5a65-4824-ac5b-5c34c828d616","Type":"ContainerDied","Data":"6fa51296fe7f6684547eae4f35c35bbf74bb1c2b6a716b61798033760fc08532"} Mar 09 13:21:31 crc kubenswrapper[4723]: I0309 13:21:31.505163 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-695767fcfd-4rv9j" event={"ID":"6a33c0e4-cc6f-49f1-bdb3-dab35255fd07","Type":"ContainerStarted","Data":"04724a8b15784ad3f022dd91f5a15852dfda05d17d6c2e8bd8d83bb7b7294bbd"} Mar 09 13:21:31 crc kubenswrapper[4723]: I0309 13:21:31.505819 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-695767fcfd-4rv9j" Mar 09 13:21:31 crc kubenswrapper[4723]: I0309 13:21:31.505918 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-695767fcfd-4rv9j" Mar 09 13:21:31 crc kubenswrapper[4723]: I0309 13:21:31.505952 4723 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 09 13:21:31 crc kubenswrapper[4723]: I0309 13:21:31.564515 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-695767fcfd-4rv9j" podStartSLOduration=3.564495407 podStartE2EDuration="3.564495407s" podCreationTimestamp="2026-03-09 13:21:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:21:31.538417547 +0000 UTC m=+1365.552885087" watchObservedRunningTime="2026-03-09 13:21:31.564495407 +0000 UTC m=+1365.578962947" Mar 09 13:21:31 crc kubenswrapper[4723]: I0309 13:21:31.796679 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-767f59545d-j4rfp" Mar 09 13:21:31 crc kubenswrapper[4723]: I0309 13:21:31.965481 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9e10194-4646-4afe-8353-146441b874ad-config-data-custom\") pod \"d9e10194-4646-4afe-8353-146441b874ad\" (UID: \"d9e10194-4646-4afe-8353-146441b874ad\") " Mar 09 13:21:31 crc kubenswrapper[4723]: I0309 13:21:31.965645 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmmlc\" (UniqueName: \"kubernetes.io/projected/d9e10194-4646-4afe-8353-146441b874ad-kube-api-access-bmmlc\") pod \"d9e10194-4646-4afe-8353-146441b874ad\" (UID: \"d9e10194-4646-4afe-8353-146441b874ad\") " Mar 09 13:21:31 crc kubenswrapper[4723]: I0309 13:21:31.965728 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9e10194-4646-4afe-8353-146441b874ad-logs\") pod \"d9e10194-4646-4afe-8353-146441b874ad\" (UID: \"d9e10194-4646-4afe-8353-146441b874ad\") " Mar 09 13:21:31 crc kubenswrapper[4723]: I0309 13:21:31.965831 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9e10194-4646-4afe-8353-146441b874ad-combined-ca-bundle\") pod \"d9e10194-4646-4afe-8353-146441b874ad\" (UID: \"d9e10194-4646-4afe-8353-146441b874ad\") " Mar 09 13:21:31 crc kubenswrapper[4723]: I0309 13:21:31.965974 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9e10194-4646-4afe-8353-146441b874ad-config-data\") pod \"d9e10194-4646-4afe-8353-146441b874ad\" (UID: \"d9e10194-4646-4afe-8353-146441b874ad\") " Mar 09 13:21:31 crc kubenswrapper[4723]: I0309 13:21:31.968388 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9e10194-4646-4afe-8353-146441b874ad-logs" (OuterVolumeSpecName: "logs") pod "d9e10194-4646-4afe-8353-146441b874ad" (UID: "d9e10194-4646-4afe-8353-146441b874ad"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:21:31 crc kubenswrapper[4723]: I0309 13:21:31.978184 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9e10194-4646-4afe-8353-146441b874ad-kube-api-access-bmmlc" (OuterVolumeSpecName: "kube-api-access-bmmlc") pod "d9e10194-4646-4afe-8353-146441b874ad" (UID: "d9e10194-4646-4afe-8353-146441b874ad"). InnerVolumeSpecName "kube-api-access-bmmlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:31 crc kubenswrapper[4723]: I0309 13:21:31.978772 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9e10194-4646-4afe-8353-146441b874ad-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d9e10194-4646-4afe-8353-146441b874ad" (UID: "d9e10194-4646-4afe-8353-146441b874ad"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:32 crc kubenswrapper[4723]: I0309 13:21:32.009915 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9e10194-4646-4afe-8353-146441b874ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d9e10194-4646-4afe-8353-146441b874ad" (UID: "d9e10194-4646-4afe-8353-146441b874ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:32 crc kubenswrapper[4723]: I0309 13:21:32.053026 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9e10194-4646-4afe-8353-146441b874ad-config-data" (OuterVolumeSpecName: "config-data") pod "d9e10194-4646-4afe-8353-146441b874ad" (UID: "d9e10194-4646-4afe-8353-146441b874ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:32 crc kubenswrapper[4723]: I0309 13:21:32.081343 4723 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9e10194-4646-4afe-8353-146441b874ad-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:32 crc kubenswrapper[4723]: I0309 13:21:32.081383 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmmlc\" (UniqueName: \"kubernetes.io/projected/d9e10194-4646-4afe-8353-146441b874ad-kube-api-access-bmmlc\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:32 crc kubenswrapper[4723]: I0309 13:21:32.081585 4723 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9e10194-4646-4afe-8353-146441b874ad-logs\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:32 crc kubenswrapper[4723]: I0309 13:21:32.081598 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9e10194-4646-4afe-8353-146441b874ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:32 crc kubenswrapper[4723]: I0309 13:21:32.081608 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9e10194-4646-4afe-8353-146441b874ad-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:32 crc kubenswrapper[4723]: I0309 13:21:32.175767 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-69bb477875-ngpk2" Mar 09 13:21:32 crc kubenswrapper[4723]: I0309 13:21:32.287022 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/afb8e2d8-5db3-4447-9c6e-4b8b6249b44d-config-data-custom\") pod \"afb8e2d8-5db3-4447-9c6e-4b8b6249b44d\" (UID: \"afb8e2d8-5db3-4447-9c6e-4b8b6249b44d\") " Mar 09 13:21:32 crc kubenswrapper[4723]: I0309 13:21:32.287191 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afb8e2d8-5db3-4447-9c6e-4b8b6249b44d-config-data\") pod \"afb8e2d8-5db3-4447-9c6e-4b8b6249b44d\" (UID: \"afb8e2d8-5db3-4447-9c6e-4b8b6249b44d\") " Mar 09 13:21:32 crc kubenswrapper[4723]: I0309 13:21:32.287217 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afb8e2d8-5db3-4447-9c6e-4b8b6249b44d-combined-ca-bundle\") pod \"afb8e2d8-5db3-4447-9c6e-4b8b6249b44d\" (UID: \"afb8e2d8-5db3-4447-9c6e-4b8b6249b44d\") " Mar 09 13:21:32 crc kubenswrapper[4723]: I0309 13:21:32.287307 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfkwx\" (UniqueName: \"kubernetes.io/projected/afb8e2d8-5db3-4447-9c6e-4b8b6249b44d-kube-api-access-vfkwx\") pod \"afb8e2d8-5db3-4447-9c6e-4b8b6249b44d\" (UID: \"afb8e2d8-5db3-4447-9c6e-4b8b6249b44d\") " Mar 09 13:21:32 crc kubenswrapper[4723]: I0309 13:21:32.287440 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afb8e2d8-5db3-4447-9c6e-4b8b6249b44d-logs\") pod \"afb8e2d8-5db3-4447-9c6e-4b8b6249b44d\" (UID: \"afb8e2d8-5db3-4447-9c6e-4b8b6249b44d\") " Mar 09 13:21:32 crc kubenswrapper[4723]: I0309 13:21:32.287842 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afb8e2d8-5db3-4447-9c6e-4b8b6249b44d-logs" (OuterVolumeSpecName: "logs") pod "afb8e2d8-5db3-4447-9c6e-4b8b6249b44d" (UID: "afb8e2d8-5db3-4447-9c6e-4b8b6249b44d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:21:32 crc kubenswrapper[4723]: I0309 13:21:32.288425 4723 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afb8e2d8-5db3-4447-9c6e-4b8b6249b44d-logs\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:32 crc kubenswrapper[4723]: I0309 13:21:32.291848 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afb8e2d8-5db3-4447-9c6e-4b8b6249b44d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "afb8e2d8-5db3-4447-9c6e-4b8b6249b44d" (UID: "afb8e2d8-5db3-4447-9c6e-4b8b6249b44d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:32 crc kubenswrapper[4723]: I0309 13:21:32.294249 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afb8e2d8-5db3-4447-9c6e-4b8b6249b44d-kube-api-access-vfkwx" (OuterVolumeSpecName: "kube-api-access-vfkwx") pod "afb8e2d8-5db3-4447-9c6e-4b8b6249b44d" (UID: "afb8e2d8-5db3-4447-9c6e-4b8b6249b44d"). InnerVolumeSpecName "kube-api-access-vfkwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:32 crc kubenswrapper[4723]: I0309 13:21:32.344301 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afb8e2d8-5db3-4447-9c6e-4b8b6249b44d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "afb8e2d8-5db3-4447-9c6e-4b8b6249b44d" (UID: "afb8e2d8-5db3-4447-9c6e-4b8b6249b44d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:32 crc kubenswrapper[4723]: I0309 13:21:32.373006 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afb8e2d8-5db3-4447-9c6e-4b8b6249b44d-config-data" (OuterVolumeSpecName: "config-data") pod "afb8e2d8-5db3-4447-9c6e-4b8b6249b44d" (UID: "afb8e2d8-5db3-4447-9c6e-4b8b6249b44d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:32 crc kubenswrapper[4723]: I0309 13:21:32.391587 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfkwx\" (UniqueName: \"kubernetes.io/projected/afb8e2d8-5db3-4447-9c6e-4b8b6249b44d-kube-api-access-vfkwx\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:32 crc kubenswrapper[4723]: I0309 13:21:32.391625 4723 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/afb8e2d8-5db3-4447-9c6e-4b8b6249b44d-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:32 crc kubenswrapper[4723]: I0309 13:21:32.391634 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afb8e2d8-5db3-4447-9c6e-4b8b6249b44d-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:32 crc kubenswrapper[4723]: I0309 13:21:32.391642 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afb8e2d8-5db3-4447-9c6e-4b8b6249b44d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:32 crc kubenswrapper[4723]: I0309 13:21:32.527129 4723 generic.go:334] "Generic (PLEG): container finished" podID="afb8e2d8-5db3-4447-9c6e-4b8b6249b44d" containerID="12250f20318d7511f4f8e27e4c414041091b0275179df1c716c0b1aae0826e16" exitCode=0 Mar 09 13:21:32 crc kubenswrapper[4723]: I0309 13:21:32.527213 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-69bb477875-ngpk2" event={"ID":"afb8e2d8-5db3-4447-9c6e-4b8b6249b44d","Type":"ContainerDied","Data":"12250f20318d7511f4f8e27e4c414041091b0275179df1c716c0b1aae0826e16"} Mar 09 13:21:32 crc kubenswrapper[4723]: I0309 13:21:32.527251 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-69bb477875-ngpk2" event={"ID":"afb8e2d8-5db3-4447-9c6e-4b8b6249b44d","Type":"ContainerDied","Data":"58a2c689bd2611389e9750bed2a7ac02068506859f7fe17473061e477d6bea41"} Mar 09 13:21:32 crc kubenswrapper[4723]: I0309 13:21:32.527277 4723 scope.go:117] "RemoveContainer" containerID="12250f20318d7511f4f8e27e4c414041091b0275179df1c716c0b1aae0826e16" Mar 09 13:21:32 crc kubenswrapper[4723]: I0309 13:21:32.527437 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-69bb477875-ngpk2" Mar 09 13:21:32 crc kubenswrapper[4723]: I0309 13:21:32.539136 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-767f59545d-j4rfp" event={"ID":"d9e10194-4646-4afe-8353-146441b874ad","Type":"ContainerDied","Data":"630a167016d975db1a4b9088d5cdf3d70d3516bc4ed7af61a1ad6b2b0db51bd9"} Mar 09 13:21:32 crc kubenswrapper[4723]: I0309 13:21:32.539300 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-767f59545d-j4rfp" Mar 09 13:21:32 crc kubenswrapper[4723]: I0309 13:21:32.575625 4723 scope.go:117] "RemoveContainer" containerID="034d681500ce7f0991b196dedef58677579f082e0ae5cc7808a595fe6f1e8ce3" Mar 09 13:21:32 crc kubenswrapper[4723]: I0309 13:21:32.600047 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-69bb477875-ngpk2"] Mar 09 13:21:32 crc kubenswrapper[4723]: I0309 13:21:32.614366 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-69bb477875-ngpk2"] Mar 09 13:21:32 crc kubenswrapper[4723]: I0309 13:21:32.630534 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-767f59545d-j4rfp"] Mar 09 13:21:32 crc kubenswrapper[4723]: I0309 13:21:32.631461 4723 scope.go:117] "RemoveContainer" containerID="12250f20318d7511f4f8e27e4c414041091b0275179df1c716c0b1aae0826e16" Mar 09 13:21:32 crc kubenswrapper[4723]: E0309 13:21:32.635021 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12250f20318d7511f4f8e27e4c414041091b0275179df1c716c0b1aae0826e16\": container with ID starting with 12250f20318d7511f4f8e27e4c414041091b0275179df1c716c0b1aae0826e16 not found: ID does not exist" containerID="12250f20318d7511f4f8e27e4c414041091b0275179df1c716c0b1aae0826e16" Mar 09 13:21:32 crc kubenswrapper[4723]: I0309 13:21:32.635070 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12250f20318d7511f4f8e27e4c414041091b0275179df1c716c0b1aae0826e16"} err="failed to get container status \"12250f20318d7511f4f8e27e4c414041091b0275179df1c716c0b1aae0826e16\": rpc error: code = NotFound desc = could not find container \"12250f20318d7511f4f8e27e4c414041091b0275179df1c716c0b1aae0826e16\": container with ID starting with 12250f20318d7511f4f8e27e4c414041091b0275179df1c716c0b1aae0826e16 not found: ID does not exist" Mar 09 13:21:32 crc kubenswrapper[4723]: I0309 13:21:32.635097 4723 scope.go:117] "RemoveContainer" containerID="034d681500ce7f0991b196dedef58677579f082e0ae5cc7808a595fe6f1e8ce3" Mar 09 13:21:32 crc kubenswrapper[4723]: E0309 13:21:32.636573 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"034d681500ce7f0991b196dedef58677579f082e0ae5cc7808a595fe6f1e8ce3\": container with ID starting with 034d681500ce7f0991b196dedef58677579f082e0ae5cc7808a595fe6f1e8ce3 not found: ID does not exist" containerID="034d681500ce7f0991b196dedef58677579f082e0ae5cc7808a595fe6f1e8ce3" Mar 09 13:21:32 crc kubenswrapper[4723]: I0309 13:21:32.636618 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"034d681500ce7f0991b196dedef58677579f082e0ae5cc7808a595fe6f1e8ce3"} err="failed to get container status \"034d681500ce7f0991b196dedef58677579f082e0ae5cc7808a595fe6f1e8ce3\": rpc error: code = NotFound desc = could not find container \"034d681500ce7f0991b196dedef58677579f082e0ae5cc7808a595fe6f1e8ce3\": container with ID starting with 034d681500ce7f0991b196dedef58677579f082e0ae5cc7808a595fe6f1e8ce3 not found: ID does not exist" Mar 09 13:21:32 crc kubenswrapper[4723]: I0309 13:21:32.636668 4723 scope.go:117] "RemoveContainer" containerID="c6d3b9357a786b3b7c5160326d951ac8e90268cc334d013226114a206f3054be" Mar 09 13:21:32 crc kubenswrapper[4723]: I0309 13:21:32.645274 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-767f59545d-j4rfp"] Mar 09 13:21:32 crc kubenswrapper[4723]: I0309 13:21:32.674277 4723 scope.go:117] "RemoveContainer" containerID="e332c9b63bffa997ef81f899bc149383525f526826ff9ec42cb036d9eaa39867" Mar 09 13:21:32 crc kubenswrapper[4723]: I0309 13:21:32.903519 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afb8e2d8-5db3-4447-9c6e-4b8b6249b44d" path="/var/lib/kubelet/pods/afb8e2d8-5db3-4447-9c6e-4b8b6249b44d/volumes" Mar 09 13:21:32 crc kubenswrapper[4723]: I0309 13:21:32.904442 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9e10194-4646-4afe-8353-146441b874ad" path="/var/lib/kubelet/pods/d9e10194-4646-4afe-8353-146441b874ad/volumes" Mar 09 13:21:33 crc kubenswrapper[4723]: I0309 13:21:33.076141 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-lf8bq" Mar 09 13:21:33 crc kubenswrapper[4723]: I0309 13:21:33.209171 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pc6n2\" (UniqueName: \"kubernetes.io/projected/90dea403-5a65-4824-ac5b-5c34c828d616-kube-api-access-pc6n2\") pod \"90dea403-5a65-4824-ac5b-5c34c828d616\" (UID: \"90dea403-5a65-4824-ac5b-5c34c828d616\") " Mar 09 13:21:33 crc kubenswrapper[4723]: I0309 13:21:33.209720 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90dea403-5a65-4824-ac5b-5c34c828d616-config-data\") pod \"90dea403-5a65-4824-ac5b-5c34c828d616\" (UID: \"90dea403-5a65-4824-ac5b-5c34c828d616\") " Mar 09 13:21:33 crc kubenswrapper[4723]: I0309 13:21:33.209791 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90dea403-5a65-4824-ac5b-5c34c828d616-combined-ca-bundle\") pod \"90dea403-5a65-4824-ac5b-5c34c828d616\" (UID: \"90dea403-5a65-4824-ac5b-5c34c828d616\") " Mar 09 13:21:33 crc kubenswrapper[4723]: I0309 13:21:33.216116 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90dea403-5a65-4824-ac5b-5c34c828d616-kube-api-access-pc6n2" (OuterVolumeSpecName: "kube-api-access-pc6n2") pod "90dea403-5a65-4824-ac5b-5c34c828d616" (UID: "90dea403-5a65-4824-ac5b-5c34c828d616"). InnerVolumeSpecName "kube-api-access-pc6n2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:33 crc kubenswrapper[4723]: I0309 13:21:33.261198 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90dea403-5a65-4824-ac5b-5c34c828d616-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90dea403-5a65-4824-ac5b-5c34c828d616" (UID: "90dea403-5a65-4824-ac5b-5c34c828d616"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:33 crc kubenswrapper[4723]: I0309 13:21:33.282998 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75c8ddd69c-9jjw9" Mar 09 13:21:33 crc kubenswrapper[4723]: I0309 13:21:33.312404 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90dea403-5a65-4824-ac5b-5c34c828d616-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:33 crc kubenswrapper[4723]: I0309 13:21:33.312443 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pc6n2\" (UniqueName: \"kubernetes.io/projected/90dea403-5a65-4824-ac5b-5c34c828d616-kube-api-access-pc6n2\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:33 crc kubenswrapper[4723]: I0309 13:21:33.367375 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-8z98d"] Mar 09 13:21:33 crc kubenswrapper[4723]: I0309 13:21:33.367657 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76fcf4b695-8z98d" podUID="470074a0-571f-4f50-a275-bb91881d9d85" containerName="dnsmasq-dns" containerID="cri-o://5bca35e86567f4384537db83fcdb2e32879be70033e484b4df29112dbe2ef9e6" gracePeriod=10 Mar 09 13:21:33 crc kubenswrapper[4723]: I0309 13:21:33.372171 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90dea403-5a65-4824-ac5b-5c34c828d616-config-data" (OuterVolumeSpecName: "config-data") pod "90dea403-5a65-4824-ac5b-5c34c828d616" (UID: "90dea403-5a65-4824-ac5b-5c34c828d616"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:33 crc kubenswrapper[4723]: I0309 13:21:33.421156 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90dea403-5a65-4824-ac5b-5c34c828d616-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:33 crc kubenswrapper[4723]: I0309 13:21:33.553936 4723 generic.go:334] "Generic (PLEG): container finished" podID="191baa15-4ac5-4e55-9f87-751eddffb83e" containerID="3089133ad3ac67c8f2b108d7f5e469a3c6333fb68492b8c0cee405c625b8fe8b" exitCode=0 Mar 09 13:21:33 crc kubenswrapper[4723]: I0309 13:21:33.554006 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-bm79v" event={"ID":"191baa15-4ac5-4e55-9f87-751eddffb83e","Type":"ContainerDied","Data":"3089133ad3ac67c8f2b108d7f5e469a3c6333fb68492b8c0cee405c625b8fe8b"} Mar 09 13:21:33 crc kubenswrapper[4723]: I0309 13:21:33.561426 4723 generic.go:334] "Generic (PLEG): container finished" podID="470074a0-571f-4f50-a275-bb91881d9d85" containerID="5bca35e86567f4384537db83fcdb2e32879be70033e484b4df29112dbe2ef9e6" exitCode=0 Mar 09 13:21:33 crc kubenswrapper[4723]: I0309 13:21:33.561489 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-8z98d" event={"ID":"470074a0-571f-4f50-a275-bb91881d9d85","Type":"ContainerDied","Data":"5bca35e86567f4384537db83fcdb2e32879be70033e484b4df29112dbe2ef9e6"} Mar 09 13:21:33 crc kubenswrapper[4723]: I0309 13:21:33.565598 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-lf8bq" event={"ID":"90dea403-5a65-4824-ac5b-5c34c828d616","Type":"ContainerDied","Data":"daa7b630797809acdb3025cdda62219a718129183378179cea49defd1f62668b"} Mar 09 13:21:33 crc kubenswrapper[4723]: I0309 13:21:33.565643 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="daa7b630797809acdb3025cdda62219a718129183378179cea49defd1f62668b" Mar 09 13:21:33 crc kubenswrapper[4723]: I0309 13:21:33.565694 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-lf8bq" Mar 09 13:21:33 crc kubenswrapper[4723]: I0309 13:21:33.591612 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-d7b4c9946-ljf67" Mar 09 13:21:33 crc kubenswrapper[4723]: E0309 13:21:33.604886 4723 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod470074a0_571f_4f50_a275_bb91881d9d85.slice/crio-5bca35e86567f4384537db83fcdb2e32879be70033e484b4df29112dbe2ef9e6.scope\": RecentStats: unable to find data in memory cache]" Mar 09 13:21:33 crc kubenswrapper[4723]: I0309 13:21:33.764287 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-d7b4c9946-ljf67" Mar 09 13:21:33 crc kubenswrapper[4723]: I0309 13:21:33.946958 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:21:33 crc kubenswrapper[4723]: I0309 13:21:33.947024 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:21:33 crc kubenswrapper[4723]: I0309 13:21:33.949067 4723 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" Mar 09 13:21:33 crc kubenswrapper[4723]: I0309 13:21:33.950436 4723 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"37a6af4ad4a336694755a90bae29c7dad0bac535fc07da2bdf95f50123da1b17"} pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 13:21:33 crc kubenswrapper[4723]: I0309 13:21:33.950508 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" containerID="cri-o://37a6af4ad4a336694755a90bae29c7dad0bac535fc07da2bdf95f50123da1b17" gracePeriod=600 Mar 09 13:21:33 crc kubenswrapper[4723]: I0309 13:21:33.977090 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 09 13:21:33 crc kubenswrapper[4723]: I0309 13:21:33.979592 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 09 13:21:33 crc kubenswrapper[4723]: I0309 13:21:33.979703 4723 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 09 13:21:33 crc kubenswrapper[4723]: I0309 13:21:33.981181 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 09 13:21:33 crc kubenswrapper[4723]: I0309 13:21:33.989282 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 09 13:21:34 crc kubenswrapper[4723]: I0309 13:21:34.578901 4723 generic.go:334] "Generic (PLEG): container finished" podID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerID="37a6af4ad4a336694755a90bae29c7dad0bac535fc07da2bdf95f50123da1b17" exitCode=0 Mar 09 13:21:34 crc kubenswrapper[4723]: I0309 13:21:34.578988 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" event={"ID":"983d5ed4-cfc7-402a-b226-29dc071c6e4e","Type":"ContainerDied","Data":"37a6af4ad4a336694755a90bae29c7dad0bac535fc07da2bdf95f50123da1b17"} Mar 09 13:21:34 crc kubenswrapper[4723]: I0309 13:21:34.579352 4723 scope.go:117] "RemoveContainer" containerID="a07909afd95b9a1ee1329ff07b5736e303acd66573a389c66b14a13e53a70f9f" Mar 09 13:21:35 crc kubenswrapper[4723]: I0309 13:21:35.158572 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6f68f59b98-9mh25" podUID="a4575d39-0084-4f11-980c-6187b318d7fa" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.203:9311/healthcheck\": read tcp 10.217.0.2:43810->10.217.0.203:9311: read: connection reset by peer" Mar 09 13:21:35 crc kubenswrapper[4723]: I0309 13:21:35.158609 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6f68f59b98-9mh25" podUID="a4575d39-0084-4f11-980c-6187b318d7fa" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.203:9311/healthcheck\": read tcp 10.217.0.2:43820->10.217.0.203:9311: read: connection reset by peer" Mar 09 13:21:35 crc kubenswrapper[4723]: I0309 13:21:35.173633 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-76fcf4b695-8z98d" podUID="470074a0-571f-4f50-a275-bb91881d9d85" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.190:5353: connect: connection refused" Mar 09 13:21:35 crc kubenswrapper[4723]: I0309 13:21:35.597274 4723 generic.go:334] "Generic (PLEG): container finished" podID="a4575d39-0084-4f11-980c-6187b318d7fa" containerID="7abce7725a88fd6062e279f214536937e4c6c3e1cffc36f4289af27afc6c2cc2" exitCode=0 Mar 09 13:21:35 crc kubenswrapper[4723]: I0309 13:21:35.597317 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f68f59b98-9mh25" event={"ID":"a4575d39-0084-4f11-980c-6187b318d7fa","Type":"ContainerDied","Data":"7abce7725a88fd6062e279f214536937e4c6c3e1cffc36f4289af27afc6c2cc2"} Mar 09 13:21:38 crc kubenswrapper[4723]: I0309 13:21:38.559674 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-bm79v" Mar 09 13:21:38 crc kubenswrapper[4723]: I0309 13:21:38.599399 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6f68f59b98-9mh25" podUID="a4575d39-0084-4f11-980c-6187b318d7fa" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.203:9311/healthcheck\": dial tcp 10.217.0.203:9311: connect: connection refused" Mar 09 13:21:38 crc kubenswrapper[4723]: I0309 13:21:38.599557 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6f68f59b98-9mh25" podUID="a4575d39-0084-4f11-980c-6187b318d7fa" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.203:9311/healthcheck\": dial tcp 10.217.0.203:9311: connect: connection refused" Mar 09 13:21:38 crc kubenswrapper[4723]: I0309 13:21:38.650127 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-bm79v" event={"ID":"191baa15-4ac5-4e55-9f87-751eddffb83e","Type":"ContainerDied","Data":"0bf1331a08a57c068cf7d37f0992feb47295e4c479f2b5345e8a9af85efd2918"} Mar 09 13:21:38 crc kubenswrapper[4723]: I0309 13:21:38.650177 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0bf1331a08a57c068cf7d37f0992feb47295e4c479f2b5345e8a9af85efd2918" Mar 09 13:21:38 crc kubenswrapper[4723]: I0309 13:21:38.650182 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-bm79v" Mar 09 13:21:38 crc kubenswrapper[4723]: I0309 13:21:38.685176 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/191baa15-4ac5-4e55-9f87-751eddffb83e-combined-ca-bundle\") pod \"191baa15-4ac5-4e55-9f87-751eddffb83e\" (UID: \"191baa15-4ac5-4e55-9f87-751eddffb83e\") " Mar 09 13:21:38 crc kubenswrapper[4723]: I0309 13:21:38.685908 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/191baa15-4ac5-4e55-9f87-751eddffb83e-config-data\") pod \"191baa15-4ac5-4e55-9f87-751eddffb83e\" (UID: \"191baa15-4ac5-4e55-9f87-751eddffb83e\") " Mar 09 13:21:38 crc kubenswrapper[4723]: I0309 13:21:38.686009 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwtpj\" (UniqueName: \"kubernetes.io/projected/191baa15-4ac5-4e55-9f87-751eddffb83e-kube-api-access-vwtpj\") pod \"191baa15-4ac5-4e55-9f87-751eddffb83e\" (UID: \"191baa15-4ac5-4e55-9f87-751eddffb83e\") " Mar 09 13:21:38 crc kubenswrapper[4723]: I0309 13:21:38.686050 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/191baa15-4ac5-4e55-9f87-751eddffb83e-db-sync-config-data\") pod \"191baa15-4ac5-4e55-9f87-751eddffb83e\" (UID: \"191baa15-4ac5-4e55-9f87-751eddffb83e\") " Mar 09 13:21:38 crc kubenswrapper[4723]: I0309 13:21:38.686128 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/191baa15-4ac5-4e55-9f87-751eddffb83e-scripts\") pod \"191baa15-4ac5-4e55-9f87-751eddffb83e\" (UID: \"191baa15-4ac5-4e55-9f87-751eddffb83e\") " Mar 09 13:21:38 crc kubenswrapper[4723]: I0309 13:21:38.686399 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/191baa15-4ac5-4e55-9f87-751eddffb83e-etc-machine-id\") pod \"191baa15-4ac5-4e55-9f87-751eddffb83e\" (UID: \"191baa15-4ac5-4e55-9f87-751eddffb83e\") " Mar 09 13:21:38 crc kubenswrapper[4723]: I0309 13:21:38.687066 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/191baa15-4ac5-4e55-9f87-751eddffb83e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "191baa15-4ac5-4e55-9f87-751eddffb83e" (UID: "191baa15-4ac5-4e55-9f87-751eddffb83e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:21:38 crc kubenswrapper[4723]: I0309 13:21:38.693214 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/191baa15-4ac5-4e55-9f87-751eddffb83e-scripts" (OuterVolumeSpecName: "scripts") pod "191baa15-4ac5-4e55-9f87-751eddffb83e" (UID: "191baa15-4ac5-4e55-9f87-751eddffb83e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:38 crc kubenswrapper[4723]: I0309 13:21:38.694962 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/191baa15-4ac5-4e55-9f87-751eddffb83e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "191baa15-4ac5-4e55-9f87-751eddffb83e" (UID: "191baa15-4ac5-4e55-9f87-751eddffb83e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:38 crc kubenswrapper[4723]: I0309 13:21:38.715069 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/191baa15-4ac5-4e55-9f87-751eddffb83e-kube-api-access-vwtpj" (OuterVolumeSpecName: "kube-api-access-vwtpj") pod "191baa15-4ac5-4e55-9f87-751eddffb83e" (UID: "191baa15-4ac5-4e55-9f87-751eddffb83e"). InnerVolumeSpecName "kube-api-access-vwtpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:38 crc kubenswrapper[4723]: I0309 13:21:38.787739 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/191baa15-4ac5-4e55-9f87-751eddffb83e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "191baa15-4ac5-4e55-9f87-751eddffb83e" (UID: "191baa15-4ac5-4e55-9f87-751eddffb83e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:38 crc kubenswrapper[4723]: I0309 13:21:38.788466 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/191baa15-4ac5-4e55-9f87-751eddffb83e-combined-ca-bundle\") pod \"191baa15-4ac5-4e55-9f87-751eddffb83e\" (UID: \"191baa15-4ac5-4e55-9f87-751eddffb83e\") " Mar 09 13:21:38 crc kubenswrapper[4723]: W0309 13:21:38.788621 4723 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/191baa15-4ac5-4e55-9f87-751eddffb83e/volumes/kubernetes.io~secret/combined-ca-bundle Mar 09 13:21:38 crc kubenswrapper[4723]: I0309 13:21:38.788648 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/191baa15-4ac5-4e55-9f87-751eddffb83e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "191baa15-4ac5-4e55-9f87-751eddffb83e" (UID: "191baa15-4ac5-4e55-9f87-751eddffb83e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:38 crc kubenswrapper[4723]: I0309 13:21:38.788972 4723 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/191baa15-4ac5-4e55-9f87-751eddffb83e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:38 crc kubenswrapper[4723]: I0309 13:21:38.788985 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/191baa15-4ac5-4e55-9f87-751eddffb83e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:38 crc kubenswrapper[4723]: I0309 13:21:38.789019 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwtpj\" (UniqueName: \"kubernetes.io/projected/191baa15-4ac5-4e55-9f87-751eddffb83e-kube-api-access-vwtpj\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:38 crc kubenswrapper[4723]: I0309 13:21:38.789032 4723 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/191baa15-4ac5-4e55-9f87-751eddffb83e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:38 crc kubenswrapper[4723]: I0309 13:21:38.789043 4723 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/191baa15-4ac5-4e55-9f87-751eddffb83e-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:38 crc kubenswrapper[4723]: I0309 13:21:38.846570 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/191baa15-4ac5-4e55-9f87-751eddffb83e-config-data" (OuterVolumeSpecName: "config-data") pod "191baa15-4ac5-4e55-9f87-751eddffb83e" (UID: "191baa15-4ac5-4e55-9f87-751eddffb83e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:38 crc kubenswrapper[4723]: I0309 13:21:38.891944 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/191baa15-4ac5-4e55-9f87-751eddffb83e-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.555744 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f68f59b98-9mh25" Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.709457 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4575d39-0084-4f11-980c-6187b318d7fa-config-data-custom\") pod \"a4575d39-0084-4f11-980c-6187b318d7fa\" (UID: \"a4575d39-0084-4f11-980c-6187b318d7fa\") " Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.709555 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4575d39-0084-4f11-980c-6187b318d7fa-config-data\") pod \"a4575d39-0084-4f11-980c-6187b318d7fa\" (UID: \"a4575d39-0084-4f11-980c-6187b318d7fa\") " Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.709759 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4575d39-0084-4f11-980c-6187b318d7fa-combined-ca-bundle\") pod \"a4575d39-0084-4f11-980c-6187b318d7fa\" (UID: \"a4575d39-0084-4f11-980c-6187b318d7fa\") " Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.709882 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4575d39-0084-4f11-980c-6187b318d7fa-logs\") pod \"a4575d39-0084-4f11-980c-6187b318d7fa\" (UID: \"a4575d39-0084-4f11-980c-6187b318d7fa\") " Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.709918 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4fh9\" (UniqueName: \"kubernetes.io/projected/a4575d39-0084-4f11-980c-6187b318d7fa-kube-api-access-x4fh9\") pod \"a4575d39-0084-4f11-980c-6187b318d7fa\" (UID: \"a4575d39-0084-4f11-980c-6187b318d7fa\") " Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.714589 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4575d39-0084-4f11-980c-6187b318d7fa-kube-api-access-x4fh9" (OuterVolumeSpecName: "kube-api-access-x4fh9") pod "a4575d39-0084-4f11-980c-6187b318d7fa" (UID: "a4575d39-0084-4f11-980c-6187b318d7fa"). InnerVolumeSpecName "kube-api-access-x4fh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.718031 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4575d39-0084-4f11-980c-6187b318d7fa-logs" (OuterVolumeSpecName: "logs") pod "a4575d39-0084-4f11-980c-6187b318d7fa" (UID: "a4575d39-0084-4f11-980c-6187b318d7fa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.720079 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4575d39-0084-4f11-980c-6187b318d7fa-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a4575d39-0084-4f11-980c-6187b318d7fa" (UID: "a4575d39-0084-4f11-980c-6187b318d7fa"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.733158 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-8z98d" Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.734463 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f68f59b98-9mh25" event={"ID":"a4575d39-0084-4f11-980c-6187b318d7fa","Type":"ContainerDied","Data":"9055dfd425f15ff36dfa0eb3bf2c166fe534e7cf4a180b468bf90d94743397a0"} Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.734502 4723 scope.go:117] "RemoveContainer" containerID="7abce7725a88fd6062e279f214536937e4c6c3e1cffc36f4289af27afc6c2cc2" Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.734606 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f68f59b98-9mh25" Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.739141 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-8z98d" event={"ID":"470074a0-571f-4f50-a275-bb91881d9d85","Type":"ContainerDied","Data":"07d007b0ac62365cb9740cc4af23198926c9d5dc1a76112d031e9f8ae04ec1a0"} Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.739259 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-8z98d" Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.788236 4723 scope.go:117] "RemoveContainer" containerID="b62efde1eb2c2d3fcd454c627418a6858b64d17b251d48f2fc5f97cc865d52a4" Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.812704 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/470074a0-571f-4f50-a275-bb91881d9d85-config\") pod \"470074a0-571f-4f50-a275-bb91881d9d85\" (UID: \"470074a0-571f-4f50-a275-bb91881d9d85\") " Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.812967 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/470074a0-571f-4f50-a275-bb91881d9d85-dns-swift-storage-0\") pod \"470074a0-571f-4f50-a275-bb91881d9d85\" (UID: \"470074a0-571f-4f50-a275-bb91881d9d85\") " Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.813009 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xr48\" (UniqueName: \"kubernetes.io/projected/470074a0-571f-4f50-a275-bb91881d9d85-kube-api-access-9xr48\") pod \"470074a0-571f-4f50-a275-bb91881d9d85\" (UID: \"470074a0-571f-4f50-a275-bb91881d9d85\") " Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.813108 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/470074a0-571f-4f50-a275-bb91881d9d85-ovsdbserver-nb\") pod \"470074a0-571f-4f50-a275-bb91881d9d85\" (UID: \"470074a0-571f-4f50-a275-bb91881d9d85\") " Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.813169 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/470074a0-571f-4f50-a275-bb91881d9d85-dns-svc\") pod \"470074a0-571f-4f50-a275-bb91881d9d85\" (UID: \"470074a0-571f-4f50-a275-bb91881d9d85\") " Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.813185 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/470074a0-571f-4f50-a275-bb91881d9d85-ovsdbserver-sb\") pod \"470074a0-571f-4f50-a275-bb91881d9d85\" (UID: \"470074a0-571f-4f50-a275-bb91881d9d85\") " Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.814177 4723 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4575d39-0084-4f11-980c-6187b318d7fa-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.814195 4723 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4575d39-0084-4f11-980c-6187b318d7fa-logs\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.814208 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4fh9\" (UniqueName: \"kubernetes.io/projected/a4575d39-0084-4f11-980c-6187b318d7fa-kube-api-access-x4fh9\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.830387 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/470074a0-571f-4f50-a275-bb91881d9d85-kube-api-access-9xr48" (OuterVolumeSpecName: "kube-api-access-9xr48") pod "470074a0-571f-4f50-a275-bb91881d9d85" (UID: "470074a0-571f-4f50-a275-bb91881d9d85"). InnerVolumeSpecName "kube-api-access-9xr48". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.834443 4723 scope.go:117] "RemoveContainer" containerID="5bca35e86567f4384537db83fcdb2e32879be70033e484b4df29112dbe2ef9e6" Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.915978 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xr48\" (UniqueName: \"kubernetes.io/projected/470074a0-571f-4f50-a275-bb91881d9d85-kube-api-access-9xr48\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.931035 4723 scope.go:117] "RemoveContainer" containerID="ceed71e93a0fa8a1083aafc88cae1c30292032a75ab2ba8cefc824b9fd8a852b" Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.949253 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 13:21:39 crc kubenswrapper[4723]: E0309 13:21:39.949723 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4575d39-0084-4f11-980c-6187b318d7fa" containerName="barbican-api-log" Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.949747 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4575d39-0084-4f11-980c-6187b318d7fa" containerName="barbican-api-log" Mar 09 13:21:39 crc kubenswrapper[4723]: E0309 13:21:39.949769 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afb8e2d8-5db3-4447-9c6e-4b8b6249b44d" containerName="barbican-worker-log" Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.949777 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="afb8e2d8-5db3-4447-9c6e-4b8b6249b44d" containerName="barbican-worker-log" Mar 09 13:21:39 crc kubenswrapper[4723]: E0309 13:21:39.949792 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9e10194-4646-4afe-8353-146441b874ad" containerName="barbican-keystone-listener-log" Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.949800 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9e10194-4646-4afe-8353-146441b874ad" containerName="barbican-keystone-listener-log" Mar 09 13:21:39 crc kubenswrapper[4723]: E0309 13:21:39.949809 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="191baa15-4ac5-4e55-9f87-751eddffb83e" containerName="cinder-db-sync" Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.949817 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="191baa15-4ac5-4e55-9f87-751eddffb83e" containerName="cinder-db-sync" Mar 09 13:21:39 crc kubenswrapper[4723]: E0309 13:21:39.949846 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="470074a0-571f-4f50-a275-bb91881d9d85" containerName="init" Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.949898 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="470074a0-571f-4f50-a275-bb91881d9d85" containerName="init" Mar 09 13:21:39 crc kubenswrapper[4723]: E0309 13:21:39.949927 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9e10194-4646-4afe-8353-146441b874ad" containerName="barbican-keystone-listener" Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.949935 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9e10194-4646-4afe-8353-146441b874ad" containerName="barbican-keystone-listener" Mar 09 13:21:39 crc kubenswrapper[4723]: E0309 13:21:39.949954 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4575d39-0084-4f11-980c-6187b318d7fa" containerName="barbican-api" Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.949960 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4575d39-0084-4f11-980c-6187b318d7fa" containerName="barbican-api" Mar 09 13:21:39 crc kubenswrapper[4723]: E0309 13:21:39.949968 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="470074a0-571f-4f50-a275-bb91881d9d85" containerName="dnsmasq-dns" Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.949974 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="470074a0-571f-4f50-a275-bb91881d9d85" containerName="dnsmasq-dns" Mar 09 13:21:39 crc kubenswrapper[4723]: E0309 13:21:39.949987 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afb8e2d8-5db3-4447-9c6e-4b8b6249b44d" containerName="barbican-worker" Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.949993 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="afb8e2d8-5db3-4447-9c6e-4b8b6249b44d" containerName="barbican-worker" Mar 09 13:21:39 crc kubenswrapper[4723]: E0309 13:21:39.950002 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90dea403-5a65-4824-ac5b-5c34c828d616" containerName="heat-db-sync" Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.950008 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="90dea403-5a65-4824-ac5b-5c34c828d616" containerName="heat-db-sync" Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.950208 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="90dea403-5a65-4824-ac5b-5c34c828d616" containerName="heat-db-sync" Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.950223 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="470074a0-571f-4f50-a275-bb91881d9d85" containerName="dnsmasq-dns" Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.950235 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9e10194-4646-4afe-8353-146441b874ad" containerName="barbican-keystone-listener-log" Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.950243 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="afb8e2d8-5db3-4447-9c6e-4b8b6249b44d" containerName="barbican-worker-log" Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.950255 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="afb8e2d8-5db3-4447-9c6e-4b8b6249b44d" containerName="barbican-worker" Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.950271 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9e10194-4646-4afe-8353-146441b874ad" containerName="barbican-keystone-listener" Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.950283 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4575d39-0084-4f11-980c-6187b318d7fa" containerName="barbican-api" Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.950294 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="191baa15-4ac5-4e55-9f87-751eddffb83e" containerName="cinder-db-sync" Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.950302 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4575d39-0084-4f11-980c-6187b318d7fa" containerName="barbican-api-log" Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.951669 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.955189 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.956719 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-cd95p" Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.956919 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.957068 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.963539 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4575d39-0084-4f11-980c-6187b318d7fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4575d39-0084-4f11-980c-6187b318d7fa" (UID: "a4575d39-0084-4f11-980c-6187b318d7fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.977442 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-tfrlw"] Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.979386 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-tfrlw" Mar 09 13:21:39 crc kubenswrapper[4723]: I0309 13:21:39.995924 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-tfrlw"] Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.017897 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/231ff024-2c9c-479e-b734-4cfd1e69ac91-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"231ff024-2c9c-479e-b734-4cfd1e69ac91\") " pod="openstack/cinder-scheduler-0" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.018312 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/231ff024-2c9c-479e-b734-4cfd1e69ac91-scripts\") pod \"cinder-scheduler-0\" (UID: \"231ff024-2c9c-479e-b734-4cfd1e69ac91\") " pod="openstack/cinder-scheduler-0" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.018525 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/231ff024-2c9c-479e-b734-4cfd1e69ac91-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"231ff024-2c9c-479e-b734-4cfd1e69ac91\") " pod="openstack/cinder-scheduler-0" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.018629 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4czr8\" (UniqueName: \"kubernetes.io/projected/231ff024-2c9c-479e-b734-4cfd1e69ac91-kube-api-access-4czr8\") pod \"cinder-scheduler-0\" (UID: \"231ff024-2c9c-479e-b734-4cfd1e69ac91\") " pod="openstack/cinder-scheduler-0" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.018736 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/231ff024-2c9c-479e-b734-4cfd1e69ac91-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"231ff024-2c9c-479e-b734-4cfd1e69ac91\") " pod="openstack/cinder-scheduler-0" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.018885 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/231ff024-2c9c-479e-b734-4cfd1e69ac91-config-data\") pod \"cinder-scheduler-0\" (UID: \"231ff024-2c9c-479e-b734-4cfd1e69ac91\") " pod="openstack/cinder-scheduler-0" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.019042 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4575d39-0084-4f11-980c-6187b318d7fa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.023257 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.122461 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/231ff024-2c9c-479e-b734-4cfd1e69ac91-config-data\") pod \"cinder-scheduler-0\" (UID: \"231ff024-2c9c-479e-b734-4cfd1e69ac91\") " pod="openstack/cinder-scheduler-0" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.122542 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4b35cba0-637b-481c-a44f-854ba4c3f86e-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-tfrlw\" (UID: \"4b35cba0-637b-481c-a44f-854ba4c3f86e\") " pod="openstack/dnsmasq-dns-5784cf869f-tfrlw" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.122585 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/231ff024-2c9c-479e-b734-4cfd1e69ac91-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"231ff024-2c9c-479e-b734-4cfd1e69ac91\") " pod="openstack/cinder-scheduler-0" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.122622 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b35cba0-637b-481c-a44f-854ba4c3f86e-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-tfrlw\" (UID: \"4b35cba0-637b-481c-a44f-854ba4c3f86e\") " pod="openstack/dnsmasq-dns-5784cf869f-tfrlw" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.122665 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/231ff024-2c9c-479e-b734-4cfd1e69ac91-scripts\") pod \"cinder-scheduler-0\" (UID: \"231ff024-2c9c-479e-b734-4cfd1e69ac91\") " pod="openstack/cinder-scheduler-0" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.122687 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b35cba0-637b-481c-a44f-854ba4c3f86e-dns-svc\") pod \"dnsmasq-dns-5784cf869f-tfrlw\" (UID: \"4b35cba0-637b-481c-a44f-854ba4c3f86e\") " pod="openstack/dnsmasq-dns-5784cf869f-tfrlw" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.122710 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/231ff024-2c9c-479e-b734-4cfd1e69ac91-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"231ff024-2c9c-479e-b734-4cfd1e69ac91\") " pod="openstack/cinder-scheduler-0" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.122736 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4czr8\" (UniqueName: \"kubernetes.io/projected/231ff024-2c9c-479e-b734-4cfd1e69ac91-kube-api-access-4czr8\") pod \"cinder-scheduler-0\" (UID: \"231ff024-2c9c-479e-b734-4cfd1e69ac91\") " pod="openstack/cinder-scheduler-0" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.122768 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b35cba0-637b-481c-a44f-854ba4c3f86e-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-tfrlw\" (UID: \"4b35cba0-637b-481c-a44f-854ba4c3f86e\") " pod="openstack/dnsmasq-dns-5784cf869f-tfrlw" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.122813 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/231ff024-2c9c-479e-b734-4cfd1e69ac91-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"231ff024-2c9c-479e-b734-4cfd1e69ac91\") " pod="openstack/cinder-scheduler-0" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.122850 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b35cba0-637b-481c-a44f-854ba4c3f86e-config\") pod \"dnsmasq-dns-5784cf869f-tfrlw\" (UID: \"4b35cba0-637b-481c-a44f-854ba4c3f86e\") " pod="openstack/dnsmasq-dns-5784cf869f-tfrlw" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.122895 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlpmw\" (UniqueName: \"kubernetes.io/projected/4b35cba0-637b-481c-a44f-854ba4c3f86e-kube-api-access-jlpmw\") pod \"dnsmasq-dns-5784cf869f-tfrlw\" (UID: \"4b35cba0-637b-481c-a44f-854ba4c3f86e\") " pod="openstack/dnsmasq-dns-5784cf869f-tfrlw" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.124025 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/231ff024-2c9c-479e-b734-4cfd1e69ac91-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"231ff024-2c9c-479e-b734-4cfd1e69ac91\") " pod="openstack/cinder-scheduler-0" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.156636 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/231ff024-2c9c-479e-b734-4cfd1e69ac91-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"231ff024-2c9c-479e-b734-4cfd1e69ac91\") " pod="openstack/cinder-scheduler-0" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.157794 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/231ff024-2c9c-479e-b734-4cfd1e69ac91-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"231ff024-2c9c-479e-b734-4cfd1e69ac91\") " pod="openstack/cinder-scheduler-0" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.157907 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/231ff024-2c9c-479e-b734-4cfd1e69ac91-scripts\") pod \"cinder-scheduler-0\" (UID: \"231ff024-2c9c-479e-b734-4cfd1e69ac91\") " pod="openstack/cinder-scheduler-0" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.170147 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/231ff024-2c9c-479e-b734-4cfd1e69ac91-config-data\") pod \"cinder-scheduler-0\" (UID: \"231ff024-2c9c-479e-b734-4cfd1e69ac91\") " pod="openstack/cinder-scheduler-0" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.222714 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4czr8\" (UniqueName: \"kubernetes.io/projected/231ff024-2c9c-479e-b734-4cfd1e69ac91-kube-api-access-4czr8\") pod \"cinder-scheduler-0\" (UID: \"231ff024-2c9c-479e-b734-4cfd1e69ac91\") " pod="openstack/cinder-scheduler-0" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.232434 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b35cba0-637b-481c-a44f-854ba4c3f86e-config\") pod \"dnsmasq-dns-5784cf869f-tfrlw\" (UID: \"4b35cba0-637b-481c-a44f-854ba4c3f86e\") " pod="openstack/dnsmasq-dns-5784cf869f-tfrlw" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.234182 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlpmw\" (UniqueName: \"kubernetes.io/projected/4b35cba0-637b-481c-a44f-854ba4c3f86e-kube-api-access-jlpmw\") pod \"dnsmasq-dns-5784cf869f-tfrlw\" (UID: \"4b35cba0-637b-481c-a44f-854ba4c3f86e\") " pod="openstack/dnsmasq-dns-5784cf869f-tfrlw" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.234544 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4b35cba0-637b-481c-a44f-854ba4c3f86e-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-tfrlw\" (UID: \"4b35cba0-637b-481c-a44f-854ba4c3f86e\") " pod="openstack/dnsmasq-dns-5784cf869f-tfrlw" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.235015 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b35cba0-637b-481c-a44f-854ba4c3f86e-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-tfrlw\" (UID: \"4b35cba0-637b-481c-a44f-854ba4c3f86e\") " pod="openstack/dnsmasq-dns-5784cf869f-tfrlw" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.235284 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b35cba0-637b-481c-a44f-854ba4c3f86e-dns-svc\") pod \"dnsmasq-dns-5784cf869f-tfrlw\" (UID: \"4b35cba0-637b-481c-a44f-854ba4c3f86e\") " pod="openstack/dnsmasq-dns-5784cf869f-tfrlw" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.236435 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b35cba0-637b-481c-a44f-854ba4c3f86e-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-tfrlw\" (UID: \"4b35cba0-637b-481c-a44f-854ba4c3f86e\") " pod="openstack/dnsmasq-dns-5784cf869f-tfrlw" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.235834 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4b35cba0-637b-481c-a44f-854ba4c3f86e-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-tfrlw\" (UID: \"4b35cba0-637b-481c-a44f-854ba4c3f86e\") " pod="openstack/dnsmasq-dns-5784cf869f-tfrlw" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.237928 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b35cba0-637b-481c-a44f-854ba4c3f86e-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-tfrlw\" (UID: \"4b35cba0-637b-481c-a44f-854ba4c3f86e\") " pod="openstack/dnsmasq-dns-5784cf869f-tfrlw" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.235306 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b35cba0-637b-481c-a44f-854ba4c3f86e-config\") pod \"dnsmasq-dns-5784cf869f-tfrlw\" (UID: \"4b35cba0-637b-481c-a44f-854ba4c3f86e\") " pod="openstack/dnsmasq-dns-5784cf869f-tfrlw" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.239485 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b35cba0-637b-481c-a44f-854ba4c3f86e-dns-svc\") pod \"dnsmasq-dns-5784cf869f-tfrlw\" (UID: \"4b35cba0-637b-481c-a44f-854ba4c3f86e\") " pod="openstack/dnsmasq-dns-5784cf869f-tfrlw" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.240029 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b35cba0-637b-481c-a44f-854ba4c3f86e-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-tfrlw\" (UID: \"4b35cba0-637b-481c-a44f-854ba4c3f86e\") " pod="openstack/dnsmasq-dns-5784cf869f-tfrlw" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.252620 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.258365 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.262415 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.283581 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.285202 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlpmw\" (UniqueName: \"kubernetes.io/projected/4b35cba0-637b-481c-a44f-854ba4c3f86e-kube-api-access-jlpmw\") pod \"dnsmasq-dns-5784cf869f-tfrlw\" (UID: \"4b35cba0-637b-481c-a44f-854ba4c3f86e\") " pod="openstack/dnsmasq-dns-5784cf869f-tfrlw" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.310023 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.328469 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-tfrlw" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.441101 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85e029d9-b399-4b73-a5b7-458ce3a459d6-scripts\") pod \"cinder-api-0\" (UID: \"85e029d9-b399-4b73-a5b7-458ce3a459d6\") " pod="openstack/cinder-api-0" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.441551 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/85e029d9-b399-4b73-a5b7-458ce3a459d6-config-data-custom\") pod \"cinder-api-0\" (UID: \"85e029d9-b399-4b73-a5b7-458ce3a459d6\") " pod="openstack/cinder-api-0" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.441697 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85e029d9-b399-4b73-a5b7-458ce3a459d6-config-data\") pod \"cinder-api-0\" (UID: \"85e029d9-b399-4b73-a5b7-458ce3a459d6\") " pod="openstack/cinder-api-0" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.441834 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b97g4\" (UniqueName: \"kubernetes.io/projected/85e029d9-b399-4b73-a5b7-458ce3a459d6-kube-api-access-b97g4\") pod \"cinder-api-0\" (UID: \"85e029d9-b399-4b73-a5b7-458ce3a459d6\") " pod="openstack/cinder-api-0" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.441986 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85e029d9-b399-4b73-a5b7-458ce3a459d6-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"85e029d9-b399-4b73-a5b7-458ce3a459d6\") " pod="openstack/cinder-api-0" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.442160 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/85e029d9-b399-4b73-a5b7-458ce3a459d6-etc-machine-id\") pod \"cinder-api-0\" (UID: \"85e029d9-b399-4b73-a5b7-458ce3a459d6\") " pod="openstack/cinder-api-0" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.442255 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85e029d9-b399-4b73-a5b7-458ce3a459d6-logs\") pod \"cinder-api-0\" (UID: \"85e029d9-b399-4b73-a5b7-458ce3a459d6\") " pod="openstack/cinder-api-0" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.547823 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/85e029d9-b399-4b73-a5b7-458ce3a459d6-config-data-custom\") pod \"cinder-api-0\" (UID: \"85e029d9-b399-4b73-a5b7-458ce3a459d6\") " pod="openstack/cinder-api-0" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.548160 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85e029d9-b399-4b73-a5b7-458ce3a459d6-config-data\") pod \"cinder-api-0\" (UID: \"85e029d9-b399-4b73-a5b7-458ce3a459d6\") " pod="openstack/cinder-api-0" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.548196 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b97g4\" (UniqueName: \"kubernetes.io/projected/85e029d9-b399-4b73-a5b7-458ce3a459d6-kube-api-access-b97g4\") pod \"cinder-api-0\" (UID: \"85e029d9-b399-4b73-a5b7-458ce3a459d6\") " pod="openstack/cinder-api-0" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.548220 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85e029d9-b399-4b73-a5b7-458ce3a459d6-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"85e029d9-b399-4b73-a5b7-458ce3a459d6\") " pod="openstack/cinder-api-0" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.548265 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/85e029d9-b399-4b73-a5b7-458ce3a459d6-etc-machine-id\") pod \"cinder-api-0\" (UID: \"85e029d9-b399-4b73-a5b7-458ce3a459d6\") " pod="openstack/cinder-api-0" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.548279 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85e029d9-b399-4b73-a5b7-458ce3a459d6-logs\") pod \"cinder-api-0\" (UID: \"85e029d9-b399-4b73-a5b7-458ce3a459d6\") " pod="openstack/cinder-api-0" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.548378 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85e029d9-b399-4b73-a5b7-458ce3a459d6-scripts\") pod \"cinder-api-0\" (UID: \"85e029d9-b399-4b73-a5b7-458ce3a459d6\") " pod="openstack/cinder-api-0" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.555698 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85e029d9-b399-4b73-a5b7-458ce3a459d6-logs\") pod \"cinder-api-0\" (UID: \"85e029d9-b399-4b73-a5b7-458ce3a459d6\") " pod="openstack/cinder-api-0" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.555784 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/85e029d9-b399-4b73-a5b7-458ce3a459d6-etc-machine-id\") pod \"cinder-api-0\" (UID: \"85e029d9-b399-4b73-a5b7-458ce3a459d6\") " pod="openstack/cinder-api-0" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.805200 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" event={"ID":"983d5ed4-cfc7-402a-b226-29dc071c6e4e","Type":"ContainerStarted","Data":"6ac6d2c984403d03e4d4370dd6ca12328beaf68b063a60d758d836e9ab8d0176"} Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.907254 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85e029d9-b399-4b73-a5b7-458ce3a459d6-config-data\") pod \"cinder-api-0\" (UID: \"85e029d9-b399-4b73-a5b7-458ce3a459d6\") " pod="openstack/cinder-api-0" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.908175 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85e029d9-b399-4b73-a5b7-458ce3a459d6-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"85e029d9-b399-4b73-a5b7-458ce3a459d6\") " pod="openstack/cinder-api-0" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.908383 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/85e029d9-b399-4b73-a5b7-458ce3a459d6-config-data-custom\") pod \"cinder-api-0\" (UID: \"85e029d9-b399-4b73-a5b7-458ce3a459d6\") " pod="openstack/cinder-api-0" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.909699 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85e029d9-b399-4b73-a5b7-458ce3a459d6-scripts\") pod \"cinder-api-0\" (UID: \"85e029d9-b399-4b73-a5b7-458ce3a459d6\") " pod="openstack/cinder-api-0" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.917523 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b97g4\" (UniqueName: \"kubernetes.io/projected/85e029d9-b399-4b73-a5b7-458ce3a459d6-kube-api-access-b97g4\") pod \"cinder-api-0\" (UID: \"85e029d9-b399-4b73-a5b7-458ce3a459d6\") " pod="openstack/cinder-api-0" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.919392 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/470074a0-571f-4f50-a275-bb91881d9d85-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "470074a0-571f-4f50-a275-bb91881d9d85" (UID: "470074a0-571f-4f50-a275-bb91881d9d85"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.982565 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/470074a0-571f-4f50-a275-bb91881d9d85-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "470074a0-571f-4f50-a275-bb91881d9d85" (UID: "470074a0-571f-4f50-a275-bb91881d9d85"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:40 crc kubenswrapper[4723]: I0309 13:21:40.982639 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4575d39-0084-4f11-980c-6187b318d7fa-config-data" (OuterVolumeSpecName: "config-data") pod "a4575d39-0084-4f11-980c-6187b318d7fa" (UID: "a4575d39-0084-4f11-980c-6187b318d7fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:41 crc kubenswrapper[4723]: I0309 13:21:40.995841 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4575d39-0084-4f11-980c-6187b318d7fa-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:41 crc kubenswrapper[4723]: I0309 13:21:41.007835 4723 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/470074a0-571f-4f50-a275-bb91881d9d85-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:41 crc kubenswrapper[4723]: I0309 13:21:41.008012 4723 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/470074a0-571f-4f50-a275-bb91881d9d85-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:41 crc kubenswrapper[4723]: I0309 13:21:41.015332 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/470074a0-571f-4f50-a275-bb91881d9d85-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "470074a0-571f-4f50-a275-bb91881d9d85" (UID: "470074a0-571f-4f50-a275-bb91881d9d85"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:41 crc kubenswrapper[4723]: I0309 13:21:41.016119 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/470074a0-571f-4f50-a275-bb91881d9d85-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "470074a0-571f-4f50-a275-bb91881d9d85" (UID: "470074a0-571f-4f50-a275-bb91881d9d85"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:41 crc kubenswrapper[4723]: I0309 13:21:41.077652 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/470074a0-571f-4f50-a275-bb91881d9d85-config" (OuterVolumeSpecName: "config") pod "470074a0-571f-4f50-a275-bb91881d9d85" (UID: "470074a0-571f-4f50-a275-bb91881d9d85"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:41 crc kubenswrapper[4723]: I0309 13:21:41.110822 4723 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/470074a0-571f-4f50-a275-bb91881d9d85-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:41 crc kubenswrapper[4723]: I0309 13:21:41.110881 4723 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/470074a0-571f-4f50-a275-bb91881d9d85-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:41 crc kubenswrapper[4723]: I0309 13:21:41.110895 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/470074a0-571f-4f50-a275-bb91881d9d85-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:41 crc kubenswrapper[4723]: I0309 13:21:41.144554 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 13:21:41 crc kubenswrapper[4723]: I0309 13:21:41.186813 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 09 13:21:41 crc kubenswrapper[4723]: I0309 13:21:41.192057 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-tfrlw"] Mar 09 13:21:41 crc kubenswrapper[4723]: I0309 13:21:41.307695 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6f68f59b98-9mh25"] Mar 09 13:21:41 crc kubenswrapper[4723]: I0309 13:21:41.328997 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6f68f59b98-9mh25"] Mar 09 13:21:41 crc kubenswrapper[4723]: I0309 13:21:41.348529 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-8z98d"] Mar 09 13:21:41 crc kubenswrapper[4723]: I0309 13:21:41.357136 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-8z98d"] Mar 09 13:21:41 crc kubenswrapper[4723]: I0309 13:21:41.502815 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-597569c5dd-vxwdd" Mar 09 13:21:41 crc kubenswrapper[4723]: I0309 13:21:41.751070 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-695767fcfd-4rv9j" Mar 09 13:21:41 crc kubenswrapper[4723]: I0309 13:21:41.830586 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 09 13:21:41 crc kubenswrapper[4723]: I0309 13:21:41.909382 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-646c887bd9-qzqxk"] Mar 09 13:21:41 crc kubenswrapper[4723]: I0309 13:21:41.909674 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-646c887bd9-qzqxk" podUID="28297498-43e9-457c-a90d-0c3f49907491" containerName="neutron-api" containerID="cri-o://7a4d205903b0350689d246a7dd833d69ec9079f318670bb9c54d41aef2c07dd5" gracePeriod=30 Mar 09 13:21:41 crc kubenswrapper[4723]: I0309 13:21:41.910097 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-646c887bd9-qzqxk" podUID="28297498-43e9-457c-a90d-0c3f49907491" containerName="neutron-httpd" containerID="cri-o://fc77008e1f53a9fd9fd0fee21c759ad16773862210f0ceaae4c1cad04c78e7c6" gracePeriod=30 Mar 09 13:21:41 crc kubenswrapper[4723]: I0309 13:21:41.942936 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6556fbf64c-254qf"] Mar 09 13:21:41 crc kubenswrapper[4723]: I0309 13:21:41.944896 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6556fbf64c-254qf" Mar 09 13:21:41 crc kubenswrapper[4723]: I0309 13:21:41.948370 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-646c887bd9-qzqxk" podUID="28297498-43e9-457c-a90d-0c3f49907491" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.198:9696/\": EOF" Mar 09 13:21:41 crc kubenswrapper[4723]: I0309 13:21:41.951790 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"85e029d9-b399-4b73-a5b7-458ce3a459d6","Type":"ContainerStarted","Data":"6676d89b1e7ccb376014f57fbe1cd0225498bc73b2496d8a32b78112c2997a81"} Mar 09 13:21:41 crc kubenswrapper[4723]: I0309 13:21:41.966966 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e0bf458-3488-4d3a-80ac-d9cf2f655791","Type":"ContainerStarted","Data":"ab6e3117ea831cb4c6d8708856c263cc0cecf1d1ab632551d91a9070e61dc344"} Mar 09 13:21:41 crc kubenswrapper[4723]: I0309 13:21:41.967421 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1e0bf458-3488-4d3a-80ac-d9cf2f655791" containerName="ceilometer-central-agent" containerID="cri-o://38abaf8dc7e932d42d539dc5445545a95ed746821765a01b2eb815a51321876c" gracePeriod=30 Mar 09 13:21:41 crc kubenswrapper[4723]: I0309 13:21:41.967759 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 13:21:41 crc kubenswrapper[4723]: I0309 13:21:41.967843 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1e0bf458-3488-4d3a-80ac-d9cf2f655791" containerName="sg-core" containerID="cri-o://2161603b5816667f7710ccc90429c6d2fb2b8f1d9495eb572edffaa2c1f175a3" gracePeriod=30 Mar 09 13:21:41 crc kubenswrapper[4723]: I0309 13:21:41.967888 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1e0bf458-3488-4d3a-80ac-d9cf2f655791" containerName="ceilometer-notification-agent" containerID="cri-o://08856bed4f33e117dd10c871cba09579935b75e1730d76f9185afe1104c1a29f" gracePeriod=30 Mar 09 13:21:41 crc kubenswrapper[4723]: I0309 13:21:41.967891 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1e0bf458-3488-4d3a-80ac-d9cf2f655791" containerName="proxy-httpd" containerID="cri-o://ab6e3117ea831cb4c6d8708856c263cc0cecf1d1ab632551d91a9070e61dc344" gracePeriod=30 Mar 09 13:21:41 crc kubenswrapper[4723]: I0309 13:21:41.978674 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"231ff024-2c9c-479e-b734-4cfd1e69ac91","Type":"ContainerStarted","Data":"24d40f1032fb30023dc6a40a58db4cbcaa95ba891829c2ea5bf8d3e89fe662ed"} Mar 09 13:21:41 crc kubenswrapper[4723]: I0309 13:21:41.985371 4723 generic.go:334] "Generic (PLEG): container finished" podID="4b35cba0-637b-481c-a44f-854ba4c3f86e" containerID="3becec47c87fe6f0880331e7522b6d5a33482764cfb17be205a1568f7e3eab21" exitCode=0 Mar 09 13:21:41 crc kubenswrapper[4723]: I0309 13:21:41.985680 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-tfrlw" event={"ID":"4b35cba0-637b-481c-a44f-854ba4c3f86e","Type":"ContainerDied","Data":"3becec47c87fe6f0880331e7522b6d5a33482764cfb17be205a1568f7e3eab21"} Mar 09 13:21:41 crc kubenswrapper[4723]: I0309 13:21:41.985931 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-tfrlw" event={"ID":"4b35cba0-637b-481c-a44f-854ba4c3f86e","Type":"ContainerStarted","Data":"07f20a88afb2cc0d902f2d35535087a280d283443326b12f79be1f06cc1f5354"} Mar 09 13:21:42 crc kubenswrapper[4723]: I0309 13:21:42.019410 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6556fbf64c-254qf"] Mar 09 13:21:42 crc kubenswrapper[4723]: I0309 13:21:42.041992 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/57d35c3a-e724-4070-b5b4-e6a979b0c09a-config\") pod \"neutron-6556fbf64c-254qf\" (UID: \"57d35c3a-e724-4070-b5b4-e6a979b0c09a\") " pod="openstack/neutron-6556fbf64c-254qf" Mar 09 13:21:42 crc kubenswrapper[4723]: I0309 13:21:42.042050 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2pqv\" (UniqueName: \"kubernetes.io/projected/57d35c3a-e724-4070-b5b4-e6a979b0c09a-kube-api-access-d2pqv\") pod \"neutron-6556fbf64c-254qf\" (UID: \"57d35c3a-e724-4070-b5b4-e6a979b0c09a\") " pod="openstack/neutron-6556fbf64c-254qf" Mar 09 13:21:42 crc kubenswrapper[4723]: I0309 13:21:42.042311 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57d35c3a-e724-4070-b5b4-e6a979b0c09a-combined-ca-bundle\") pod \"neutron-6556fbf64c-254qf\" (UID: \"57d35c3a-e724-4070-b5b4-e6a979b0c09a\") " pod="openstack/neutron-6556fbf64c-254qf" Mar 09 13:21:42 crc kubenswrapper[4723]: I0309 13:21:42.042398 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/57d35c3a-e724-4070-b5b4-e6a979b0c09a-public-tls-certs\") pod \"neutron-6556fbf64c-254qf\" (UID: \"57d35c3a-e724-4070-b5b4-e6a979b0c09a\") " pod="openstack/neutron-6556fbf64c-254qf" Mar 09 13:21:42 crc kubenswrapper[4723]: I0309 13:21:42.042458 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/57d35c3a-e724-4070-b5b4-e6a979b0c09a-httpd-config\") pod \"neutron-6556fbf64c-254qf\" (UID: \"57d35c3a-e724-4070-b5b4-e6a979b0c09a\") " pod="openstack/neutron-6556fbf64c-254qf" Mar 09 13:21:42 crc kubenswrapper[4723]: I0309 13:21:42.042493 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/57d35c3a-e724-4070-b5b4-e6a979b0c09a-ovndb-tls-certs\") pod \"neutron-6556fbf64c-254qf\" (UID: \"57d35c3a-e724-4070-b5b4-e6a979b0c09a\") " pod="openstack/neutron-6556fbf64c-254qf" Mar 09 13:21:42 crc kubenswrapper[4723]: I0309 13:21:42.042527 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/57d35c3a-e724-4070-b5b4-e6a979b0c09a-internal-tls-certs\") pod \"neutron-6556fbf64c-254qf\" (UID: \"57d35c3a-e724-4070-b5b4-e6a979b0c09a\") " pod="openstack/neutron-6556fbf64c-254qf" Mar 09 13:21:42 crc kubenswrapper[4723]: I0309 13:21:42.074590 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.368994636 podStartE2EDuration="1m2.074564933s" podCreationTimestamp="2026-03-09 13:20:40 +0000 UTC" firstStartedPulling="2026-03-09 13:20:41.83416688 +0000 UTC m=+1315.848634430" lastFinishedPulling="2026-03-09 13:21:39.539737187 +0000 UTC m=+1373.554204727" observedRunningTime="2026-03-09 13:21:42.006073951 +0000 UTC m=+1376.020541491" watchObservedRunningTime="2026-03-09 13:21:42.074564933 +0000 UTC m=+1376.089032473" Mar 09 13:21:42 crc kubenswrapper[4723]: I0309 13:21:42.130957 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-695767fcfd-4rv9j" Mar 09 13:21:42 crc kubenswrapper[4723]: I0309 13:21:42.145138 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/57d35c3a-e724-4070-b5b4-e6a979b0c09a-internal-tls-certs\") pod \"neutron-6556fbf64c-254qf\" (UID: \"57d35c3a-e724-4070-b5b4-e6a979b0c09a\") " pod="openstack/neutron-6556fbf64c-254qf" Mar 09 13:21:42 crc kubenswrapper[4723]: I0309 13:21:42.145241 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/57d35c3a-e724-4070-b5b4-e6a979b0c09a-config\") pod \"neutron-6556fbf64c-254qf\" (UID: \"57d35c3a-e724-4070-b5b4-e6a979b0c09a\") " pod="openstack/neutron-6556fbf64c-254qf" Mar 09 13:21:42 crc kubenswrapper[4723]: I0309 13:21:42.145272 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2pqv\" (UniqueName: \"kubernetes.io/projected/57d35c3a-e724-4070-b5b4-e6a979b0c09a-kube-api-access-d2pqv\") pod \"neutron-6556fbf64c-254qf\" (UID: \"57d35c3a-e724-4070-b5b4-e6a979b0c09a\") " pod="openstack/neutron-6556fbf64c-254qf" Mar 09 13:21:42 crc kubenswrapper[4723]: I0309 13:21:42.145424 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57d35c3a-e724-4070-b5b4-e6a979b0c09a-combined-ca-bundle\") pod \"neutron-6556fbf64c-254qf\" (UID: \"57d35c3a-e724-4070-b5b4-e6a979b0c09a\") " pod="openstack/neutron-6556fbf64c-254qf" Mar 09 13:21:42 crc kubenswrapper[4723]: I0309 13:21:42.145475 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/57d35c3a-e724-4070-b5b4-e6a979b0c09a-public-tls-certs\") pod \"neutron-6556fbf64c-254qf\" (UID: \"57d35c3a-e724-4070-b5b4-e6a979b0c09a\") " pod="openstack/neutron-6556fbf64c-254qf" Mar 09 13:21:42 crc kubenswrapper[4723]: I0309 13:21:42.145517 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/57d35c3a-e724-4070-b5b4-e6a979b0c09a-httpd-config\") pod \"neutron-6556fbf64c-254qf\" (UID: \"57d35c3a-e724-4070-b5b4-e6a979b0c09a\") " pod="openstack/neutron-6556fbf64c-254qf" Mar 09 13:21:42 crc kubenswrapper[4723]: I0309 13:21:42.145549 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/57d35c3a-e724-4070-b5b4-e6a979b0c09a-ovndb-tls-certs\") pod \"neutron-6556fbf64c-254qf\" (UID: \"57d35c3a-e724-4070-b5b4-e6a979b0c09a\") " pod="openstack/neutron-6556fbf64c-254qf" Mar 09 13:21:42 crc kubenswrapper[4723]: I0309 13:21:42.153562 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/57d35c3a-e724-4070-b5b4-e6a979b0c09a-config\") pod \"neutron-6556fbf64c-254qf\" (UID: \"57d35c3a-e724-4070-b5b4-e6a979b0c09a\") " pod="openstack/neutron-6556fbf64c-254qf" Mar 09 13:21:42 crc kubenswrapper[4723]: I0309 13:21:42.155268 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/57d35c3a-e724-4070-b5b4-e6a979b0c09a-internal-tls-certs\") pod \"neutron-6556fbf64c-254qf\" (UID: \"57d35c3a-e724-4070-b5b4-e6a979b0c09a\") " pod="openstack/neutron-6556fbf64c-254qf" Mar 09 13:21:42 crc kubenswrapper[4723]: I0309 13:21:42.157362 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/57d35c3a-e724-4070-b5b4-e6a979b0c09a-ovndb-tls-certs\") pod \"neutron-6556fbf64c-254qf\" (UID: \"57d35c3a-e724-4070-b5b4-e6a979b0c09a\") " pod="openstack/neutron-6556fbf64c-254qf" Mar 09 13:21:42 crc kubenswrapper[4723]: I0309 13:21:42.165059 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/57d35c3a-e724-4070-b5b4-e6a979b0c09a-public-tls-certs\") pod \"neutron-6556fbf64c-254qf\" (UID: \"57d35c3a-e724-4070-b5b4-e6a979b0c09a\") " pod="openstack/neutron-6556fbf64c-254qf" Mar 09 13:21:42 crc kubenswrapper[4723]: I0309 13:21:42.169640 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57d35c3a-e724-4070-b5b4-e6a979b0c09a-combined-ca-bundle\") pod \"neutron-6556fbf64c-254qf\" (UID: \"57d35c3a-e724-4070-b5b4-e6a979b0c09a\") " pod="openstack/neutron-6556fbf64c-254qf" Mar 09 13:21:42 crc kubenswrapper[4723]: I0309 13:21:42.173522 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/57d35c3a-e724-4070-b5b4-e6a979b0c09a-httpd-config\") pod \"neutron-6556fbf64c-254qf\" (UID: \"57d35c3a-e724-4070-b5b4-e6a979b0c09a\") " pod="openstack/neutron-6556fbf64c-254qf" Mar 09 13:21:42 crc kubenswrapper[4723]: I0309 13:21:42.178374 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2pqv\" (UniqueName: \"kubernetes.io/projected/57d35c3a-e724-4070-b5b4-e6a979b0c09a-kube-api-access-d2pqv\") pod \"neutron-6556fbf64c-254qf\" (UID: \"57d35c3a-e724-4070-b5b4-e6a979b0c09a\") " pod="openstack/neutron-6556fbf64c-254qf" Mar 09 13:21:42 crc kubenswrapper[4723]: I0309 13:21:42.283563 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-d7b4c9946-ljf67"] Mar 09 13:21:42 crc kubenswrapper[4723]: I0309 13:21:42.284185 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-d7b4c9946-ljf67" podUID="591e7541-2095-490f-9787-d4551a2e4f9d" containerName="barbican-api-log" containerID="cri-o://92dcefaf02293c319df2e4e7a6b68680e3d30054f568a1aaa658e5dedeec64d2" gracePeriod=30 Mar 09 13:21:42 crc kubenswrapper[4723]: I0309 13:21:42.300935 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-d7b4c9946-ljf67" podUID="591e7541-2095-490f-9787-d4551a2e4f9d" containerName="barbican-api" containerID="cri-o://0d5f2915d0560d32c496d76824067567dd9de60e05e9b2294a4bff612dd09489" gracePeriod=30 Mar 09 13:21:42 crc kubenswrapper[4723]: I0309 13:21:42.311299 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6556fbf64c-254qf" Mar 09 13:21:42 crc kubenswrapper[4723]: I0309 13:21:42.919822 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="470074a0-571f-4f50-a275-bb91881d9d85" path="/var/lib/kubelet/pods/470074a0-571f-4f50-a275-bb91881d9d85/volumes" Mar 09 13:21:42 crc kubenswrapper[4723]: I0309 13:21:42.924659 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4575d39-0084-4f11-980c-6187b318d7fa" path="/var/lib/kubelet/pods/a4575d39-0084-4f11-980c-6187b318d7fa/volumes" Mar 09 13:21:42 crc kubenswrapper[4723]: I0309 13:21:42.943179 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 09 13:21:43 crc kubenswrapper[4723]: I0309 13:21:43.058113 4723 generic.go:334] "Generic (PLEG): container finished" podID="28297498-43e9-457c-a90d-0c3f49907491" containerID="fc77008e1f53a9fd9fd0fee21c759ad16773862210f0ceaae4c1cad04c78e7c6" exitCode=0 Mar 09 13:21:43 crc kubenswrapper[4723]: I0309 13:21:43.058208 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-646c887bd9-qzqxk" event={"ID":"28297498-43e9-457c-a90d-0c3f49907491","Type":"ContainerDied","Data":"fc77008e1f53a9fd9fd0fee21c759ad16773862210f0ceaae4c1cad04c78e7c6"} Mar 09 13:21:43 crc kubenswrapper[4723]: I0309 13:21:43.143299 4723 generic.go:334] "Generic (PLEG): container finished" podID="1e0bf458-3488-4d3a-80ac-d9cf2f655791" containerID="ab6e3117ea831cb4c6d8708856c263cc0cecf1d1ab632551d91a9070e61dc344" exitCode=0 Mar 09 13:21:43 crc kubenswrapper[4723]: I0309 13:21:43.143331 4723 generic.go:334] "Generic (PLEG): container finished" podID="1e0bf458-3488-4d3a-80ac-d9cf2f655791" containerID="2161603b5816667f7710ccc90429c6d2fb2b8f1d9495eb572edffaa2c1f175a3" exitCode=2 Mar 09 13:21:43 crc kubenswrapper[4723]: I0309 13:21:43.143341 4723 generic.go:334] "Generic (PLEG): container finished" podID="1e0bf458-3488-4d3a-80ac-d9cf2f655791" containerID="38abaf8dc7e932d42d539dc5445545a95ed746821765a01b2eb815a51321876c" exitCode=0 Mar 09 13:21:43 crc kubenswrapper[4723]: I0309 13:21:43.143423 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e0bf458-3488-4d3a-80ac-d9cf2f655791","Type":"ContainerDied","Data":"ab6e3117ea831cb4c6d8708856c263cc0cecf1d1ab632551d91a9070e61dc344"} Mar 09 13:21:43 crc kubenswrapper[4723]: I0309 13:21:43.143448 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e0bf458-3488-4d3a-80ac-d9cf2f655791","Type":"ContainerDied","Data":"2161603b5816667f7710ccc90429c6d2fb2b8f1d9495eb572edffaa2c1f175a3"} Mar 09 13:21:43 crc kubenswrapper[4723]: I0309 13:21:43.143457 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e0bf458-3488-4d3a-80ac-d9cf2f655791","Type":"ContainerDied","Data":"38abaf8dc7e932d42d539dc5445545a95ed746821765a01b2eb815a51321876c"} Mar 09 13:21:43 crc kubenswrapper[4723]: I0309 13:21:43.158982 4723 generic.go:334] "Generic (PLEG): container finished" podID="591e7541-2095-490f-9787-d4551a2e4f9d" containerID="92dcefaf02293c319df2e4e7a6b68680e3d30054f568a1aaa658e5dedeec64d2" exitCode=143 Mar 09 13:21:43 crc kubenswrapper[4723]: I0309 13:21:43.159028 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d7b4c9946-ljf67" event={"ID":"591e7541-2095-490f-9787-d4551a2e4f9d","Type":"ContainerDied","Data":"92dcefaf02293c319df2e4e7a6b68680e3d30054f568a1aaa658e5dedeec64d2"} Mar 09 13:21:43 crc kubenswrapper[4723]: I0309 13:21:43.236394 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6556fbf64c-254qf"] Mar 09 13:21:43 crc kubenswrapper[4723]: I0309 13:21:43.915879 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-646c887bd9-qzqxk" podUID="28297498-43e9-457c-a90d-0c3f49907491" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.198:9696/\": dial tcp 10.217.0.198:9696: connect: connection refused" Mar 09 13:21:44 crc kubenswrapper[4723]: I0309 13:21:44.192486 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6556fbf64c-254qf" event={"ID":"57d35c3a-e724-4070-b5b4-e6a979b0c09a","Type":"ContainerStarted","Data":"f6922ef58ccabf57951cbf533ea5f518acf5b71b41b14b83bb02676230eb8ce4"} Mar 09 13:21:44 crc kubenswrapper[4723]: I0309 13:21:44.192757 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6556fbf64c-254qf" event={"ID":"57d35c3a-e724-4070-b5b4-e6a979b0c09a","Type":"ContainerStarted","Data":"e67302ef1fe757d5f672b6787c37d5640d5f506eba2be129bd189f260a0eded7"} Mar 09 13:21:44 crc kubenswrapper[4723]: I0309 13:21:44.199946 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-tfrlw" event={"ID":"4b35cba0-637b-481c-a44f-854ba4c3f86e","Type":"ContainerStarted","Data":"9e3da25c95a32f77f29584cfcf43fa1688fa84629704fc08f032bef49a403eff"} Mar 09 13:21:44 crc kubenswrapper[4723]: I0309 13:21:44.200092 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5784cf869f-tfrlw" Mar 09 13:21:44 crc kubenswrapper[4723]: I0309 13:21:44.202329 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"85e029d9-b399-4b73-a5b7-458ce3a459d6","Type":"ContainerStarted","Data":"f2ca1676e90d1c871dcdaff4231d3ec0d4c9c081386001474bb6c5a2c8ba0916"} Mar 09 13:21:44 crc kubenswrapper[4723]: I0309 13:21:44.209922 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"231ff024-2c9c-479e-b734-4cfd1e69ac91","Type":"ContainerStarted","Data":"726d8ac4d64caa2c94d96ce2787c4b32183803e5054b4daa79acc270575e859e"} Mar 09 13:21:44 crc kubenswrapper[4723]: I0309 13:21:44.230824 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5784cf869f-tfrlw" podStartSLOduration=5.230801179 podStartE2EDuration="5.230801179s" podCreationTimestamp="2026-03-09 13:21:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:21:44.221342309 +0000 UTC m=+1378.235809869" watchObservedRunningTime="2026-03-09 13:21:44.230801179 +0000 UTC m=+1378.245268719" Mar 09 13:21:45 crc kubenswrapper[4723]: I0309 13:21:45.222675 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"85e029d9-b399-4b73-a5b7-458ce3a459d6","Type":"ContainerStarted","Data":"536b540a1c2fa2c8933a710b773d68d8f5d423f462382be9cc4aba588555274d"} Mar 09 13:21:45 crc kubenswrapper[4723]: I0309 13:21:45.222772 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="85e029d9-b399-4b73-a5b7-458ce3a459d6" containerName="cinder-api-log" containerID="cri-o://f2ca1676e90d1c871dcdaff4231d3ec0d4c9c081386001474bb6c5a2c8ba0916" gracePeriod=30 Mar 09 13:21:45 crc kubenswrapper[4723]: I0309 13:21:45.223223 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="85e029d9-b399-4b73-a5b7-458ce3a459d6" containerName="cinder-api" containerID="cri-o://536b540a1c2fa2c8933a710b773d68d8f5d423f462382be9cc4aba588555274d" gracePeriod=30 Mar 09 13:21:45 crc kubenswrapper[4723]: I0309 13:21:45.225106 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 09 13:21:45 crc kubenswrapper[4723]: I0309 13:21:45.227091 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"231ff024-2c9c-479e-b734-4cfd1e69ac91","Type":"ContainerStarted","Data":"c08169c322d1c84997d310c42a57a10b911c1a97c2dd12f578eef4048001fbe9"} Mar 09 13:21:45 crc kubenswrapper[4723]: I0309 13:21:45.237958 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6556fbf64c-254qf" event={"ID":"57d35c3a-e724-4070-b5b4-e6a979b0c09a","Type":"ContainerStarted","Data":"b9d800489892a48428c3091bb89a19f2820e2c6a3febc36ec776bc36096a25b9"} Mar 09 13:21:45 crc kubenswrapper[4723]: I0309 13:21:45.238000 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6556fbf64c-254qf" Mar 09 13:21:45 crc kubenswrapper[4723]: I0309 13:21:45.256410 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.256396373 podStartE2EDuration="5.256396373s" podCreationTimestamp="2026-03-09 13:21:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:21:45.252415307 +0000 UTC m=+1379.266882847" watchObservedRunningTime="2026-03-09 13:21:45.256396373 +0000 UTC m=+1379.270863913" Mar 09 13:21:45 crc kubenswrapper[4723]: I0309 13:21:45.281219 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.132251221 podStartE2EDuration="6.281198049s" podCreationTimestamp="2026-03-09 13:21:39 +0000 UTC" firstStartedPulling="2026-03-09 13:21:41.164733774 +0000 UTC m=+1375.179201314" lastFinishedPulling="2026-03-09 13:21:42.313680602 +0000 UTC m=+1376.328148142" observedRunningTime="2026-03-09 13:21:45.27139425 +0000 UTC m=+1379.285861820" watchObservedRunningTime="2026-03-09 13:21:45.281198049 +0000 UTC m=+1379.295665589" Mar 09 13:21:45 crc kubenswrapper[4723]: I0309 13:21:45.298374 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6556fbf64c-254qf" podStartSLOduration=4.298350213 podStartE2EDuration="4.298350213s" podCreationTimestamp="2026-03-09 13:21:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:21:45.291592804 +0000 UTC m=+1379.306060344" watchObservedRunningTime="2026-03-09 13:21:45.298350213 +0000 UTC m=+1379.312817773" Mar 09 13:21:45 crc kubenswrapper[4723]: I0309 13:21:45.311010 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.261650 4723 generic.go:334] "Generic (PLEG): container finished" podID="85e029d9-b399-4b73-a5b7-458ce3a459d6" containerID="536b540a1c2fa2c8933a710b773d68d8f5d423f462382be9cc4aba588555274d" exitCode=0 Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.262128 4723 generic.go:334] "Generic (PLEG): container finished" podID="85e029d9-b399-4b73-a5b7-458ce3a459d6" containerID="f2ca1676e90d1c871dcdaff4231d3ec0d4c9c081386001474bb6c5a2c8ba0916" exitCode=143 Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.261751 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"85e029d9-b399-4b73-a5b7-458ce3a459d6","Type":"ContainerDied","Data":"536b540a1c2fa2c8933a710b773d68d8f5d423f462382be9cc4aba588555274d"} Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.262237 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"85e029d9-b399-4b73-a5b7-458ce3a459d6","Type":"ContainerDied","Data":"f2ca1676e90d1c871dcdaff4231d3ec0d4c9c081386001474bb6c5a2c8ba0916"} Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.264262 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"85e029d9-b399-4b73-a5b7-458ce3a459d6","Type":"ContainerDied","Data":"6676d89b1e7ccb376014f57fbe1cd0225498bc73b2496d8a32b78112c2997a81"} Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.264292 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6676d89b1e7ccb376014f57fbe1cd0225498bc73b2496d8a32b78112c2997a81" Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.286072 4723 generic.go:334] "Generic (PLEG): container finished" podID="591e7541-2095-490f-9787-d4551a2e4f9d" containerID="0d5f2915d0560d32c496d76824067567dd9de60e05e9b2294a4bff612dd09489" exitCode=0 Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.286267 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d7b4c9946-ljf67" event={"ID":"591e7541-2095-490f-9787-d4551a2e4f9d","Type":"ContainerDied","Data":"0d5f2915d0560d32c496d76824067567dd9de60e05e9b2294a4bff612dd09489"} Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.301246 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.308445 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d7b4c9946-ljf67" Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.459460 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85e029d9-b399-4b73-a5b7-458ce3a459d6-config-data\") pod \"85e029d9-b399-4b73-a5b7-458ce3a459d6\" (UID: \"85e029d9-b399-4b73-a5b7-458ce3a459d6\") " Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.459919 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/591e7541-2095-490f-9787-d4551a2e4f9d-config-data-custom\") pod \"591e7541-2095-490f-9787-d4551a2e4f9d\" (UID: \"591e7541-2095-490f-9787-d4551a2e4f9d\") " Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.459960 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/591e7541-2095-490f-9787-d4551a2e4f9d-config-data\") pod \"591e7541-2095-490f-9787-d4551a2e4f9d\" (UID: \"591e7541-2095-490f-9787-d4551a2e4f9d\") " Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.459979 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85e029d9-b399-4b73-a5b7-458ce3a459d6-logs\") pod \"85e029d9-b399-4b73-a5b7-458ce3a459d6\" (UID: \"85e029d9-b399-4b73-a5b7-458ce3a459d6\") " Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.459999 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/85e029d9-b399-4b73-a5b7-458ce3a459d6-config-data-custom\") pod \"85e029d9-b399-4b73-a5b7-458ce3a459d6\" (UID: \"85e029d9-b399-4b73-a5b7-458ce3a459d6\") " Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.460032 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/591e7541-2095-490f-9787-d4551a2e4f9d-public-tls-certs\") pod \"591e7541-2095-490f-9787-d4551a2e4f9d\" (UID: \"591e7541-2095-490f-9787-d4551a2e4f9d\") " Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.460122 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b97g4\" (UniqueName: \"kubernetes.io/projected/85e029d9-b399-4b73-a5b7-458ce3a459d6-kube-api-access-b97g4\") pod \"85e029d9-b399-4b73-a5b7-458ce3a459d6\" (UID: \"85e029d9-b399-4b73-a5b7-458ce3a459d6\") " Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.460139 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85e029d9-b399-4b73-a5b7-458ce3a459d6-combined-ca-bundle\") pod \"85e029d9-b399-4b73-a5b7-458ce3a459d6\" (UID: \"85e029d9-b399-4b73-a5b7-458ce3a459d6\") " Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.460185 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sf7wn\" (UniqueName: \"kubernetes.io/projected/591e7541-2095-490f-9787-d4551a2e4f9d-kube-api-access-sf7wn\") pod \"591e7541-2095-490f-9787-d4551a2e4f9d\" (UID: \"591e7541-2095-490f-9787-d4551a2e4f9d\") " Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.460214 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/591e7541-2095-490f-9787-d4551a2e4f9d-internal-tls-certs\") pod \"591e7541-2095-490f-9787-d4551a2e4f9d\" (UID: \"591e7541-2095-490f-9787-d4551a2e4f9d\") " Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.460240 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/85e029d9-b399-4b73-a5b7-458ce3a459d6-etc-machine-id\") pod \"85e029d9-b399-4b73-a5b7-458ce3a459d6\" (UID: \"85e029d9-b399-4b73-a5b7-458ce3a459d6\") " Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.460334 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/591e7541-2095-490f-9787-d4551a2e4f9d-logs\") pod \"591e7541-2095-490f-9787-d4551a2e4f9d\" (UID: \"591e7541-2095-490f-9787-d4551a2e4f9d\") " Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.460372 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85e029d9-b399-4b73-a5b7-458ce3a459d6-scripts\") pod \"85e029d9-b399-4b73-a5b7-458ce3a459d6\" (UID: \"85e029d9-b399-4b73-a5b7-458ce3a459d6\") " Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.460581 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85e029d9-b399-4b73-a5b7-458ce3a459d6-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "85e029d9-b399-4b73-a5b7-458ce3a459d6" (UID: "85e029d9-b399-4b73-a5b7-458ce3a459d6"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.461229 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/591e7541-2095-490f-9787-d4551a2e4f9d-combined-ca-bundle\") pod \"591e7541-2095-490f-9787-d4551a2e4f9d\" (UID: \"591e7541-2095-490f-9787-d4551a2e4f9d\") " Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.461934 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85e029d9-b399-4b73-a5b7-458ce3a459d6-logs" (OuterVolumeSpecName: "logs") pod "85e029d9-b399-4b73-a5b7-458ce3a459d6" (UID: "85e029d9-b399-4b73-a5b7-458ce3a459d6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.462346 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/591e7541-2095-490f-9787-d4551a2e4f9d-logs" (OuterVolumeSpecName: "logs") pod "591e7541-2095-490f-9787-d4551a2e4f9d" (UID: "591e7541-2095-490f-9787-d4551a2e4f9d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.468108 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85e029d9-b399-4b73-a5b7-458ce3a459d6-kube-api-access-b97g4" (OuterVolumeSpecName: "kube-api-access-b97g4") pod "85e029d9-b399-4b73-a5b7-458ce3a459d6" (UID: "85e029d9-b399-4b73-a5b7-458ce3a459d6"). InnerVolumeSpecName "kube-api-access-b97g4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.469060 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/591e7541-2095-490f-9787-d4551a2e4f9d-kube-api-access-sf7wn" (OuterVolumeSpecName: "kube-api-access-sf7wn") pod "591e7541-2095-490f-9787-d4551a2e4f9d" (UID: "591e7541-2095-490f-9787-d4551a2e4f9d"). InnerVolumeSpecName "kube-api-access-sf7wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.469475 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85e029d9-b399-4b73-a5b7-458ce3a459d6-scripts" (OuterVolumeSpecName: "scripts") pod "85e029d9-b399-4b73-a5b7-458ce3a459d6" (UID: "85e029d9-b399-4b73-a5b7-458ce3a459d6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.469820 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85e029d9-b399-4b73-a5b7-458ce3a459d6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "85e029d9-b399-4b73-a5b7-458ce3a459d6" (UID: "85e029d9-b399-4b73-a5b7-458ce3a459d6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.473268 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/591e7541-2095-490f-9787-d4551a2e4f9d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "591e7541-2095-490f-9787-d4551a2e4f9d" (UID: "591e7541-2095-490f-9787-d4551a2e4f9d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.478178 4723 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/591e7541-2095-490f-9787-d4551a2e4f9d-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.481505 4723 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85e029d9-b399-4b73-a5b7-458ce3a459d6-logs\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.481934 4723 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/85e029d9-b399-4b73-a5b7-458ce3a459d6-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.482060 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b97g4\" (UniqueName: \"kubernetes.io/projected/85e029d9-b399-4b73-a5b7-458ce3a459d6-kube-api-access-b97g4\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.482171 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sf7wn\" (UniqueName: \"kubernetes.io/projected/591e7541-2095-490f-9787-d4551a2e4f9d-kube-api-access-sf7wn\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.482309 4723 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/85e029d9-b399-4b73-a5b7-458ce3a459d6-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.482474 4723 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/591e7541-2095-490f-9787-d4551a2e4f9d-logs\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.482970 4723 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85e029d9-b399-4b73-a5b7-458ce3a459d6-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.528027 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85e029d9-b399-4b73-a5b7-458ce3a459d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "85e029d9-b399-4b73-a5b7-458ce3a459d6" (UID: "85e029d9-b399-4b73-a5b7-458ce3a459d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.532568 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/591e7541-2095-490f-9787-d4551a2e4f9d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "591e7541-2095-490f-9787-d4551a2e4f9d" (UID: "591e7541-2095-490f-9787-d4551a2e4f9d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.551328 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/591e7541-2095-490f-9787-d4551a2e4f9d-config-data" (OuterVolumeSpecName: "config-data") pod "591e7541-2095-490f-9787-d4551a2e4f9d" (UID: "591e7541-2095-490f-9787-d4551a2e4f9d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.554883 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/591e7541-2095-490f-9787-d4551a2e4f9d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "591e7541-2095-490f-9787-d4551a2e4f9d" (UID: "591e7541-2095-490f-9787-d4551a2e4f9d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.557938 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/591e7541-2095-490f-9787-d4551a2e4f9d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "591e7541-2095-490f-9787-d4551a2e4f9d" (UID: "591e7541-2095-490f-9787-d4551a2e4f9d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.585470 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/591e7541-2095-490f-9787-d4551a2e4f9d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.585676 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/591e7541-2095-490f-9787-d4551a2e4f9d-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.585690 4723 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/591e7541-2095-490f-9787-d4551a2e4f9d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.585702 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85e029d9-b399-4b73-a5b7-458ce3a459d6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.585714 4723 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/591e7541-2095-490f-9787-d4551a2e4f9d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.591600 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85e029d9-b399-4b73-a5b7-458ce3a459d6-config-data" (OuterVolumeSpecName: "config-data") pod "85e029d9-b399-4b73-a5b7-458ce3a459d6" (UID: "85e029d9-b399-4b73-a5b7-458ce3a459d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:46 crc kubenswrapper[4723]: I0309 13:21:46.688699 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85e029d9-b399-4b73-a5b7-458ce3a459d6-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.312377 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d7b4c9946-ljf67" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.312375 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d7b4c9946-ljf67" event={"ID":"591e7541-2095-490f-9787-d4551a2e4f9d","Type":"ContainerDied","Data":"1925aa83e5f062d7fe6ccb9d26ece2e7cde2daef6afd506e9ee19477faa1657f"} Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.312988 4723 scope.go:117] "RemoveContainer" containerID="0d5f2915d0560d32c496d76824067567dd9de60e05e9b2294a4bff612dd09489" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.320504 4723 generic.go:334] "Generic (PLEG): container finished" podID="1e0bf458-3488-4d3a-80ac-d9cf2f655791" containerID="08856bed4f33e117dd10c871cba09579935b75e1730d76f9185afe1104c1a29f" exitCode=0 Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.320579 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.321515 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e0bf458-3488-4d3a-80ac-d9cf2f655791","Type":"ContainerDied","Data":"08856bed4f33e117dd10c871cba09579935b75e1730d76f9185afe1104c1a29f"} Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.499784 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.518611 4723 scope.go:117] "RemoveContainer" containerID="92dcefaf02293c319df2e4e7a6b68680e3d30054f568a1aaa658e5dedeec64d2" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.521170 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e0bf458-3488-4d3a-80ac-d9cf2f655791-config-data\") pod \"1e0bf458-3488-4d3a-80ac-d9cf2f655791\" (UID: \"1e0bf458-3488-4d3a-80ac-d9cf2f655791\") " Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.521224 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e0bf458-3488-4d3a-80ac-d9cf2f655791-run-httpd\") pod \"1e0bf458-3488-4d3a-80ac-d9cf2f655791\" (UID: \"1e0bf458-3488-4d3a-80ac-d9cf2f655791\") " Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.521290 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e0bf458-3488-4d3a-80ac-d9cf2f655791-log-httpd\") pod \"1e0bf458-3488-4d3a-80ac-d9cf2f655791\" (UID: \"1e0bf458-3488-4d3a-80ac-d9cf2f655791\") " Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.521390 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e0bf458-3488-4d3a-80ac-d9cf2f655791-combined-ca-bundle\") pod \"1e0bf458-3488-4d3a-80ac-d9cf2f655791\" (UID: \"1e0bf458-3488-4d3a-80ac-d9cf2f655791\") " Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.521420 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fgl4\" (UniqueName: \"kubernetes.io/projected/1e0bf458-3488-4d3a-80ac-d9cf2f655791-kube-api-access-5fgl4\") pod \"1e0bf458-3488-4d3a-80ac-d9cf2f655791\" (UID: \"1e0bf458-3488-4d3a-80ac-d9cf2f655791\") " Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.521451 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e0bf458-3488-4d3a-80ac-d9cf2f655791-scripts\") pod \"1e0bf458-3488-4d3a-80ac-d9cf2f655791\" (UID: \"1e0bf458-3488-4d3a-80ac-d9cf2f655791\") " Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.521611 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1e0bf458-3488-4d3a-80ac-d9cf2f655791-sg-core-conf-yaml\") pod \"1e0bf458-3488-4d3a-80ac-d9cf2f655791\" (UID: \"1e0bf458-3488-4d3a-80ac-d9cf2f655791\") " Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.523234 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e0bf458-3488-4d3a-80ac-d9cf2f655791-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1e0bf458-3488-4d3a-80ac-d9cf2f655791" (UID: "1e0bf458-3488-4d3a-80ac-d9cf2f655791"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.523242 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e0bf458-3488-4d3a-80ac-d9cf2f655791-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1e0bf458-3488-4d3a-80ac-d9cf2f655791" (UID: "1e0bf458-3488-4d3a-80ac-d9cf2f655791"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.531929 4723 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e0bf458-3488-4d3a-80ac-d9cf2f655791-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.531968 4723 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e0bf458-3488-4d3a-80ac-d9cf2f655791-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.533579 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e0bf458-3488-4d3a-80ac-d9cf2f655791-kube-api-access-5fgl4" (OuterVolumeSpecName: "kube-api-access-5fgl4") pod "1e0bf458-3488-4d3a-80ac-d9cf2f655791" (UID: "1e0bf458-3488-4d3a-80ac-d9cf2f655791"). InnerVolumeSpecName "kube-api-access-5fgl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.566273 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e0bf458-3488-4d3a-80ac-d9cf2f655791-scripts" (OuterVolumeSpecName: "scripts") pod "1e0bf458-3488-4d3a-80ac-d9cf2f655791" (UID: "1e0bf458-3488-4d3a-80ac-d9cf2f655791"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.633480 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fgl4\" (UniqueName: \"kubernetes.io/projected/1e0bf458-3488-4d3a-80ac-d9cf2f655791-kube-api-access-5fgl4\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.633505 4723 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e0bf458-3488-4d3a-80ac-d9cf2f655791-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.649481 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-d7b4c9946-ljf67"] Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.675135 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e0bf458-3488-4d3a-80ac-d9cf2f655791-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1e0bf458-3488-4d3a-80ac-d9cf2f655791" (UID: "1e0bf458-3488-4d3a-80ac-d9cf2f655791"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.678013 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-d7b4c9946-ljf67"] Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.734946 4723 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1e0bf458-3488-4d3a-80ac-d9cf2f655791-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.742286 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e0bf458-3488-4d3a-80ac-d9cf2f655791-config-data" (OuterVolumeSpecName: "config-data") pod "1e0bf458-3488-4d3a-80ac-d9cf2f655791" (UID: "1e0bf458-3488-4d3a-80ac-d9cf2f655791"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.744753 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.748195 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e0bf458-3488-4d3a-80ac-d9cf2f655791-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e0bf458-3488-4d3a-80ac-d9cf2f655791" (UID: "1e0bf458-3488-4d3a-80ac-d9cf2f655791"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.761549 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.772799 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 09 13:21:47 crc kubenswrapper[4723]: E0309 13:21:47.773347 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e0bf458-3488-4d3a-80ac-d9cf2f655791" containerName="proxy-httpd" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.773375 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e0bf458-3488-4d3a-80ac-d9cf2f655791" containerName="proxy-httpd" Mar 09 13:21:47 crc kubenswrapper[4723]: E0309 13:21:47.773408 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="591e7541-2095-490f-9787-d4551a2e4f9d" containerName="barbican-api-log" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.773418 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="591e7541-2095-490f-9787-d4551a2e4f9d" containerName="barbican-api-log" Mar 09 13:21:47 crc kubenswrapper[4723]: E0309 13:21:47.773430 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e0bf458-3488-4d3a-80ac-d9cf2f655791" containerName="sg-core" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.773437 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e0bf458-3488-4d3a-80ac-d9cf2f655791" containerName="sg-core" Mar 09 13:21:47 crc kubenswrapper[4723]: E0309 13:21:47.773456 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e0bf458-3488-4d3a-80ac-d9cf2f655791" containerName="ceilometer-central-agent" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.773464 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e0bf458-3488-4d3a-80ac-d9cf2f655791" containerName="ceilometer-central-agent" Mar 09 13:21:47 crc kubenswrapper[4723]: E0309 13:21:47.773494 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="591e7541-2095-490f-9787-d4551a2e4f9d" containerName="barbican-api" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.773502 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="591e7541-2095-490f-9787-d4551a2e4f9d" containerName="barbican-api" Mar 09 13:21:47 crc kubenswrapper[4723]: E0309 13:21:47.773512 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85e029d9-b399-4b73-a5b7-458ce3a459d6" containerName="cinder-api-log" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.773519 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="85e029d9-b399-4b73-a5b7-458ce3a459d6" containerName="cinder-api-log" Mar 09 13:21:47 crc kubenswrapper[4723]: E0309 13:21:47.773533 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85e029d9-b399-4b73-a5b7-458ce3a459d6" containerName="cinder-api" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.773540 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="85e029d9-b399-4b73-a5b7-458ce3a459d6" containerName="cinder-api" Mar 09 13:21:47 crc kubenswrapper[4723]: E0309 13:21:47.773551 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e0bf458-3488-4d3a-80ac-d9cf2f655791" containerName="ceilometer-notification-agent" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.773560 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e0bf458-3488-4d3a-80ac-d9cf2f655791" containerName="ceilometer-notification-agent" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.773835 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e0bf458-3488-4d3a-80ac-d9cf2f655791" containerName="proxy-httpd" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.773850 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e0bf458-3488-4d3a-80ac-d9cf2f655791" containerName="ceilometer-notification-agent" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.773917 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e0bf458-3488-4d3a-80ac-d9cf2f655791" containerName="ceilometer-central-agent" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.773934 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="85e029d9-b399-4b73-a5b7-458ce3a459d6" containerName="cinder-api-log" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.773955 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="85e029d9-b399-4b73-a5b7-458ce3a459d6" containerName="cinder-api" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.773971 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e0bf458-3488-4d3a-80ac-d9cf2f655791" containerName="sg-core" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.773992 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="591e7541-2095-490f-9787-d4551a2e4f9d" containerName="barbican-api" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.774005 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="591e7541-2095-490f-9787-d4551a2e4f9d" containerName="barbican-api-log" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.775530 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.778473 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.778709 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.782028 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.801717 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.836767 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/08390bdf-47f0-43a2-89ac-1a7240dcd2dd-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"08390bdf-47f0-43a2-89ac-1a7240dcd2dd\") " pod="openstack/cinder-api-0" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.836821 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj6v2\" (UniqueName: \"kubernetes.io/projected/08390bdf-47f0-43a2-89ac-1a7240dcd2dd-kube-api-access-jj6v2\") pod \"cinder-api-0\" (UID: \"08390bdf-47f0-43a2-89ac-1a7240dcd2dd\") " pod="openstack/cinder-api-0" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.836883 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08390bdf-47f0-43a2-89ac-1a7240dcd2dd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"08390bdf-47f0-43a2-89ac-1a7240dcd2dd\") " pod="openstack/cinder-api-0" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.836923 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08390bdf-47f0-43a2-89ac-1a7240dcd2dd-config-data-custom\") pod \"cinder-api-0\" (UID: \"08390bdf-47f0-43a2-89ac-1a7240dcd2dd\") " pod="openstack/cinder-api-0" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.836991 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/08390bdf-47f0-43a2-89ac-1a7240dcd2dd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"08390bdf-47f0-43a2-89ac-1a7240dcd2dd\") " pod="openstack/cinder-api-0" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.837062 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08390bdf-47f0-43a2-89ac-1a7240dcd2dd-config-data\") pod \"cinder-api-0\" (UID: \"08390bdf-47f0-43a2-89ac-1a7240dcd2dd\") " pod="openstack/cinder-api-0" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.837192 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08390bdf-47f0-43a2-89ac-1a7240dcd2dd-logs\") pod \"cinder-api-0\" (UID: \"08390bdf-47f0-43a2-89ac-1a7240dcd2dd\") " pod="openstack/cinder-api-0" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.837614 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08390bdf-47f0-43a2-89ac-1a7240dcd2dd-scripts\") pod \"cinder-api-0\" (UID: \"08390bdf-47f0-43a2-89ac-1a7240dcd2dd\") " pod="openstack/cinder-api-0" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.837666 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/08390bdf-47f0-43a2-89ac-1a7240dcd2dd-public-tls-certs\") pod \"cinder-api-0\" (UID: \"08390bdf-47f0-43a2-89ac-1a7240dcd2dd\") " pod="openstack/cinder-api-0" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.837830 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e0bf458-3488-4d3a-80ac-d9cf2f655791-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.837849 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e0bf458-3488-4d3a-80ac-d9cf2f655791-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.939517 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08390bdf-47f0-43a2-89ac-1a7240dcd2dd-config-data\") pod \"cinder-api-0\" (UID: \"08390bdf-47f0-43a2-89ac-1a7240dcd2dd\") " pod="openstack/cinder-api-0" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.939589 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08390bdf-47f0-43a2-89ac-1a7240dcd2dd-logs\") pod \"cinder-api-0\" (UID: \"08390bdf-47f0-43a2-89ac-1a7240dcd2dd\") " pod="openstack/cinder-api-0" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.939741 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08390bdf-47f0-43a2-89ac-1a7240dcd2dd-scripts\") pod \"cinder-api-0\" (UID: \"08390bdf-47f0-43a2-89ac-1a7240dcd2dd\") " pod="openstack/cinder-api-0" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.939772 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/08390bdf-47f0-43a2-89ac-1a7240dcd2dd-public-tls-certs\") pod \"cinder-api-0\" (UID: \"08390bdf-47f0-43a2-89ac-1a7240dcd2dd\") " pod="openstack/cinder-api-0" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.939832 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/08390bdf-47f0-43a2-89ac-1a7240dcd2dd-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"08390bdf-47f0-43a2-89ac-1a7240dcd2dd\") " pod="openstack/cinder-api-0" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.939880 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj6v2\" (UniqueName: \"kubernetes.io/projected/08390bdf-47f0-43a2-89ac-1a7240dcd2dd-kube-api-access-jj6v2\") pod \"cinder-api-0\" (UID: \"08390bdf-47f0-43a2-89ac-1a7240dcd2dd\") " pod="openstack/cinder-api-0" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.939909 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08390bdf-47f0-43a2-89ac-1a7240dcd2dd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"08390bdf-47f0-43a2-89ac-1a7240dcd2dd\") " pod="openstack/cinder-api-0" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.939944 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08390bdf-47f0-43a2-89ac-1a7240dcd2dd-config-data-custom\") pod \"cinder-api-0\" (UID: \"08390bdf-47f0-43a2-89ac-1a7240dcd2dd\") " pod="openstack/cinder-api-0" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.939990 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/08390bdf-47f0-43a2-89ac-1a7240dcd2dd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"08390bdf-47f0-43a2-89ac-1a7240dcd2dd\") " pod="openstack/cinder-api-0" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.941562 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/08390bdf-47f0-43a2-89ac-1a7240dcd2dd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"08390bdf-47f0-43a2-89ac-1a7240dcd2dd\") " pod="openstack/cinder-api-0" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.942108 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08390bdf-47f0-43a2-89ac-1a7240dcd2dd-logs\") pod \"cinder-api-0\" (UID: \"08390bdf-47f0-43a2-89ac-1a7240dcd2dd\") " pod="openstack/cinder-api-0" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.945284 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/08390bdf-47f0-43a2-89ac-1a7240dcd2dd-public-tls-certs\") pod \"cinder-api-0\" (UID: \"08390bdf-47f0-43a2-89ac-1a7240dcd2dd\") " pod="openstack/cinder-api-0" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.945868 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/08390bdf-47f0-43a2-89ac-1a7240dcd2dd-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"08390bdf-47f0-43a2-89ac-1a7240dcd2dd\") " pod="openstack/cinder-api-0" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.947366 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08390bdf-47f0-43a2-89ac-1a7240dcd2dd-scripts\") pod \"cinder-api-0\" (UID: \"08390bdf-47f0-43a2-89ac-1a7240dcd2dd\") " pod="openstack/cinder-api-0" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.947696 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08390bdf-47f0-43a2-89ac-1a7240dcd2dd-config-data-custom\") pod \"cinder-api-0\" (UID: \"08390bdf-47f0-43a2-89ac-1a7240dcd2dd\") " pod="openstack/cinder-api-0" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.947854 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08390bdf-47f0-43a2-89ac-1a7240dcd2dd-config-data\") pod \"cinder-api-0\" (UID: \"08390bdf-47f0-43a2-89ac-1a7240dcd2dd\") " pod="openstack/cinder-api-0" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.948496 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08390bdf-47f0-43a2-89ac-1a7240dcd2dd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"08390bdf-47f0-43a2-89ac-1a7240dcd2dd\") " pod="openstack/cinder-api-0" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.961464 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj6v2\" (UniqueName: \"kubernetes.io/projected/08390bdf-47f0-43a2-89ac-1a7240dcd2dd-kube-api-access-jj6v2\") pod \"cinder-api-0\" (UID: \"08390bdf-47f0-43a2-89ac-1a7240dcd2dd\") " pod="openstack/cinder-api-0" Mar 09 13:21:47 crc kubenswrapper[4723]: I0309 13:21:47.982187 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-646c887bd9-qzqxk" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.041188 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/28297498-43e9-457c-a90d-0c3f49907491-config\") pod \"28297498-43e9-457c-a90d-0c3f49907491\" (UID: \"28297498-43e9-457c-a90d-0c3f49907491\") " Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.041286 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28297498-43e9-457c-a90d-0c3f49907491-combined-ca-bundle\") pod \"28297498-43e9-457c-a90d-0c3f49907491\" (UID: \"28297498-43e9-457c-a90d-0c3f49907491\") " Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.041330 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8swz8\" (UniqueName: \"kubernetes.io/projected/28297498-43e9-457c-a90d-0c3f49907491-kube-api-access-8swz8\") pod \"28297498-43e9-457c-a90d-0c3f49907491\" (UID: \"28297498-43e9-457c-a90d-0c3f49907491\") " Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.041489 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/28297498-43e9-457c-a90d-0c3f49907491-public-tls-certs\") pod \"28297498-43e9-457c-a90d-0c3f49907491\" (UID: \"28297498-43e9-457c-a90d-0c3f49907491\") " Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.041535 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/28297498-43e9-457c-a90d-0c3f49907491-internal-tls-certs\") pod \"28297498-43e9-457c-a90d-0c3f49907491\" (UID: \"28297498-43e9-457c-a90d-0c3f49907491\") " Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.041554 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/28297498-43e9-457c-a90d-0c3f49907491-ovndb-tls-certs\") pod \"28297498-43e9-457c-a90d-0c3f49907491\" (UID: \"28297498-43e9-457c-a90d-0c3f49907491\") " Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.041610 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/28297498-43e9-457c-a90d-0c3f49907491-httpd-config\") pod \"28297498-43e9-457c-a90d-0c3f49907491\" (UID: \"28297498-43e9-457c-a90d-0c3f49907491\") " Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.045637 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28297498-43e9-457c-a90d-0c3f49907491-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "28297498-43e9-457c-a90d-0c3f49907491" (UID: "28297498-43e9-457c-a90d-0c3f49907491"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.045927 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28297498-43e9-457c-a90d-0c3f49907491-kube-api-access-8swz8" (OuterVolumeSpecName: "kube-api-access-8swz8") pod "28297498-43e9-457c-a90d-0c3f49907491" (UID: "28297498-43e9-457c-a90d-0c3f49907491"). InnerVolumeSpecName "kube-api-access-8swz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.097553 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28297498-43e9-457c-a90d-0c3f49907491-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "28297498-43e9-457c-a90d-0c3f49907491" (UID: "28297498-43e9-457c-a90d-0c3f49907491"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.098965 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28297498-43e9-457c-a90d-0c3f49907491-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "28297498-43e9-457c-a90d-0c3f49907491" (UID: "28297498-43e9-457c-a90d-0c3f49907491"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.114146 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28297498-43e9-457c-a90d-0c3f49907491-config" (OuterVolumeSpecName: "config") pod "28297498-43e9-457c-a90d-0c3f49907491" (UID: "28297498-43e9-457c-a90d-0c3f49907491"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.115615 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28297498-43e9-457c-a90d-0c3f49907491-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "28297498-43e9-457c-a90d-0c3f49907491" (UID: "28297498-43e9-457c-a90d-0c3f49907491"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.132816 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28297498-43e9-457c-a90d-0c3f49907491-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "28297498-43e9-457c-a90d-0c3f49907491" (UID: "28297498-43e9-457c-a90d-0c3f49907491"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.144153 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/28297498-43e9-457c-a90d-0c3f49907491-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.144187 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28297498-43e9-457c-a90d-0c3f49907491-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.144200 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8swz8\" (UniqueName: \"kubernetes.io/projected/28297498-43e9-457c-a90d-0c3f49907491-kube-api-access-8swz8\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.144209 4723 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/28297498-43e9-457c-a90d-0c3f49907491-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.144218 4723 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/28297498-43e9-457c-a90d-0c3f49907491-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.144226 4723 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/28297498-43e9-457c-a90d-0c3f49907491-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.144234 4723 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/28297498-43e9-457c-a90d-0c3f49907491-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.172940 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.346772 4723 generic.go:334] "Generic (PLEG): container finished" podID="28297498-43e9-457c-a90d-0c3f49907491" containerID="7a4d205903b0350689d246a7dd833d69ec9079f318670bb9c54d41aef2c07dd5" exitCode=0 Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.346874 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-646c887bd9-qzqxk" event={"ID":"28297498-43e9-457c-a90d-0c3f49907491","Type":"ContainerDied","Data":"7a4d205903b0350689d246a7dd833d69ec9079f318670bb9c54d41aef2c07dd5"} Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.347166 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-646c887bd9-qzqxk" event={"ID":"28297498-43e9-457c-a90d-0c3f49907491","Type":"ContainerDied","Data":"427e7ad9e57e7112049b7ea9355f3ab9e3247fee84a0f1773512064007dc24b6"} Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.347193 4723 scope.go:117] "RemoveContainer" containerID="fc77008e1f53a9fd9fd0fee21c759ad16773862210f0ceaae4c1cad04c78e7c6" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.348705 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-646c887bd9-qzqxk" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.351934 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e0bf458-3488-4d3a-80ac-d9cf2f655791","Type":"ContainerDied","Data":"a211a03dcf1458ff9b9cf538906f5fcdc0b232c428b25f2b3c043bac4351bb66"} Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.352081 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.372588 4723 scope.go:117] "RemoveContainer" containerID="7a4d205903b0350689d246a7dd833d69ec9079f318670bb9c54d41aef2c07dd5" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.395252 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.406541 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.432966 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:21:48 crc kubenswrapper[4723]: E0309 13:21:48.433527 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28297498-43e9-457c-a90d-0c3f49907491" containerName="neutron-api" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.433554 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="28297498-43e9-457c-a90d-0c3f49907491" containerName="neutron-api" Mar 09 13:21:48 crc kubenswrapper[4723]: E0309 13:21:48.433566 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28297498-43e9-457c-a90d-0c3f49907491" containerName="neutron-httpd" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.433574 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="28297498-43e9-457c-a90d-0c3f49907491" containerName="neutron-httpd" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.434682 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="28297498-43e9-457c-a90d-0c3f49907491" containerName="neutron-api" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.434720 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="28297498-43e9-457c-a90d-0c3f49907491" containerName="neutron-httpd" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.435435 4723 scope.go:117] "RemoveContainer" containerID="fc77008e1f53a9fd9fd0fee21c759ad16773862210f0ceaae4c1cad04c78e7c6" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.459254 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-646c887bd9-qzqxk"] Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.459350 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:21:48 crc kubenswrapper[4723]: E0309 13:21:48.461067 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc77008e1f53a9fd9fd0fee21c759ad16773862210f0ceaae4c1cad04c78e7c6\": container with ID starting with fc77008e1f53a9fd9fd0fee21c759ad16773862210f0ceaae4c1cad04c78e7c6 not found: ID does not exist" containerID="fc77008e1f53a9fd9fd0fee21c759ad16773862210f0ceaae4c1cad04c78e7c6" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.461176 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc77008e1f53a9fd9fd0fee21c759ad16773862210f0ceaae4c1cad04c78e7c6"} err="failed to get container status \"fc77008e1f53a9fd9fd0fee21c759ad16773862210f0ceaae4c1cad04c78e7c6\": rpc error: code = NotFound desc = could not find container \"fc77008e1f53a9fd9fd0fee21c759ad16773862210f0ceaae4c1cad04c78e7c6\": container with ID starting with fc77008e1f53a9fd9fd0fee21c759ad16773862210f0ceaae4c1cad04c78e7c6 not found: ID does not exist" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.461265 4723 scope.go:117] "RemoveContainer" containerID="7a4d205903b0350689d246a7dd833d69ec9079f318670bb9c54d41aef2c07dd5" Mar 09 13:21:48 crc kubenswrapper[4723]: E0309 13:21:48.461646 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a4d205903b0350689d246a7dd833d69ec9079f318670bb9c54d41aef2c07dd5\": container with ID starting with 7a4d205903b0350689d246a7dd833d69ec9079f318670bb9c54d41aef2c07dd5 not found: ID does not exist" containerID="7a4d205903b0350689d246a7dd833d69ec9079f318670bb9c54d41aef2c07dd5" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.461697 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a4d205903b0350689d246a7dd833d69ec9079f318670bb9c54d41aef2c07dd5"} err="failed to get container status \"7a4d205903b0350689d246a7dd833d69ec9079f318670bb9c54d41aef2c07dd5\": rpc error: code = NotFound desc = could not find container \"7a4d205903b0350689d246a7dd833d69ec9079f318670bb9c54d41aef2c07dd5\": container with ID starting with 7a4d205903b0350689d246a7dd833d69ec9079f318670bb9c54d41aef2c07dd5 not found: ID does not exist" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.461728 4723 scope.go:117] "RemoveContainer" containerID="ab6e3117ea831cb4c6d8708856c263cc0cecf1d1ab632551d91a9070e61dc344" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.462227 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.464098 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.474842 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-646c887bd9-qzqxk"] Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.488094 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.496351 4723 scope.go:117] "RemoveContainer" containerID="2161603b5816667f7710ccc90429c6d2fb2b8f1d9495eb572edffaa2c1f175a3" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.536294 4723 scope.go:117] "RemoveContainer" containerID="08856bed4f33e117dd10c871cba09579935b75e1730d76f9185afe1104c1a29f" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.552887 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/868faedb-4c6c-479b-b519-45c8a252dee8-run-httpd\") pod \"ceilometer-0\" (UID: \"868faedb-4c6c-479b-b519-45c8a252dee8\") " pod="openstack/ceilometer-0" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.553008 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpp2c\" (UniqueName: \"kubernetes.io/projected/868faedb-4c6c-479b-b519-45c8a252dee8-kube-api-access-cpp2c\") pod \"ceilometer-0\" (UID: \"868faedb-4c6c-479b-b519-45c8a252dee8\") " pod="openstack/ceilometer-0" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.553042 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/868faedb-4c6c-479b-b519-45c8a252dee8-config-data\") pod \"ceilometer-0\" (UID: \"868faedb-4c6c-479b-b519-45c8a252dee8\") " pod="openstack/ceilometer-0" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.553114 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/868faedb-4c6c-479b-b519-45c8a252dee8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"868faedb-4c6c-479b-b519-45c8a252dee8\") " pod="openstack/ceilometer-0" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.553140 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/868faedb-4c6c-479b-b519-45c8a252dee8-log-httpd\") pod \"ceilometer-0\" (UID: \"868faedb-4c6c-479b-b519-45c8a252dee8\") " pod="openstack/ceilometer-0" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.553222 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/868faedb-4c6c-479b-b519-45c8a252dee8-scripts\") pod \"ceilometer-0\" (UID: \"868faedb-4c6c-479b-b519-45c8a252dee8\") " pod="openstack/ceilometer-0" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.553254 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/868faedb-4c6c-479b-b519-45c8a252dee8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"868faedb-4c6c-479b-b519-45c8a252dee8\") " pod="openstack/ceilometer-0" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.653290 4723 scope.go:117] "RemoveContainer" containerID="38abaf8dc7e932d42d539dc5445545a95ed746821765a01b2eb815a51321876c" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.654643 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpp2c\" (UniqueName: \"kubernetes.io/projected/868faedb-4c6c-479b-b519-45c8a252dee8-kube-api-access-cpp2c\") pod \"ceilometer-0\" (UID: \"868faedb-4c6c-479b-b519-45c8a252dee8\") " pod="openstack/ceilometer-0" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.654687 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/868faedb-4c6c-479b-b519-45c8a252dee8-config-data\") pod \"ceilometer-0\" (UID: \"868faedb-4c6c-479b-b519-45c8a252dee8\") " pod="openstack/ceilometer-0" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.654746 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/868faedb-4c6c-479b-b519-45c8a252dee8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"868faedb-4c6c-479b-b519-45c8a252dee8\") " pod="openstack/ceilometer-0" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.654772 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/868faedb-4c6c-479b-b519-45c8a252dee8-log-httpd\") pod \"ceilometer-0\" (UID: \"868faedb-4c6c-479b-b519-45c8a252dee8\") " pod="openstack/ceilometer-0" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.654829 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/868faedb-4c6c-479b-b519-45c8a252dee8-scripts\") pod \"ceilometer-0\" (UID: \"868faedb-4c6c-479b-b519-45c8a252dee8\") " pod="openstack/ceilometer-0" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.654880 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/868faedb-4c6c-479b-b519-45c8a252dee8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"868faedb-4c6c-479b-b519-45c8a252dee8\") " pod="openstack/ceilometer-0" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.654910 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/868faedb-4c6c-479b-b519-45c8a252dee8-run-httpd\") pod \"ceilometer-0\" (UID: \"868faedb-4c6c-479b-b519-45c8a252dee8\") " pod="openstack/ceilometer-0" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.655821 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/868faedb-4c6c-479b-b519-45c8a252dee8-run-httpd\") pod \"ceilometer-0\" (UID: \"868faedb-4c6c-479b-b519-45c8a252dee8\") " pod="openstack/ceilometer-0" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.656146 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/868faedb-4c6c-479b-b519-45c8a252dee8-log-httpd\") pod \"ceilometer-0\" (UID: \"868faedb-4c6c-479b-b519-45c8a252dee8\") " pod="openstack/ceilometer-0" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.664695 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/868faedb-4c6c-479b-b519-45c8a252dee8-config-data\") pod \"ceilometer-0\" (UID: \"868faedb-4c6c-479b-b519-45c8a252dee8\") " pod="openstack/ceilometer-0" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.666802 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/868faedb-4c6c-479b-b519-45c8a252dee8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"868faedb-4c6c-479b-b519-45c8a252dee8\") " pod="openstack/ceilometer-0" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.669776 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/868faedb-4c6c-479b-b519-45c8a252dee8-scripts\") pod \"ceilometer-0\" (UID: \"868faedb-4c6c-479b-b519-45c8a252dee8\") " pod="openstack/ceilometer-0" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.673335 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/868faedb-4c6c-479b-b519-45c8a252dee8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"868faedb-4c6c-479b-b519-45c8a252dee8\") " pod="openstack/ceilometer-0" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.674441 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpp2c\" (UniqueName: \"kubernetes.io/projected/868faedb-4c6c-479b-b519-45c8a252dee8-kube-api-access-cpp2c\") pod \"ceilometer-0\" (UID: \"868faedb-4c6c-479b-b519-45c8a252dee8\") " pod="openstack/ceilometer-0" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.726449 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.800383 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.939551 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e0bf458-3488-4d3a-80ac-d9cf2f655791" path="/var/lib/kubelet/pods/1e0bf458-3488-4d3a-80ac-d9cf2f655791/volumes" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.940915 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28297498-43e9-457c-a90d-0c3f49907491" path="/var/lib/kubelet/pods/28297498-43e9-457c-a90d-0c3f49907491/volumes" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.941683 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="591e7541-2095-490f-9787-d4551a2e4f9d" path="/var/lib/kubelet/pods/591e7541-2095-490f-9787-d4551a2e4f9d/volumes" Mar 09 13:21:48 crc kubenswrapper[4723]: I0309 13:21:48.953073 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85e029d9-b399-4b73-a5b7-458ce3a459d6" path="/var/lib/kubelet/pods/85e029d9-b399-4b73-a5b7-458ce3a459d6/volumes" Mar 09 13:21:49 crc kubenswrapper[4723]: I0309 13:21:49.405779 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"08390bdf-47f0-43a2-89ac-1a7240dcd2dd","Type":"ContainerStarted","Data":"6e55f8081913da561b161cb676df506c50488b7639dc0a3b6b1d4e9f27659332"} Mar 09 13:21:49 crc kubenswrapper[4723]: I0309 13:21:49.413036 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:21:50 crc kubenswrapper[4723]: I0309 13:21:50.330214 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5784cf869f-tfrlw" Mar 09 13:21:50 crc kubenswrapper[4723]: I0309 13:21:50.409905 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-9jjw9"] Mar 09 13:21:50 crc kubenswrapper[4723]: I0309 13:21:50.410162 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75c8ddd69c-9jjw9" podUID="4313196c-6a31-4615-9ffe-329aed2bfef4" containerName="dnsmasq-dns" containerID="cri-o://0a327eb0475d9cfd15217ad702f0d307341bf391a37281c48d9bfbe427021cec" gracePeriod=10 Mar 09 13:21:50 crc kubenswrapper[4723]: I0309 13:21:50.456764 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"08390bdf-47f0-43a2-89ac-1a7240dcd2dd","Type":"ContainerStarted","Data":"7e7d7dfc526920ce2c3ce129679bf73e208e1640f292a01d6de537838addad19"} Mar 09 13:21:50 crc kubenswrapper[4723]: I0309 13:21:50.464059 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"868faedb-4c6c-479b-b519-45c8a252dee8","Type":"ContainerStarted","Data":"dd5b500095dd0c60df2c62d705c3589e5648c8728a83ccd454d9f0ec119c1858"} Mar 09 13:21:50 crc kubenswrapper[4723]: I0309 13:21:50.464098 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"868faedb-4c6c-479b-b519-45c8a252dee8","Type":"ContainerStarted","Data":"0c16e11258916c615bb17126e405eb645bd52a2ffb4eddd7c6ab3b1d7afd40f6"} Mar 09 13:21:50 crc kubenswrapper[4723]: I0309 13:21:50.994012 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 09 13:21:51 crc kubenswrapper[4723]: I0309 13:21:51.092589 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 13:21:51 crc kubenswrapper[4723]: I0309 13:21:51.112126 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-9jjw9" Mar 09 13:21:51 crc kubenswrapper[4723]: I0309 13:21:51.208961 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4313196c-6a31-4615-9ffe-329aed2bfef4-config\") pod \"4313196c-6a31-4615-9ffe-329aed2bfef4\" (UID: \"4313196c-6a31-4615-9ffe-329aed2bfef4\") " Mar 09 13:21:51 crc kubenswrapper[4723]: I0309 13:21:51.209079 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4313196c-6a31-4615-9ffe-329aed2bfef4-dns-swift-storage-0\") pod \"4313196c-6a31-4615-9ffe-329aed2bfef4\" (UID: \"4313196c-6a31-4615-9ffe-329aed2bfef4\") " Mar 09 13:21:51 crc kubenswrapper[4723]: I0309 13:21:51.209146 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prqmc\" (UniqueName: \"kubernetes.io/projected/4313196c-6a31-4615-9ffe-329aed2bfef4-kube-api-access-prqmc\") pod \"4313196c-6a31-4615-9ffe-329aed2bfef4\" (UID: \"4313196c-6a31-4615-9ffe-329aed2bfef4\") " Mar 09 13:21:51 crc kubenswrapper[4723]: I0309 13:21:51.209296 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4313196c-6a31-4615-9ffe-329aed2bfef4-ovsdbserver-nb\") pod \"4313196c-6a31-4615-9ffe-329aed2bfef4\" (UID: \"4313196c-6a31-4615-9ffe-329aed2bfef4\") " Mar 09 13:21:51 crc kubenswrapper[4723]: I0309 13:21:51.209454 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4313196c-6a31-4615-9ffe-329aed2bfef4-dns-svc\") pod \"4313196c-6a31-4615-9ffe-329aed2bfef4\" (UID: \"4313196c-6a31-4615-9ffe-329aed2bfef4\") " Mar 09 13:21:51 crc kubenswrapper[4723]: I0309 13:21:51.209511 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4313196c-6a31-4615-9ffe-329aed2bfef4-ovsdbserver-sb\") pod \"4313196c-6a31-4615-9ffe-329aed2bfef4\" (UID: \"4313196c-6a31-4615-9ffe-329aed2bfef4\") " Mar 09 13:21:51 crc kubenswrapper[4723]: I0309 13:21:51.245623 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4313196c-6a31-4615-9ffe-329aed2bfef4-kube-api-access-prqmc" (OuterVolumeSpecName: "kube-api-access-prqmc") pod "4313196c-6a31-4615-9ffe-329aed2bfef4" (UID: "4313196c-6a31-4615-9ffe-329aed2bfef4"). InnerVolumeSpecName "kube-api-access-prqmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:51 crc kubenswrapper[4723]: I0309 13:21:51.314082 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prqmc\" (UniqueName: \"kubernetes.io/projected/4313196c-6a31-4615-9ffe-329aed2bfef4-kube-api-access-prqmc\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:51 crc kubenswrapper[4723]: I0309 13:21:51.338910 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4313196c-6a31-4615-9ffe-329aed2bfef4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4313196c-6a31-4615-9ffe-329aed2bfef4" (UID: "4313196c-6a31-4615-9ffe-329aed2bfef4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:51 crc kubenswrapper[4723]: I0309 13:21:51.338943 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4313196c-6a31-4615-9ffe-329aed2bfef4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4313196c-6a31-4615-9ffe-329aed2bfef4" (UID: "4313196c-6a31-4615-9ffe-329aed2bfef4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:51 crc kubenswrapper[4723]: I0309 13:21:51.351441 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4313196c-6a31-4615-9ffe-329aed2bfef4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4313196c-6a31-4615-9ffe-329aed2bfef4" (UID: "4313196c-6a31-4615-9ffe-329aed2bfef4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:51 crc kubenswrapper[4723]: I0309 13:21:51.387359 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4313196c-6a31-4615-9ffe-329aed2bfef4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4313196c-6a31-4615-9ffe-329aed2bfef4" (UID: "4313196c-6a31-4615-9ffe-329aed2bfef4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:51 crc kubenswrapper[4723]: I0309 13:21:51.397416 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4313196c-6a31-4615-9ffe-329aed2bfef4-config" (OuterVolumeSpecName: "config") pod "4313196c-6a31-4615-9ffe-329aed2bfef4" (UID: "4313196c-6a31-4615-9ffe-329aed2bfef4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:51 crc kubenswrapper[4723]: I0309 13:21:51.417885 4723 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4313196c-6a31-4615-9ffe-329aed2bfef4-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:51 crc kubenswrapper[4723]: I0309 13:21:51.417929 4723 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4313196c-6a31-4615-9ffe-329aed2bfef4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:51 crc kubenswrapper[4723]: I0309 13:21:51.417940 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4313196c-6a31-4615-9ffe-329aed2bfef4-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:51 crc kubenswrapper[4723]: I0309 13:21:51.417952 4723 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4313196c-6a31-4615-9ffe-329aed2bfef4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:51 crc kubenswrapper[4723]: I0309 13:21:51.417961 4723 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4313196c-6a31-4615-9ffe-329aed2bfef4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:51 crc kubenswrapper[4723]: I0309 13:21:51.476769 4723 generic.go:334] "Generic (PLEG): container finished" podID="4313196c-6a31-4615-9ffe-329aed2bfef4" containerID="0a327eb0475d9cfd15217ad702f0d307341bf391a37281c48d9bfbe427021cec" exitCode=0 Mar 09 13:21:51 crc kubenswrapper[4723]: I0309 13:21:51.476901 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-9jjw9" event={"ID":"4313196c-6a31-4615-9ffe-329aed2bfef4","Type":"ContainerDied","Data":"0a327eb0475d9cfd15217ad702f0d307341bf391a37281c48d9bfbe427021cec"} Mar 09 13:21:51 crc kubenswrapper[4723]: I0309 13:21:51.476936 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-9jjw9" event={"ID":"4313196c-6a31-4615-9ffe-329aed2bfef4","Type":"ContainerDied","Data":"32101265ce9905129a95fb07ceebec339efbfb422691efbd8c51740b1f5c41d7"} Mar 09 13:21:51 crc kubenswrapper[4723]: I0309 13:21:51.476955 4723 scope.go:117] "RemoveContainer" containerID="0a327eb0475d9cfd15217ad702f0d307341bf391a37281c48d9bfbe427021cec" Mar 09 13:21:51 crc kubenswrapper[4723]: I0309 13:21:51.477087 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-9jjw9" Mar 09 13:21:51 crc kubenswrapper[4723]: I0309 13:21:51.482873 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"08390bdf-47f0-43a2-89ac-1a7240dcd2dd","Type":"ContainerStarted","Data":"0032a0b92b6c37f5140ca93dd02871656f1c9f9504ab66ec426acfa66ceda492"} Mar 09 13:21:51 crc kubenswrapper[4723]: I0309 13:21:51.483009 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 09 13:21:51 crc kubenswrapper[4723]: I0309 13:21:51.487604 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="231ff024-2c9c-479e-b734-4cfd1e69ac91" containerName="cinder-scheduler" containerID="cri-o://726d8ac4d64caa2c94d96ce2787c4b32183803e5054b4daa79acc270575e859e" gracePeriod=30 Mar 09 13:21:51 crc kubenswrapper[4723]: I0309 13:21:51.487948 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"868faedb-4c6c-479b-b519-45c8a252dee8","Type":"ContainerStarted","Data":"dd5c68fc27791b5cb65c4cab5cff1b3860a58331b107eedd79abd83bc841ce93"} Mar 09 13:21:51 crc kubenswrapper[4723]: I0309 13:21:51.487988 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="231ff024-2c9c-479e-b734-4cfd1e69ac91" containerName="probe" containerID="cri-o://c08169c322d1c84997d310c42a57a10b911c1a97c2dd12f578eef4048001fbe9" gracePeriod=30 Mar 09 13:21:51 crc kubenswrapper[4723]: I0309 13:21:51.520123 4723 scope.go:117] "RemoveContainer" containerID="0af3fb47358d96397b65fcaf3629b2b6cf1be6d981d4ec437f2a04134c567377" Mar 09 13:21:51 crc kubenswrapper[4723]: I0309 13:21:51.520141 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.520120526 podStartE2EDuration="4.520120526s" podCreationTimestamp="2026-03-09 13:21:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:21:51.515136434 +0000 UTC m=+1385.529603974" watchObservedRunningTime="2026-03-09 13:21:51.520120526 +0000 UTC m=+1385.534588066" Mar 09 13:21:51 crc kubenswrapper[4723]: I0309 13:21:51.582001 4723 scope.go:117] "RemoveContainer" containerID="0a327eb0475d9cfd15217ad702f0d307341bf391a37281c48d9bfbe427021cec" Mar 09 13:21:51 crc kubenswrapper[4723]: E0309 13:21:51.587784 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a327eb0475d9cfd15217ad702f0d307341bf391a37281c48d9bfbe427021cec\": container with ID starting with 0a327eb0475d9cfd15217ad702f0d307341bf391a37281c48d9bfbe427021cec not found: ID does not exist" containerID="0a327eb0475d9cfd15217ad702f0d307341bf391a37281c48d9bfbe427021cec" Mar 09 13:21:51 crc kubenswrapper[4723]: I0309 13:21:51.587839 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a327eb0475d9cfd15217ad702f0d307341bf391a37281c48d9bfbe427021cec"} err="failed to get container status \"0a327eb0475d9cfd15217ad702f0d307341bf391a37281c48d9bfbe427021cec\": rpc error: code = NotFound desc = could not find container \"0a327eb0475d9cfd15217ad702f0d307341bf391a37281c48d9bfbe427021cec\": container with ID starting with 0a327eb0475d9cfd15217ad702f0d307341bf391a37281c48d9bfbe427021cec not found: ID does not exist" Mar 09 13:21:51 crc kubenswrapper[4723]: I0309 13:21:51.587885 4723 scope.go:117] "RemoveContainer" containerID="0af3fb47358d96397b65fcaf3629b2b6cf1be6d981d4ec437f2a04134c567377" Mar 09 13:21:51 crc kubenswrapper[4723]: I0309 13:21:51.592004 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-9jjw9"] Mar 09 13:21:51 crc kubenswrapper[4723]: E0309 13:21:51.592046 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0af3fb47358d96397b65fcaf3629b2b6cf1be6d981d4ec437f2a04134c567377\": container with ID starting with 0af3fb47358d96397b65fcaf3629b2b6cf1be6d981d4ec437f2a04134c567377 not found: ID does not exist" containerID="0af3fb47358d96397b65fcaf3629b2b6cf1be6d981d4ec437f2a04134c567377" Mar 09 13:21:51 crc kubenswrapper[4723]: I0309 13:21:51.592093 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0af3fb47358d96397b65fcaf3629b2b6cf1be6d981d4ec437f2a04134c567377"} err="failed to get container status \"0af3fb47358d96397b65fcaf3629b2b6cf1be6d981d4ec437f2a04134c567377\": rpc error: code = NotFound desc = could not find container \"0af3fb47358d96397b65fcaf3629b2b6cf1be6d981d4ec437f2a04134c567377\": container with ID starting with 0af3fb47358d96397b65fcaf3629b2b6cf1be6d981d4ec437f2a04134c567377 not found: ID does not exist" Mar 09 13:21:51 crc kubenswrapper[4723]: I0309 13:21:51.605915 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-9jjw9"] Mar 09 13:21:52 crc kubenswrapper[4723]: I0309 13:21:52.502754 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"868faedb-4c6c-479b-b519-45c8a252dee8","Type":"ContainerStarted","Data":"276aafd0a7cfae040a0d105fce63aa7cb3f85af7e501922b266342496107a3f1"} Mar 09 13:21:52 crc kubenswrapper[4723]: I0309 13:21:52.690900 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-677d745ffb-ng6tr" Mar 09 13:21:52 crc kubenswrapper[4723]: I0309 13:21:52.693574 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-677d745ffb-ng6tr" Mar 09 13:21:52 crc kubenswrapper[4723]: I0309 13:21:52.923345 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4313196c-6a31-4615-9ffe-329aed2bfef4" path="/var/lib/kubelet/pods/4313196c-6a31-4615-9ffe-329aed2bfef4/volumes" Mar 09 13:21:53 crc kubenswrapper[4723]: I0309 13:21:53.049433 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7d4b578b96-kksd4"] Mar 09 13:21:53 crc kubenswrapper[4723]: E0309 13:21:53.051478 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4313196c-6a31-4615-9ffe-329aed2bfef4" containerName="dnsmasq-dns" Mar 09 13:21:53 crc kubenswrapper[4723]: I0309 13:21:53.051627 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="4313196c-6a31-4615-9ffe-329aed2bfef4" containerName="dnsmasq-dns" Mar 09 13:21:53 crc kubenswrapper[4723]: E0309 13:21:53.051777 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4313196c-6a31-4615-9ffe-329aed2bfef4" containerName="init" Mar 09 13:21:53 crc kubenswrapper[4723]: I0309 13:21:53.051890 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="4313196c-6a31-4615-9ffe-329aed2bfef4" containerName="init" Mar 09 13:21:53 crc kubenswrapper[4723]: I0309 13:21:53.052292 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="4313196c-6a31-4615-9ffe-329aed2bfef4" containerName="dnsmasq-dns" Mar 09 13:21:53 crc kubenswrapper[4723]: I0309 13:21:53.053957 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7d4b578b96-kksd4" Mar 09 13:21:53 crc kubenswrapper[4723]: I0309 13:21:53.066737 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7d4b578b96-kksd4"] Mar 09 13:21:53 crc kubenswrapper[4723]: I0309 13:21:53.161450 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/764f6f4a-4b34-41d3-b8b2-e7cd8cc754bf-public-tls-certs\") pod \"placement-7d4b578b96-kksd4\" (UID: \"764f6f4a-4b34-41d3-b8b2-e7cd8cc754bf\") " pod="openstack/placement-7d4b578b96-kksd4" Mar 09 13:21:53 crc kubenswrapper[4723]: I0309 13:21:53.161544 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/764f6f4a-4b34-41d3-b8b2-e7cd8cc754bf-combined-ca-bundle\") pod \"placement-7d4b578b96-kksd4\" (UID: \"764f6f4a-4b34-41d3-b8b2-e7cd8cc754bf\") " pod="openstack/placement-7d4b578b96-kksd4" Mar 09 13:21:53 crc kubenswrapper[4723]: I0309 13:21:53.161683 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njz8s\" (UniqueName: \"kubernetes.io/projected/764f6f4a-4b34-41d3-b8b2-e7cd8cc754bf-kube-api-access-njz8s\") pod \"placement-7d4b578b96-kksd4\" (UID: \"764f6f4a-4b34-41d3-b8b2-e7cd8cc754bf\") " pod="openstack/placement-7d4b578b96-kksd4" Mar 09 13:21:53 crc kubenswrapper[4723]: I0309 13:21:53.161987 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/764f6f4a-4b34-41d3-b8b2-e7cd8cc754bf-scripts\") pod \"placement-7d4b578b96-kksd4\" (UID: \"764f6f4a-4b34-41d3-b8b2-e7cd8cc754bf\") " pod="openstack/placement-7d4b578b96-kksd4" Mar 09 13:21:53 crc kubenswrapper[4723]: I0309 13:21:53.162101 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/764f6f4a-4b34-41d3-b8b2-e7cd8cc754bf-config-data\") pod \"placement-7d4b578b96-kksd4\" (UID: \"764f6f4a-4b34-41d3-b8b2-e7cd8cc754bf\") " pod="openstack/placement-7d4b578b96-kksd4" Mar 09 13:21:53 crc kubenswrapper[4723]: I0309 13:21:53.162204 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/764f6f4a-4b34-41d3-b8b2-e7cd8cc754bf-internal-tls-certs\") pod \"placement-7d4b578b96-kksd4\" (UID: \"764f6f4a-4b34-41d3-b8b2-e7cd8cc754bf\") " pod="openstack/placement-7d4b578b96-kksd4" Mar 09 13:21:53 crc kubenswrapper[4723]: I0309 13:21:53.162327 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/764f6f4a-4b34-41d3-b8b2-e7cd8cc754bf-logs\") pod \"placement-7d4b578b96-kksd4\" (UID: \"764f6f4a-4b34-41d3-b8b2-e7cd8cc754bf\") " pod="openstack/placement-7d4b578b96-kksd4" Mar 09 13:21:53 crc kubenswrapper[4723]: I0309 13:21:53.265459 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/764f6f4a-4b34-41d3-b8b2-e7cd8cc754bf-public-tls-certs\") pod \"placement-7d4b578b96-kksd4\" (UID: \"764f6f4a-4b34-41d3-b8b2-e7cd8cc754bf\") " pod="openstack/placement-7d4b578b96-kksd4" Mar 09 13:21:53 crc kubenswrapper[4723]: I0309 13:21:53.265949 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/764f6f4a-4b34-41d3-b8b2-e7cd8cc754bf-combined-ca-bundle\") pod \"placement-7d4b578b96-kksd4\" (UID: \"764f6f4a-4b34-41d3-b8b2-e7cd8cc754bf\") " pod="openstack/placement-7d4b578b96-kksd4" Mar 09 13:21:53 crc kubenswrapper[4723]: I0309 13:21:53.266064 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njz8s\" (UniqueName: \"kubernetes.io/projected/764f6f4a-4b34-41d3-b8b2-e7cd8cc754bf-kube-api-access-njz8s\") pod \"placement-7d4b578b96-kksd4\" (UID: \"764f6f4a-4b34-41d3-b8b2-e7cd8cc754bf\") " pod="openstack/placement-7d4b578b96-kksd4" Mar 09 13:21:53 crc kubenswrapper[4723]: I0309 13:21:53.266246 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/764f6f4a-4b34-41d3-b8b2-e7cd8cc754bf-scripts\") pod \"placement-7d4b578b96-kksd4\" (UID: \"764f6f4a-4b34-41d3-b8b2-e7cd8cc754bf\") " pod="openstack/placement-7d4b578b96-kksd4" Mar 09 13:21:53 crc kubenswrapper[4723]: I0309 13:21:53.266414 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/764f6f4a-4b34-41d3-b8b2-e7cd8cc754bf-config-data\") pod \"placement-7d4b578b96-kksd4\" (UID: \"764f6f4a-4b34-41d3-b8b2-e7cd8cc754bf\") " pod="openstack/placement-7d4b578b96-kksd4" Mar 09 13:21:53 crc kubenswrapper[4723]: I0309 13:21:53.266589 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/764f6f4a-4b34-41d3-b8b2-e7cd8cc754bf-internal-tls-certs\") pod \"placement-7d4b578b96-kksd4\" (UID: \"764f6f4a-4b34-41d3-b8b2-e7cd8cc754bf\") " pod="openstack/placement-7d4b578b96-kksd4" Mar 09 13:21:53 crc kubenswrapper[4723]: I0309 13:21:53.266721 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/764f6f4a-4b34-41d3-b8b2-e7cd8cc754bf-logs\") pod \"placement-7d4b578b96-kksd4\" (UID: \"764f6f4a-4b34-41d3-b8b2-e7cd8cc754bf\") " pod="openstack/placement-7d4b578b96-kksd4" Mar 09 13:21:53 crc kubenswrapper[4723]: I0309 13:21:53.267181 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/764f6f4a-4b34-41d3-b8b2-e7cd8cc754bf-logs\") pod \"placement-7d4b578b96-kksd4\" (UID: \"764f6f4a-4b34-41d3-b8b2-e7cd8cc754bf\") " pod="openstack/placement-7d4b578b96-kksd4" Mar 09 13:21:53 crc kubenswrapper[4723]: I0309 13:21:53.271667 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/764f6f4a-4b34-41d3-b8b2-e7cd8cc754bf-internal-tls-certs\") pod \"placement-7d4b578b96-kksd4\" (UID: \"764f6f4a-4b34-41d3-b8b2-e7cd8cc754bf\") " pod="openstack/placement-7d4b578b96-kksd4" Mar 09 13:21:53 crc kubenswrapper[4723]: I0309 13:21:53.272449 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/764f6f4a-4b34-41d3-b8b2-e7cd8cc754bf-combined-ca-bundle\") pod \"placement-7d4b578b96-kksd4\" (UID: \"764f6f4a-4b34-41d3-b8b2-e7cd8cc754bf\") " pod="openstack/placement-7d4b578b96-kksd4" Mar 09 13:21:53 crc kubenswrapper[4723]: I0309 13:21:53.272582 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/764f6f4a-4b34-41d3-b8b2-e7cd8cc754bf-scripts\") pod \"placement-7d4b578b96-kksd4\" (UID: \"764f6f4a-4b34-41d3-b8b2-e7cd8cc754bf\") " pod="openstack/placement-7d4b578b96-kksd4" Mar 09 13:21:53 crc kubenswrapper[4723]: I0309 13:21:53.273331 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/764f6f4a-4b34-41d3-b8b2-e7cd8cc754bf-config-data\") pod \"placement-7d4b578b96-kksd4\" (UID: \"764f6f4a-4b34-41d3-b8b2-e7cd8cc754bf\") " pod="openstack/placement-7d4b578b96-kksd4" Mar 09 13:21:53 crc kubenswrapper[4723]: I0309 13:21:53.273972 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/764f6f4a-4b34-41d3-b8b2-e7cd8cc754bf-public-tls-certs\") pod \"placement-7d4b578b96-kksd4\" (UID: \"764f6f4a-4b34-41d3-b8b2-e7cd8cc754bf\") " pod="openstack/placement-7d4b578b96-kksd4" Mar 09 13:21:53 crc kubenswrapper[4723]: I0309 13:21:53.294551 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njz8s\" (UniqueName: \"kubernetes.io/projected/764f6f4a-4b34-41d3-b8b2-e7cd8cc754bf-kube-api-access-njz8s\") pod \"placement-7d4b578b96-kksd4\" (UID: \"764f6f4a-4b34-41d3-b8b2-e7cd8cc754bf\") " pod="openstack/placement-7d4b578b96-kksd4" Mar 09 13:21:53 crc kubenswrapper[4723]: I0309 13:21:53.391798 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7d4b578b96-kksd4" Mar 09 13:21:53 crc kubenswrapper[4723]: I0309 13:21:53.524418 4723 generic.go:334] "Generic (PLEG): container finished" podID="231ff024-2c9c-479e-b734-4cfd1e69ac91" containerID="c08169c322d1c84997d310c42a57a10b911c1a97c2dd12f578eef4048001fbe9" exitCode=0 Mar 09 13:21:53 crc kubenswrapper[4723]: I0309 13:21:53.524482 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"231ff024-2c9c-479e-b734-4cfd1e69ac91","Type":"ContainerDied","Data":"c08169c322d1c84997d310c42a57a10b911c1a97c2dd12f578eef4048001fbe9"} Mar 09 13:21:54 crc kubenswrapper[4723]: W0309 13:21:53.877302 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod764f6f4a_4b34_41d3_b8b2_e7cd8cc754bf.slice/crio-e7c92d2caca81832cbde255dfd864dbaf973fc14334fdc889596cf2b9da7984b WatchSource:0}: Error finding container e7c92d2caca81832cbde255dfd864dbaf973fc14334fdc889596cf2b9da7984b: Status 404 returned error can't find the container with id e7c92d2caca81832cbde255dfd864dbaf973fc14334fdc889596cf2b9da7984b Mar 09 13:21:54 crc kubenswrapper[4723]: I0309 13:21:53.877613 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7d4b578b96-kksd4"] Mar 09 13:21:54 crc kubenswrapper[4723]: I0309 13:21:54.543118 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7d4b578b96-kksd4" event={"ID":"764f6f4a-4b34-41d3-b8b2-e7cd8cc754bf","Type":"ContainerStarted","Data":"ac59893d076225af6e4f6f631274df3a4416d56a037d5b501f330596f4372e4d"} Mar 09 13:21:54 crc kubenswrapper[4723]: I0309 13:21:54.543466 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7d4b578b96-kksd4" event={"ID":"764f6f4a-4b34-41d3-b8b2-e7cd8cc754bf","Type":"ContainerStarted","Data":"60fa3a2e1fd2ca96cd41ca9cb4f9c709882d340ad9cecc71d2ef36bc40ed7dc4"} Mar 09 13:21:54 crc kubenswrapper[4723]: I0309 13:21:54.543484 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7d4b578b96-kksd4" event={"ID":"764f6f4a-4b34-41d3-b8b2-e7cd8cc754bf","Type":"ContainerStarted","Data":"e7c92d2caca81832cbde255dfd864dbaf973fc14334fdc889596cf2b9da7984b"} Mar 09 13:21:54 crc kubenswrapper[4723]: I0309 13:21:54.543512 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7d4b578b96-kksd4" Mar 09 13:21:54 crc kubenswrapper[4723]: I0309 13:21:54.543531 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7d4b578b96-kksd4" Mar 09 13:21:54 crc kubenswrapper[4723]: I0309 13:21:54.549324 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"868faedb-4c6c-479b-b519-45c8a252dee8","Type":"ContainerStarted","Data":"e642cbfe23c56063d2e6dc0b223e408313d261acad06028681a662711dd1b182"} Mar 09 13:21:54 crc kubenswrapper[4723]: I0309 13:21:54.549705 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 13:21:54 crc kubenswrapper[4723]: I0309 13:21:54.578288 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7d4b578b96-kksd4" podStartSLOduration=1.578265412 podStartE2EDuration="1.578265412s" podCreationTimestamp="2026-03-09 13:21:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:21:54.569852999 +0000 UTC m=+1388.584320559" watchObservedRunningTime="2026-03-09 13:21:54.578265412 +0000 UTC m=+1388.592732952" Mar 09 13:21:54 crc kubenswrapper[4723]: I0309 13:21:54.609452 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.8485461380000001 podStartE2EDuration="6.609428977s" podCreationTimestamp="2026-03-09 13:21:48 +0000 UTC" firstStartedPulling="2026-03-09 13:21:49.438261409 +0000 UTC m=+1383.452728949" lastFinishedPulling="2026-03-09 13:21:54.199144248 +0000 UTC m=+1388.213611788" observedRunningTime="2026-03-09 13:21:54.598408625 +0000 UTC m=+1388.612876165" watchObservedRunningTime="2026-03-09 13:21:54.609428977 +0000 UTC m=+1388.623896527" Mar 09 13:21:56 crc kubenswrapper[4723]: I0309 13:21:56.579387 4723 generic.go:334] "Generic (PLEG): container finished" podID="231ff024-2c9c-479e-b734-4cfd1e69ac91" containerID="726d8ac4d64caa2c94d96ce2787c4b32183803e5054b4daa79acc270575e859e" exitCode=0 Mar 09 13:21:56 crc kubenswrapper[4723]: I0309 13:21:56.579781 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"231ff024-2c9c-479e-b734-4cfd1e69ac91","Type":"ContainerDied","Data":"726d8ac4d64caa2c94d96ce2787c4b32183803e5054b4daa79acc270575e859e"} Mar 09 13:21:56 crc kubenswrapper[4723]: I0309 13:21:56.855352 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 09 13:21:56 crc kubenswrapper[4723]: I0309 13:21:56.968773 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/231ff024-2c9c-479e-b734-4cfd1e69ac91-combined-ca-bundle\") pod \"231ff024-2c9c-479e-b734-4cfd1e69ac91\" (UID: \"231ff024-2c9c-479e-b734-4cfd1e69ac91\") " Mar 09 13:21:56 crc kubenswrapper[4723]: I0309 13:21:56.969013 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/231ff024-2c9c-479e-b734-4cfd1e69ac91-scripts\") pod \"231ff024-2c9c-479e-b734-4cfd1e69ac91\" (UID: \"231ff024-2c9c-479e-b734-4cfd1e69ac91\") " Mar 09 13:21:56 crc kubenswrapper[4723]: I0309 13:21:56.969078 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4czr8\" (UniqueName: \"kubernetes.io/projected/231ff024-2c9c-479e-b734-4cfd1e69ac91-kube-api-access-4czr8\") pod \"231ff024-2c9c-479e-b734-4cfd1e69ac91\" (UID: \"231ff024-2c9c-479e-b734-4cfd1e69ac91\") " Mar 09 13:21:56 crc kubenswrapper[4723]: I0309 13:21:56.969153 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/231ff024-2c9c-479e-b734-4cfd1e69ac91-config-data\") pod \"231ff024-2c9c-479e-b734-4cfd1e69ac91\" (UID: \"231ff024-2c9c-479e-b734-4cfd1e69ac91\") " Mar 09 13:21:56 crc kubenswrapper[4723]: I0309 13:21:56.969243 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/231ff024-2c9c-479e-b734-4cfd1e69ac91-config-data-custom\") pod \"231ff024-2c9c-479e-b734-4cfd1e69ac91\" (UID: \"231ff024-2c9c-479e-b734-4cfd1e69ac91\") " Mar 09 13:21:56 crc kubenswrapper[4723]: I0309 13:21:56.969272 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/231ff024-2c9c-479e-b734-4cfd1e69ac91-etc-machine-id\") pod \"231ff024-2c9c-479e-b734-4cfd1e69ac91\" (UID: \"231ff024-2c9c-479e-b734-4cfd1e69ac91\") " Mar 09 13:21:56 crc kubenswrapper[4723]: I0309 13:21:56.972550 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/231ff024-2c9c-479e-b734-4cfd1e69ac91-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "231ff024-2c9c-479e-b734-4cfd1e69ac91" (UID: "231ff024-2c9c-479e-b734-4cfd1e69ac91"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:21:56 crc kubenswrapper[4723]: I0309 13:21:56.995046 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/231ff024-2c9c-479e-b734-4cfd1e69ac91-kube-api-access-4czr8" (OuterVolumeSpecName: "kube-api-access-4czr8") pod "231ff024-2c9c-479e-b734-4cfd1e69ac91" (UID: "231ff024-2c9c-479e-b734-4cfd1e69ac91"). InnerVolumeSpecName "kube-api-access-4czr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:56 crc kubenswrapper[4723]: I0309 13:21:56.998055 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/231ff024-2c9c-479e-b734-4cfd1e69ac91-scripts" (OuterVolumeSpecName: "scripts") pod "231ff024-2c9c-479e-b734-4cfd1e69ac91" (UID: "231ff024-2c9c-479e-b734-4cfd1e69ac91"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:56 crc kubenswrapper[4723]: I0309 13:21:56.998159 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/231ff024-2c9c-479e-b734-4cfd1e69ac91-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "231ff024-2c9c-479e-b734-4cfd1e69ac91" (UID: "231ff024-2c9c-479e-b734-4cfd1e69ac91"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:57 crc kubenswrapper[4723]: I0309 13:21:57.054812 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/231ff024-2c9c-479e-b734-4cfd1e69ac91-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "231ff024-2c9c-479e-b734-4cfd1e69ac91" (UID: "231ff024-2c9c-479e-b734-4cfd1e69ac91"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:57 crc kubenswrapper[4723]: I0309 13:21:57.072328 4723 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/231ff024-2c9c-479e-b734-4cfd1e69ac91-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:57 crc kubenswrapper[4723]: I0309 13:21:57.072374 4723 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/231ff024-2c9c-479e-b734-4cfd1e69ac91-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:57 crc kubenswrapper[4723]: I0309 13:21:57.072389 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/231ff024-2c9c-479e-b734-4cfd1e69ac91-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:57 crc kubenswrapper[4723]: I0309 13:21:57.072402 4723 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/231ff024-2c9c-479e-b734-4cfd1e69ac91-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:57 crc kubenswrapper[4723]: I0309 13:21:57.072414 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4czr8\" (UniqueName: \"kubernetes.io/projected/231ff024-2c9c-479e-b734-4cfd1e69ac91-kube-api-access-4czr8\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:57 crc kubenswrapper[4723]: I0309 13:21:57.143784 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/231ff024-2c9c-479e-b734-4cfd1e69ac91-config-data" (OuterVolumeSpecName: "config-data") pod "231ff024-2c9c-479e-b734-4cfd1e69ac91" (UID: "231ff024-2c9c-479e-b734-4cfd1e69ac91"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:57 crc kubenswrapper[4723]: I0309 13:21:57.174821 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/231ff024-2c9c-479e-b734-4cfd1e69ac91-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:57 crc kubenswrapper[4723]: I0309 13:21:57.432508 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-644bd545d4-m82n9" Mar 09 13:21:57 crc kubenswrapper[4723]: I0309 13:21:57.591068 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"231ff024-2c9c-479e-b734-4cfd1e69ac91","Type":"ContainerDied","Data":"24d40f1032fb30023dc6a40a58db4cbcaa95ba891829c2ea5bf8d3e89fe662ed"} Mar 09 13:21:57 crc kubenswrapper[4723]: I0309 13:21:57.591119 4723 scope.go:117] "RemoveContainer" containerID="c08169c322d1c84997d310c42a57a10b911c1a97c2dd12f578eef4048001fbe9" Mar 09 13:21:57 crc kubenswrapper[4723]: I0309 13:21:57.591202 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 09 13:21:57 crc kubenswrapper[4723]: I0309 13:21:57.641560 4723 scope.go:117] "RemoveContainer" containerID="726d8ac4d64caa2c94d96ce2787c4b32183803e5054b4daa79acc270575e859e" Mar 09 13:21:57 crc kubenswrapper[4723]: I0309 13:21:57.647882 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 13:21:57 crc kubenswrapper[4723]: I0309 13:21:57.682021 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 13:21:57 crc kubenswrapper[4723]: I0309 13:21:57.693960 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 13:21:57 crc kubenswrapper[4723]: E0309 13:21:57.694412 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="231ff024-2c9c-479e-b734-4cfd1e69ac91" containerName="probe" Mar 09 13:21:57 crc kubenswrapper[4723]: I0309 13:21:57.694430 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="231ff024-2c9c-479e-b734-4cfd1e69ac91" containerName="probe" Mar 09 13:21:57 crc kubenswrapper[4723]: E0309 13:21:57.694447 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="231ff024-2c9c-479e-b734-4cfd1e69ac91" containerName="cinder-scheduler" Mar 09 13:21:57 crc kubenswrapper[4723]: I0309 13:21:57.694453 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="231ff024-2c9c-479e-b734-4cfd1e69ac91" containerName="cinder-scheduler" Mar 09 13:21:57 crc kubenswrapper[4723]: I0309 13:21:57.694780 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="231ff024-2c9c-479e-b734-4cfd1e69ac91" containerName="probe" Mar 09 13:21:57 crc kubenswrapper[4723]: I0309 13:21:57.694811 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="231ff024-2c9c-479e-b734-4cfd1e69ac91" containerName="cinder-scheduler" Mar 09 13:21:57 crc kubenswrapper[4723]: I0309 13:21:57.695960 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 09 13:21:57 crc kubenswrapper[4723]: I0309 13:21:57.702224 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 09 13:21:57 crc kubenswrapper[4723]: I0309 13:21:57.714760 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 13:21:57 crc kubenswrapper[4723]: I0309 13:21:57.793100 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb802a89-59e3-4b45-bb49-20b980e06a57-scripts\") pod \"cinder-scheduler-0\" (UID: \"cb802a89-59e3-4b45-bb49-20b980e06a57\") " pod="openstack/cinder-scheduler-0" Mar 09 13:21:57 crc kubenswrapper[4723]: I0309 13:21:57.793192 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb802a89-59e3-4b45-bb49-20b980e06a57-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"cb802a89-59e3-4b45-bb49-20b980e06a57\") " pod="openstack/cinder-scheduler-0" Mar 09 13:21:57 crc kubenswrapper[4723]: I0309 13:21:57.793244 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb802a89-59e3-4b45-bb49-20b980e06a57-config-data\") pod \"cinder-scheduler-0\" (UID: \"cb802a89-59e3-4b45-bb49-20b980e06a57\") " pod="openstack/cinder-scheduler-0" Mar 09 13:21:57 crc kubenswrapper[4723]: I0309 13:21:57.793335 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cb802a89-59e3-4b45-bb49-20b980e06a57-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"cb802a89-59e3-4b45-bb49-20b980e06a57\") " pod="openstack/cinder-scheduler-0" Mar 09 13:21:57 crc kubenswrapper[4723]: I0309 13:21:57.793381 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4bvt\" (UniqueName: \"kubernetes.io/projected/cb802a89-59e3-4b45-bb49-20b980e06a57-kube-api-access-f4bvt\") pod \"cinder-scheduler-0\" (UID: \"cb802a89-59e3-4b45-bb49-20b980e06a57\") " pod="openstack/cinder-scheduler-0" Mar 09 13:21:57 crc kubenswrapper[4723]: I0309 13:21:57.793400 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb802a89-59e3-4b45-bb49-20b980e06a57-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"cb802a89-59e3-4b45-bb49-20b980e06a57\") " pod="openstack/cinder-scheduler-0" Mar 09 13:21:57 crc kubenswrapper[4723]: I0309 13:21:57.895584 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb802a89-59e3-4b45-bb49-20b980e06a57-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"cb802a89-59e3-4b45-bb49-20b980e06a57\") " pod="openstack/cinder-scheduler-0" Mar 09 13:21:57 crc kubenswrapper[4723]: I0309 13:21:57.895658 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb802a89-59e3-4b45-bb49-20b980e06a57-config-data\") pod \"cinder-scheduler-0\" (UID: \"cb802a89-59e3-4b45-bb49-20b980e06a57\") " pod="openstack/cinder-scheduler-0" Mar 09 13:21:57 crc kubenswrapper[4723]: I0309 13:21:57.895763 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cb802a89-59e3-4b45-bb49-20b980e06a57-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"cb802a89-59e3-4b45-bb49-20b980e06a57\") " pod="openstack/cinder-scheduler-0" Mar 09 13:21:57 crc kubenswrapper[4723]: I0309 13:21:57.895839 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4bvt\" (UniqueName: \"kubernetes.io/projected/cb802a89-59e3-4b45-bb49-20b980e06a57-kube-api-access-f4bvt\") pod \"cinder-scheduler-0\" (UID: \"cb802a89-59e3-4b45-bb49-20b980e06a57\") " pod="openstack/cinder-scheduler-0" Mar 09 13:21:57 crc kubenswrapper[4723]: I0309 13:21:57.895889 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb802a89-59e3-4b45-bb49-20b980e06a57-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"cb802a89-59e3-4b45-bb49-20b980e06a57\") " pod="openstack/cinder-scheduler-0" Mar 09 13:21:57 crc kubenswrapper[4723]: I0309 13:21:57.895938 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb802a89-59e3-4b45-bb49-20b980e06a57-scripts\") pod \"cinder-scheduler-0\" (UID: \"cb802a89-59e3-4b45-bb49-20b980e06a57\") " pod="openstack/cinder-scheduler-0" Mar 09 13:21:57 crc kubenswrapper[4723]: I0309 13:21:57.896783 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cb802a89-59e3-4b45-bb49-20b980e06a57-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"cb802a89-59e3-4b45-bb49-20b980e06a57\") " pod="openstack/cinder-scheduler-0" Mar 09 13:21:57 crc kubenswrapper[4723]: I0309 13:21:57.907538 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb802a89-59e3-4b45-bb49-20b980e06a57-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"cb802a89-59e3-4b45-bb49-20b980e06a57\") " pod="openstack/cinder-scheduler-0" Mar 09 13:21:57 crc kubenswrapper[4723]: I0309 13:21:57.907562 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb802a89-59e3-4b45-bb49-20b980e06a57-scripts\") pod \"cinder-scheduler-0\" (UID: \"cb802a89-59e3-4b45-bb49-20b980e06a57\") " pod="openstack/cinder-scheduler-0" Mar 09 13:21:57 crc kubenswrapper[4723]: I0309 13:21:57.907759 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb802a89-59e3-4b45-bb49-20b980e06a57-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"cb802a89-59e3-4b45-bb49-20b980e06a57\") " pod="openstack/cinder-scheduler-0" Mar 09 13:21:57 crc kubenswrapper[4723]: I0309 13:21:57.908359 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb802a89-59e3-4b45-bb49-20b980e06a57-config-data\") pod \"cinder-scheduler-0\" (UID: \"cb802a89-59e3-4b45-bb49-20b980e06a57\") " pod="openstack/cinder-scheduler-0" Mar 09 13:21:57 crc kubenswrapper[4723]: I0309 13:21:57.917526 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4bvt\" (UniqueName: \"kubernetes.io/projected/cb802a89-59e3-4b45-bb49-20b980e06a57-kube-api-access-f4bvt\") pod \"cinder-scheduler-0\" (UID: \"cb802a89-59e3-4b45-bb49-20b980e06a57\") " pod="openstack/cinder-scheduler-0" Mar 09 13:21:58 crc kubenswrapper[4723]: I0309 13:21:58.026219 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 09 13:21:58 crc kubenswrapper[4723]: I0309 13:21:58.630598 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 13:21:58 crc kubenswrapper[4723]: I0309 13:21:58.937140 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="231ff024-2c9c-479e-b734-4cfd1e69ac91" path="/var/lib/kubelet/pods/231ff024-2c9c-479e-b734-4cfd1e69ac91/volumes" Mar 09 13:21:58 crc kubenswrapper[4723]: I0309 13:21:58.938005 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 09 13:21:58 crc kubenswrapper[4723]: I0309 13:21:58.939614 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 09 13:21:58 crc kubenswrapper[4723]: I0309 13:21:58.944148 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-h8rq2" Mar 09 13:21:58 crc kubenswrapper[4723]: I0309 13:21:58.944295 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 09 13:21:58 crc kubenswrapper[4723]: I0309 13:21:58.944408 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 09 13:21:58 crc kubenswrapper[4723]: I0309 13:21:58.947491 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 09 13:21:59 crc kubenswrapper[4723]: I0309 13:21:59.029353 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2c7e526b-cb25-4469-a6bd-b19fa44ca499-openstack-config\") pod \"openstackclient\" (UID: \"2c7e526b-cb25-4469-a6bd-b19fa44ca499\") " pod="openstack/openstackclient" Mar 09 13:21:59 crc kubenswrapper[4723]: I0309 13:21:59.029619 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c7e526b-cb25-4469-a6bd-b19fa44ca499-combined-ca-bundle\") pod \"openstackclient\" (UID: \"2c7e526b-cb25-4469-a6bd-b19fa44ca499\") " pod="openstack/openstackclient" Mar 09 13:21:59 crc kubenswrapper[4723]: I0309 13:21:59.029648 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2c7e526b-cb25-4469-a6bd-b19fa44ca499-openstack-config-secret\") pod \"openstackclient\" (UID: \"2c7e526b-cb25-4469-a6bd-b19fa44ca499\") " pod="openstack/openstackclient" Mar 09 13:21:59 crc kubenswrapper[4723]: I0309 13:21:59.029668 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcp5z\" (UniqueName: \"kubernetes.io/projected/2c7e526b-cb25-4469-a6bd-b19fa44ca499-kube-api-access-kcp5z\") pod \"openstackclient\" (UID: \"2c7e526b-cb25-4469-a6bd-b19fa44ca499\") " pod="openstack/openstackclient" Mar 09 13:21:59 crc kubenswrapper[4723]: I0309 13:21:59.132837 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c7e526b-cb25-4469-a6bd-b19fa44ca499-combined-ca-bundle\") pod \"openstackclient\" (UID: \"2c7e526b-cb25-4469-a6bd-b19fa44ca499\") " pod="openstack/openstackclient" Mar 09 13:21:59 crc kubenswrapper[4723]: I0309 13:21:59.132908 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2c7e526b-cb25-4469-a6bd-b19fa44ca499-openstack-config-secret\") pod \"openstackclient\" (UID: \"2c7e526b-cb25-4469-a6bd-b19fa44ca499\") " pod="openstack/openstackclient" Mar 09 13:21:59 crc kubenswrapper[4723]: I0309 13:21:59.132932 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcp5z\" (UniqueName: \"kubernetes.io/projected/2c7e526b-cb25-4469-a6bd-b19fa44ca499-kube-api-access-kcp5z\") pod \"openstackclient\" (UID: \"2c7e526b-cb25-4469-a6bd-b19fa44ca499\") " pod="openstack/openstackclient" Mar 09 13:21:59 crc kubenswrapper[4723]: I0309 13:21:59.133002 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2c7e526b-cb25-4469-a6bd-b19fa44ca499-openstack-config\") pod \"openstackclient\" (UID: \"2c7e526b-cb25-4469-a6bd-b19fa44ca499\") " pod="openstack/openstackclient" Mar 09 13:21:59 crc kubenswrapper[4723]: I0309 13:21:59.133987 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2c7e526b-cb25-4469-a6bd-b19fa44ca499-openstack-config\") pod \"openstackclient\" (UID: \"2c7e526b-cb25-4469-a6bd-b19fa44ca499\") " pod="openstack/openstackclient" Mar 09 13:21:59 crc kubenswrapper[4723]: I0309 13:21:59.137113 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c7e526b-cb25-4469-a6bd-b19fa44ca499-combined-ca-bundle\") pod \"openstackclient\" (UID: \"2c7e526b-cb25-4469-a6bd-b19fa44ca499\") " pod="openstack/openstackclient" Mar 09 13:21:59 crc kubenswrapper[4723]: I0309 13:21:59.145850 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2c7e526b-cb25-4469-a6bd-b19fa44ca499-openstack-config-secret\") pod \"openstackclient\" (UID: \"2c7e526b-cb25-4469-a6bd-b19fa44ca499\") " pod="openstack/openstackclient" Mar 09 13:21:59 crc kubenswrapper[4723]: I0309 13:21:59.153670 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcp5z\" (UniqueName: \"kubernetes.io/projected/2c7e526b-cb25-4469-a6bd-b19fa44ca499-kube-api-access-kcp5z\") pod \"openstackclient\" (UID: \"2c7e526b-cb25-4469-a6bd-b19fa44ca499\") " pod="openstack/openstackclient" Mar 09 13:21:59 crc kubenswrapper[4723]: I0309 13:21:59.278183 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 09 13:21:59 crc kubenswrapper[4723]: I0309 13:21:59.650658 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cb802a89-59e3-4b45-bb49-20b980e06a57","Type":"ContainerStarted","Data":"f293aebdd6945999452439de2837f319ff8a2c8274c1112a5350754b0b9c8f7e"} Mar 09 13:21:59 crc kubenswrapper[4723]: I0309 13:21:59.651077 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cb802a89-59e3-4b45-bb49-20b980e06a57","Type":"ContainerStarted","Data":"c9649d566c26002041736bbdcc08d3c6e9c073a1175a2224bfca57b959b5a9a6"} Mar 09 13:22:00 crc kubenswrapper[4723]: I0309 13:22:00.011186 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 09 13:22:00 crc kubenswrapper[4723]: I0309 13:22:00.142639 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551042-bkmb7"] Mar 09 13:22:00 crc kubenswrapper[4723]: I0309 13:22:00.144017 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551042-bkmb7" Mar 09 13:22:00 crc kubenswrapper[4723]: I0309 13:22:00.147583 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:22:00 crc kubenswrapper[4723]: I0309 13:22:00.147906 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:22:00 crc kubenswrapper[4723]: I0309 13:22:00.148038 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jrp4x" Mar 09 13:22:00 crc kubenswrapper[4723]: I0309 13:22:00.164423 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551042-bkmb7"] Mar 09 13:22:00 crc kubenswrapper[4723]: I0309 13:22:00.176159 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5r9w\" (UniqueName: \"kubernetes.io/projected/54bb3205-95cb-4772-977a-3f33fcfe1ab3-kube-api-access-x5r9w\") pod \"auto-csr-approver-29551042-bkmb7\" (UID: \"54bb3205-95cb-4772-977a-3f33fcfe1ab3\") " pod="openshift-infra/auto-csr-approver-29551042-bkmb7" Mar 09 13:22:00 crc kubenswrapper[4723]: I0309 13:22:00.278047 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5r9w\" (UniqueName: \"kubernetes.io/projected/54bb3205-95cb-4772-977a-3f33fcfe1ab3-kube-api-access-x5r9w\") pod \"auto-csr-approver-29551042-bkmb7\" (UID: \"54bb3205-95cb-4772-977a-3f33fcfe1ab3\") " pod="openshift-infra/auto-csr-approver-29551042-bkmb7" Mar 09 13:22:00 crc kubenswrapper[4723]: I0309 13:22:00.298637 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5r9w\" (UniqueName: \"kubernetes.io/projected/54bb3205-95cb-4772-977a-3f33fcfe1ab3-kube-api-access-x5r9w\") pod \"auto-csr-approver-29551042-bkmb7\" (UID: \"54bb3205-95cb-4772-977a-3f33fcfe1ab3\") " pod="openshift-infra/auto-csr-approver-29551042-bkmb7" Mar 09 13:22:00 crc kubenswrapper[4723]: I0309 13:22:00.462108 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551042-bkmb7" Mar 09 13:22:00 crc kubenswrapper[4723]: I0309 13:22:00.671158 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"2c7e526b-cb25-4469-a6bd-b19fa44ca499","Type":"ContainerStarted","Data":"cbb18624690dd06ef8f2eb5d3dfee30da78816bbae17e037777fc3e8152cc29a"} Mar 09 13:22:00 crc kubenswrapper[4723]: I0309 13:22:00.695530 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cb802a89-59e3-4b45-bb49-20b980e06a57","Type":"ContainerStarted","Data":"cbdf059cc57ed0e56c4c4026dc4303e1c36f10b16a3211b8748bc590da93f74d"} Mar 09 13:22:00 crc kubenswrapper[4723]: I0309 13:22:00.721877 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.7218458869999997 podStartE2EDuration="3.721845887s" podCreationTimestamp="2026-03-09 13:21:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:22:00.717354888 +0000 UTC m=+1394.731822428" watchObservedRunningTime="2026-03-09 13:22:00.721845887 +0000 UTC m=+1394.736313427" Mar 09 13:22:00 crc kubenswrapper[4723]: I0309 13:22:00.961954 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551042-bkmb7"] Mar 09 13:22:01 crc kubenswrapper[4723]: I0309 13:22:01.094035 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 09 13:22:01 crc kubenswrapper[4723]: I0309 13:22:01.727602 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551042-bkmb7" event={"ID":"54bb3205-95cb-4772-977a-3f33fcfe1ab3","Type":"ContainerStarted","Data":"289c01eeec60e9640dc6bc2ad144d78be52c7a09ca8e914aac38335379180ea8"} Mar 09 13:22:02 crc kubenswrapper[4723]: I0309 13:22:02.739567 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551042-bkmb7" event={"ID":"54bb3205-95cb-4772-977a-3f33fcfe1ab3","Type":"ContainerStarted","Data":"0809588e36ace37a3be426468c6babbc9559b368029ab389c1fa467da8beb254"} Mar 09 13:22:02 crc kubenswrapper[4723]: I0309 13:22:02.763123 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551042-bkmb7" podStartSLOduration=1.736654604 podStartE2EDuration="2.76310618s" podCreationTimestamp="2026-03-09 13:22:00 +0000 UTC" firstStartedPulling="2026-03-09 13:22:00.969354806 +0000 UTC m=+1394.983822346" lastFinishedPulling="2026-03-09 13:22:01.995806372 +0000 UTC m=+1396.010273922" observedRunningTime="2026-03-09 13:22:02.75595604 +0000 UTC m=+1396.770423580" watchObservedRunningTime="2026-03-09 13:22:02.76310618 +0000 UTC m=+1396.777573720" Mar 09 13:22:03 crc kubenswrapper[4723]: I0309 13:22:03.026929 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 09 13:22:04 crc kubenswrapper[4723]: I0309 13:22:04.738907 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-575fd88557-l2fxr"] Mar 09 13:22:04 crc kubenswrapper[4723]: I0309 13:22:04.741749 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-575fd88557-l2fxr" Mar 09 13:22:04 crc kubenswrapper[4723]: I0309 13:22:04.747434 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 09 13:22:04 crc kubenswrapper[4723]: I0309 13:22:04.747682 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 09 13:22:04 crc kubenswrapper[4723]: I0309 13:22:04.753425 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 09 13:22:04 crc kubenswrapper[4723]: I0309 13:22:04.765978 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-575fd88557-l2fxr"] Mar 09 13:22:04 crc kubenswrapper[4723]: I0309 13:22:04.788523 4723 generic.go:334] "Generic (PLEG): container finished" podID="54bb3205-95cb-4772-977a-3f33fcfe1ab3" containerID="0809588e36ace37a3be426468c6babbc9559b368029ab389c1fa467da8beb254" exitCode=0 Mar 09 13:22:04 crc kubenswrapper[4723]: I0309 13:22:04.788602 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551042-bkmb7" event={"ID":"54bb3205-95cb-4772-977a-3f33fcfe1ab3","Type":"ContainerDied","Data":"0809588e36ace37a3be426468c6babbc9559b368029ab389c1fa467da8beb254"} Mar 09 13:22:04 crc kubenswrapper[4723]: I0309 13:22:04.838546 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/818af82f-16fa-47eb-a868-2272b915b99c-log-httpd\") pod \"swift-proxy-575fd88557-l2fxr\" (UID: \"818af82f-16fa-47eb-a868-2272b915b99c\") " pod="openstack/swift-proxy-575fd88557-l2fxr" Mar 09 13:22:04 crc kubenswrapper[4723]: I0309 13:22:04.838622 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/818af82f-16fa-47eb-a868-2272b915b99c-run-httpd\") pod \"swift-proxy-575fd88557-l2fxr\" (UID: \"818af82f-16fa-47eb-a868-2272b915b99c\") " pod="openstack/swift-proxy-575fd88557-l2fxr" Mar 09 13:22:04 crc kubenswrapper[4723]: I0309 13:22:04.838665 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/818af82f-16fa-47eb-a868-2272b915b99c-public-tls-certs\") pod \"swift-proxy-575fd88557-l2fxr\" (UID: \"818af82f-16fa-47eb-a868-2272b915b99c\") " pod="openstack/swift-proxy-575fd88557-l2fxr" Mar 09 13:22:04 crc kubenswrapper[4723]: I0309 13:22:04.838784 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/818af82f-16fa-47eb-a868-2272b915b99c-combined-ca-bundle\") pod \"swift-proxy-575fd88557-l2fxr\" (UID: \"818af82f-16fa-47eb-a868-2272b915b99c\") " pod="openstack/swift-proxy-575fd88557-l2fxr" Mar 09 13:22:04 crc kubenswrapper[4723]: I0309 13:22:04.838820 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/818af82f-16fa-47eb-a868-2272b915b99c-config-data\") pod \"swift-proxy-575fd88557-l2fxr\" (UID: \"818af82f-16fa-47eb-a868-2272b915b99c\") " pod="openstack/swift-proxy-575fd88557-l2fxr" Mar 09 13:22:04 crc kubenswrapper[4723]: I0309 13:22:04.838849 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffmf9\" (UniqueName: \"kubernetes.io/projected/818af82f-16fa-47eb-a868-2272b915b99c-kube-api-access-ffmf9\") pod \"swift-proxy-575fd88557-l2fxr\" (UID: \"818af82f-16fa-47eb-a868-2272b915b99c\") " pod="openstack/swift-proxy-575fd88557-l2fxr" Mar 09 13:22:04 crc kubenswrapper[4723]: I0309 13:22:04.838901 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/818af82f-16fa-47eb-a868-2272b915b99c-internal-tls-certs\") pod \"swift-proxy-575fd88557-l2fxr\" (UID: \"818af82f-16fa-47eb-a868-2272b915b99c\") " pod="openstack/swift-proxy-575fd88557-l2fxr" Mar 09 13:22:04 crc kubenswrapper[4723]: I0309 13:22:04.838929 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/818af82f-16fa-47eb-a868-2272b915b99c-etc-swift\") pod \"swift-proxy-575fd88557-l2fxr\" (UID: \"818af82f-16fa-47eb-a868-2272b915b99c\") " pod="openstack/swift-proxy-575fd88557-l2fxr" Mar 09 13:22:04 crc kubenswrapper[4723]: I0309 13:22:04.940595 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/818af82f-16fa-47eb-a868-2272b915b99c-combined-ca-bundle\") pod \"swift-proxy-575fd88557-l2fxr\" (UID: \"818af82f-16fa-47eb-a868-2272b915b99c\") " pod="openstack/swift-proxy-575fd88557-l2fxr" Mar 09 13:22:04 crc kubenswrapper[4723]: I0309 13:22:04.940842 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/818af82f-16fa-47eb-a868-2272b915b99c-config-data\") pod \"swift-proxy-575fd88557-l2fxr\" (UID: \"818af82f-16fa-47eb-a868-2272b915b99c\") " pod="openstack/swift-proxy-575fd88557-l2fxr" Mar 09 13:22:04 crc kubenswrapper[4723]: I0309 13:22:04.940940 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffmf9\" (UniqueName: \"kubernetes.io/projected/818af82f-16fa-47eb-a868-2272b915b99c-kube-api-access-ffmf9\") pod \"swift-proxy-575fd88557-l2fxr\" (UID: \"818af82f-16fa-47eb-a868-2272b915b99c\") " pod="openstack/swift-proxy-575fd88557-l2fxr" Mar 09 13:22:04 crc kubenswrapper[4723]: I0309 13:22:04.941027 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/818af82f-16fa-47eb-a868-2272b915b99c-internal-tls-certs\") pod \"swift-proxy-575fd88557-l2fxr\" (UID: \"818af82f-16fa-47eb-a868-2272b915b99c\") " pod="openstack/swift-proxy-575fd88557-l2fxr" Mar 09 13:22:04 crc kubenswrapper[4723]: I0309 13:22:04.941095 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/818af82f-16fa-47eb-a868-2272b915b99c-etc-swift\") pod \"swift-proxy-575fd88557-l2fxr\" (UID: \"818af82f-16fa-47eb-a868-2272b915b99c\") " pod="openstack/swift-proxy-575fd88557-l2fxr" Mar 09 13:22:04 crc kubenswrapper[4723]: I0309 13:22:04.941248 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/818af82f-16fa-47eb-a868-2272b915b99c-log-httpd\") pod \"swift-proxy-575fd88557-l2fxr\" (UID: \"818af82f-16fa-47eb-a868-2272b915b99c\") " pod="openstack/swift-proxy-575fd88557-l2fxr" Mar 09 13:22:04 crc kubenswrapper[4723]: I0309 13:22:04.941334 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/818af82f-16fa-47eb-a868-2272b915b99c-run-httpd\") pod \"swift-proxy-575fd88557-l2fxr\" (UID: \"818af82f-16fa-47eb-a868-2272b915b99c\") " pod="openstack/swift-proxy-575fd88557-l2fxr" Mar 09 13:22:04 crc kubenswrapper[4723]: I0309 13:22:04.941415 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/818af82f-16fa-47eb-a868-2272b915b99c-public-tls-certs\") pod \"swift-proxy-575fd88557-l2fxr\" (UID: \"818af82f-16fa-47eb-a868-2272b915b99c\") " pod="openstack/swift-proxy-575fd88557-l2fxr" Mar 09 13:22:04 crc kubenswrapper[4723]: I0309 13:22:04.948856 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/818af82f-16fa-47eb-a868-2272b915b99c-log-httpd\") pod \"swift-proxy-575fd88557-l2fxr\" (UID: \"818af82f-16fa-47eb-a868-2272b915b99c\") " pod="openstack/swift-proxy-575fd88557-l2fxr" Mar 09 13:22:04 crc kubenswrapper[4723]: I0309 13:22:04.951406 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/818af82f-16fa-47eb-a868-2272b915b99c-run-httpd\") pod \"swift-proxy-575fd88557-l2fxr\" (UID: \"818af82f-16fa-47eb-a868-2272b915b99c\") " pod="openstack/swift-proxy-575fd88557-l2fxr" Mar 09 13:22:04 crc kubenswrapper[4723]: I0309 13:22:04.952619 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/818af82f-16fa-47eb-a868-2272b915b99c-config-data\") pod \"swift-proxy-575fd88557-l2fxr\" (UID: \"818af82f-16fa-47eb-a868-2272b915b99c\") " pod="openstack/swift-proxy-575fd88557-l2fxr" Mar 09 13:22:04 crc kubenswrapper[4723]: I0309 13:22:04.954927 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/818af82f-16fa-47eb-a868-2272b915b99c-etc-swift\") pod \"swift-proxy-575fd88557-l2fxr\" (UID: \"818af82f-16fa-47eb-a868-2272b915b99c\") " pod="openstack/swift-proxy-575fd88557-l2fxr" Mar 09 13:22:04 crc kubenswrapper[4723]: I0309 13:22:04.955431 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/818af82f-16fa-47eb-a868-2272b915b99c-internal-tls-certs\") pod \"swift-proxy-575fd88557-l2fxr\" (UID: \"818af82f-16fa-47eb-a868-2272b915b99c\") " pod="openstack/swift-proxy-575fd88557-l2fxr" Mar 09 13:22:04 crc kubenswrapper[4723]: I0309 13:22:04.955765 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/818af82f-16fa-47eb-a868-2272b915b99c-public-tls-certs\") pod \"swift-proxy-575fd88557-l2fxr\" (UID: \"818af82f-16fa-47eb-a868-2272b915b99c\") " pod="openstack/swift-proxy-575fd88557-l2fxr" Mar 09 13:22:04 crc kubenswrapper[4723]: I0309 13:22:04.968984 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/818af82f-16fa-47eb-a868-2272b915b99c-combined-ca-bundle\") pod \"swift-proxy-575fd88557-l2fxr\" (UID: \"818af82f-16fa-47eb-a868-2272b915b99c\") " pod="openstack/swift-proxy-575fd88557-l2fxr" Mar 09 13:22:04 crc kubenswrapper[4723]: I0309 13:22:04.970688 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffmf9\" (UniqueName: \"kubernetes.io/projected/818af82f-16fa-47eb-a868-2272b915b99c-kube-api-access-ffmf9\") pod \"swift-proxy-575fd88557-l2fxr\" (UID: \"818af82f-16fa-47eb-a868-2272b915b99c\") " pod="openstack/swift-proxy-575fd88557-l2fxr" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.037284 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-88895"] Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.039042 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-88895" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.048543 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-88895"] Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.069069 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-575fd88557-l2fxr" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.142915 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-4cgtr"] Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.144754 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-4cgtr" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.149191 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c035e2e5-1d3a-4254-8950-b6893fc60ff3-operator-scripts\") pod \"nova-api-db-create-88895\" (UID: \"c035e2e5-1d3a-4254-8950-b6893fc60ff3\") " pod="openstack/nova-api-db-create-88895" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.149346 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbcx7\" (UniqueName: \"kubernetes.io/projected/c035e2e5-1d3a-4254-8950-b6893fc60ff3-kube-api-access-fbcx7\") pod \"nova-api-db-create-88895\" (UID: \"c035e2e5-1d3a-4254-8950-b6893fc60ff3\") " pod="openstack/nova-api-db-create-88895" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.191788 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-4cgtr"] Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.235634 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-64d0-account-create-update-92s6n"] Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.237612 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-64d0-account-create-update-92s6n" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.241679 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.252925 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbcx7\" (UniqueName: \"kubernetes.io/projected/c035e2e5-1d3a-4254-8950-b6893fc60ff3-kube-api-access-fbcx7\") pod \"nova-api-db-create-88895\" (UID: \"c035e2e5-1d3a-4254-8950-b6893fc60ff3\") " pod="openstack/nova-api-db-create-88895" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.253056 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1248e57-4fe2-41f6-9fd9-d3980e6b3c7f-operator-scripts\") pod \"nova-cell0-db-create-4cgtr\" (UID: \"f1248e57-4fe2-41f6-9fd9-d3980e6b3c7f\") " pod="openstack/nova-cell0-db-create-4cgtr" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.253148 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c035e2e5-1d3a-4254-8950-b6893fc60ff3-operator-scripts\") pod \"nova-api-db-create-88895\" (UID: \"c035e2e5-1d3a-4254-8950-b6893fc60ff3\") " pod="openstack/nova-api-db-create-88895" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.253193 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8vjh\" (UniqueName: \"kubernetes.io/projected/f1248e57-4fe2-41f6-9fd9-d3980e6b3c7f-kube-api-access-k8vjh\") pod \"nova-cell0-db-create-4cgtr\" (UID: \"f1248e57-4fe2-41f6-9fd9-d3980e6b3c7f\") " pod="openstack/nova-cell0-db-create-4cgtr" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.255059 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-64d0-account-create-update-92s6n"] Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.258211 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c035e2e5-1d3a-4254-8950-b6893fc60ff3-operator-scripts\") pod \"nova-api-db-create-88895\" (UID: \"c035e2e5-1d3a-4254-8950-b6893fc60ff3\") " pod="openstack/nova-api-db-create-88895" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.285571 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbcx7\" (UniqueName: \"kubernetes.io/projected/c035e2e5-1d3a-4254-8950-b6893fc60ff3-kube-api-access-fbcx7\") pod \"nova-api-db-create-88895\" (UID: \"c035e2e5-1d3a-4254-8950-b6893fc60ff3\") " pod="openstack/nova-api-db-create-88895" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.354136 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-nlv65"] Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.355596 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nlv65" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.360242 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1248e57-4fe2-41f6-9fd9-d3980e6b3c7f-operator-scripts\") pod \"nova-cell0-db-create-4cgtr\" (UID: \"f1248e57-4fe2-41f6-9fd9-d3980e6b3c7f\") " pod="openstack/nova-cell0-db-create-4cgtr" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.360401 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/573e22bf-ce84-4ce1-bd2d-45f52b8cd30a-operator-scripts\") pod \"nova-api-64d0-account-create-update-92s6n\" (UID: \"573e22bf-ce84-4ce1-bd2d-45f52b8cd30a\") " pod="openstack/nova-api-64d0-account-create-update-92s6n" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.360484 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8vjh\" (UniqueName: \"kubernetes.io/projected/f1248e57-4fe2-41f6-9fd9-d3980e6b3c7f-kube-api-access-k8vjh\") pod \"nova-cell0-db-create-4cgtr\" (UID: \"f1248e57-4fe2-41f6-9fd9-d3980e6b3c7f\") " pod="openstack/nova-cell0-db-create-4cgtr" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.360555 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2hll\" (UniqueName: \"kubernetes.io/projected/573e22bf-ce84-4ce1-bd2d-45f52b8cd30a-kube-api-access-t2hll\") pod \"nova-api-64d0-account-create-update-92s6n\" (UID: \"573e22bf-ce84-4ce1-bd2d-45f52b8cd30a\") " pod="openstack/nova-api-64d0-account-create-update-92s6n" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.366981 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-nlv65"] Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.378774 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1248e57-4fe2-41f6-9fd9-d3980e6b3c7f-operator-scripts\") pod \"nova-cell0-db-create-4cgtr\" (UID: \"f1248e57-4fe2-41f6-9fd9-d3980e6b3c7f\") " pod="openstack/nova-cell0-db-create-4cgtr" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.378841 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-8b2e-account-create-update-jp7ld"] Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.380601 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8b2e-account-create-update-jp7ld" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.385164 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.389581 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-8b2e-account-create-update-jp7ld"] Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.397588 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-88895" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.397682 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8vjh\" (UniqueName: \"kubernetes.io/projected/f1248e57-4fe2-41f6-9fd9-d3980e6b3c7f-kube-api-access-k8vjh\") pod \"nova-cell0-db-create-4cgtr\" (UID: \"f1248e57-4fe2-41f6-9fd9-d3980e6b3c7f\") " pod="openstack/nova-cell0-db-create-4cgtr" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.464258 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j48n2\" (UniqueName: \"kubernetes.io/projected/a08da534-32c5-4d52-ba3f-2bc7a8f491c4-kube-api-access-j48n2\") pod \"nova-cell0-8b2e-account-create-update-jp7ld\" (UID: \"a08da534-32c5-4d52-ba3f-2bc7a8f491c4\") " pod="openstack/nova-cell0-8b2e-account-create-update-jp7ld" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.464353 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/573e22bf-ce84-4ce1-bd2d-45f52b8cd30a-operator-scripts\") pod \"nova-api-64d0-account-create-update-92s6n\" (UID: \"573e22bf-ce84-4ce1-bd2d-45f52b8cd30a\") " pod="openstack/nova-api-64d0-account-create-update-92s6n" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.464396 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l69c\" (UniqueName: \"kubernetes.io/projected/f82b3a10-19c5-4071-9ab5-5356f38bf35e-kube-api-access-4l69c\") pod \"nova-cell1-db-create-nlv65\" (UID: \"f82b3a10-19c5-4071-9ab5-5356f38bf35e\") " pod="openstack/nova-cell1-db-create-nlv65" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.464476 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2hll\" (UniqueName: \"kubernetes.io/projected/573e22bf-ce84-4ce1-bd2d-45f52b8cd30a-kube-api-access-t2hll\") pod \"nova-api-64d0-account-create-update-92s6n\" (UID: \"573e22bf-ce84-4ce1-bd2d-45f52b8cd30a\") " pod="openstack/nova-api-64d0-account-create-update-92s6n" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.464541 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f82b3a10-19c5-4071-9ab5-5356f38bf35e-operator-scripts\") pod \"nova-cell1-db-create-nlv65\" (UID: \"f82b3a10-19c5-4071-9ab5-5356f38bf35e\") " pod="openstack/nova-cell1-db-create-nlv65" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.464565 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a08da534-32c5-4d52-ba3f-2bc7a8f491c4-operator-scripts\") pod \"nova-cell0-8b2e-account-create-update-jp7ld\" (UID: \"a08da534-32c5-4d52-ba3f-2bc7a8f491c4\") " pod="openstack/nova-cell0-8b2e-account-create-update-jp7ld" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.465413 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/573e22bf-ce84-4ce1-bd2d-45f52b8cd30a-operator-scripts\") pod \"nova-api-64d0-account-create-update-92s6n\" (UID: \"573e22bf-ce84-4ce1-bd2d-45f52b8cd30a\") " pod="openstack/nova-api-64d0-account-create-update-92s6n" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.485013 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2hll\" (UniqueName: \"kubernetes.io/projected/573e22bf-ce84-4ce1-bd2d-45f52b8cd30a-kube-api-access-t2hll\") pod \"nova-api-64d0-account-create-update-92s6n\" (UID: \"573e22bf-ce84-4ce1-bd2d-45f52b8cd30a\") " pod="openstack/nova-api-64d0-account-create-update-92s6n" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.491911 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.492273 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="868faedb-4c6c-479b-b519-45c8a252dee8" containerName="ceilometer-central-agent" containerID="cri-o://dd5b500095dd0c60df2c62d705c3589e5648c8728a83ccd454d9f0ec119c1858" gracePeriod=30 Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.494298 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="868faedb-4c6c-479b-b519-45c8a252dee8" containerName="sg-core" containerID="cri-o://276aafd0a7cfae040a0d105fce63aa7cb3f85af7e501922b266342496107a3f1" gracePeriod=30 Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.494427 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="868faedb-4c6c-479b-b519-45c8a252dee8" containerName="proxy-httpd" containerID="cri-o://e642cbfe23c56063d2e6dc0b223e408313d261acad06028681a662711dd1b182" gracePeriod=30 Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.494478 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="868faedb-4c6c-479b-b519-45c8a252dee8" containerName="ceilometer-notification-agent" containerID="cri-o://dd5c68fc27791b5cb65c4cab5cff1b3860a58331b107eedd79abd83bc841ce93" gracePeriod=30 Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.501142 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.520017 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-4cgtr" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.558833 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-64d0-account-create-update-92s6n" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.567398 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j48n2\" (UniqueName: \"kubernetes.io/projected/a08da534-32c5-4d52-ba3f-2bc7a8f491c4-kube-api-access-j48n2\") pod \"nova-cell0-8b2e-account-create-update-jp7ld\" (UID: \"a08da534-32c5-4d52-ba3f-2bc7a8f491c4\") " pod="openstack/nova-cell0-8b2e-account-create-update-jp7ld" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.567479 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l69c\" (UniqueName: \"kubernetes.io/projected/f82b3a10-19c5-4071-9ab5-5356f38bf35e-kube-api-access-4l69c\") pod \"nova-cell1-db-create-nlv65\" (UID: \"f82b3a10-19c5-4071-9ab5-5356f38bf35e\") " pod="openstack/nova-cell1-db-create-nlv65" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.567586 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f82b3a10-19c5-4071-9ab5-5356f38bf35e-operator-scripts\") pod \"nova-cell1-db-create-nlv65\" (UID: \"f82b3a10-19c5-4071-9ab5-5356f38bf35e\") " pod="openstack/nova-cell1-db-create-nlv65" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.567603 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a08da534-32c5-4d52-ba3f-2bc7a8f491c4-operator-scripts\") pod \"nova-cell0-8b2e-account-create-update-jp7ld\" (UID: \"a08da534-32c5-4d52-ba3f-2bc7a8f491c4\") " pod="openstack/nova-cell0-8b2e-account-create-update-jp7ld" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.568401 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a08da534-32c5-4d52-ba3f-2bc7a8f491c4-operator-scripts\") pod \"nova-cell0-8b2e-account-create-update-jp7ld\" (UID: \"a08da534-32c5-4d52-ba3f-2bc7a8f491c4\") " pod="openstack/nova-cell0-8b2e-account-create-update-jp7ld" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.568404 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f82b3a10-19c5-4071-9ab5-5356f38bf35e-operator-scripts\") pod \"nova-cell1-db-create-nlv65\" (UID: \"f82b3a10-19c5-4071-9ab5-5356f38bf35e\") " pod="openstack/nova-cell1-db-create-nlv65" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.573595 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-948f-account-create-update-2t9dn"] Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.585153 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l69c\" (UniqueName: \"kubernetes.io/projected/f82b3a10-19c5-4071-9ab5-5356f38bf35e-kube-api-access-4l69c\") pod \"nova-cell1-db-create-nlv65\" (UID: \"f82b3a10-19c5-4071-9ab5-5356f38bf35e\") " pod="openstack/nova-cell1-db-create-nlv65" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.587143 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-948f-account-create-update-2t9dn" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.593044 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.593754 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j48n2\" (UniqueName: \"kubernetes.io/projected/a08da534-32c5-4d52-ba3f-2bc7a8f491c4-kube-api-access-j48n2\") pod \"nova-cell0-8b2e-account-create-update-jp7ld\" (UID: \"a08da534-32c5-4d52-ba3f-2bc7a8f491c4\") " pod="openstack/nova-cell0-8b2e-account-create-update-jp7ld" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.605469 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-948f-account-create-update-2t9dn"] Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.670791 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36f5570d-569f-4871-9be2-bc1650c32fb8-operator-scripts\") pod \"nova-cell1-948f-account-create-update-2t9dn\" (UID: \"36f5570d-569f-4871-9be2-bc1650c32fb8\") " pod="openstack/nova-cell1-948f-account-create-update-2t9dn" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.671010 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc2b2\" (UniqueName: \"kubernetes.io/projected/36f5570d-569f-4871-9be2-bc1650c32fb8-kube-api-access-tc2b2\") pod \"nova-cell1-948f-account-create-update-2t9dn\" (UID: \"36f5570d-569f-4871-9be2-bc1650c32fb8\") " pod="openstack/nova-cell1-948f-account-create-update-2t9dn" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.691106 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nlv65" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.716881 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8b2e-account-create-update-jp7ld" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.775306 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tc2b2\" (UniqueName: \"kubernetes.io/projected/36f5570d-569f-4871-9be2-bc1650c32fb8-kube-api-access-tc2b2\") pod \"nova-cell1-948f-account-create-update-2t9dn\" (UID: \"36f5570d-569f-4871-9be2-bc1650c32fb8\") " pod="openstack/nova-cell1-948f-account-create-update-2t9dn" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.775621 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36f5570d-569f-4871-9be2-bc1650c32fb8-operator-scripts\") pod \"nova-cell1-948f-account-create-update-2t9dn\" (UID: \"36f5570d-569f-4871-9be2-bc1650c32fb8\") " pod="openstack/nova-cell1-948f-account-create-update-2t9dn" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.777395 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36f5570d-569f-4871-9be2-bc1650c32fb8-operator-scripts\") pod \"nova-cell1-948f-account-create-update-2t9dn\" (UID: \"36f5570d-569f-4871-9be2-bc1650c32fb8\") " pod="openstack/nova-cell1-948f-account-create-update-2t9dn" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.801311 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc2b2\" (UniqueName: \"kubernetes.io/projected/36f5570d-569f-4871-9be2-bc1650c32fb8-kube-api-access-tc2b2\") pod \"nova-cell1-948f-account-create-update-2t9dn\" (UID: \"36f5570d-569f-4871-9be2-bc1650c32fb8\") " pod="openstack/nova-cell1-948f-account-create-update-2t9dn" Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.827850 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-575fd88557-l2fxr"] Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.857822 4723 generic.go:334] "Generic (PLEG): container finished" podID="868faedb-4c6c-479b-b519-45c8a252dee8" containerID="e642cbfe23c56063d2e6dc0b223e408313d261acad06028681a662711dd1b182" exitCode=0 Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.857881 4723 generic.go:334] "Generic (PLEG): container finished" podID="868faedb-4c6c-479b-b519-45c8a252dee8" containerID="276aafd0a7cfae040a0d105fce63aa7cb3f85af7e501922b266342496107a3f1" exitCode=2 Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.858057 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"868faedb-4c6c-479b-b519-45c8a252dee8","Type":"ContainerDied","Data":"e642cbfe23c56063d2e6dc0b223e408313d261acad06028681a662711dd1b182"} Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.858085 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"868faedb-4c6c-479b-b519-45c8a252dee8","Type":"ContainerDied","Data":"276aafd0a7cfae040a0d105fce63aa7cb3f85af7e501922b266342496107a3f1"} Mar 09 13:22:05 crc kubenswrapper[4723]: I0309 13:22:05.917563 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-948f-account-create-update-2t9dn" Mar 09 13:22:06 crc kubenswrapper[4723]: I0309 13:22:06.230391 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-88895"] Mar 09 13:22:06 crc kubenswrapper[4723]: W0309 13:22:06.254947 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc035e2e5_1d3a_4254_8950_b6893fc60ff3.slice/crio-0d5053ebb9369a70282ff7dfc1e6ee19d2981c0fa86c324de49373a048872945 WatchSource:0}: Error finding container 0d5053ebb9369a70282ff7dfc1e6ee19d2981c0fa86c324de49373a048872945: Status 404 returned error can't find the container with id 0d5053ebb9369a70282ff7dfc1e6ee19d2981c0fa86c324de49373a048872945 Mar 09 13:22:06 crc kubenswrapper[4723]: W0309 13:22:06.313466 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod573e22bf_ce84_4ce1_bd2d_45f52b8cd30a.slice/crio-df9463658692947dbaa6646095a797d8c75c0689e76bf4cf416a225ab48f4052 WatchSource:0}: Error finding container df9463658692947dbaa6646095a797d8c75c0689e76bf4cf416a225ab48f4052: Status 404 returned error can't find the container with id df9463658692947dbaa6646095a797d8c75c0689e76bf4cf416a225ab48f4052 Mar 09 13:22:06 crc kubenswrapper[4723]: I0309 13:22:06.330521 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-64d0-account-create-update-92s6n"] Mar 09 13:22:06 crc kubenswrapper[4723]: I0309 13:22:06.681046 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551042-bkmb7" Mar 09 13:22:06 crc kubenswrapper[4723]: I0309 13:22:06.789875 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-8b2e-account-create-update-jp7ld"] Mar 09 13:22:06 crc kubenswrapper[4723]: I0309 13:22:06.803769 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-4cgtr"] Mar 09 13:22:06 crc kubenswrapper[4723]: I0309 13:22:06.844203 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5r9w\" (UniqueName: \"kubernetes.io/projected/54bb3205-95cb-4772-977a-3f33fcfe1ab3-kube-api-access-x5r9w\") pod \"54bb3205-95cb-4772-977a-3f33fcfe1ab3\" (UID: \"54bb3205-95cb-4772-977a-3f33fcfe1ab3\") " Mar 09 13:22:06 crc kubenswrapper[4723]: I0309 13:22:06.852133 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54bb3205-95cb-4772-977a-3f33fcfe1ab3-kube-api-access-x5r9w" (OuterVolumeSpecName: "kube-api-access-x5r9w") pod "54bb3205-95cb-4772-977a-3f33fcfe1ab3" (UID: "54bb3205-95cb-4772-977a-3f33fcfe1ab3"). InnerVolumeSpecName "kube-api-access-x5r9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:06 crc kubenswrapper[4723]: I0309 13:22:06.949090 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5r9w\" (UniqueName: \"kubernetes.io/projected/54bb3205-95cb-4772-977a-3f33fcfe1ab3-kube-api-access-x5r9w\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:06 crc kubenswrapper[4723]: I0309 13:22:06.953317 4723 generic.go:334] "Generic (PLEG): container finished" podID="868faedb-4c6c-479b-b519-45c8a252dee8" containerID="dd5c68fc27791b5cb65c4cab5cff1b3860a58331b107eedd79abd83bc841ce93" exitCode=0 Mar 09 13:22:06 crc kubenswrapper[4723]: I0309 13:22:06.953345 4723 generic.go:334] "Generic (PLEG): container finished" podID="868faedb-4c6c-479b-b519-45c8a252dee8" containerID="dd5b500095dd0c60df2c62d705c3589e5648c8728a83ccd454d9f0ec119c1858" exitCode=0 Mar 09 13:22:06 crc kubenswrapper[4723]: I0309 13:22:06.959047 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551042-bkmb7" Mar 09 13:22:06 crc kubenswrapper[4723]: I0309 13:22:06.967323 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8b2e-account-create-update-jp7ld" event={"ID":"a08da534-32c5-4d52-ba3f-2bc7a8f491c4","Type":"ContainerStarted","Data":"bf1a79e51e16c0a1295352b174e4e70a53c788c00e617aa448c75faec3687708"} Mar 09 13:22:06 crc kubenswrapper[4723]: I0309 13:22:06.967366 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551042-bkmb7" event={"ID":"54bb3205-95cb-4772-977a-3f33fcfe1ab3","Type":"ContainerDied","Data":"289c01eeec60e9640dc6bc2ad144d78be52c7a09ca8e914aac38335379180ea8"} Mar 09 13:22:06 crc kubenswrapper[4723]: I0309 13:22:06.967387 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="289c01eeec60e9640dc6bc2ad144d78be52c7a09ca8e914aac38335379180ea8" Mar 09 13:22:06 crc kubenswrapper[4723]: I0309 13:22:06.967401 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551036-5g57t"] Mar 09 13:22:06 crc kubenswrapper[4723]: I0309 13:22:06.967421 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551036-5g57t"] Mar 09 13:22:06 crc kubenswrapper[4723]: I0309 13:22:06.967440 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-4cgtr" event={"ID":"f1248e57-4fe2-41f6-9fd9-d3980e6b3c7f","Type":"ContainerStarted","Data":"50fcce62ceaa3749d0287f66112ae4853c8624aadc970fce8bdbce0ce8b97007"} Mar 09 13:22:06 crc kubenswrapper[4723]: I0309 13:22:06.967453 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-575fd88557-l2fxr" event={"ID":"818af82f-16fa-47eb-a868-2272b915b99c","Type":"ContainerStarted","Data":"4ee26bcccfbb546f21b7107d8e523344c9daf88cbb30aa26e4469fd943ea5d2c"} Mar 09 13:22:06 crc kubenswrapper[4723]: I0309 13:22:06.967466 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-575fd88557-l2fxr" event={"ID":"818af82f-16fa-47eb-a868-2272b915b99c","Type":"ContainerStarted","Data":"9fc17ffece694e7387f8ffe7822f472a4a1c95502a915ce53a41599fa3ac79de"} Mar 09 13:22:06 crc kubenswrapper[4723]: I0309 13:22:06.967479 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-575fd88557-l2fxr" event={"ID":"818af82f-16fa-47eb-a868-2272b915b99c","Type":"ContainerStarted","Data":"bf292f718cd2636898fa235cddd8c540f99e7972a423f5055fece665328282fd"} Mar 09 13:22:06 crc kubenswrapper[4723]: I0309 13:22:06.967489 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"868faedb-4c6c-479b-b519-45c8a252dee8","Type":"ContainerDied","Data":"dd5c68fc27791b5cb65c4cab5cff1b3860a58331b107eedd79abd83bc841ce93"} Mar 09 13:22:06 crc kubenswrapper[4723]: I0309 13:22:06.967504 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"868faedb-4c6c-479b-b519-45c8a252dee8","Type":"ContainerDied","Data":"dd5b500095dd0c60df2c62d705c3589e5648c8728a83ccd454d9f0ec119c1858"} Mar 09 13:22:06 crc kubenswrapper[4723]: I0309 13:22:06.967516 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-88895" event={"ID":"c035e2e5-1d3a-4254-8950-b6893fc60ff3","Type":"ContainerStarted","Data":"dad61ce22585c2f465388c8d8e66e33317bad4e22ce62cb653d79435cc799155"} Mar 09 13:22:06 crc kubenswrapper[4723]: I0309 13:22:06.967529 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-88895" event={"ID":"c035e2e5-1d3a-4254-8950-b6893fc60ff3","Type":"ContainerStarted","Data":"0d5053ebb9369a70282ff7dfc1e6ee19d2981c0fa86c324de49373a048872945"} Mar 09 13:22:06 crc kubenswrapper[4723]: I0309 13:22:06.967684 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-64d0-account-create-update-92s6n" event={"ID":"573e22bf-ce84-4ce1-bd2d-45f52b8cd30a","Type":"ContainerStarted","Data":"e727cb285e0827dde49c8e97f0f4423d19579378350dc70ca73915fd30d39bdc"} Mar 09 13:22:06 crc kubenswrapper[4723]: I0309 13:22:06.967703 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-64d0-account-create-update-92s6n" event={"ID":"573e22bf-ce84-4ce1-bd2d-45f52b8cd30a","Type":"ContainerStarted","Data":"df9463658692947dbaa6646095a797d8c75c0689e76bf4cf416a225ab48f4052"} Mar 09 13:22:07 crc kubenswrapper[4723]: I0309 13:22:07.148753 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-nlv65"] Mar 09 13:22:07 crc kubenswrapper[4723]: I0309 13:22:07.169540 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-948f-account-create-update-2t9dn"] Mar 09 13:22:07 crc kubenswrapper[4723]: I0309 13:22:07.180552 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-575fd88557-l2fxr" podStartSLOduration=3.18053169 podStartE2EDuration="3.18053169s" podCreationTimestamp="2026-03-09 13:22:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:22:07.118881849 +0000 UTC m=+1401.133349409" watchObservedRunningTime="2026-03-09 13:22:07.18053169 +0000 UTC m=+1401.194999230" Mar 09 13:22:07 crc kubenswrapper[4723]: I0309 13:22:07.186398 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-64d0-account-create-update-92s6n" podStartSLOduration=2.186381145 podStartE2EDuration="2.186381145s" podCreationTimestamp="2026-03-09 13:22:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:22:07.133028213 +0000 UTC m=+1401.147495753" watchObservedRunningTime="2026-03-09 13:22:07.186381145 +0000 UTC m=+1401.200848685" Mar 09 13:22:07 crc kubenswrapper[4723]: I0309 13:22:07.195803 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-88895" podStartSLOduration=2.195784394 podStartE2EDuration="2.195784394s" podCreationTimestamp="2026-03-09 13:22:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:22:07.146041897 +0000 UTC m=+1401.160509427" watchObservedRunningTime="2026-03-09 13:22:07.195784394 +0000 UTC m=+1401.210251934" Mar 09 13:22:07 crc kubenswrapper[4723]: I0309 13:22:07.985198 4723 generic.go:334] "Generic (PLEG): container finished" podID="573e22bf-ce84-4ce1-bd2d-45f52b8cd30a" containerID="e727cb285e0827dde49c8e97f0f4423d19579378350dc70ca73915fd30d39bdc" exitCode=0 Mar 09 13:22:07 crc kubenswrapper[4723]: I0309 13:22:07.985591 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-64d0-account-create-update-92s6n" event={"ID":"573e22bf-ce84-4ce1-bd2d-45f52b8cd30a","Type":"ContainerDied","Data":"e727cb285e0827dde49c8e97f0f4423d19579378350dc70ca73915fd30d39bdc"} Mar 09 13:22:07 crc kubenswrapper[4723]: I0309 13:22:07.988828 4723 generic.go:334] "Generic (PLEG): container finished" podID="c035e2e5-1d3a-4254-8950-b6893fc60ff3" containerID="dad61ce22585c2f465388c8d8e66e33317bad4e22ce62cb653d79435cc799155" exitCode=0 Mar 09 13:22:07 crc kubenswrapper[4723]: I0309 13:22:07.988912 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-88895" event={"ID":"c035e2e5-1d3a-4254-8950-b6893fc60ff3","Type":"ContainerDied","Data":"dad61ce22585c2f465388c8d8e66e33317bad4e22ce62cb653d79435cc799155"} Mar 09 13:22:07 crc kubenswrapper[4723]: I0309 13:22:07.989248 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-575fd88557-l2fxr" Mar 09 13:22:07 crc kubenswrapper[4723]: I0309 13:22:07.989389 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-575fd88557-l2fxr" Mar 09 13:22:08 crc kubenswrapper[4723]: I0309 13:22:08.267962 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 09 13:22:08 crc kubenswrapper[4723]: I0309 13:22:08.898219 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ad9c8b7-3845-41fa-b73a-88cb02635900" path="/var/lib/kubelet/pods/5ad9c8b7-3845-41fa-b73a-88cb02635900/volumes" Mar 09 13:22:11 crc kubenswrapper[4723]: I0309 13:22:11.201675 4723 scope.go:117] "RemoveContainer" containerID="b70a1ab9ed74a98e75c02efbf4800c8b4082bf380fe425243c91da6c6eee89da" Mar 09 13:22:12 crc kubenswrapper[4723]: I0309 13:22:12.332398 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6556fbf64c-254qf" Mar 09 13:22:12 crc kubenswrapper[4723]: I0309 13:22:12.416070 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-597569c5dd-vxwdd"] Mar 09 13:22:12 crc kubenswrapper[4723]: I0309 13:22:12.416530 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-597569c5dd-vxwdd" podUID="de4e8079-9f44-44ce-937d-0364b3ff7a9e" containerName="neutron-httpd" containerID="cri-o://396ebe17fa795e7aaeb9ee1d3e1c6982be72aa9650342f05d98b29e567725f39" gracePeriod=30 Mar 09 13:22:12 crc kubenswrapper[4723]: I0309 13:22:12.416882 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-597569c5dd-vxwdd" podUID="de4e8079-9f44-44ce-937d-0364b3ff7a9e" containerName="neutron-api" containerID="cri-o://21d2fe4aec92c09368a83bf195eb1d1edb63abd53ee959d548c5ef4582b18375" gracePeriod=30 Mar 09 13:22:13 crc kubenswrapper[4723]: I0309 13:22:13.047641 4723 generic.go:334] "Generic (PLEG): container finished" podID="de4e8079-9f44-44ce-937d-0364b3ff7a9e" containerID="396ebe17fa795e7aaeb9ee1d3e1c6982be72aa9650342f05d98b29e567725f39" exitCode=0 Mar 09 13:22:13 crc kubenswrapper[4723]: I0309 13:22:13.047725 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-597569c5dd-vxwdd" event={"ID":"de4e8079-9f44-44ce-937d-0364b3ff7a9e","Type":"ContainerDied","Data":"396ebe17fa795e7aaeb9ee1d3e1c6982be72aa9650342f05d98b29e567725f39"} Mar 09 13:22:13 crc kubenswrapper[4723]: W0309 13:22:13.098319 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36f5570d_569f_4871_9be2_bc1650c32fb8.slice/crio-38668a9e26c7a437151e357921f018dc5083e6bb62d0f98753243cb032ad931b WatchSource:0}: Error finding container 38668a9e26c7a437151e357921f018dc5083e6bb62d0f98753243cb032ad931b: Status 404 returned error can't find the container with id 38668a9e26c7a437151e357921f018dc5083e6bb62d0f98753243cb032ad931b Mar 09 13:22:13 crc kubenswrapper[4723]: W0309 13:22:13.119077 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf82b3a10_19c5_4071_9ab5_5356f38bf35e.slice/crio-8ec079cc5a7c7ceb292504b2e4d1d871f3b15bc0d135d472ee4249788787f227 WatchSource:0}: Error finding container 8ec079cc5a7c7ceb292504b2e4d1d871f3b15bc0d135d472ee4249788787f227: Status 404 returned error can't find the container with id 8ec079cc5a7c7ceb292504b2e4d1d871f3b15bc0d135d472ee4249788787f227 Mar 09 13:22:13 crc kubenswrapper[4723]: I0309 13:22:13.265556 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-88895" Mar 09 13:22:13 crc kubenswrapper[4723]: I0309 13:22:13.294929 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:22:13 crc kubenswrapper[4723]: I0309 13:22:13.310303 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbcx7\" (UniqueName: \"kubernetes.io/projected/c035e2e5-1d3a-4254-8950-b6893fc60ff3-kube-api-access-fbcx7\") pod \"c035e2e5-1d3a-4254-8950-b6893fc60ff3\" (UID: \"c035e2e5-1d3a-4254-8950-b6893fc60ff3\") " Mar 09 13:22:13 crc kubenswrapper[4723]: I0309 13:22:13.310481 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c035e2e5-1d3a-4254-8950-b6893fc60ff3-operator-scripts\") pod \"c035e2e5-1d3a-4254-8950-b6893fc60ff3\" (UID: \"c035e2e5-1d3a-4254-8950-b6893fc60ff3\") " Mar 09 13:22:13 crc kubenswrapper[4723]: I0309 13:22:13.314126 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c035e2e5-1d3a-4254-8950-b6893fc60ff3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c035e2e5-1d3a-4254-8950-b6893fc60ff3" (UID: "c035e2e5-1d3a-4254-8950-b6893fc60ff3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:13 crc kubenswrapper[4723]: I0309 13:22:13.316641 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-64d0-account-create-update-92s6n" Mar 09 13:22:13 crc kubenswrapper[4723]: I0309 13:22:13.323386 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c035e2e5-1d3a-4254-8950-b6893fc60ff3-kube-api-access-fbcx7" (OuterVolumeSpecName: "kube-api-access-fbcx7") pod "c035e2e5-1d3a-4254-8950-b6893fc60ff3" (UID: "c035e2e5-1d3a-4254-8950-b6893fc60ff3"). InnerVolumeSpecName "kube-api-access-fbcx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:13 crc kubenswrapper[4723]: I0309 13:22:13.417611 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/868faedb-4c6c-479b-b519-45c8a252dee8-run-httpd\") pod \"868faedb-4c6c-479b-b519-45c8a252dee8\" (UID: \"868faedb-4c6c-479b-b519-45c8a252dee8\") " Mar 09 13:22:13 crc kubenswrapper[4723]: I0309 13:22:13.418776 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpp2c\" (UniqueName: \"kubernetes.io/projected/868faedb-4c6c-479b-b519-45c8a252dee8-kube-api-access-cpp2c\") pod \"868faedb-4c6c-479b-b519-45c8a252dee8\" (UID: \"868faedb-4c6c-479b-b519-45c8a252dee8\") " Mar 09 13:22:13 crc kubenswrapper[4723]: I0309 13:22:13.419009 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2hll\" (UniqueName: \"kubernetes.io/projected/573e22bf-ce84-4ce1-bd2d-45f52b8cd30a-kube-api-access-t2hll\") pod \"573e22bf-ce84-4ce1-bd2d-45f52b8cd30a\" (UID: \"573e22bf-ce84-4ce1-bd2d-45f52b8cd30a\") " Mar 09 13:22:13 crc kubenswrapper[4723]: I0309 13:22:13.419240 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/868faedb-4c6c-479b-b519-45c8a252dee8-sg-core-conf-yaml\") pod \"868faedb-4c6c-479b-b519-45c8a252dee8\" (UID: \"868faedb-4c6c-479b-b519-45c8a252dee8\") " Mar 09 13:22:13 crc kubenswrapper[4723]: I0309 13:22:13.419344 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/868faedb-4c6c-479b-b519-45c8a252dee8-scripts\") pod \"868faedb-4c6c-479b-b519-45c8a252dee8\" (UID: \"868faedb-4c6c-479b-b519-45c8a252dee8\") " Mar 09 13:22:13 crc kubenswrapper[4723]: I0309 13:22:13.419453 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/573e22bf-ce84-4ce1-bd2d-45f52b8cd30a-operator-scripts\") pod \"573e22bf-ce84-4ce1-bd2d-45f52b8cd30a\" (UID: \"573e22bf-ce84-4ce1-bd2d-45f52b8cd30a\") " Mar 09 13:22:13 crc kubenswrapper[4723]: I0309 13:22:13.419553 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/868faedb-4c6c-479b-b519-45c8a252dee8-config-data\") pod \"868faedb-4c6c-479b-b519-45c8a252dee8\" (UID: \"868faedb-4c6c-479b-b519-45c8a252dee8\") " Mar 09 13:22:13 crc kubenswrapper[4723]: I0309 13:22:13.419654 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/868faedb-4c6c-479b-b519-45c8a252dee8-log-httpd\") pod \"868faedb-4c6c-479b-b519-45c8a252dee8\" (UID: \"868faedb-4c6c-479b-b519-45c8a252dee8\") " Mar 09 13:22:13 crc kubenswrapper[4723]: I0309 13:22:13.419753 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/868faedb-4c6c-479b-b519-45c8a252dee8-combined-ca-bundle\") pod \"868faedb-4c6c-479b-b519-45c8a252dee8\" (UID: \"868faedb-4c6c-479b-b519-45c8a252dee8\") " Mar 09 13:22:13 crc kubenswrapper[4723]: I0309 13:22:13.418574 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/868faedb-4c6c-479b-b519-45c8a252dee8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "868faedb-4c6c-479b-b519-45c8a252dee8" (UID: "868faedb-4c6c-479b-b519-45c8a252dee8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:22:13 crc kubenswrapper[4723]: I0309 13:22:13.420603 4723 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c035e2e5-1d3a-4254-8950-b6893fc60ff3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:13 crc kubenswrapper[4723]: I0309 13:22:13.420705 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbcx7\" (UniqueName: \"kubernetes.io/projected/c035e2e5-1d3a-4254-8950-b6893fc60ff3-kube-api-access-fbcx7\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:13 crc kubenswrapper[4723]: I0309 13:22:13.422049 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/573e22bf-ce84-4ce1-bd2d-45f52b8cd30a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "573e22bf-ce84-4ce1-bd2d-45f52b8cd30a" (UID: "573e22bf-ce84-4ce1-bd2d-45f52b8cd30a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:13 crc kubenswrapper[4723]: I0309 13:22:13.424549 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/868faedb-4c6c-479b-b519-45c8a252dee8-kube-api-access-cpp2c" (OuterVolumeSpecName: "kube-api-access-cpp2c") pod "868faedb-4c6c-479b-b519-45c8a252dee8" (UID: "868faedb-4c6c-479b-b519-45c8a252dee8"). InnerVolumeSpecName "kube-api-access-cpp2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:13 crc kubenswrapper[4723]: I0309 13:22:13.425203 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/868faedb-4c6c-479b-b519-45c8a252dee8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "868faedb-4c6c-479b-b519-45c8a252dee8" (UID: "868faedb-4c6c-479b-b519-45c8a252dee8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:22:13 crc kubenswrapper[4723]: I0309 13:22:13.425267 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/868faedb-4c6c-479b-b519-45c8a252dee8-scripts" (OuterVolumeSpecName: "scripts") pod "868faedb-4c6c-479b-b519-45c8a252dee8" (UID: "868faedb-4c6c-479b-b519-45c8a252dee8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:13 crc kubenswrapper[4723]: I0309 13:22:13.437814 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/573e22bf-ce84-4ce1-bd2d-45f52b8cd30a-kube-api-access-t2hll" (OuterVolumeSpecName: "kube-api-access-t2hll") pod "573e22bf-ce84-4ce1-bd2d-45f52b8cd30a" (UID: "573e22bf-ce84-4ce1-bd2d-45f52b8cd30a"). InnerVolumeSpecName "kube-api-access-t2hll". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:13 crc kubenswrapper[4723]: I0309 13:22:13.522490 4723 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/868faedb-4c6c-479b-b519-45c8a252dee8-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:13 crc kubenswrapper[4723]: I0309 13:22:13.522522 4723 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/573e22bf-ce84-4ce1-bd2d-45f52b8cd30a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:13 crc kubenswrapper[4723]: I0309 13:22:13.522531 4723 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/868faedb-4c6c-479b-b519-45c8a252dee8-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:13 crc kubenswrapper[4723]: I0309 13:22:13.522538 4723 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/868faedb-4c6c-479b-b519-45c8a252dee8-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:13 crc kubenswrapper[4723]: I0309 13:22:13.522546 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpp2c\" (UniqueName: \"kubernetes.io/projected/868faedb-4c6c-479b-b519-45c8a252dee8-kube-api-access-cpp2c\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:13 crc kubenswrapper[4723]: I0309 13:22:13.522556 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2hll\" (UniqueName: \"kubernetes.io/projected/573e22bf-ce84-4ce1-bd2d-45f52b8cd30a-kube-api-access-t2hll\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:13 crc kubenswrapper[4723]: I0309 13:22:13.529369 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/868faedb-4c6c-479b-b519-45c8a252dee8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "868faedb-4c6c-479b-b519-45c8a252dee8" (UID: "868faedb-4c6c-479b-b519-45c8a252dee8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:13 crc kubenswrapper[4723]: I0309 13:22:13.599278 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/868faedb-4c6c-479b-b519-45c8a252dee8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "868faedb-4c6c-479b-b519-45c8a252dee8" (UID: "868faedb-4c6c-479b-b519-45c8a252dee8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:13 crc kubenswrapper[4723]: I0309 13:22:13.624892 4723 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/868faedb-4c6c-479b-b519-45c8a252dee8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:13 crc kubenswrapper[4723]: I0309 13:22:13.625086 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/868faedb-4c6c-479b-b519-45c8a252dee8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:13 crc kubenswrapper[4723]: I0309 13:22:13.734445 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/868faedb-4c6c-479b-b519-45c8a252dee8-config-data" (OuterVolumeSpecName: "config-data") pod "868faedb-4c6c-479b-b519-45c8a252dee8" (UID: "868faedb-4c6c-479b-b519-45c8a252dee8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:13 crc kubenswrapper[4723]: I0309 13:22:13.828801 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/868faedb-4c6c-479b-b519-45c8a252dee8-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.059332 4723 generic.go:334] "Generic (PLEG): container finished" podID="f82b3a10-19c5-4071-9ab5-5356f38bf35e" containerID="de82fa43626083c3dc28fdea40eaa87275fab33290dd422d7772b03e696b0e08" exitCode=0 Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.059396 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nlv65" event={"ID":"f82b3a10-19c5-4071-9ab5-5356f38bf35e","Type":"ContainerDied","Data":"de82fa43626083c3dc28fdea40eaa87275fab33290dd422d7772b03e696b0e08"} Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.059532 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nlv65" event={"ID":"f82b3a10-19c5-4071-9ab5-5356f38bf35e","Type":"ContainerStarted","Data":"8ec079cc5a7c7ceb292504b2e4d1d871f3b15bc0d135d472ee4249788787f227"} Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.061286 4723 generic.go:334] "Generic (PLEG): container finished" podID="36f5570d-569f-4871-9be2-bc1650c32fb8" containerID="ac9edbf387a821529550a9f1e53cb67936be28fcc2d665ab4009384ed4c0bb14" exitCode=0 Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.061402 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-948f-account-create-update-2t9dn" event={"ID":"36f5570d-569f-4871-9be2-bc1650c32fb8","Type":"ContainerDied","Data":"ac9edbf387a821529550a9f1e53cb67936be28fcc2d665ab4009384ed4c0bb14"} Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.061419 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-948f-account-create-update-2t9dn" event={"ID":"36f5570d-569f-4871-9be2-bc1650c32fb8","Type":"ContainerStarted","Data":"38668a9e26c7a437151e357921f018dc5083e6bb62d0f98753243cb032ad931b"} Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.064048 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"868faedb-4c6c-479b-b519-45c8a252dee8","Type":"ContainerDied","Data":"0c16e11258916c615bb17126e405eb645bd52a2ffb4eddd7c6ab3b1d7afd40f6"} Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.064072 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.064101 4723 scope.go:117] "RemoveContainer" containerID="e642cbfe23c56063d2e6dc0b223e408313d261acad06028681a662711dd1b182" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.065735 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"2c7e526b-cb25-4469-a6bd-b19fa44ca499","Type":"ContainerStarted","Data":"99660348fd3e76366b8db6ff05fbb40de586135af960fa837d5d1c878b42232d"} Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.073493 4723 generic.go:334] "Generic (PLEG): container finished" podID="f1248e57-4fe2-41f6-9fd9-d3980e6b3c7f" containerID="de912b3b49280290a9e08e997c841df747a2b308cc9462fab569c29911e39df9" exitCode=0 Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.073715 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-4cgtr" event={"ID":"f1248e57-4fe2-41f6-9fd9-d3980e6b3c7f","Type":"ContainerDied","Data":"de912b3b49280290a9e08e997c841df747a2b308cc9462fab569c29911e39df9"} Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.076119 4723 generic.go:334] "Generic (PLEG): container finished" podID="a08da534-32c5-4d52-ba3f-2bc7a8f491c4" containerID="5e62c40bdb2d3eed33c692b0cbc3d05d43ae4c06d524032082bf044dd17a6229" exitCode=0 Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.076179 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8b2e-account-create-update-jp7ld" event={"ID":"a08da534-32c5-4d52-ba3f-2bc7a8f491c4","Type":"ContainerDied","Data":"5e62c40bdb2d3eed33c692b0cbc3d05d43ae4c06d524032082bf044dd17a6229"} Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.078886 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-88895" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.078977 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-88895" event={"ID":"c035e2e5-1d3a-4254-8950-b6893fc60ff3","Type":"ContainerDied","Data":"0d5053ebb9369a70282ff7dfc1e6ee19d2981c0fa86c324de49373a048872945"} Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.079018 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d5053ebb9369a70282ff7dfc1e6ee19d2981c0fa86c324de49373a048872945" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.084149 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-64d0-account-create-update-92s6n" event={"ID":"573e22bf-ce84-4ce1-bd2d-45f52b8cd30a","Type":"ContainerDied","Data":"df9463658692947dbaa6646095a797d8c75c0689e76bf4cf416a225ab48f4052"} Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.084184 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df9463658692947dbaa6646095a797d8c75c0689e76bf4cf416a225ab48f4052" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.084189 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-64d0-account-create-update-92s6n" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.166165 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.8842852949999997 podStartE2EDuration="16.166119508s" podCreationTimestamp="2026-03-09 13:21:58 +0000 UTC" firstStartedPulling="2026-03-09 13:22:00.061586372 +0000 UTC m=+1394.076053912" lastFinishedPulling="2026-03-09 13:22:13.343420585 +0000 UTC m=+1407.357888125" observedRunningTime="2026-03-09 13:22:14.105778561 +0000 UTC m=+1408.120246101" watchObservedRunningTime="2026-03-09 13:22:14.166119508 +0000 UTC m=+1408.180587048" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.216171 4723 scope.go:117] "RemoveContainer" containerID="276aafd0a7cfae040a0d105fce63aa7cb3f85af7e501922b266342496107a3f1" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.246023 4723 scope.go:117] "RemoveContainer" containerID="dd5c68fc27791b5cb65c4cab5cff1b3860a58331b107eedd79abd83bc841ce93" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.249595 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.269894 4723 scope.go:117] "RemoveContainer" containerID="dd5b500095dd0c60df2c62d705c3589e5648c8728a83ccd454d9f0ec119c1858" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.272062 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.301600 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:22:14 crc kubenswrapper[4723]: E0309 13:22:14.302210 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="868faedb-4c6c-479b-b519-45c8a252dee8" containerName="ceilometer-notification-agent" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.302240 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="868faedb-4c6c-479b-b519-45c8a252dee8" containerName="ceilometer-notification-agent" Mar 09 13:22:14 crc kubenswrapper[4723]: E0309 13:22:14.302255 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c035e2e5-1d3a-4254-8950-b6893fc60ff3" containerName="mariadb-database-create" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.302262 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="c035e2e5-1d3a-4254-8950-b6893fc60ff3" containerName="mariadb-database-create" Mar 09 13:22:14 crc kubenswrapper[4723]: E0309 13:22:14.302292 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="868faedb-4c6c-479b-b519-45c8a252dee8" containerName="ceilometer-central-agent" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.302300 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="868faedb-4c6c-479b-b519-45c8a252dee8" containerName="ceilometer-central-agent" Mar 09 13:22:14 crc kubenswrapper[4723]: E0309 13:22:14.302308 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="868faedb-4c6c-479b-b519-45c8a252dee8" containerName="proxy-httpd" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.302316 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="868faedb-4c6c-479b-b519-45c8a252dee8" containerName="proxy-httpd" Mar 09 13:22:14 crc kubenswrapper[4723]: E0309 13:22:14.302324 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="573e22bf-ce84-4ce1-bd2d-45f52b8cd30a" containerName="mariadb-account-create-update" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.302332 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="573e22bf-ce84-4ce1-bd2d-45f52b8cd30a" containerName="mariadb-account-create-update" Mar 09 13:22:14 crc kubenswrapper[4723]: E0309 13:22:14.302340 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="868faedb-4c6c-479b-b519-45c8a252dee8" containerName="sg-core" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.302347 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="868faedb-4c6c-479b-b519-45c8a252dee8" containerName="sg-core" Mar 09 13:22:14 crc kubenswrapper[4723]: E0309 13:22:14.302379 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54bb3205-95cb-4772-977a-3f33fcfe1ab3" containerName="oc" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.302387 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="54bb3205-95cb-4772-977a-3f33fcfe1ab3" containerName="oc" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.302652 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="c035e2e5-1d3a-4254-8950-b6893fc60ff3" containerName="mariadb-database-create" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.302683 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="868faedb-4c6c-479b-b519-45c8a252dee8" containerName="proxy-httpd" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.302702 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="573e22bf-ce84-4ce1-bd2d-45f52b8cd30a" containerName="mariadb-account-create-update" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.302716 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="868faedb-4c6c-479b-b519-45c8a252dee8" containerName="ceilometer-notification-agent" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.302734 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="868faedb-4c6c-479b-b519-45c8a252dee8" containerName="sg-core" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.302753 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="868faedb-4c6c-479b-b519-45c8a252dee8" containerName="ceilometer-central-agent" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.302769 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="54bb3205-95cb-4772-977a-3f33fcfe1ab3" containerName="oc" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.305413 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.307812 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.308065 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.314148 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.343093 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f180e14d-e014-49e0-8177-619c97476f71-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f180e14d-e014-49e0-8177-619c97476f71\") " pod="openstack/ceilometer-0" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.343156 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9bn8\" (UniqueName: \"kubernetes.io/projected/f180e14d-e014-49e0-8177-619c97476f71-kube-api-access-b9bn8\") pod \"ceilometer-0\" (UID: \"f180e14d-e014-49e0-8177-619c97476f71\") " pod="openstack/ceilometer-0" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.343245 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f180e14d-e014-49e0-8177-619c97476f71-log-httpd\") pod \"ceilometer-0\" (UID: \"f180e14d-e014-49e0-8177-619c97476f71\") " pod="openstack/ceilometer-0" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.343313 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f180e14d-e014-49e0-8177-619c97476f71-run-httpd\") pod \"ceilometer-0\" (UID: \"f180e14d-e014-49e0-8177-619c97476f71\") " pod="openstack/ceilometer-0" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.343332 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f180e14d-e014-49e0-8177-619c97476f71-config-data\") pod \"ceilometer-0\" (UID: \"f180e14d-e014-49e0-8177-619c97476f71\") " pod="openstack/ceilometer-0" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.343371 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f180e14d-e014-49e0-8177-619c97476f71-scripts\") pod \"ceilometer-0\" (UID: \"f180e14d-e014-49e0-8177-619c97476f71\") " pod="openstack/ceilometer-0" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.343387 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f180e14d-e014-49e0-8177-619c97476f71-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f180e14d-e014-49e0-8177-619c97476f71\") " pod="openstack/ceilometer-0" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.445471 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f180e14d-e014-49e0-8177-619c97476f71-scripts\") pod \"ceilometer-0\" (UID: \"f180e14d-e014-49e0-8177-619c97476f71\") " pod="openstack/ceilometer-0" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.445520 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f180e14d-e014-49e0-8177-619c97476f71-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f180e14d-e014-49e0-8177-619c97476f71\") " pod="openstack/ceilometer-0" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.446321 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f180e14d-e014-49e0-8177-619c97476f71-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f180e14d-e014-49e0-8177-619c97476f71\") " pod="openstack/ceilometer-0" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.446393 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9bn8\" (UniqueName: \"kubernetes.io/projected/f180e14d-e014-49e0-8177-619c97476f71-kube-api-access-b9bn8\") pod \"ceilometer-0\" (UID: \"f180e14d-e014-49e0-8177-619c97476f71\") " pod="openstack/ceilometer-0" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.447033 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f180e14d-e014-49e0-8177-619c97476f71-log-httpd\") pod \"ceilometer-0\" (UID: \"f180e14d-e014-49e0-8177-619c97476f71\") " pod="openstack/ceilometer-0" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.447256 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f180e14d-e014-49e0-8177-619c97476f71-run-httpd\") pod \"ceilometer-0\" (UID: \"f180e14d-e014-49e0-8177-619c97476f71\") " pod="openstack/ceilometer-0" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.447289 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f180e14d-e014-49e0-8177-619c97476f71-config-data\") pod \"ceilometer-0\" (UID: \"f180e14d-e014-49e0-8177-619c97476f71\") " pod="openstack/ceilometer-0" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.447501 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f180e14d-e014-49e0-8177-619c97476f71-log-httpd\") pod \"ceilometer-0\" (UID: \"f180e14d-e014-49e0-8177-619c97476f71\") " pod="openstack/ceilometer-0" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.447740 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f180e14d-e014-49e0-8177-619c97476f71-run-httpd\") pod \"ceilometer-0\" (UID: \"f180e14d-e014-49e0-8177-619c97476f71\") " pod="openstack/ceilometer-0" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.449736 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f180e14d-e014-49e0-8177-619c97476f71-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f180e14d-e014-49e0-8177-619c97476f71\") " pod="openstack/ceilometer-0" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.459768 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f180e14d-e014-49e0-8177-619c97476f71-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f180e14d-e014-49e0-8177-619c97476f71\") " pod="openstack/ceilometer-0" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.459976 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f180e14d-e014-49e0-8177-619c97476f71-scripts\") pod \"ceilometer-0\" (UID: \"f180e14d-e014-49e0-8177-619c97476f71\") " pod="openstack/ceilometer-0" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.460249 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f180e14d-e014-49e0-8177-619c97476f71-config-data\") pod \"ceilometer-0\" (UID: \"f180e14d-e014-49e0-8177-619c97476f71\") " pod="openstack/ceilometer-0" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.468054 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9bn8\" (UniqueName: \"kubernetes.io/projected/f180e14d-e014-49e0-8177-619c97476f71-kube-api-access-b9bn8\") pod \"ceilometer-0\" (UID: \"f180e14d-e014-49e0-8177-619c97476f71\") " pod="openstack/ceilometer-0" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.640124 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:22:14 crc kubenswrapper[4723]: I0309 13:22:14.899466 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="868faedb-4c6c-479b-b519-45c8a252dee8" path="/var/lib/kubelet/pods/868faedb-4c6c-479b-b519-45c8a252dee8/volumes" Mar 09 13:22:15 crc kubenswrapper[4723]: I0309 13:22:15.077057 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-575fd88557-l2fxr" Mar 09 13:22:15 crc kubenswrapper[4723]: I0309 13:22:15.099725 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-575fd88557-l2fxr" Mar 09 13:22:15 crc kubenswrapper[4723]: I0309 13:22:15.175302 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:22:15 crc kubenswrapper[4723]: I0309 13:22:15.721924 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:22:15 crc kubenswrapper[4723]: I0309 13:22:15.749309 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-948f-account-create-update-2t9dn" Mar 09 13:22:15 crc kubenswrapper[4723]: I0309 13:22:15.887552 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tc2b2\" (UniqueName: \"kubernetes.io/projected/36f5570d-569f-4871-9be2-bc1650c32fb8-kube-api-access-tc2b2\") pod \"36f5570d-569f-4871-9be2-bc1650c32fb8\" (UID: \"36f5570d-569f-4871-9be2-bc1650c32fb8\") " Mar 09 13:22:15 crc kubenswrapper[4723]: I0309 13:22:15.887958 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36f5570d-569f-4871-9be2-bc1650c32fb8-operator-scripts\") pod \"36f5570d-569f-4871-9be2-bc1650c32fb8\" (UID: \"36f5570d-569f-4871-9be2-bc1650c32fb8\") " Mar 09 13:22:15 crc kubenswrapper[4723]: I0309 13:22:15.889943 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36f5570d-569f-4871-9be2-bc1650c32fb8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "36f5570d-569f-4871-9be2-bc1650c32fb8" (UID: "36f5570d-569f-4871-9be2-bc1650c32fb8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:15 crc kubenswrapper[4723]: I0309 13:22:15.893035 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36f5570d-569f-4871-9be2-bc1650c32fb8-kube-api-access-tc2b2" (OuterVolumeSpecName: "kube-api-access-tc2b2") pod "36f5570d-569f-4871-9be2-bc1650c32fb8" (UID: "36f5570d-569f-4871-9be2-bc1650c32fb8"). InnerVolumeSpecName "kube-api-access-tc2b2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:15 crc kubenswrapper[4723]: I0309 13:22:15.955640 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-4cgtr" Mar 09 13:22:15 crc kubenswrapper[4723]: I0309 13:22:15.964096 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nlv65" Mar 09 13:22:15 crc kubenswrapper[4723]: I0309 13:22:15.972429 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8b2e-account-create-update-jp7ld" Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.040704 4723 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36f5570d-569f-4871-9be2-bc1650c32fb8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.040744 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tc2b2\" (UniqueName: \"kubernetes.io/projected/36f5570d-569f-4871-9be2-bc1650c32fb8-kube-api-access-tc2b2\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.119508 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-4cgtr" event={"ID":"f1248e57-4fe2-41f6-9fd9-d3980e6b3c7f","Type":"ContainerDied","Data":"50fcce62ceaa3749d0287f66112ae4853c8624aadc970fce8bdbce0ce8b97007"} Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.120008 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50fcce62ceaa3749d0287f66112ae4853c8624aadc970fce8bdbce0ce8b97007" Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.119731 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-4cgtr" Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.123578 4723 generic.go:334] "Generic (PLEG): container finished" podID="de4e8079-9f44-44ce-937d-0364b3ff7a9e" containerID="21d2fe4aec92c09368a83bf195eb1d1edb63abd53ee959d548c5ef4582b18375" exitCode=0 Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.123649 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-597569c5dd-vxwdd" event={"ID":"de4e8079-9f44-44ce-937d-0364b3ff7a9e","Type":"ContainerDied","Data":"21d2fe4aec92c09368a83bf195eb1d1edb63abd53ee959d548c5ef4582b18375"} Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.126111 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f180e14d-e014-49e0-8177-619c97476f71","Type":"ContainerStarted","Data":"0272fd11ac7fc956be7b2bcb860e5bf6408a5fd85e06c7da6adddcd36f4e70c4"} Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.126157 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f180e14d-e014-49e0-8177-619c97476f71","Type":"ContainerStarted","Data":"1b2ec9758db1896f3384b5dead357cce3c85b04be6a57559fe0b79bee18d74e1"} Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.130635 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8b2e-account-create-update-jp7ld" event={"ID":"a08da534-32c5-4d52-ba3f-2bc7a8f491c4","Type":"ContainerDied","Data":"bf1a79e51e16c0a1295352b174e4e70a53c788c00e617aa448c75faec3687708"} Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.130679 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf1a79e51e16c0a1295352b174e4e70a53c788c00e617aa448c75faec3687708" Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.130790 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8b2e-account-create-update-jp7ld" Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.135174 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nlv65" event={"ID":"f82b3a10-19c5-4071-9ab5-5356f38bf35e","Type":"ContainerDied","Data":"8ec079cc5a7c7ceb292504b2e4d1d871f3b15bc0d135d472ee4249788787f227"} Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.135393 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ec079cc5a7c7ceb292504b2e4d1d871f3b15bc0d135d472ee4249788787f227" Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.135227 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nlv65" Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.140566 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-948f-account-create-update-2t9dn" event={"ID":"36f5570d-569f-4871-9be2-bc1650c32fb8","Type":"ContainerDied","Data":"38668a9e26c7a437151e357921f018dc5083e6bb62d0f98753243cb032ad931b"} Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.140611 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38668a9e26c7a437151e357921f018dc5083e6bb62d0f98753243cb032ad931b" Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.140656 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-948f-account-create-update-2t9dn" Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.143347 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8vjh\" (UniqueName: \"kubernetes.io/projected/f1248e57-4fe2-41f6-9fd9-d3980e6b3c7f-kube-api-access-k8vjh\") pod \"f1248e57-4fe2-41f6-9fd9-d3980e6b3c7f\" (UID: \"f1248e57-4fe2-41f6-9fd9-d3980e6b3c7f\") " Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.143538 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j48n2\" (UniqueName: \"kubernetes.io/projected/a08da534-32c5-4d52-ba3f-2bc7a8f491c4-kube-api-access-j48n2\") pod \"a08da534-32c5-4d52-ba3f-2bc7a8f491c4\" (UID: \"a08da534-32c5-4d52-ba3f-2bc7a8f491c4\") " Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.143598 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f82b3a10-19c5-4071-9ab5-5356f38bf35e-operator-scripts\") pod \"f82b3a10-19c5-4071-9ab5-5356f38bf35e\" (UID: \"f82b3a10-19c5-4071-9ab5-5356f38bf35e\") " Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.143666 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a08da534-32c5-4d52-ba3f-2bc7a8f491c4-operator-scripts\") pod \"a08da534-32c5-4d52-ba3f-2bc7a8f491c4\" (UID: \"a08da534-32c5-4d52-ba3f-2bc7a8f491c4\") " Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.143770 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1248e57-4fe2-41f6-9fd9-d3980e6b3c7f-operator-scripts\") pod \"f1248e57-4fe2-41f6-9fd9-d3980e6b3c7f\" (UID: \"f1248e57-4fe2-41f6-9fd9-d3980e6b3c7f\") " Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.143791 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4l69c\" (UniqueName: \"kubernetes.io/projected/f82b3a10-19c5-4071-9ab5-5356f38bf35e-kube-api-access-4l69c\") pod \"f82b3a10-19c5-4071-9ab5-5356f38bf35e\" (UID: \"f82b3a10-19c5-4071-9ab5-5356f38bf35e\") " Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.144573 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f82b3a10-19c5-4071-9ab5-5356f38bf35e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f82b3a10-19c5-4071-9ab5-5356f38bf35e" (UID: "f82b3a10-19c5-4071-9ab5-5356f38bf35e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.145175 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a08da534-32c5-4d52-ba3f-2bc7a8f491c4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a08da534-32c5-4d52-ba3f-2bc7a8f491c4" (UID: "a08da534-32c5-4d52-ba3f-2bc7a8f491c4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.145802 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1248e57-4fe2-41f6-9fd9-d3980e6b3c7f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f1248e57-4fe2-41f6-9fd9-d3980e6b3c7f" (UID: "f1248e57-4fe2-41f6-9fd9-d3980e6b3c7f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.151263 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f82b3a10-19c5-4071-9ab5-5356f38bf35e-kube-api-access-4l69c" (OuterVolumeSpecName: "kube-api-access-4l69c") pod "f82b3a10-19c5-4071-9ab5-5356f38bf35e" (UID: "f82b3a10-19c5-4071-9ab5-5356f38bf35e"). InnerVolumeSpecName "kube-api-access-4l69c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.152123 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1248e57-4fe2-41f6-9fd9-d3980e6b3c7f-kube-api-access-k8vjh" (OuterVolumeSpecName: "kube-api-access-k8vjh") pod "f1248e57-4fe2-41f6-9fd9-d3980e6b3c7f" (UID: "f1248e57-4fe2-41f6-9fd9-d3980e6b3c7f"). InnerVolumeSpecName "kube-api-access-k8vjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.157027 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a08da534-32c5-4d52-ba3f-2bc7a8f491c4-kube-api-access-j48n2" (OuterVolumeSpecName: "kube-api-access-j48n2") pod "a08da534-32c5-4d52-ba3f-2bc7a8f491c4" (UID: "a08da534-32c5-4d52-ba3f-2bc7a8f491c4"). InnerVolumeSpecName "kube-api-access-j48n2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.247014 4723 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a08da534-32c5-4d52-ba3f-2bc7a8f491c4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.247038 4723 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1248e57-4fe2-41f6-9fd9-d3980e6b3c7f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.247049 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4l69c\" (UniqueName: \"kubernetes.io/projected/f82b3a10-19c5-4071-9ab5-5356f38bf35e-kube-api-access-4l69c\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.247058 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8vjh\" (UniqueName: \"kubernetes.io/projected/f1248e57-4fe2-41f6-9fd9-d3980e6b3c7f-kube-api-access-k8vjh\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.247066 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j48n2\" (UniqueName: \"kubernetes.io/projected/a08da534-32c5-4d52-ba3f-2bc7a8f491c4-kube-api-access-j48n2\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.247074 4723 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f82b3a10-19c5-4071-9ab5-5356f38bf35e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.569164 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-597569c5dd-vxwdd" Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.658603 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/de4e8079-9f44-44ce-937d-0364b3ff7a9e-config\") pod \"de4e8079-9f44-44ce-937d-0364b3ff7a9e\" (UID: \"de4e8079-9f44-44ce-937d-0364b3ff7a9e\") " Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.658894 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de4e8079-9f44-44ce-937d-0364b3ff7a9e-combined-ca-bundle\") pod \"de4e8079-9f44-44ce-937d-0364b3ff7a9e\" (UID: \"de4e8079-9f44-44ce-937d-0364b3ff7a9e\") " Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.661538 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/de4e8079-9f44-44ce-937d-0364b3ff7a9e-httpd-config\") pod \"de4e8079-9f44-44ce-937d-0364b3ff7a9e\" (UID: \"de4e8079-9f44-44ce-937d-0364b3ff7a9e\") " Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.661718 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de4e8079-9f44-44ce-937d-0364b3ff7a9e-ovndb-tls-certs\") pod \"de4e8079-9f44-44ce-937d-0364b3ff7a9e\" (UID: \"de4e8079-9f44-44ce-937d-0364b3ff7a9e\") " Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.661835 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25ssx\" (UniqueName: \"kubernetes.io/projected/de4e8079-9f44-44ce-937d-0364b3ff7a9e-kube-api-access-25ssx\") pod \"de4e8079-9f44-44ce-937d-0364b3ff7a9e\" (UID: \"de4e8079-9f44-44ce-937d-0364b3ff7a9e\") " Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.670658 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de4e8079-9f44-44ce-937d-0364b3ff7a9e-kube-api-access-25ssx" (OuterVolumeSpecName: "kube-api-access-25ssx") pod "de4e8079-9f44-44ce-937d-0364b3ff7a9e" (UID: "de4e8079-9f44-44ce-937d-0364b3ff7a9e"). InnerVolumeSpecName "kube-api-access-25ssx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.671237 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de4e8079-9f44-44ce-937d-0364b3ff7a9e-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "de4e8079-9f44-44ce-937d-0364b3ff7a9e" (UID: "de4e8079-9f44-44ce-937d-0364b3ff7a9e"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.765238 4723 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/de4e8079-9f44-44ce-937d-0364b3ff7a9e-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.765273 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25ssx\" (UniqueName: \"kubernetes.io/projected/de4e8079-9f44-44ce-937d-0364b3ff7a9e-kube-api-access-25ssx\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.775644 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de4e8079-9f44-44ce-937d-0364b3ff7a9e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de4e8079-9f44-44ce-937d-0364b3ff7a9e" (UID: "de4e8079-9f44-44ce-937d-0364b3ff7a9e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.775911 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de4e8079-9f44-44ce-937d-0364b3ff7a9e-config" (OuterVolumeSpecName: "config") pod "de4e8079-9f44-44ce-937d-0364b3ff7a9e" (UID: "de4e8079-9f44-44ce-937d-0364b3ff7a9e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.794491 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de4e8079-9f44-44ce-937d-0364b3ff7a9e-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "de4e8079-9f44-44ce-937d-0364b3ff7a9e" (UID: "de4e8079-9f44-44ce-937d-0364b3ff7a9e"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.867075 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/de4e8079-9f44-44ce-937d-0364b3ff7a9e-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.867108 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de4e8079-9f44-44ce-937d-0364b3ff7a9e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:16 crc kubenswrapper[4723]: I0309 13:22:16.867120 4723 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de4e8079-9f44-44ce-937d-0364b3ff7a9e-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:17 crc kubenswrapper[4723]: I0309 13:22:17.152887 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-597569c5dd-vxwdd" event={"ID":"de4e8079-9f44-44ce-937d-0364b3ff7a9e","Type":"ContainerDied","Data":"3e48af8a56fad11cc047b1d1d435d508320f0297008ac8e5d80628bf0ed3f546"} Mar 09 13:22:17 crc kubenswrapper[4723]: I0309 13:22:17.153222 4723 scope.go:117] "RemoveContainer" containerID="396ebe17fa795e7aaeb9ee1d3e1c6982be72aa9650342f05d98b29e567725f39" Mar 09 13:22:17 crc kubenswrapper[4723]: I0309 13:22:17.152908 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-597569c5dd-vxwdd" Mar 09 13:22:17 crc kubenswrapper[4723]: I0309 13:22:17.156239 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f180e14d-e014-49e0-8177-619c97476f71","Type":"ContainerStarted","Data":"7b23dca3800cbe823fc146c2ca4068e4badd3ebd69cc7d2a607209bd762fe9c3"} Mar 09 13:22:17 crc kubenswrapper[4723]: I0309 13:22:17.175612 4723 scope.go:117] "RemoveContainer" containerID="21d2fe4aec92c09368a83bf195eb1d1edb63abd53ee959d548c5ef4582b18375" Mar 09 13:22:17 crc kubenswrapper[4723]: I0309 13:22:17.190311 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-597569c5dd-vxwdd"] Mar 09 13:22:17 crc kubenswrapper[4723]: I0309 13:22:17.203720 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-597569c5dd-vxwdd"] Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.187605 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f180e14d-e014-49e0-8177-619c97476f71","Type":"ContainerStarted","Data":"673b7722468a4748c2c77474ebc301afdf3c4a61f1b63c8bdf3239852eb96420"} Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.448538 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-756d785958-qpphh"] Mar 09 13:22:18 crc kubenswrapper[4723]: E0309 13:22:18.449079 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f82b3a10-19c5-4071-9ab5-5356f38bf35e" containerName="mariadb-database-create" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.449094 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="f82b3a10-19c5-4071-9ab5-5356f38bf35e" containerName="mariadb-database-create" Mar 09 13:22:18 crc kubenswrapper[4723]: E0309 13:22:18.449120 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de4e8079-9f44-44ce-937d-0364b3ff7a9e" containerName="neutron-httpd" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.449128 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="de4e8079-9f44-44ce-937d-0364b3ff7a9e" containerName="neutron-httpd" Mar 09 13:22:18 crc kubenswrapper[4723]: E0309 13:22:18.449156 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de4e8079-9f44-44ce-937d-0364b3ff7a9e" containerName="neutron-api" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.449163 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="de4e8079-9f44-44ce-937d-0364b3ff7a9e" containerName="neutron-api" Mar 09 13:22:18 crc kubenswrapper[4723]: E0309 13:22:18.449179 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36f5570d-569f-4871-9be2-bc1650c32fb8" containerName="mariadb-account-create-update" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.449186 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="36f5570d-569f-4871-9be2-bc1650c32fb8" containerName="mariadb-account-create-update" Mar 09 13:22:18 crc kubenswrapper[4723]: E0309 13:22:18.449198 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a08da534-32c5-4d52-ba3f-2bc7a8f491c4" containerName="mariadb-account-create-update" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.449204 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="a08da534-32c5-4d52-ba3f-2bc7a8f491c4" containerName="mariadb-account-create-update" Mar 09 13:22:18 crc kubenswrapper[4723]: E0309 13:22:18.449212 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1248e57-4fe2-41f6-9fd9-d3980e6b3c7f" containerName="mariadb-database-create" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.449218 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1248e57-4fe2-41f6-9fd9-d3980e6b3c7f" containerName="mariadb-database-create" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.449408 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="de4e8079-9f44-44ce-937d-0364b3ff7a9e" containerName="neutron-api" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.449426 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="de4e8079-9f44-44ce-937d-0364b3ff7a9e" containerName="neutron-httpd" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.449439 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="a08da534-32c5-4d52-ba3f-2bc7a8f491c4" containerName="mariadb-account-create-update" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.449455 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1248e57-4fe2-41f6-9fd9-d3980e6b3c7f" containerName="mariadb-database-create" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.449466 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="f82b3a10-19c5-4071-9ab5-5356f38bf35e" containerName="mariadb-database-create" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.449477 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="36f5570d-569f-4871-9be2-bc1650c32fb8" containerName="mariadb-account-create-update" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.450245 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-756d785958-qpphh" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.453756 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-f69fq" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.454020 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.454229 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.477152 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-756d785958-qpphh"] Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.501848 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89zmp\" (UniqueName: \"kubernetes.io/projected/d365869f-e896-43d2-80ab-520b5e71beae-kube-api-access-89zmp\") pod \"heat-engine-756d785958-qpphh\" (UID: \"d365869f-e896-43d2-80ab-520b5e71beae\") " pod="openstack/heat-engine-756d785958-qpphh" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.501963 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d365869f-e896-43d2-80ab-520b5e71beae-config-data\") pod \"heat-engine-756d785958-qpphh\" (UID: \"d365869f-e896-43d2-80ab-520b5e71beae\") " pod="openstack/heat-engine-756d785958-qpphh" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.502043 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d365869f-e896-43d2-80ab-520b5e71beae-config-data-custom\") pod \"heat-engine-756d785958-qpphh\" (UID: \"d365869f-e896-43d2-80ab-520b5e71beae\") " pod="openstack/heat-engine-756d785958-qpphh" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.502229 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d365869f-e896-43d2-80ab-520b5e71beae-combined-ca-bundle\") pod \"heat-engine-756d785958-qpphh\" (UID: \"d365869f-e896-43d2-80ab-520b5e71beae\") " pod="openstack/heat-engine-756d785958-qpphh" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.549501 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f6bc4c6c9-vrvk6"] Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.551253 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f6bc4c6c9-vrvk6" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.606347 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f49fba07-b6b5-4e69-9c8a-2c3e1b09182f-ovsdbserver-sb\") pod \"dnsmasq-dns-f6bc4c6c9-vrvk6\" (UID: \"f49fba07-b6b5-4e69-9c8a-2c3e1b09182f\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-vrvk6" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.606442 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kk72\" (UniqueName: \"kubernetes.io/projected/f49fba07-b6b5-4e69-9c8a-2c3e1b09182f-kube-api-access-7kk72\") pod \"dnsmasq-dns-f6bc4c6c9-vrvk6\" (UID: \"f49fba07-b6b5-4e69-9c8a-2c3e1b09182f\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-vrvk6" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.606509 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d365869f-e896-43d2-80ab-520b5e71beae-combined-ca-bundle\") pod \"heat-engine-756d785958-qpphh\" (UID: \"d365869f-e896-43d2-80ab-520b5e71beae\") " pod="openstack/heat-engine-756d785958-qpphh" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.606568 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f49fba07-b6b5-4e69-9c8a-2c3e1b09182f-dns-svc\") pod \"dnsmasq-dns-f6bc4c6c9-vrvk6\" (UID: \"f49fba07-b6b5-4e69-9c8a-2c3e1b09182f\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-vrvk6" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.606619 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f49fba07-b6b5-4e69-9c8a-2c3e1b09182f-dns-swift-storage-0\") pod \"dnsmasq-dns-f6bc4c6c9-vrvk6\" (UID: \"f49fba07-b6b5-4e69-9c8a-2c3e1b09182f\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-vrvk6" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.606710 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f49fba07-b6b5-4e69-9c8a-2c3e1b09182f-ovsdbserver-nb\") pod \"dnsmasq-dns-f6bc4c6c9-vrvk6\" (UID: \"f49fba07-b6b5-4e69-9c8a-2c3e1b09182f\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-vrvk6" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.606749 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89zmp\" (UniqueName: \"kubernetes.io/projected/d365869f-e896-43d2-80ab-520b5e71beae-kube-api-access-89zmp\") pod \"heat-engine-756d785958-qpphh\" (UID: \"d365869f-e896-43d2-80ab-520b5e71beae\") " pod="openstack/heat-engine-756d785958-qpphh" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.606802 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d365869f-e896-43d2-80ab-520b5e71beae-config-data\") pod \"heat-engine-756d785958-qpphh\" (UID: \"d365869f-e896-43d2-80ab-520b5e71beae\") " pod="openstack/heat-engine-756d785958-qpphh" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.606893 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d365869f-e896-43d2-80ab-520b5e71beae-config-data-custom\") pod \"heat-engine-756d785958-qpphh\" (UID: \"d365869f-e896-43d2-80ab-520b5e71beae\") " pod="openstack/heat-engine-756d785958-qpphh" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.607046 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f49fba07-b6b5-4e69-9c8a-2c3e1b09182f-config\") pod \"dnsmasq-dns-f6bc4c6c9-vrvk6\" (UID: \"f49fba07-b6b5-4e69-9c8a-2c3e1b09182f\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-vrvk6" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.620501 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d365869f-e896-43d2-80ab-520b5e71beae-config-data-custom\") pod \"heat-engine-756d785958-qpphh\" (UID: \"d365869f-e896-43d2-80ab-520b5e71beae\") " pod="openstack/heat-engine-756d785958-qpphh" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.620910 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d365869f-e896-43d2-80ab-520b5e71beae-config-data\") pod \"heat-engine-756d785958-qpphh\" (UID: \"d365869f-e896-43d2-80ab-520b5e71beae\") " pod="openstack/heat-engine-756d785958-qpphh" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.631725 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d365869f-e896-43d2-80ab-520b5e71beae-combined-ca-bundle\") pod \"heat-engine-756d785958-qpphh\" (UID: \"d365869f-e896-43d2-80ab-520b5e71beae\") " pod="openstack/heat-engine-756d785958-qpphh" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.631803 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f6bc4c6c9-vrvk6"] Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.683024 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89zmp\" (UniqueName: \"kubernetes.io/projected/d365869f-e896-43d2-80ab-520b5e71beae-kube-api-access-89zmp\") pod \"heat-engine-756d785958-qpphh\" (UID: \"d365869f-e896-43d2-80ab-520b5e71beae\") " pod="openstack/heat-engine-756d785958-qpphh" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.688911 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-59fcdfbdd7-xjdk4"] Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.691150 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-59fcdfbdd7-xjdk4" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.704704 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.711043 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f49fba07-b6b5-4e69-9c8a-2c3e1b09182f-config\") pod \"dnsmasq-dns-f6bc4c6c9-vrvk6\" (UID: \"f49fba07-b6b5-4e69-9c8a-2c3e1b09182f\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-vrvk6" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.711134 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f49fba07-b6b5-4e69-9c8a-2c3e1b09182f-ovsdbserver-sb\") pod \"dnsmasq-dns-f6bc4c6c9-vrvk6\" (UID: \"f49fba07-b6b5-4e69-9c8a-2c3e1b09182f\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-vrvk6" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.711203 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kk72\" (UniqueName: \"kubernetes.io/projected/f49fba07-b6b5-4e69-9c8a-2c3e1b09182f-kube-api-access-7kk72\") pod \"dnsmasq-dns-f6bc4c6c9-vrvk6\" (UID: \"f49fba07-b6b5-4e69-9c8a-2c3e1b09182f\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-vrvk6" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.711306 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f49fba07-b6b5-4e69-9c8a-2c3e1b09182f-dns-svc\") pod \"dnsmasq-dns-f6bc4c6c9-vrvk6\" (UID: \"f49fba07-b6b5-4e69-9c8a-2c3e1b09182f\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-vrvk6" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.711380 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f49fba07-b6b5-4e69-9c8a-2c3e1b09182f-dns-swift-storage-0\") pod \"dnsmasq-dns-f6bc4c6c9-vrvk6\" (UID: \"f49fba07-b6b5-4e69-9c8a-2c3e1b09182f\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-vrvk6" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.711481 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f49fba07-b6b5-4e69-9c8a-2c3e1b09182f-ovsdbserver-nb\") pod \"dnsmasq-dns-f6bc4c6c9-vrvk6\" (UID: \"f49fba07-b6b5-4e69-9c8a-2c3e1b09182f\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-vrvk6" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.712385 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f49fba07-b6b5-4e69-9c8a-2c3e1b09182f-config\") pod \"dnsmasq-dns-f6bc4c6c9-vrvk6\" (UID: \"f49fba07-b6b5-4e69-9c8a-2c3e1b09182f\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-vrvk6" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.713249 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f49fba07-b6b5-4e69-9c8a-2c3e1b09182f-ovsdbserver-sb\") pod \"dnsmasq-dns-f6bc4c6c9-vrvk6\" (UID: \"f49fba07-b6b5-4e69-9c8a-2c3e1b09182f\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-vrvk6" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.714346 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f49fba07-b6b5-4e69-9c8a-2c3e1b09182f-dns-swift-storage-0\") pod \"dnsmasq-dns-f6bc4c6c9-vrvk6\" (UID: \"f49fba07-b6b5-4e69-9c8a-2c3e1b09182f\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-vrvk6" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.715504 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f49fba07-b6b5-4e69-9c8a-2c3e1b09182f-dns-svc\") pod \"dnsmasq-dns-f6bc4c6c9-vrvk6\" (UID: \"f49fba07-b6b5-4e69-9c8a-2c3e1b09182f\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-vrvk6" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.716530 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f49fba07-b6b5-4e69-9c8a-2c3e1b09182f-ovsdbserver-nb\") pod \"dnsmasq-dns-f6bc4c6c9-vrvk6\" (UID: \"f49fba07-b6b5-4e69-9c8a-2c3e1b09182f\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-vrvk6" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.744032 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kk72\" (UniqueName: \"kubernetes.io/projected/f49fba07-b6b5-4e69-9c8a-2c3e1b09182f-kube-api-access-7kk72\") pod \"dnsmasq-dns-f6bc4c6c9-vrvk6\" (UID: \"f49fba07-b6b5-4e69-9c8a-2c3e1b09182f\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-vrvk6" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.797532 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-756d785958-qpphh" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.817880 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2psw5\" (UniqueName: \"kubernetes.io/projected/e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff-kube-api-access-2psw5\") pod \"heat-api-59fcdfbdd7-xjdk4\" (UID: \"e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff\") " pod="openstack/heat-api-59fcdfbdd7-xjdk4" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.818043 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff-combined-ca-bundle\") pod \"heat-api-59fcdfbdd7-xjdk4\" (UID: \"e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff\") " pod="openstack/heat-api-59fcdfbdd7-xjdk4" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.818145 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff-config-data\") pod \"heat-api-59fcdfbdd7-xjdk4\" (UID: \"e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff\") " pod="openstack/heat-api-59fcdfbdd7-xjdk4" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.818290 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff-config-data-custom\") pod \"heat-api-59fcdfbdd7-xjdk4\" (UID: \"e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff\") " pod="openstack/heat-api-59fcdfbdd7-xjdk4" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.892827 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-59fcdfbdd7-xjdk4"] Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.920376 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff-config-data\") pod \"heat-api-59fcdfbdd7-xjdk4\" (UID: \"e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff\") " pod="openstack/heat-api-59fcdfbdd7-xjdk4" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.920544 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff-config-data-custom\") pod \"heat-api-59fcdfbdd7-xjdk4\" (UID: \"e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff\") " pod="openstack/heat-api-59fcdfbdd7-xjdk4" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.920648 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2psw5\" (UniqueName: \"kubernetes.io/projected/e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff-kube-api-access-2psw5\") pod \"heat-api-59fcdfbdd7-xjdk4\" (UID: \"e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff\") " pod="openstack/heat-api-59fcdfbdd7-xjdk4" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.920760 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff-combined-ca-bundle\") pod \"heat-api-59fcdfbdd7-xjdk4\" (UID: \"e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff\") " pod="openstack/heat-api-59fcdfbdd7-xjdk4" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.924817 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de4e8079-9f44-44ce-937d-0364b3ff7a9e" path="/var/lib/kubelet/pods/de4e8079-9f44-44ce-937d-0364b3ff7a9e/volumes" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.927887 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff-config-data\") pod \"heat-api-59fcdfbdd7-xjdk4\" (UID: \"e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff\") " pod="openstack/heat-api-59fcdfbdd7-xjdk4" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.937129 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff-config-data-custom\") pod \"heat-api-59fcdfbdd7-xjdk4\" (UID: \"e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff\") " pod="openstack/heat-api-59fcdfbdd7-xjdk4" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.949925 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-858f65d478-kpb9g"] Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.951473 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-858f65d478-kpb9g" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.956783 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff-combined-ca-bundle\") pod \"heat-api-59fcdfbdd7-xjdk4\" (UID: \"e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff\") " pod="openstack/heat-api-59fcdfbdd7-xjdk4" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.957085 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.966601 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2psw5\" (UniqueName: \"kubernetes.io/projected/e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff-kube-api-access-2psw5\") pod \"heat-api-59fcdfbdd7-xjdk4\" (UID: \"e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff\") " pod="openstack/heat-api-59fcdfbdd7-xjdk4" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.967584 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-59fcdfbdd7-xjdk4" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.968147 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f6bc4c6c9-vrvk6" Mar 09 13:22:18 crc kubenswrapper[4723]: I0309 13:22:18.994678 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-858f65d478-kpb9g"] Mar 09 13:22:19 crc kubenswrapper[4723]: I0309 13:22:19.125816 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd2px\" (UniqueName: \"kubernetes.io/projected/679e1a62-9b64-4f60-a7b5-b218eed30fd7-kube-api-access-bd2px\") pod \"heat-cfnapi-858f65d478-kpb9g\" (UID: \"679e1a62-9b64-4f60-a7b5-b218eed30fd7\") " pod="openstack/heat-cfnapi-858f65d478-kpb9g" Mar 09 13:22:19 crc kubenswrapper[4723]: I0309 13:22:19.126174 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/679e1a62-9b64-4f60-a7b5-b218eed30fd7-config-data\") pod \"heat-cfnapi-858f65d478-kpb9g\" (UID: \"679e1a62-9b64-4f60-a7b5-b218eed30fd7\") " pod="openstack/heat-cfnapi-858f65d478-kpb9g" Mar 09 13:22:19 crc kubenswrapper[4723]: I0309 13:22:19.126212 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/679e1a62-9b64-4f60-a7b5-b218eed30fd7-config-data-custom\") pod \"heat-cfnapi-858f65d478-kpb9g\" (UID: \"679e1a62-9b64-4f60-a7b5-b218eed30fd7\") " pod="openstack/heat-cfnapi-858f65d478-kpb9g" Mar 09 13:22:19 crc kubenswrapper[4723]: I0309 13:22:19.126243 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/679e1a62-9b64-4f60-a7b5-b218eed30fd7-combined-ca-bundle\") pod \"heat-cfnapi-858f65d478-kpb9g\" (UID: \"679e1a62-9b64-4f60-a7b5-b218eed30fd7\") " pod="openstack/heat-cfnapi-858f65d478-kpb9g" Mar 09 13:22:19 crc kubenswrapper[4723]: I0309 13:22:19.234394 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bd2px\" (UniqueName: \"kubernetes.io/projected/679e1a62-9b64-4f60-a7b5-b218eed30fd7-kube-api-access-bd2px\") pod \"heat-cfnapi-858f65d478-kpb9g\" (UID: \"679e1a62-9b64-4f60-a7b5-b218eed30fd7\") " pod="openstack/heat-cfnapi-858f65d478-kpb9g" Mar 09 13:22:19 crc kubenswrapper[4723]: I0309 13:22:19.234479 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/679e1a62-9b64-4f60-a7b5-b218eed30fd7-config-data\") pod \"heat-cfnapi-858f65d478-kpb9g\" (UID: \"679e1a62-9b64-4f60-a7b5-b218eed30fd7\") " pod="openstack/heat-cfnapi-858f65d478-kpb9g" Mar 09 13:22:19 crc kubenswrapper[4723]: I0309 13:22:19.234512 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/679e1a62-9b64-4f60-a7b5-b218eed30fd7-config-data-custom\") pod \"heat-cfnapi-858f65d478-kpb9g\" (UID: \"679e1a62-9b64-4f60-a7b5-b218eed30fd7\") " pod="openstack/heat-cfnapi-858f65d478-kpb9g" Mar 09 13:22:19 crc kubenswrapper[4723]: I0309 13:22:19.234532 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/679e1a62-9b64-4f60-a7b5-b218eed30fd7-combined-ca-bundle\") pod \"heat-cfnapi-858f65d478-kpb9g\" (UID: \"679e1a62-9b64-4f60-a7b5-b218eed30fd7\") " pod="openstack/heat-cfnapi-858f65d478-kpb9g" Mar 09 13:22:19 crc kubenswrapper[4723]: I0309 13:22:19.245401 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/679e1a62-9b64-4f60-a7b5-b218eed30fd7-combined-ca-bundle\") pod \"heat-cfnapi-858f65d478-kpb9g\" (UID: \"679e1a62-9b64-4f60-a7b5-b218eed30fd7\") " pod="openstack/heat-cfnapi-858f65d478-kpb9g" Mar 09 13:22:19 crc kubenswrapper[4723]: I0309 13:22:19.245771 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/679e1a62-9b64-4f60-a7b5-b218eed30fd7-config-data\") pod \"heat-cfnapi-858f65d478-kpb9g\" (UID: \"679e1a62-9b64-4f60-a7b5-b218eed30fd7\") " pod="openstack/heat-cfnapi-858f65d478-kpb9g" Mar 09 13:22:19 crc kubenswrapper[4723]: I0309 13:22:19.246155 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/679e1a62-9b64-4f60-a7b5-b218eed30fd7-config-data-custom\") pod \"heat-cfnapi-858f65d478-kpb9g\" (UID: \"679e1a62-9b64-4f60-a7b5-b218eed30fd7\") " pod="openstack/heat-cfnapi-858f65d478-kpb9g" Mar 09 13:22:19 crc kubenswrapper[4723]: I0309 13:22:19.253515 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd2px\" (UniqueName: \"kubernetes.io/projected/679e1a62-9b64-4f60-a7b5-b218eed30fd7-kube-api-access-bd2px\") pod \"heat-cfnapi-858f65d478-kpb9g\" (UID: \"679e1a62-9b64-4f60-a7b5-b218eed30fd7\") " pod="openstack/heat-cfnapi-858f65d478-kpb9g" Mar 09 13:22:19 crc kubenswrapper[4723]: I0309 13:22:19.341036 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-858f65d478-kpb9g" Mar 09 13:22:19 crc kubenswrapper[4723]: I0309 13:22:19.659038 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-59fcdfbdd7-xjdk4"] Mar 09 13:22:19 crc kubenswrapper[4723]: I0309 13:22:19.667058 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-756d785958-qpphh"] Mar 09 13:22:19 crc kubenswrapper[4723]: I0309 13:22:19.875836 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f6bc4c6c9-vrvk6"] Mar 09 13:22:19 crc kubenswrapper[4723]: W0309 13:22:19.880626 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf49fba07_b6b5_4e69_9c8a_2c3e1b09182f.slice/crio-be3cd259005cb1ce5d04e521e59fbe62fd86cd56879a5a2a12c42e94ee1b09b3 WatchSource:0}: Error finding container be3cd259005cb1ce5d04e521e59fbe62fd86cd56879a5a2a12c42e94ee1b09b3: Status 404 returned error can't find the container with id be3cd259005cb1ce5d04e521e59fbe62fd86cd56879a5a2a12c42e94ee1b09b3 Mar 09 13:22:19 crc kubenswrapper[4723]: I0309 13:22:19.999313 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-858f65d478-kpb9g"] Mar 09 13:22:20 crc kubenswrapper[4723]: W0309 13:22:20.012151 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod679e1a62_9b64_4f60_a7b5_b218eed30fd7.slice/crio-cc71cd76c6b1574ceb478fb4da66268c523aa779e217be3aa239b656aa9efe86 WatchSource:0}: Error finding container cc71cd76c6b1574ceb478fb4da66268c523aa779e217be3aa239b656aa9efe86: Status 404 returned error can't find the container with id cc71cd76c6b1574ceb478fb4da66268c523aa779e217be3aa239b656aa9efe86 Mar 09 13:22:20 crc kubenswrapper[4723]: I0309 13:22:20.234705 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f180e14d-e014-49e0-8177-619c97476f71","Type":"ContainerStarted","Data":"e6145fa800d76acbd5fdbb41cc97956a662917864102c6930bc905db1cca3f70"} Mar 09 13:22:20 crc kubenswrapper[4723]: I0309 13:22:20.235298 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f180e14d-e014-49e0-8177-619c97476f71" containerName="ceilometer-central-agent" containerID="cri-o://0272fd11ac7fc956be7b2bcb860e5bf6408a5fd85e06c7da6adddcd36f4e70c4" gracePeriod=30 Mar 09 13:22:20 crc kubenswrapper[4723]: I0309 13:22:20.235571 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 13:22:20 crc kubenswrapper[4723]: I0309 13:22:20.235939 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f180e14d-e014-49e0-8177-619c97476f71" containerName="proxy-httpd" containerID="cri-o://e6145fa800d76acbd5fdbb41cc97956a662917864102c6930bc905db1cca3f70" gracePeriod=30 Mar 09 13:22:20 crc kubenswrapper[4723]: I0309 13:22:20.235961 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f180e14d-e014-49e0-8177-619c97476f71" containerName="sg-core" containerID="cri-o://673b7722468a4748c2c77474ebc301afdf3c4a61f1b63c8bdf3239852eb96420" gracePeriod=30 Mar 09 13:22:20 crc kubenswrapper[4723]: I0309 13:22:20.236060 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f180e14d-e014-49e0-8177-619c97476f71" containerName="ceilometer-notification-agent" containerID="cri-o://7b23dca3800cbe823fc146c2ca4068e4badd3ebd69cc7d2a607209bd762fe9c3" gracePeriod=30 Mar 09 13:22:20 crc kubenswrapper[4723]: I0309 13:22:20.254767 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f6bc4c6c9-vrvk6" event={"ID":"f49fba07-b6b5-4e69-9c8a-2c3e1b09182f","Type":"ContainerStarted","Data":"09e6092ee38ade1a7a0616246728a205c0da58281be3c8201d48b12596d7a897"} Mar 09 13:22:20 crc kubenswrapper[4723]: I0309 13:22:20.254817 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f6bc4c6c9-vrvk6" event={"ID":"f49fba07-b6b5-4e69-9c8a-2c3e1b09182f","Type":"ContainerStarted","Data":"be3cd259005cb1ce5d04e521e59fbe62fd86cd56879a5a2a12c42e94ee1b09b3"} Mar 09 13:22:20 crc kubenswrapper[4723]: I0309 13:22:20.258068 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-59fcdfbdd7-xjdk4" event={"ID":"e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff","Type":"ContainerStarted","Data":"69de9708249aa7f9a8ab14ef505ce762c78a9d4f8dc77326498c26302c868709"} Mar 09 13:22:20 crc kubenswrapper[4723]: I0309 13:22:20.260041 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-858f65d478-kpb9g" event={"ID":"679e1a62-9b64-4f60-a7b5-b218eed30fd7","Type":"ContainerStarted","Data":"cc71cd76c6b1574ceb478fb4da66268c523aa779e217be3aa239b656aa9efe86"} Mar 09 13:22:20 crc kubenswrapper[4723]: I0309 13:22:20.269920 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-756d785958-qpphh" event={"ID":"d365869f-e896-43d2-80ab-520b5e71beae","Type":"ContainerStarted","Data":"8e99ef46ce36916e98dcddaff58d8238fccadac0643c954b4d8d8ad6c6eefab6"} Mar 09 13:22:20 crc kubenswrapper[4723]: I0309 13:22:20.270148 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-756d785958-qpphh" event={"ID":"d365869f-e896-43d2-80ab-520b5e71beae","Type":"ContainerStarted","Data":"6d7ae58fdb96b3af8f09513dbd473244095b6f1664631e890ca50783a2b90f10"} Mar 09 13:22:20 crc kubenswrapper[4723]: I0309 13:22:20.271294 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-756d785958-qpphh" Mar 09 13:22:20 crc kubenswrapper[4723]: I0309 13:22:20.280200 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.947550385 podStartE2EDuration="6.280180011s" podCreationTimestamp="2026-03-09 13:22:14 +0000 UTC" firstStartedPulling="2026-03-09 13:22:15.241791606 +0000 UTC m=+1409.256259146" lastFinishedPulling="2026-03-09 13:22:19.574421232 +0000 UTC m=+1413.588888772" observedRunningTime="2026-03-09 13:22:20.271653145 +0000 UTC m=+1414.286120685" watchObservedRunningTime="2026-03-09 13:22:20.280180011 +0000 UTC m=+1414.294647551" Mar 09 13:22:20 crc kubenswrapper[4723]: I0309 13:22:20.388668 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-756d785958-qpphh" podStartSLOduration=2.3886468020000002 podStartE2EDuration="2.388646802s" podCreationTimestamp="2026-03-09 13:22:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:22:20.33151808 +0000 UTC m=+1414.345985620" watchObservedRunningTime="2026-03-09 13:22:20.388646802 +0000 UTC m=+1414.403114342" Mar 09 13:22:20 crc kubenswrapper[4723]: I0309 13:22:20.678395 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-pppx5"] Mar 09 13:22:20 crc kubenswrapper[4723]: I0309 13:22:20.680284 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-pppx5" Mar 09 13:22:20 crc kubenswrapper[4723]: I0309 13:22:20.688798 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 09 13:22:20 crc kubenswrapper[4723]: I0309 13:22:20.688945 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 09 13:22:20 crc kubenswrapper[4723]: I0309 13:22:20.689119 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-jwtm7" Mar 09 13:22:20 crc kubenswrapper[4723]: I0309 13:22:20.726936 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-pppx5"] Mar 09 13:22:20 crc kubenswrapper[4723]: I0309 13:22:20.821743 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e8ed559-11fc-4511-9258-1681da84b5cd-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-pppx5\" (UID: \"7e8ed559-11fc-4511-9258-1681da84b5cd\") " pod="openstack/nova-cell0-conductor-db-sync-pppx5" Mar 09 13:22:20 crc kubenswrapper[4723]: I0309 13:22:20.821914 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e8ed559-11fc-4511-9258-1681da84b5cd-scripts\") pod \"nova-cell0-conductor-db-sync-pppx5\" (UID: \"7e8ed559-11fc-4511-9258-1681da84b5cd\") " pod="openstack/nova-cell0-conductor-db-sync-pppx5" Mar 09 13:22:20 crc kubenswrapper[4723]: I0309 13:22:20.822011 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfngs\" (UniqueName: \"kubernetes.io/projected/7e8ed559-11fc-4511-9258-1681da84b5cd-kube-api-access-jfngs\") pod \"nova-cell0-conductor-db-sync-pppx5\" (UID: \"7e8ed559-11fc-4511-9258-1681da84b5cd\") " pod="openstack/nova-cell0-conductor-db-sync-pppx5" Mar 09 13:22:20 crc kubenswrapper[4723]: I0309 13:22:20.822067 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e8ed559-11fc-4511-9258-1681da84b5cd-config-data\") pod \"nova-cell0-conductor-db-sync-pppx5\" (UID: \"7e8ed559-11fc-4511-9258-1681da84b5cd\") " pod="openstack/nova-cell0-conductor-db-sync-pppx5" Mar 09 13:22:20 crc kubenswrapper[4723]: I0309 13:22:20.924145 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e8ed559-11fc-4511-9258-1681da84b5cd-scripts\") pod \"nova-cell0-conductor-db-sync-pppx5\" (UID: \"7e8ed559-11fc-4511-9258-1681da84b5cd\") " pod="openstack/nova-cell0-conductor-db-sync-pppx5" Mar 09 13:22:20 crc kubenswrapper[4723]: I0309 13:22:20.924269 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfngs\" (UniqueName: \"kubernetes.io/projected/7e8ed559-11fc-4511-9258-1681da84b5cd-kube-api-access-jfngs\") pod \"nova-cell0-conductor-db-sync-pppx5\" (UID: \"7e8ed559-11fc-4511-9258-1681da84b5cd\") " pod="openstack/nova-cell0-conductor-db-sync-pppx5" Mar 09 13:22:20 crc kubenswrapper[4723]: I0309 13:22:20.924322 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e8ed559-11fc-4511-9258-1681da84b5cd-config-data\") pod \"nova-cell0-conductor-db-sync-pppx5\" (UID: \"7e8ed559-11fc-4511-9258-1681da84b5cd\") " pod="openstack/nova-cell0-conductor-db-sync-pppx5" Mar 09 13:22:20 crc kubenswrapper[4723]: I0309 13:22:20.924357 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e8ed559-11fc-4511-9258-1681da84b5cd-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-pppx5\" (UID: \"7e8ed559-11fc-4511-9258-1681da84b5cd\") " pod="openstack/nova-cell0-conductor-db-sync-pppx5" Mar 09 13:22:20 crc kubenswrapper[4723]: I0309 13:22:20.930590 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e8ed559-11fc-4511-9258-1681da84b5cd-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-pppx5\" (UID: \"7e8ed559-11fc-4511-9258-1681da84b5cd\") " pod="openstack/nova-cell0-conductor-db-sync-pppx5" Mar 09 13:22:20 crc kubenswrapper[4723]: I0309 13:22:20.930960 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e8ed559-11fc-4511-9258-1681da84b5cd-scripts\") pod \"nova-cell0-conductor-db-sync-pppx5\" (UID: \"7e8ed559-11fc-4511-9258-1681da84b5cd\") " pod="openstack/nova-cell0-conductor-db-sync-pppx5" Mar 09 13:22:20 crc kubenswrapper[4723]: I0309 13:22:20.936411 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e8ed559-11fc-4511-9258-1681da84b5cd-config-data\") pod \"nova-cell0-conductor-db-sync-pppx5\" (UID: \"7e8ed559-11fc-4511-9258-1681da84b5cd\") " pod="openstack/nova-cell0-conductor-db-sync-pppx5" Mar 09 13:22:20 crc kubenswrapper[4723]: I0309 13:22:20.949488 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfngs\" (UniqueName: \"kubernetes.io/projected/7e8ed559-11fc-4511-9258-1681da84b5cd-kube-api-access-jfngs\") pod \"nova-cell0-conductor-db-sync-pppx5\" (UID: \"7e8ed559-11fc-4511-9258-1681da84b5cd\") " pod="openstack/nova-cell0-conductor-db-sync-pppx5" Mar 09 13:22:21 crc kubenswrapper[4723]: I0309 13:22:21.062248 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-pppx5" Mar 09 13:22:21 crc kubenswrapper[4723]: I0309 13:22:21.296423 4723 generic.go:334] "Generic (PLEG): container finished" podID="f180e14d-e014-49e0-8177-619c97476f71" containerID="e6145fa800d76acbd5fdbb41cc97956a662917864102c6930bc905db1cca3f70" exitCode=0 Mar 09 13:22:21 crc kubenswrapper[4723]: I0309 13:22:21.296459 4723 generic.go:334] "Generic (PLEG): container finished" podID="f180e14d-e014-49e0-8177-619c97476f71" containerID="673b7722468a4748c2c77474ebc301afdf3c4a61f1b63c8bdf3239852eb96420" exitCode=2 Mar 09 13:22:21 crc kubenswrapper[4723]: I0309 13:22:21.296467 4723 generic.go:334] "Generic (PLEG): container finished" podID="f180e14d-e014-49e0-8177-619c97476f71" containerID="7b23dca3800cbe823fc146c2ca4068e4badd3ebd69cc7d2a607209bd762fe9c3" exitCode=0 Mar 09 13:22:21 crc kubenswrapper[4723]: I0309 13:22:21.296505 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f180e14d-e014-49e0-8177-619c97476f71","Type":"ContainerDied","Data":"e6145fa800d76acbd5fdbb41cc97956a662917864102c6930bc905db1cca3f70"} Mar 09 13:22:21 crc kubenswrapper[4723]: I0309 13:22:21.296548 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f180e14d-e014-49e0-8177-619c97476f71","Type":"ContainerDied","Data":"673b7722468a4748c2c77474ebc301afdf3c4a61f1b63c8bdf3239852eb96420"} Mar 09 13:22:21 crc kubenswrapper[4723]: I0309 13:22:21.296558 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f180e14d-e014-49e0-8177-619c97476f71","Type":"ContainerDied","Data":"7b23dca3800cbe823fc146c2ca4068e4badd3ebd69cc7d2a607209bd762fe9c3"} Mar 09 13:22:21 crc kubenswrapper[4723]: I0309 13:22:21.298321 4723 generic.go:334] "Generic (PLEG): container finished" podID="f49fba07-b6b5-4e69-9c8a-2c3e1b09182f" containerID="09e6092ee38ade1a7a0616246728a205c0da58281be3c8201d48b12596d7a897" exitCode=0 Mar 09 13:22:21 crc kubenswrapper[4723]: I0309 13:22:21.298379 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f6bc4c6c9-vrvk6" event={"ID":"f49fba07-b6b5-4e69-9c8a-2c3e1b09182f","Type":"ContainerDied","Data":"09e6092ee38ade1a7a0616246728a205c0da58281be3c8201d48b12596d7a897"} Mar 09 13:22:21 crc kubenswrapper[4723]: I0309 13:22:21.298423 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f6bc4c6c9-vrvk6" event={"ID":"f49fba07-b6b5-4e69-9c8a-2c3e1b09182f","Type":"ContainerStarted","Data":"708242fd59bd257212f9758436d825acf296795bda0ecb23d35b5f78ea6f1191"} Mar 09 13:22:21 crc kubenswrapper[4723]: I0309 13:22:21.323758 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f6bc4c6c9-vrvk6" podStartSLOduration=3.32373659 podStartE2EDuration="3.32373659s" podCreationTimestamp="2026-03-09 13:22:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:22:21.316852018 +0000 UTC m=+1415.331319568" watchObservedRunningTime="2026-03-09 13:22:21.32373659 +0000 UTC m=+1415.338204130" Mar 09 13:22:22 crc kubenswrapper[4723]: I0309 13:22:22.312414 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f6bc4c6c9-vrvk6" Mar 09 13:22:22 crc kubenswrapper[4723]: I0309 13:22:22.597175 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-pppx5"] Mar 09 13:22:23 crc kubenswrapper[4723]: I0309 13:22:23.324625 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-pppx5" event={"ID":"7e8ed559-11fc-4511-9258-1681da84b5cd","Type":"ContainerStarted","Data":"b7faf82aadc8e8983e17c07442f63ed164016bbbab33b67276e6987e42d185df"} Mar 09 13:22:23 crc kubenswrapper[4723]: I0309 13:22:23.326883 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-59fcdfbdd7-xjdk4" event={"ID":"e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff","Type":"ContainerStarted","Data":"4447d520e7430d1b1fa51a0b30066b24df42ad8a3421fec79d57677c43332d8e"} Mar 09 13:22:23 crc kubenswrapper[4723]: I0309 13:22:23.326965 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-59fcdfbdd7-xjdk4" Mar 09 13:22:23 crc kubenswrapper[4723]: I0309 13:22:23.329025 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-858f65d478-kpb9g" event={"ID":"679e1a62-9b64-4f60-a7b5-b218eed30fd7","Type":"ContainerStarted","Data":"9d3a0391a5f5505c131e4f1338948ab5d2b8697120ffd0291ccc063ee931352b"} Mar 09 13:22:23 crc kubenswrapper[4723]: I0309 13:22:23.356757 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-59fcdfbdd7-xjdk4" podStartSLOduration=2.972856542 podStartE2EDuration="5.356737434s" podCreationTimestamp="2026-03-09 13:22:18 +0000 UTC" firstStartedPulling="2026-03-09 13:22:19.686476988 +0000 UTC m=+1413.700944538" lastFinishedPulling="2026-03-09 13:22:22.07035788 +0000 UTC m=+1416.084825430" observedRunningTime="2026-03-09 13:22:23.348904167 +0000 UTC m=+1417.363371727" watchObservedRunningTime="2026-03-09 13:22:23.356737434 +0000 UTC m=+1417.371204974" Mar 09 13:22:23 crc kubenswrapper[4723]: I0309 13:22:23.375802 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-858f65d478-kpb9g" podStartSLOduration=3.323626666 podStartE2EDuration="5.375781898s" podCreationTimestamp="2026-03-09 13:22:18 +0000 UTC" firstStartedPulling="2026-03-09 13:22:20.018129996 +0000 UTC m=+1414.032597536" lastFinishedPulling="2026-03-09 13:22:22.070285228 +0000 UTC m=+1416.084752768" observedRunningTime="2026-03-09 13:22:23.367329674 +0000 UTC m=+1417.381797224" watchObservedRunningTime="2026-03-09 13:22:23.375781898 +0000 UTC m=+1417.390249428" Mar 09 13:22:24 crc kubenswrapper[4723]: I0309 13:22:24.342495 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-858f65d478-kpb9g" Mar 09 13:22:24 crc kubenswrapper[4723]: I0309 13:22:24.781742 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7d4b578b96-kksd4" Mar 09 13:22:24 crc kubenswrapper[4723]: I0309 13:22:24.798066 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7d4b578b96-kksd4" Mar 09 13:22:24 crc kubenswrapper[4723]: I0309 13:22:24.910198 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-677d745ffb-ng6tr"] Mar 09 13:22:24 crc kubenswrapper[4723]: I0309 13:22:24.910489 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-677d745ffb-ng6tr" podUID="ed96f382-04dd-41ec-b370-832266d07122" containerName="placement-log" containerID="cri-o://d04c43cd4468292b687fa6b635dd878c8d44b4defaf6b4c9adff867bb0b84ad2" gracePeriod=30 Mar 09 13:22:24 crc kubenswrapper[4723]: I0309 13:22:24.912118 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-677d745ffb-ng6tr" podUID="ed96f382-04dd-41ec-b370-832266d07122" containerName="placement-api" containerID="cri-o://8b43efdd93be446949623ace9492232e4959bc04ea236173db991fe5c509e83d" gracePeriod=30 Mar 09 13:22:25 crc kubenswrapper[4723]: I0309 13:22:25.368017 4723 generic.go:334] "Generic (PLEG): container finished" podID="ed96f382-04dd-41ec-b370-832266d07122" containerID="d04c43cd4468292b687fa6b635dd878c8d44b4defaf6b4c9adff867bb0b84ad2" exitCode=143 Mar 09 13:22:25 crc kubenswrapper[4723]: I0309 13:22:25.368116 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-677d745ffb-ng6tr" event={"ID":"ed96f382-04dd-41ec-b370-832266d07122","Type":"ContainerDied","Data":"d04c43cd4468292b687fa6b635dd878c8d44b4defaf6b4c9adff867bb0b84ad2"} Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.288130 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-5d4f94b9d4-2l2jj"] Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.290369 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5d4f94b9d4-2l2jj" Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.303208 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-5d4f94b9d4-2l2jj"] Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.342029 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-56845cc998-n2w92"] Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.345149 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-56845cc998-n2w92" Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.367806 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-57d4ddfcc7-js2kz"] Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.369675 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-57d4ddfcc7-js2kz" Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.402131 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88ef3786-4edc-4e35-b54c-ae1edbfb27ca-config-data-custom\") pod \"heat-cfnapi-57d4ddfcc7-js2kz\" (UID: \"88ef3786-4edc-4e35-b54c-ae1edbfb27ca\") " pod="openstack/heat-cfnapi-57d4ddfcc7-js2kz" Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.402236 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a014777-41ba-4350-92ed-b6036f193d1c-config-data\") pod \"heat-api-56845cc998-n2w92\" (UID: \"3a014777-41ba-4350-92ed-b6036f193d1c\") " pod="openstack/heat-api-56845cc998-n2w92" Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.402322 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/227cded8-49e9-4484-94a3-5ffebb8e4e47-config-data\") pod \"heat-engine-5d4f94b9d4-2l2jj\" (UID: \"227cded8-49e9-4484-94a3-5ffebb8e4e47\") " pod="openstack/heat-engine-5d4f94b9d4-2l2jj" Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.402390 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/227cded8-49e9-4484-94a3-5ffebb8e4e47-combined-ca-bundle\") pod \"heat-engine-5d4f94b9d4-2l2jj\" (UID: \"227cded8-49e9-4484-94a3-5ffebb8e4e47\") " pod="openstack/heat-engine-5d4f94b9d4-2l2jj" Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.402413 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a014777-41ba-4350-92ed-b6036f193d1c-combined-ca-bundle\") pod \"heat-api-56845cc998-n2w92\" (UID: \"3a014777-41ba-4350-92ed-b6036f193d1c\") " pod="openstack/heat-api-56845cc998-n2w92" Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.402606 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88ef3786-4edc-4e35-b54c-ae1edbfb27ca-config-data\") pod \"heat-cfnapi-57d4ddfcc7-js2kz\" (UID: \"88ef3786-4edc-4e35-b54c-ae1edbfb27ca\") " pod="openstack/heat-cfnapi-57d4ddfcc7-js2kz" Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.402647 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjkkd\" (UniqueName: \"kubernetes.io/projected/3a014777-41ba-4350-92ed-b6036f193d1c-kube-api-access-vjkkd\") pod \"heat-api-56845cc998-n2w92\" (UID: \"3a014777-41ba-4350-92ed-b6036f193d1c\") " pod="openstack/heat-api-56845cc998-n2w92" Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.402743 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88ef3786-4edc-4e35-b54c-ae1edbfb27ca-combined-ca-bundle\") pod \"heat-cfnapi-57d4ddfcc7-js2kz\" (UID: \"88ef3786-4edc-4e35-b54c-ae1edbfb27ca\") " pod="openstack/heat-cfnapi-57d4ddfcc7-js2kz" Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.402787 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkphm\" (UniqueName: \"kubernetes.io/projected/88ef3786-4edc-4e35-b54c-ae1edbfb27ca-kube-api-access-rkphm\") pod \"heat-cfnapi-57d4ddfcc7-js2kz\" (UID: \"88ef3786-4edc-4e35-b54c-ae1edbfb27ca\") " pod="openstack/heat-cfnapi-57d4ddfcc7-js2kz" Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.402848 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a014777-41ba-4350-92ed-b6036f193d1c-config-data-custom\") pod \"heat-api-56845cc998-n2w92\" (UID: \"3a014777-41ba-4350-92ed-b6036f193d1c\") " pod="openstack/heat-api-56845cc998-n2w92" Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.403009 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/227cded8-49e9-4484-94a3-5ffebb8e4e47-config-data-custom\") pod \"heat-engine-5d4f94b9d4-2l2jj\" (UID: \"227cded8-49e9-4484-94a3-5ffebb8e4e47\") " pod="openstack/heat-engine-5d4f94b9d4-2l2jj" Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.403034 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nwqw\" (UniqueName: \"kubernetes.io/projected/227cded8-49e9-4484-94a3-5ffebb8e4e47-kube-api-access-4nwqw\") pod \"heat-engine-5d4f94b9d4-2l2jj\" (UID: \"227cded8-49e9-4484-94a3-5ffebb8e4e47\") " pod="openstack/heat-engine-5d4f94b9d4-2l2jj" Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.410069 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-56845cc998-n2w92"] Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.425918 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-57d4ddfcc7-js2kz"] Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.505624 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88ef3786-4edc-4e35-b54c-ae1edbfb27ca-config-data-custom\") pod \"heat-cfnapi-57d4ddfcc7-js2kz\" (UID: \"88ef3786-4edc-4e35-b54c-ae1edbfb27ca\") " pod="openstack/heat-cfnapi-57d4ddfcc7-js2kz" Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.506947 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a014777-41ba-4350-92ed-b6036f193d1c-config-data\") pod \"heat-api-56845cc998-n2w92\" (UID: \"3a014777-41ba-4350-92ed-b6036f193d1c\") " pod="openstack/heat-api-56845cc998-n2w92" Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.507063 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/227cded8-49e9-4484-94a3-5ffebb8e4e47-config-data\") pod \"heat-engine-5d4f94b9d4-2l2jj\" (UID: \"227cded8-49e9-4484-94a3-5ffebb8e4e47\") " pod="openstack/heat-engine-5d4f94b9d4-2l2jj" Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.507107 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/227cded8-49e9-4484-94a3-5ffebb8e4e47-combined-ca-bundle\") pod \"heat-engine-5d4f94b9d4-2l2jj\" (UID: \"227cded8-49e9-4484-94a3-5ffebb8e4e47\") " pod="openstack/heat-engine-5d4f94b9d4-2l2jj" Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.507124 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a014777-41ba-4350-92ed-b6036f193d1c-combined-ca-bundle\") pod \"heat-api-56845cc998-n2w92\" (UID: \"3a014777-41ba-4350-92ed-b6036f193d1c\") " pod="openstack/heat-api-56845cc998-n2w92" Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.507245 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88ef3786-4edc-4e35-b54c-ae1edbfb27ca-config-data\") pod \"heat-cfnapi-57d4ddfcc7-js2kz\" (UID: \"88ef3786-4edc-4e35-b54c-ae1edbfb27ca\") " pod="openstack/heat-cfnapi-57d4ddfcc7-js2kz" Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.507274 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjkkd\" (UniqueName: \"kubernetes.io/projected/3a014777-41ba-4350-92ed-b6036f193d1c-kube-api-access-vjkkd\") pod \"heat-api-56845cc998-n2w92\" (UID: \"3a014777-41ba-4350-92ed-b6036f193d1c\") " pod="openstack/heat-api-56845cc998-n2w92" Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.507336 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88ef3786-4edc-4e35-b54c-ae1edbfb27ca-combined-ca-bundle\") pod \"heat-cfnapi-57d4ddfcc7-js2kz\" (UID: \"88ef3786-4edc-4e35-b54c-ae1edbfb27ca\") " pod="openstack/heat-cfnapi-57d4ddfcc7-js2kz" Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.507405 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkphm\" (UniqueName: \"kubernetes.io/projected/88ef3786-4edc-4e35-b54c-ae1edbfb27ca-kube-api-access-rkphm\") pod \"heat-cfnapi-57d4ddfcc7-js2kz\" (UID: \"88ef3786-4edc-4e35-b54c-ae1edbfb27ca\") " pod="openstack/heat-cfnapi-57d4ddfcc7-js2kz" Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.507457 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a014777-41ba-4350-92ed-b6036f193d1c-config-data-custom\") pod \"heat-api-56845cc998-n2w92\" (UID: \"3a014777-41ba-4350-92ed-b6036f193d1c\") " pod="openstack/heat-api-56845cc998-n2w92" Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.507482 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/227cded8-49e9-4484-94a3-5ffebb8e4e47-config-data-custom\") pod \"heat-engine-5d4f94b9d4-2l2jj\" (UID: \"227cded8-49e9-4484-94a3-5ffebb8e4e47\") " pod="openstack/heat-engine-5d4f94b9d4-2l2jj" Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.507496 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nwqw\" (UniqueName: \"kubernetes.io/projected/227cded8-49e9-4484-94a3-5ffebb8e4e47-kube-api-access-4nwqw\") pod \"heat-engine-5d4f94b9d4-2l2jj\" (UID: \"227cded8-49e9-4484-94a3-5ffebb8e4e47\") " pod="openstack/heat-engine-5d4f94b9d4-2l2jj" Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.515816 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a014777-41ba-4350-92ed-b6036f193d1c-config-data-custom\") pod \"heat-api-56845cc998-n2w92\" (UID: \"3a014777-41ba-4350-92ed-b6036f193d1c\") " pod="openstack/heat-api-56845cc998-n2w92" Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.521455 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88ef3786-4edc-4e35-b54c-ae1edbfb27ca-combined-ca-bundle\") pod \"heat-cfnapi-57d4ddfcc7-js2kz\" (UID: \"88ef3786-4edc-4e35-b54c-ae1edbfb27ca\") " pod="openstack/heat-cfnapi-57d4ddfcc7-js2kz" Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.522700 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a014777-41ba-4350-92ed-b6036f193d1c-combined-ca-bundle\") pod \"heat-api-56845cc998-n2w92\" (UID: \"3a014777-41ba-4350-92ed-b6036f193d1c\") " pod="openstack/heat-api-56845cc998-n2w92" Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.526193 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/227cded8-49e9-4484-94a3-5ffebb8e4e47-config-data\") pod \"heat-engine-5d4f94b9d4-2l2jj\" (UID: \"227cded8-49e9-4484-94a3-5ffebb8e4e47\") " pod="openstack/heat-engine-5d4f94b9d4-2l2jj" Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.526974 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88ef3786-4edc-4e35-b54c-ae1edbfb27ca-config-data-custom\") pod \"heat-cfnapi-57d4ddfcc7-js2kz\" (UID: \"88ef3786-4edc-4e35-b54c-ae1edbfb27ca\") " pod="openstack/heat-cfnapi-57d4ddfcc7-js2kz" Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.528301 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88ef3786-4edc-4e35-b54c-ae1edbfb27ca-config-data\") pod \"heat-cfnapi-57d4ddfcc7-js2kz\" (UID: \"88ef3786-4edc-4e35-b54c-ae1edbfb27ca\") " pod="openstack/heat-cfnapi-57d4ddfcc7-js2kz" Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.528694 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/227cded8-49e9-4484-94a3-5ffebb8e4e47-config-data-custom\") pod \"heat-engine-5d4f94b9d4-2l2jj\" (UID: \"227cded8-49e9-4484-94a3-5ffebb8e4e47\") " pod="openstack/heat-engine-5d4f94b9d4-2l2jj" Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.530471 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a014777-41ba-4350-92ed-b6036f193d1c-config-data\") pod \"heat-api-56845cc998-n2w92\" (UID: \"3a014777-41ba-4350-92ed-b6036f193d1c\") " pod="openstack/heat-api-56845cc998-n2w92" Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.537244 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjkkd\" (UniqueName: \"kubernetes.io/projected/3a014777-41ba-4350-92ed-b6036f193d1c-kube-api-access-vjkkd\") pod \"heat-api-56845cc998-n2w92\" (UID: \"3a014777-41ba-4350-92ed-b6036f193d1c\") " pod="openstack/heat-api-56845cc998-n2w92" Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.542662 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkphm\" (UniqueName: \"kubernetes.io/projected/88ef3786-4edc-4e35-b54c-ae1edbfb27ca-kube-api-access-rkphm\") pod \"heat-cfnapi-57d4ddfcc7-js2kz\" (UID: \"88ef3786-4edc-4e35-b54c-ae1edbfb27ca\") " pod="openstack/heat-cfnapi-57d4ddfcc7-js2kz" Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.543837 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nwqw\" (UniqueName: \"kubernetes.io/projected/227cded8-49e9-4484-94a3-5ffebb8e4e47-kube-api-access-4nwqw\") pod \"heat-engine-5d4f94b9d4-2l2jj\" (UID: \"227cded8-49e9-4484-94a3-5ffebb8e4e47\") " pod="openstack/heat-engine-5d4f94b9d4-2l2jj" Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.544955 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/227cded8-49e9-4484-94a3-5ffebb8e4e47-combined-ca-bundle\") pod \"heat-engine-5d4f94b9d4-2l2jj\" (UID: \"227cded8-49e9-4484-94a3-5ffebb8e4e47\") " pod="openstack/heat-engine-5d4f94b9d4-2l2jj" Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.627599 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5d4f94b9d4-2l2jj" Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.670599 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-56845cc998-n2w92" Mar 09 13:22:27 crc kubenswrapper[4723]: I0309 13:22:27.715497 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-57d4ddfcc7-js2kz" Mar 09 13:22:28 crc kubenswrapper[4723]: I0309 13:22:28.420952 4723 generic.go:334] "Generic (PLEG): container finished" podID="ed96f382-04dd-41ec-b370-832266d07122" containerID="8b43efdd93be446949623ace9492232e4959bc04ea236173db991fe5c509e83d" exitCode=0 Mar 09 13:22:28 crc kubenswrapper[4723]: I0309 13:22:28.421035 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-677d745ffb-ng6tr" event={"ID":"ed96f382-04dd-41ec-b370-832266d07122","Type":"ContainerDied","Data":"8b43efdd93be446949623ace9492232e4959bc04ea236173db991fe5c509e83d"} Mar 09 13:22:28 crc kubenswrapper[4723]: I0309 13:22:28.577492 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-56845cc998-n2w92"] Mar 09 13:22:28 crc kubenswrapper[4723]: W0309 13:22:28.578489 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a014777_41ba_4350_92ed_b6036f193d1c.slice/crio-f0bff2f88e1ba6e6a79a102f91021a85545c7d11b3b7a78645cc007728ba6b86 WatchSource:0}: Error finding container f0bff2f88e1ba6e6a79a102f91021a85545c7d11b3b7a78645cc007728ba6b86: Status 404 returned error can't find the container with id f0bff2f88e1ba6e6a79a102f91021a85545c7d11b3b7a78645cc007728ba6b86 Mar 09 13:22:28 crc kubenswrapper[4723]: I0309 13:22:28.621836 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-57d4ddfcc7-js2kz"] Mar 09 13:22:28 crc kubenswrapper[4723]: I0309 13:22:28.684811 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-5d4f94b9d4-2l2jj"] Mar 09 13:22:28 crc kubenswrapper[4723]: W0309 13:22:28.686934 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod227cded8_49e9_4484_94a3_5ffebb8e4e47.slice/crio-3867f6ec87f105547d9166c7c33cb5290d7ae9336cdb4843437eb5a35264341a WatchSource:0}: Error finding container 3867f6ec87f105547d9166c7c33cb5290d7ae9336cdb4843437eb5a35264341a: Status 404 returned error can't find the container with id 3867f6ec87f105547d9166c7c33cb5290d7ae9336cdb4843437eb5a35264341a Mar 09 13:22:28 crc kubenswrapper[4723]: I0309 13:22:28.970186 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-f6bc4c6c9-vrvk6" Mar 09 13:22:29 crc kubenswrapper[4723]: I0309 13:22:29.051374 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-tfrlw"] Mar 09 13:22:29 crc kubenswrapper[4723]: I0309 13:22:29.053291 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5784cf869f-tfrlw" podUID="4b35cba0-637b-481c-a44f-854ba4c3f86e" containerName="dnsmasq-dns" containerID="cri-o://9e3da25c95a32f77f29584cfcf43fa1688fa84629704fc08f032bef49a403eff" gracePeriod=10 Mar 09 13:22:29 crc kubenswrapper[4723]: I0309 13:22:29.458486 4723 generic.go:334] "Generic (PLEG): container finished" podID="4b35cba0-637b-481c-a44f-854ba4c3f86e" containerID="9e3da25c95a32f77f29584cfcf43fa1688fa84629704fc08f032bef49a403eff" exitCode=0 Mar 09 13:22:29 crc kubenswrapper[4723]: I0309 13:22:29.458564 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-tfrlw" event={"ID":"4b35cba0-637b-481c-a44f-854ba4c3f86e","Type":"ContainerDied","Data":"9e3da25c95a32f77f29584cfcf43fa1688fa84629704fc08f032bef49a403eff"} Mar 09 13:22:29 crc kubenswrapper[4723]: I0309 13:22:29.462639 4723 generic.go:334] "Generic (PLEG): container finished" podID="3a014777-41ba-4350-92ed-b6036f193d1c" containerID="99498a678481ffb7e1eeaed904c95b0323f70750e638d937b590852c554b6b8f" exitCode=1 Mar 09 13:22:29 crc kubenswrapper[4723]: I0309 13:22:29.463044 4723 scope.go:117] "RemoveContainer" containerID="99498a678481ffb7e1eeaed904c95b0323f70750e638d937b590852c554b6b8f" Mar 09 13:22:29 crc kubenswrapper[4723]: I0309 13:22:29.463541 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-56845cc998-n2w92" event={"ID":"3a014777-41ba-4350-92ed-b6036f193d1c","Type":"ContainerDied","Data":"99498a678481ffb7e1eeaed904c95b0323f70750e638d937b590852c554b6b8f"} Mar 09 13:22:29 crc kubenswrapper[4723]: I0309 13:22:29.463572 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-56845cc998-n2w92" event={"ID":"3a014777-41ba-4350-92ed-b6036f193d1c","Type":"ContainerStarted","Data":"f0bff2f88e1ba6e6a79a102f91021a85545c7d11b3b7a78645cc007728ba6b86"} Mar 09 13:22:29 crc kubenswrapper[4723]: I0309 13:22:29.475083 4723 generic.go:334] "Generic (PLEG): container finished" podID="88ef3786-4edc-4e35-b54c-ae1edbfb27ca" containerID="da4a0630f1ffaf294147e094d64c9e8ceced6a2b6526b66004cbf84bad23bb65" exitCode=1 Mar 09 13:22:29 crc kubenswrapper[4723]: I0309 13:22:29.475151 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-57d4ddfcc7-js2kz" event={"ID":"88ef3786-4edc-4e35-b54c-ae1edbfb27ca","Type":"ContainerDied","Data":"da4a0630f1ffaf294147e094d64c9e8ceced6a2b6526b66004cbf84bad23bb65"} Mar 09 13:22:29 crc kubenswrapper[4723]: I0309 13:22:29.475178 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-57d4ddfcc7-js2kz" event={"ID":"88ef3786-4edc-4e35-b54c-ae1edbfb27ca","Type":"ContainerStarted","Data":"1091e6cedffaa100a5d79cf651377901da0852d6aad342854d9a7bf07e6fa862"} Mar 09 13:22:29 crc kubenswrapper[4723]: I0309 13:22:29.475895 4723 scope.go:117] "RemoveContainer" containerID="da4a0630f1ffaf294147e094d64c9e8ceced6a2b6526b66004cbf84bad23bb65" Mar 09 13:22:29 crc kubenswrapper[4723]: I0309 13:22:29.494721 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5d4f94b9d4-2l2jj" event={"ID":"227cded8-49e9-4484-94a3-5ffebb8e4e47","Type":"ContainerStarted","Data":"3867f6ec87f105547d9166c7c33cb5290d7ae9336cdb4843437eb5a35264341a"} Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.201266 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.201891 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2e90760e-0ff0-4195-8bb9-d32fe674feb5" containerName="glance-log" containerID="cri-o://dbc687e3211b92eb828ad0207767237f7fe0ccc94f29cfde15316b72d8d5efbf" gracePeriod=30 Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.201964 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2e90760e-0ff0-4195-8bb9-d32fe674feb5" containerName="glance-httpd" containerID="cri-o://f76c6b69b519dcbc0aca84270d607b9fdb4f7fc3927531998d5b7deedd7f3f99" gracePeriod=30 Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.330321 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5784cf869f-tfrlw" podUID="4b35cba0-637b-481c-a44f-854ba4c3f86e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.212:5353: connect: connection refused" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.487614 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-59fcdfbdd7-xjdk4"] Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.487933 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-59fcdfbdd7-xjdk4" podUID="e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff" containerName="heat-api" containerID="cri-o://4447d520e7430d1b1fa51a0b30066b24df42ad8a3421fec79d57677c43332d8e" gracePeriod=60 Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.515961 4723 generic.go:334] "Generic (PLEG): container finished" podID="2e90760e-0ff0-4195-8bb9-d32fe674feb5" containerID="dbc687e3211b92eb828ad0207767237f7fe0ccc94f29cfde15316b72d8d5efbf" exitCode=143 Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.516019 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2e90760e-0ff0-4195-8bb9-d32fe674feb5","Type":"ContainerDied","Data":"dbc687e3211b92eb828ad0207767237f7fe0ccc94f29cfde15316b72d8d5efbf"} Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.572875 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-68d98b8999-qqz47"] Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.574925 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-68d98b8999-qqz47" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.582431 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.582609 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.600423 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-68d98b8999-qqz47"] Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.602299 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4841a92-8277-45f9-b366-8913a20ec8ad-public-tls-certs\") pod \"heat-api-68d98b8999-qqz47\" (UID: \"a4841a92-8277-45f9-b366-8913a20ec8ad\") " pod="openstack/heat-api-68d98b8999-qqz47" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.602434 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4841a92-8277-45f9-b366-8913a20ec8ad-combined-ca-bundle\") pod \"heat-api-68d98b8999-qqz47\" (UID: \"a4841a92-8277-45f9-b366-8913a20ec8ad\") " pod="openstack/heat-api-68d98b8999-qqz47" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.602472 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbs6l\" (UniqueName: \"kubernetes.io/projected/a4841a92-8277-45f9-b366-8913a20ec8ad-kube-api-access-vbs6l\") pod \"heat-api-68d98b8999-qqz47\" (UID: \"a4841a92-8277-45f9-b366-8913a20ec8ad\") " pod="openstack/heat-api-68d98b8999-qqz47" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.602522 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4841a92-8277-45f9-b366-8913a20ec8ad-config-data-custom\") pod \"heat-api-68d98b8999-qqz47\" (UID: \"a4841a92-8277-45f9-b366-8913a20ec8ad\") " pod="openstack/heat-api-68d98b8999-qqz47" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.602634 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4841a92-8277-45f9-b366-8913a20ec8ad-config-data\") pod \"heat-api-68d98b8999-qqz47\" (UID: \"a4841a92-8277-45f9-b366-8913a20ec8ad\") " pod="openstack/heat-api-68d98b8999-qqz47" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.602703 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4841a92-8277-45f9-b366-8913a20ec8ad-internal-tls-certs\") pod \"heat-api-68d98b8999-qqz47\" (UID: \"a4841a92-8277-45f9-b366-8913a20ec8ad\") " pod="openstack/heat-api-68d98b8999-qqz47" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.606813 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-858f65d478-kpb9g"] Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.607061 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-858f65d478-kpb9g" podUID="679e1a62-9b64-4f60-a7b5-b218eed30fd7" containerName="heat-cfnapi" containerID="cri-o://9d3a0391a5f5505c131e4f1338948ab5d2b8697120ffd0291ccc063ee931352b" gracePeriod=60 Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.630002 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/heat-api-59fcdfbdd7-xjdk4" podUID="e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.0.231:8004/healthcheck\": EOF" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.630392 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-59fcdfbdd7-xjdk4" podUID="e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.0.231:8004/healthcheck\": EOF" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.662694 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-6578b64f7d-9cxnx"] Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.664183 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6578b64f7d-9cxnx" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.680509 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.680509 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.704085 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/heat-cfnapi-858f65d478-kpb9g" podUID="679e1a62-9b64-4f60-a7b5-b218eed30fd7" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.0.232:8000/healthcheck\": EOF" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.704085 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-858f65d478-kpb9g" podUID="679e1a62-9b64-4f60-a7b5-b218eed30fd7" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.0.232:8000/healthcheck\": EOF" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.705744 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqxmp\" (UniqueName: \"kubernetes.io/projected/9b314084-941d-4d00-bae6-6fdce2dc24db-kube-api-access-rqxmp\") pod \"heat-cfnapi-6578b64f7d-9cxnx\" (UID: \"9b314084-941d-4d00-bae6-6fdce2dc24db\") " pod="openstack/heat-cfnapi-6578b64f7d-9cxnx" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.705799 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b314084-941d-4d00-bae6-6fdce2dc24db-internal-tls-certs\") pod \"heat-cfnapi-6578b64f7d-9cxnx\" (UID: \"9b314084-941d-4d00-bae6-6fdce2dc24db\") " pod="openstack/heat-cfnapi-6578b64f7d-9cxnx" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.705840 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b314084-941d-4d00-bae6-6fdce2dc24db-combined-ca-bundle\") pod \"heat-cfnapi-6578b64f7d-9cxnx\" (UID: \"9b314084-941d-4d00-bae6-6fdce2dc24db\") " pod="openstack/heat-cfnapi-6578b64f7d-9cxnx" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.705986 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4841a92-8277-45f9-b366-8913a20ec8ad-public-tls-certs\") pod \"heat-api-68d98b8999-qqz47\" (UID: \"a4841a92-8277-45f9-b366-8913a20ec8ad\") " pod="openstack/heat-api-68d98b8999-qqz47" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.706067 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9b314084-941d-4d00-bae6-6fdce2dc24db-config-data-custom\") pod \"heat-cfnapi-6578b64f7d-9cxnx\" (UID: \"9b314084-941d-4d00-bae6-6fdce2dc24db\") " pod="openstack/heat-cfnapi-6578b64f7d-9cxnx" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.706162 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4841a92-8277-45f9-b366-8913a20ec8ad-combined-ca-bundle\") pod \"heat-api-68d98b8999-qqz47\" (UID: \"a4841a92-8277-45f9-b366-8913a20ec8ad\") " pod="openstack/heat-api-68d98b8999-qqz47" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.706199 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbs6l\" (UniqueName: \"kubernetes.io/projected/a4841a92-8277-45f9-b366-8913a20ec8ad-kube-api-access-vbs6l\") pod \"heat-api-68d98b8999-qqz47\" (UID: \"a4841a92-8277-45f9-b366-8913a20ec8ad\") " pod="openstack/heat-api-68d98b8999-qqz47" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.706250 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4841a92-8277-45f9-b366-8913a20ec8ad-config-data-custom\") pod \"heat-api-68d98b8999-qqz47\" (UID: \"a4841a92-8277-45f9-b366-8913a20ec8ad\") " pod="openstack/heat-api-68d98b8999-qqz47" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.706304 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b314084-941d-4d00-bae6-6fdce2dc24db-public-tls-certs\") pod \"heat-cfnapi-6578b64f7d-9cxnx\" (UID: \"9b314084-941d-4d00-bae6-6fdce2dc24db\") " pod="openstack/heat-cfnapi-6578b64f7d-9cxnx" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.706425 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4841a92-8277-45f9-b366-8913a20ec8ad-config-data\") pod \"heat-api-68d98b8999-qqz47\" (UID: \"a4841a92-8277-45f9-b366-8913a20ec8ad\") " pod="openstack/heat-api-68d98b8999-qqz47" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.706495 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4841a92-8277-45f9-b366-8913a20ec8ad-internal-tls-certs\") pod \"heat-api-68d98b8999-qqz47\" (UID: \"a4841a92-8277-45f9-b366-8913a20ec8ad\") " pod="openstack/heat-api-68d98b8999-qqz47" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.706557 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b314084-941d-4d00-bae6-6fdce2dc24db-config-data\") pod \"heat-cfnapi-6578b64f7d-9cxnx\" (UID: \"9b314084-941d-4d00-bae6-6fdce2dc24db\") " pod="openstack/heat-cfnapi-6578b64f7d-9cxnx" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.711441 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6578b64f7d-9cxnx"] Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.727836 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4841a92-8277-45f9-b366-8913a20ec8ad-internal-tls-certs\") pod \"heat-api-68d98b8999-qqz47\" (UID: \"a4841a92-8277-45f9-b366-8913a20ec8ad\") " pod="openstack/heat-api-68d98b8999-qqz47" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.728056 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4841a92-8277-45f9-b366-8913a20ec8ad-combined-ca-bundle\") pod \"heat-api-68d98b8999-qqz47\" (UID: \"a4841a92-8277-45f9-b366-8913a20ec8ad\") " pod="openstack/heat-api-68d98b8999-qqz47" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.728352 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4841a92-8277-45f9-b366-8913a20ec8ad-public-tls-certs\") pod \"heat-api-68d98b8999-qqz47\" (UID: \"a4841a92-8277-45f9-b366-8913a20ec8ad\") " pod="openstack/heat-api-68d98b8999-qqz47" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.728981 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4841a92-8277-45f9-b366-8913a20ec8ad-config-data-custom\") pod \"heat-api-68d98b8999-qqz47\" (UID: \"a4841a92-8277-45f9-b366-8913a20ec8ad\") " pod="openstack/heat-api-68d98b8999-qqz47" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.742091 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4841a92-8277-45f9-b366-8913a20ec8ad-config-data\") pod \"heat-api-68d98b8999-qqz47\" (UID: \"a4841a92-8277-45f9-b366-8913a20ec8ad\") " pod="openstack/heat-api-68d98b8999-qqz47" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.755735 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbs6l\" (UniqueName: \"kubernetes.io/projected/a4841a92-8277-45f9-b366-8913a20ec8ad-kube-api-access-vbs6l\") pod \"heat-api-68d98b8999-qqz47\" (UID: \"a4841a92-8277-45f9-b366-8913a20ec8ad\") " pod="openstack/heat-api-68d98b8999-qqz47" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.808553 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b314084-941d-4d00-bae6-6fdce2dc24db-config-data\") pod \"heat-cfnapi-6578b64f7d-9cxnx\" (UID: \"9b314084-941d-4d00-bae6-6fdce2dc24db\") " pod="openstack/heat-cfnapi-6578b64f7d-9cxnx" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.808690 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqxmp\" (UniqueName: \"kubernetes.io/projected/9b314084-941d-4d00-bae6-6fdce2dc24db-kube-api-access-rqxmp\") pod \"heat-cfnapi-6578b64f7d-9cxnx\" (UID: \"9b314084-941d-4d00-bae6-6fdce2dc24db\") " pod="openstack/heat-cfnapi-6578b64f7d-9cxnx" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.808721 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b314084-941d-4d00-bae6-6fdce2dc24db-internal-tls-certs\") pod \"heat-cfnapi-6578b64f7d-9cxnx\" (UID: \"9b314084-941d-4d00-bae6-6fdce2dc24db\") " pod="openstack/heat-cfnapi-6578b64f7d-9cxnx" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.808759 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b314084-941d-4d00-bae6-6fdce2dc24db-combined-ca-bundle\") pod \"heat-cfnapi-6578b64f7d-9cxnx\" (UID: \"9b314084-941d-4d00-bae6-6fdce2dc24db\") " pod="openstack/heat-cfnapi-6578b64f7d-9cxnx" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.808850 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9b314084-941d-4d00-bae6-6fdce2dc24db-config-data-custom\") pod \"heat-cfnapi-6578b64f7d-9cxnx\" (UID: \"9b314084-941d-4d00-bae6-6fdce2dc24db\") " pod="openstack/heat-cfnapi-6578b64f7d-9cxnx" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.809120 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b314084-941d-4d00-bae6-6fdce2dc24db-public-tls-certs\") pod \"heat-cfnapi-6578b64f7d-9cxnx\" (UID: \"9b314084-941d-4d00-bae6-6fdce2dc24db\") " pod="openstack/heat-cfnapi-6578b64f7d-9cxnx" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.825410 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b314084-941d-4d00-bae6-6fdce2dc24db-config-data\") pod \"heat-cfnapi-6578b64f7d-9cxnx\" (UID: \"9b314084-941d-4d00-bae6-6fdce2dc24db\") " pod="openstack/heat-cfnapi-6578b64f7d-9cxnx" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.833113 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b314084-941d-4d00-bae6-6fdce2dc24db-public-tls-certs\") pod \"heat-cfnapi-6578b64f7d-9cxnx\" (UID: \"9b314084-941d-4d00-bae6-6fdce2dc24db\") " pod="openstack/heat-cfnapi-6578b64f7d-9cxnx" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.834392 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9b314084-941d-4d00-bae6-6fdce2dc24db-config-data-custom\") pod \"heat-cfnapi-6578b64f7d-9cxnx\" (UID: \"9b314084-941d-4d00-bae6-6fdce2dc24db\") " pod="openstack/heat-cfnapi-6578b64f7d-9cxnx" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.837548 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b314084-941d-4d00-bae6-6fdce2dc24db-combined-ca-bundle\") pod \"heat-cfnapi-6578b64f7d-9cxnx\" (UID: \"9b314084-941d-4d00-bae6-6fdce2dc24db\") " pod="openstack/heat-cfnapi-6578b64f7d-9cxnx" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.843674 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b314084-941d-4d00-bae6-6fdce2dc24db-internal-tls-certs\") pod \"heat-cfnapi-6578b64f7d-9cxnx\" (UID: \"9b314084-941d-4d00-bae6-6fdce2dc24db\") " pod="openstack/heat-cfnapi-6578b64f7d-9cxnx" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.845351 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqxmp\" (UniqueName: \"kubernetes.io/projected/9b314084-941d-4d00-bae6-6fdce2dc24db-kube-api-access-rqxmp\") pod \"heat-cfnapi-6578b64f7d-9cxnx\" (UID: \"9b314084-941d-4d00-bae6-6fdce2dc24db\") " pod="openstack/heat-cfnapi-6578b64f7d-9cxnx" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.857387 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-858f65d478-kpb9g" podUID="679e1a62-9b64-4f60-a7b5-b218eed30fd7" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.0.232:8000/healthcheck\": EOF" Mar 09 13:22:30 crc kubenswrapper[4723]: I0309 13:22:30.942801 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-68d98b8999-qqz47" Mar 09 13:22:31 crc kubenswrapper[4723]: I0309 13:22:31.007577 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6578b64f7d-9cxnx" Mar 09 13:22:31 crc kubenswrapper[4723]: I0309 13:22:31.551965 4723 generic.go:334] "Generic (PLEG): container finished" podID="f180e14d-e014-49e0-8177-619c97476f71" containerID="0272fd11ac7fc956be7b2bcb860e5bf6408a5fd85e06c7da6adddcd36f4e70c4" exitCode=0 Mar 09 13:22:31 crc kubenswrapper[4723]: I0309 13:22:31.552127 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f180e14d-e014-49e0-8177-619c97476f71","Type":"ContainerDied","Data":"0272fd11ac7fc956be7b2bcb860e5bf6408a5fd85e06c7da6adddcd36f4e70c4"} Mar 09 13:22:32 crc kubenswrapper[4723]: I0309 13:22:32.671062 4723 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-56845cc998-n2w92" Mar 09 13:22:32 crc kubenswrapper[4723]: I0309 13:22:32.671377 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-56845cc998-n2w92" Mar 09 13:22:32 crc kubenswrapper[4723]: I0309 13:22:32.716704 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-57d4ddfcc7-js2kz" Mar 09 13:22:32 crc kubenswrapper[4723]: I0309 13:22:32.716749 4723 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-57d4ddfcc7-js2kz" Mar 09 13:22:33 crc kubenswrapper[4723]: I0309 13:22:33.602033 4723 generic.go:334] "Generic (PLEG): container finished" podID="2e90760e-0ff0-4195-8bb9-d32fe674feb5" containerID="f76c6b69b519dcbc0aca84270d607b9fdb4f7fc3927531998d5b7deedd7f3f99" exitCode=0 Mar 09 13:22:33 crc kubenswrapper[4723]: I0309 13:22:33.602223 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2e90760e-0ff0-4195-8bb9-d32fe674feb5","Type":"ContainerDied","Data":"f76c6b69b519dcbc0aca84270d607b9fdb4f7fc3927531998d5b7deedd7f3f99"} Mar 09 13:22:35 crc kubenswrapper[4723]: I0309 13:22:35.190825 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-59fcdfbdd7-xjdk4" podUID="e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.0.231:8004/healthcheck\": read tcp 10.217.0.2:43386->10.217.0.231:8004: read: connection reset by peer" Mar 09 13:22:35 crc kubenswrapper[4723]: I0309 13:22:35.191784 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-59fcdfbdd7-xjdk4" podUID="e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.0.231:8004/healthcheck\": dial tcp 10.217.0.231:8004: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4723]: I0309 13:22:35.330553 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5784cf869f-tfrlw" podUID="4b35cba0-637b-481c-a44f-854ba4c3f86e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.212:5353: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4723]: I0309 13:22:35.629141 4723 generic.go:334] "Generic (PLEG): container finished" podID="e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff" containerID="4447d520e7430d1b1fa51a0b30066b24df42ad8a3421fec79d57677c43332d8e" exitCode=0 Mar 09 13:22:35 crc kubenswrapper[4723]: I0309 13:22:35.629175 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-59fcdfbdd7-xjdk4" event={"ID":"e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff","Type":"ContainerDied","Data":"4447d520e7430d1b1fa51a0b30066b24df42ad8a3421fec79d57677c43332d8e"} Mar 09 13:22:35 crc kubenswrapper[4723]: I0309 13:22:35.938598 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-677d745ffb-ng6tr" Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.080592 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed96f382-04dd-41ec-b370-832266d07122-internal-tls-certs\") pod \"ed96f382-04dd-41ec-b370-832266d07122\" (UID: \"ed96f382-04dd-41ec-b370-832266d07122\") " Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.080647 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed96f382-04dd-41ec-b370-832266d07122-scripts\") pod \"ed96f382-04dd-41ec-b370-832266d07122\" (UID: \"ed96f382-04dd-41ec-b370-832266d07122\") " Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.080721 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed96f382-04dd-41ec-b370-832266d07122-logs\") pod \"ed96f382-04dd-41ec-b370-832266d07122\" (UID: \"ed96f382-04dd-41ec-b370-832266d07122\") " Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.080755 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed96f382-04dd-41ec-b370-832266d07122-config-data\") pod \"ed96f382-04dd-41ec-b370-832266d07122\" (UID: \"ed96f382-04dd-41ec-b370-832266d07122\") " Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.080788 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tb5cd\" (UniqueName: \"kubernetes.io/projected/ed96f382-04dd-41ec-b370-832266d07122-kube-api-access-tb5cd\") pod \"ed96f382-04dd-41ec-b370-832266d07122\" (UID: \"ed96f382-04dd-41ec-b370-832266d07122\") " Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.080828 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed96f382-04dd-41ec-b370-832266d07122-public-tls-certs\") pod \"ed96f382-04dd-41ec-b370-832266d07122\" (UID: \"ed96f382-04dd-41ec-b370-832266d07122\") " Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.081085 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed96f382-04dd-41ec-b370-832266d07122-combined-ca-bundle\") pod \"ed96f382-04dd-41ec-b370-832266d07122\" (UID: \"ed96f382-04dd-41ec-b370-832266d07122\") " Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.082705 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed96f382-04dd-41ec-b370-832266d07122-logs" (OuterVolumeSpecName: "logs") pod "ed96f382-04dd-41ec-b370-832266d07122" (UID: "ed96f382-04dd-41ec-b370-832266d07122"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.103508 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed96f382-04dd-41ec-b370-832266d07122-kube-api-access-tb5cd" (OuterVolumeSpecName: "kube-api-access-tb5cd") pod "ed96f382-04dd-41ec-b370-832266d07122" (UID: "ed96f382-04dd-41ec-b370-832266d07122"). InnerVolumeSpecName "kube-api-access-tb5cd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.113062 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed96f382-04dd-41ec-b370-832266d07122-scripts" (OuterVolumeSpecName: "scripts") pod "ed96f382-04dd-41ec-b370-832266d07122" (UID: "ed96f382-04dd-41ec-b370-832266d07122"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.122177 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-858f65d478-kpb9g" podUID="679e1a62-9b64-4f60-a7b5-b218eed30fd7" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.0.232:8000/healthcheck\": read tcp 10.217.0.2:57666->10.217.0.232:8000: read: connection reset by peer" Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.184982 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tb5cd\" (UniqueName: \"kubernetes.io/projected/ed96f382-04dd-41ec-b370-832266d07122-kube-api-access-tb5cd\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.185323 4723 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed96f382-04dd-41ec-b370-832266d07122-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.185336 4723 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed96f382-04dd-41ec-b370-832266d07122-logs\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.288131 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed96f382-04dd-41ec-b370-832266d07122-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed96f382-04dd-41ec-b370-832266d07122" (UID: "ed96f382-04dd-41ec-b370-832266d07122"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.314746 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed96f382-04dd-41ec-b370-832266d07122-config-data" (OuterVolumeSpecName: "config-data") pod "ed96f382-04dd-41ec-b370-832266d07122" (UID: "ed96f382-04dd-41ec-b370-832266d07122"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.323986 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed96f382-04dd-41ec-b370-832266d07122-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ed96f382-04dd-41ec-b370-832266d07122" (UID: "ed96f382-04dd-41ec-b370-832266d07122"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.360018 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed96f382-04dd-41ec-b370-832266d07122-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ed96f382-04dd-41ec-b370-832266d07122" (UID: "ed96f382-04dd-41ec-b370-832266d07122"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.390324 4723 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed96f382-04dd-41ec-b370-832266d07122-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.390356 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed96f382-04dd-41ec-b370-832266d07122-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.390366 4723 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed96f382-04dd-41ec-b370-832266d07122-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.390375 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed96f382-04dd-41ec-b370-832266d07122-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.685409 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-59fcdfbdd7-xjdk4" Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.688562 4723 generic.go:334] "Generic (PLEG): container finished" podID="679e1a62-9b64-4f60-a7b5-b218eed30fd7" containerID="9d3a0391a5f5505c131e4f1338948ab5d2b8697120ffd0291ccc063ee931352b" exitCode=0 Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.688630 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-858f65d478-kpb9g" event={"ID":"679e1a62-9b64-4f60-a7b5-b218eed30fd7","Type":"ContainerDied","Data":"9d3a0391a5f5505c131e4f1338948ab5d2b8697120ffd0291ccc063ee931352b"} Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.695247 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-tfrlw" Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.717826 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-677d745ffb-ng6tr" event={"ID":"ed96f382-04dd-41ec-b370-832266d07122","Type":"ContainerDied","Data":"108797fee4bc3e599d56d59fe81b93399c1c794db55cfea0cf31591e9e784560"} Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.717906 4723 scope.go:117] "RemoveContainer" containerID="8b43efdd93be446949623ace9492232e4959bc04ea236173db991fe5c509e83d" Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.717922 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-677d745ffb-ng6tr" Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.773277 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-677d745ffb-ng6tr"] Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.789631 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-677d745ffb-ng6tr"] Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.804097 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b35cba0-637b-481c-a44f-854ba4c3f86e-dns-svc\") pod \"4b35cba0-637b-481c-a44f-854ba4c3f86e\" (UID: \"4b35cba0-637b-481c-a44f-854ba4c3f86e\") " Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.804180 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b35cba0-637b-481c-a44f-854ba4c3f86e-ovsdbserver-sb\") pod \"4b35cba0-637b-481c-a44f-854ba4c3f86e\" (UID: \"4b35cba0-637b-481c-a44f-854ba4c3f86e\") " Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.804285 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff-combined-ca-bundle\") pod \"e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff\" (UID: \"e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff\") " Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.804339 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2psw5\" (UniqueName: \"kubernetes.io/projected/e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff-kube-api-access-2psw5\") pod \"e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff\" (UID: \"e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff\") " Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.804372 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b35cba0-637b-481c-a44f-854ba4c3f86e-config\") pod \"4b35cba0-637b-481c-a44f-854ba4c3f86e\" (UID: \"4b35cba0-637b-481c-a44f-854ba4c3f86e\") " Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.804389 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff-config-data-custom\") pod \"e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff\" (UID: \"e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff\") " Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.804460 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff-config-data\") pod \"e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff\" (UID: \"e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff\") " Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.804503 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b35cba0-637b-481c-a44f-854ba4c3f86e-ovsdbserver-nb\") pod \"4b35cba0-637b-481c-a44f-854ba4c3f86e\" (UID: \"4b35cba0-637b-481c-a44f-854ba4c3f86e\") " Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.804561 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlpmw\" (UniqueName: \"kubernetes.io/projected/4b35cba0-637b-481c-a44f-854ba4c3f86e-kube-api-access-jlpmw\") pod \"4b35cba0-637b-481c-a44f-854ba4c3f86e\" (UID: \"4b35cba0-637b-481c-a44f-854ba4c3f86e\") " Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.804619 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4b35cba0-637b-481c-a44f-854ba4c3f86e-dns-swift-storage-0\") pod \"4b35cba0-637b-481c-a44f-854ba4c3f86e\" (UID: \"4b35cba0-637b-481c-a44f-854ba4c3f86e\") " Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.814142 4723 scope.go:117] "RemoveContainer" containerID="d04c43cd4468292b687fa6b635dd878c8d44b4defaf6b4c9adff867bb0b84ad2" Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.835137 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b35cba0-637b-481c-a44f-854ba4c3f86e-kube-api-access-jlpmw" (OuterVolumeSpecName: "kube-api-access-jlpmw") pod "4b35cba0-637b-481c-a44f-854ba4c3f86e" (UID: "4b35cba0-637b-481c-a44f-854ba4c3f86e"). InnerVolumeSpecName "kube-api-access-jlpmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.836494 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff-kube-api-access-2psw5" (OuterVolumeSpecName: "kube-api-access-2psw5") pod "e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff" (UID: "e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff"). InnerVolumeSpecName "kube-api-access-2psw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.846653 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff" (UID: "e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.912833 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlpmw\" (UniqueName: \"kubernetes.io/projected/4b35cba0-637b-481c-a44f-854ba4c3f86e-kube-api-access-jlpmw\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.916215 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2psw5\" (UniqueName: \"kubernetes.io/projected/e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff-kube-api-access-2psw5\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.916232 4723 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.935724 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b35cba0-637b-481c-a44f-854ba4c3f86e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4b35cba0-637b-481c-a44f-854ba4c3f86e" (UID: "4b35cba0-637b-481c-a44f-854ba4c3f86e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.966658 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b35cba0-637b-481c-a44f-854ba4c3f86e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4b35cba0-637b-481c-a44f-854ba4c3f86e" (UID: "4b35cba0-637b-481c-a44f-854ba4c3f86e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:36 crc kubenswrapper[4723]: I0309 13:22:36.975051 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed96f382-04dd-41ec-b370-832266d07122" path="/var/lib/kubelet/pods/ed96f382-04dd-41ec-b370-832266d07122/volumes" Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.012855 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b35cba0-637b-481c-a44f-854ba4c3f86e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4b35cba0-637b-481c-a44f-854ba4c3f86e" (UID: "4b35cba0-637b-481c-a44f-854ba4c3f86e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.018813 4723 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b35cba0-637b-481c-a44f-854ba4c3f86e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.018847 4723 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b35cba0-637b-481c-a44f-854ba4c3f86e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.018873 4723 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4b35cba0-637b-481c-a44f-854ba4c3f86e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.020113 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff" (UID: "e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.035567 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff-config-data" (OuterVolumeSpecName: "config-data") pod "e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff" (UID: "e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.110077 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b35cba0-637b-481c-a44f-854ba4c3f86e-config" (OuterVolumeSpecName: "config") pod "4b35cba0-637b-481c-a44f-854ba4c3f86e" (UID: "4b35cba0-637b-481c-a44f-854ba4c3f86e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.121839 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.121900 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b35cba0-637b-481c-a44f-854ba4c3f86e-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.121914 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.136005 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b35cba0-637b-481c-a44f-854ba4c3f86e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4b35cba0-637b-481c-a44f-854ba4c3f86e" (UID: "4b35cba0-637b-481c-a44f-854ba4c3f86e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.232719 4723 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b35cba0-637b-481c-a44f-854ba4c3f86e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.359089 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.442329 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f180e14d-e014-49e0-8177-619c97476f71-run-httpd\") pod \"f180e14d-e014-49e0-8177-619c97476f71\" (UID: \"f180e14d-e014-49e0-8177-619c97476f71\") " Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.442456 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f180e14d-e014-49e0-8177-619c97476f71-combined-ca-bundle\") pod \"f180e14d-e014-49e0-8177-619c97476f71\" (UID: \"f180e14d-e014-49e0-8177-619c97476f71\") " Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.443794 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f180e14d-e014-49e0-8177-619c97476f71-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f180e14d-e014-49e0-8177-619c97476f71" (UID: "f180e14d-e014-49e0-8177-619c97476f71"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.444429 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9bn8\" (UniqueName: \"kubernetes.io/projected/f180e14d-e014-49e0-8177-619c97476f71-kube-api-access-b9bn8\") pod \"f180e14d-e014-49e0-8177-619c97476f71\" (UID: \"f180e14d-e014-49e0-8177-619c97476f71\") " Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.445015 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f180e14d-e014-49e0-8177-619c97476f71-sg-core-conf-yaml\") pod \"f180e14d-e014-49e0-8177-619c97476f71\" (UID: \"f180e14d-e014-49e0-8177-619c97476f71\") " Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.446583 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f180e14d-e014-49e0-8177-619c97476f71-config-data\") pod \"f180e14d-e014-49e0-8177-619c97476f71\" (UID: \"f180e14d-e014-49e0-8177-619c97476f71\") " Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.446715 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f180e14d-e014-49e0-8177-619c97476f71-log-httpd\") pod \"f180e14d-e014-49e0-8177-619c97476f71\" (UID: \"f180e14d-e014-49e0-8177-619c97476f71\") " Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.447585 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f180e14d-e014-49e0-8177-619c97476f71-scripts\") pod \"f180e14d-e014-49e0-8177-619c97476f71\" (UID: \"f180e14d-e014-49e0-8177-619c97476f71\") " Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.449191 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f180e14d-e014-49e0-8177-619c97476f71-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f180e14d-e014-49e0-8177-619c97476f71" (UID: "f180e14d-e014-49e0-8177-619c97476f71"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.458606 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-858f65d478-kpb9g" Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.460234 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f180e14d-e014-49e0-8177-619c97476f71-kube-api-access-b9bn8" (OuterVolumeSpecName: "kube-api-access-b9bn8") pod "f180e14d-e014-49e0-8177-619c97476f71" (UID: "f180e14d-e014-49e0-8177-619c97476f71"). InnerVolumeSpecName "kube-api-access-b9bn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.461167 4723 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f180e14d-e014-49e0-8177-619c97476f71-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.461194 4723 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f180e14d-e014-49e0-8177-619c97476f71-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.461229 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9bn8\" (UniqueName: \"kubernetes.io/projected/f180e14d-e014-49e0-8177-619c97476f71-kube-api-access-b9bn8\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.461167 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f180e14d-e014-49e0-8177-619c97476f71-scripts" (OuterVolumeSpecName: "scripts") pod "f180e14d-e014-49e0-8177-619c97476f71" (UID: "f180e14d-e014-49e0-8177-619c97476f71"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.493495 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.564600 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e90760e-0ff0-4195-8bb9-d32fe674feb5-config-data\") pod \"2e90760e-0ff0-4195-8bb9-d32fe674feb5\" (UID: \"2e90760e-0ff0-4195-8bb9-d32fe674feb5\") " Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.564951 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e90760e-0ff0-4195-8bb9-d32fe674feb5-combined-ca-bundle\") pod \"2e90760e-0ff0-4195-8bb9-d32fe674feb5\" (UID: \"2e90760e-0ff0-4195-8bb9-d32fe674feb5\") " Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.565006 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2e90760e-0ff0-4195-8bb9-d32fe674feb5-httpd-run\") pod \"2e90760e-0ff0-4195-8bb9-d32fe674feb5\" (UID: \"2e90760e-0ff0-4195-8bb9-d32fe674feb5\") " Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.566075 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c93877a3-be1d-4f5b-9fa6-375a3db60f77\") pod \"2e90760e-0ff0-4195-8bb9-d32fe674feb5\" (UID: \"2e90760e-0ff0-4195-8bb9-d32fe674feb5\") " Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.566136 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e90760e-0ff0-4195-8bb9-d32fe674feb5-scripts\") pod \"2e90760e-0ff0-4195-8bb9-d32fe674feb5\" (UID: \"2e90760e-0ff0-4195-8bb9-d32fe674feb5\") " Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.566229 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e90760e-0ff0-4195-8bb9-d32fe674feb5-internal-tls-certs\") pod \"2e90760e-0ff0-4195-8bb9-d32fe674feb5\" (UID: \"2e90760e-0ff0-4195-8bb9-d32fe674feb5\") " Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.566276 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/679e1a62-9b64-4f60-a7b5-b218eed30fd7-config-data\") pod \"679e1a62-9b64-4f60-a7b5-b218eed30fd7\" (UID: \"679e1a62-9b64-4f60-a7b5-b218eed30fd7\") " Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.566378 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bd2px\" (UniqueName: \"kubernetes.io/projected/679e1a62-9b64-4f60-a7b5-b218eed30fd7-kube-api-access-bd2px\") pod \"679e1a62-9b64-4f60-a7b5-b218eed30fd7\" (UID: \"679e1a62-9b64-4f60-a7b5-b218eed30fd7\") " Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.566435 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e90760e-0ff0-4195-8bb9-d32fe674feb5-logs\") pod \"2e90760e-0ff0-4195-8bb9-d32fe674feb5\" (UID: \"2e90760e-0ff0-4195-8bb9-d32fe674feb5\") " Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.566489 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/679e1a62-9b64-4f60-a7b5-b218eed30fd7-config-data-custom\") pod \"679e1a62-9b64-4f60-a7b5-b218eed30fd7\" (UID: \"679e1a62-9b64-4f60-a7b5-b218eed30fd7\") " Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.566643 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jf57h\" (UniqueName: \"kubernetes.io/projected/2e90760e-0ff0-4195-8bb9-d32fe674feb5-kube-api-access-jf57h\") pod \"2e90760e-0ff0-4195-8bb9-d32fe674feb5\" (UID: \"2e90760e-0ff0-4195-8bb9-d32fe674feb5\") " Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.566718 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/679e1a62-9b64-4f60-a7b5-b218eed30fd7-combined-ca-bundle\") pod \"679e1a62-9b64-4f60-a7b5-b218eed30fd7\" (UID: \"679e1a62-9b64-4f60-a7b5-b218eed30fd7\") " Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.567637 4723 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f180e14d-e014-49e0-8177-619c97476f71-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.574032 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e90760e-0ff0-4195-8bb9-d32fe674feb5-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2e90760e-0ff0-4195-8bb9-d32fe674feb5" (UID: "2e90760e-0ff0-4195-8bb9-d32fe674feb5"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.581234 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e90760e-0ff0-4195-8bb9-d32fe674feb5-logs" (OuterVolumeSpecName: "logs") pod "2e90760e-0ff0-4195-8bb9-d32fe674feb5" (UID: "2e90760e-0ff0-4195-8bb9-d32fe674feb5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.612799 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/679e1a62-9b64-4f60-a7b5-b218eed30fd7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "679e1a62-9b64-4f60-a7b5-b218eed30fd7" (UID: "679e1a62-9b64-4f60-a7b5-b218eed30fd7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.613326 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e90760e-0ff0-4195-8bb9-d32fe674feb5-kube-api-access-jf57h" (OuterVolumeSpecName: "kube-api-access-jf57h") pod "2e90760e-0ff0-4195-8bb9-d32fe674feb5" (UID: "2e90760e-0ff0-4195-8bb9-d32fe674feb5"). InnerVolumeSpecName "kube-api-access-jf57h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.619600 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/679e1a62-9b64-4f60-a7b5-b218eed30fd7-kube-api-access-bd2px" (OuterVolumeSpecName: "kube-api-access-bd2px") pod "679e1a62-9b64-4f60-a7b5-b218eed30fd7" (UID: "679e1a62-9b64-4f60-a7b5-b218eed30fd7"). InnerVolumeSpecName "kube-api-access-bd2px". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.624808 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e90760e-0ff0-4195-8bb9-d32fe674feb5-scripts" (OuterVolumeSpecName: "scripts") pod "2e90760e-0ff0-4195-8bb9-d32fe674feb5" (UID: "2e90760e-0ff0-4195-8bb9-d32fe674feb5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.653565 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6578b64f7d-9cxnx"] Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.672104 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-68d98b8999-qqz47"] Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.673123 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bd2px\" (UniqueName: \"kubernetes.io/projected/679e1a62-9b64-4f60-a7b5-b218eed30fd7-kube-api-access-bd2px\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.673163 4723 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e90760e-0ff0-4195-8bb9-d32fe674feb5-logs\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.673177 4723 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/679e1a62-9b64-4f60-a7b5-b218eed30fd7-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.673187 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jf57h\" (UniqueName: \"kubernetes.io/projected/2e90760e-0ff0-4195-8bb9-d32fe674feb5-kube-api-access-jf57h\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.673198 4723 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2e90760e-0ff0-4195-8bb9-d32fe674feb5-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.673207 4723 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e90760e-0ff0-4195-8bb9-d32fe674feb5-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.740679 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-pppx5" event={"ID":"7e8ed559-11fc-4511-9258-1681da84b5cd","Type":"ContainerStarted","Data":"1da2ab881be2e547108284e873355eae3dc6a7acd1feaa32a1e59f89b44f94c5"} Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.762043 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-59fcdfbdd7-xjdk4" event={"ID":"e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff","Type":"ContainerDied","Data":"69de9708249aa7f9a8ab14ef505ce762c78a9d4f8dc77326498c26302c868709"} Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.762107 4723 scope.go:117] "RemoveContainer" containerID="4447d520e7430d1b1fa51a0b30066b24df42ad8a3421fec79d57677c43332d8e" Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.762254 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-59fcdfbdd7-xjdk4" Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.814715 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-858f65d478-kpb9g" event={"ID":"679e1a62-9b64-4f60-a7b5-b218eed30fd7","Type":"ContainerDied","Data":"cc71cd76c6b1574ceb478fb4da66268c523aa779e217be3aa239b656aa9efe86"} Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.814928 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-858f65d478-kpb9g" Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.823404 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-68d98b8999-qqz47" event={"ID":"a4841a92-8277-45f9-b366-8913a20ec8ad","Type":"ContainerStarted","Data":"397c9ac834a8653b4e8065ceeef8bd7bd7300bf37d394d257680654a0671cba1"} Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.834568 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6578b64f7d-9cxnx" event={"ID":"9b314084-941d-4d00-bae6-6fdce2dc24db","Type":"ContainerStarted","Data":"4c88b4129ecc293ac1b2d7276595947520931a6f4a60127f8b69ca4b33cbd0eb"} Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.856733 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f180e14d-e014-49e0-8177-619c97476f71","Type":"ContainerDied","Data":"1b2ec9758db1896f3384b5dead357cce3c85b04be6a57559fe0b79bee18d74e1"} Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.856897 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.871310 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-57d4ddfcc7-js2kz" Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.874208 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2e90760e-0ff0-4195-8bb9-d32fe674feb5","Type":"ContainerDied","Data":"3d4fc48d171500cfc6491ae632987877a3921d836a072fead19695bd557ed8f3"} Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.874421 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.876307 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-tfrlw" event={"ID":"4b35cba0-637b-481c-a44f-854ba4c3f86e","Type":"ContainerDied","Data":"07f20a88afb2cc0d902f2d35535087a280d283443326b12f79be1f06cc1f5354"} Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.881917 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-tfrlw" Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.890278 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-57d4ddfcc7-js2kz" podStartSLOduration=10.890253721 podStartE2EDuration="10.890253721s" podCreationTimestamp="2026-03-09 13:22:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:22:37.889018838 +0000 UTC m=+1431.903486388" watchObservedRunningTime="2026-03-09 13:22:37.890253721 +0000 UTC m=+1431.904721271" Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.903906 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-pppx5" podStartSLOduration=3.76160225 podStartE2EDuration="17.90388569s" podCreationTimestamp="2026-03-09 13:22:20 +0000 UTC" firstStartedPulling="2026-03-09 13:22:22.592311203 +0000 UTC m=+1416.606778743" lastFinishedPulling="2026-03-09 13:22:36.734594633 +0000 UTC m=+1430.749062183" observedRunningTime="2026-03-09 13:22:37.769703843 +0000 UTC m=+1431.784171373" watchObservedRunningTime="2026-03-09 13:22:37.90388569 +0000 UTC m=+1431.918353230" Mar 09 13:22:37 crc kubenswrapper[4723]: I0309 13:22:37.988426 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f180e14d-e014-49e0-8177-619c97476f71-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f180e14d-e014-49e0-8177-619c97476f71" (UID: "f180e14d-e014-49e0-8177-619c97476f71"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:38 crc kubenswrapper[4723]: E0309 13:22:38.045254 4723 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c93877a3-be1d-4f5b-9fa6-375a3db60f77 podName:2e90760e-0ff0-4195-8bb9-d32fe674feb5 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:38.545224275 +0000 UTC m=+1432.559691825 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "glance" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c93877a3-be1d-4f5b-9fa6-375a3db60f77") pod "2e90760e-0ff0-4195-8bb9-d32fe674feb5" (UID: "2e90760e-0ff0-4195-8bb9-d32fe674feb5") : kubernetes.io/csi: Unmounter.TearDownAt failed: rpc error: code = Unknown desc = check target path: could not get consistent content of /proc/mounts after 3 attempts Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.072413 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e90760e-0ff0-4195-8bb9-d32fe674feb5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e90760e-0ff0-4195-8bb9-d32fe674feb5" (UID: "2e90760e-0ff0-4195-8bb9-d32fe674feb5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.103637 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e90760e-0ff0-4195-8bb9-d32fe674feb5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.103668 4723 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f180e14d-e014-49e0-8177-619c97476f71-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.112921 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e90760e-0ff0-4195-8bb9-d32fe674feb5-config-data" (OuterVolumeSpecName: "config-data") pod "2e90760e-0ff0-4195-8bb9-d32fe674feb5" (UID: "2e90760e-0ff0-4195-8bb9-d32fe674feb5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.124223 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/679e1a62-9b64-4f60-a7b5-b218eed30fd7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "679e1a62-9b64-4f60-a7b5-b218eed30fd7" (UID: "679e1a62-9b64-4f60-a7b5-b218eed30fd7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.136215 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e90760e-0ff0-4195-8bb9-d32fe674feb5-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2e90760e-0ff0-4195-8bb9-d32fe674feb5" (UID: "2e90760e-0ff0-4195-8bb9-d32fe674feb5"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.168442 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f180e14d-e014-49e0-8177-619c97476f71-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f180e14d-e014-49e0-8177-619c97476f71" (UID: "f180e14d-e014-49e0-8177-619c97476f71"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.172339 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/679e1a62-9b64-4f60-a7b5-b218eed30fd7-config-data" (OuterVolumeSpecName: "config-data") pod "679e1a62-9b64-4f60-a7b5-b218eed30fd7" (UID: "679e1a62-9b64-4f60-a7b5-b218eed30fd7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.205648 4723 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e90760e-0ff0-4195-8bb9-d32fe674feb5-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.205674 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/679e1a62-9b64-4f60-a7b5-b218eed30fd7-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.205683 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/679e1a62-9b64-4f60-a7b5-b218eed30fd7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.205694 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f180e14d-e014-49e0-8177-619c97476f71-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.205702 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e90760e-0ff0-4195-8bb9-d32fe674feb5-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.216454 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f180e14d-e014-49e0-8177-619c97476f71-config-data" (OuterVolumeSpecName: "config-data") pod "f180e14d-e014-49e0-8177-619c97476f71" (UID: "f180e14d-e014-49e0-8177-619c97476f71"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.308241 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f180e14d-e014-49e0-8177-619c97476f71-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.461936 4723 scope.go:117] "RemoveContainer" containerID="9d3a0391a5f5505c131e4f1338948ab5d2b8697120ffd0291ccc063ee931352b" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.487958 4723 scope.go:117] "RemoveContainer" containerID="e6145fa800d76acbd5fdbb41cc97956a662917864102c6930bc905db1cca3f70" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.516504 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-tfrlw"] Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.524657 4723 scope.go:117] "RemoveContainer" containerID="673b7722468a4748c2c77474ebc301afdf3c4a61f1b63c8bdf3239852eb96420" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.597086 4723 scope.go:117] "RemoveContainer" containerID="7b23dca3800cbe823fc146c2ca4068e4badd3ebd69cc7d2a607209bd762fe9c3" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.617814 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c93877a3-be1d-4f5b-9fa6-375a3db60f77\") pod \"2e90760e-0ff0-4195-8bb9-d32fe674feb5\" (UID: \"2e90760e-0ff0-4195-8bb9-d32fe674feb5\") " Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.622981 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-tfrlw"] Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.634032 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-59fcdfbdd7-xjdk4"] Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.659367 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-59fcdfbdd7-xjdk4"] Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.660109 4723 scope.go:117] "RemoveContainer" containerID="0272fd11ac7fc956be7b2bcb860e5bf6408a5fd85e06c7da6adddcd36f4e70c4" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.674138 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.685594 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c93877a3-be1d-4f5b-9fa6-375a3db60f77" (OuterVolumeSpecName: "glance") pod "2e90760e-0ff0-4195-8bb9-d32fe674feb5" (UID: "2e90760e-0ff0-4195-8bb9-d32fe674feb5"). InnerVolumeSpecName "pvc-c93877a3-be1d-4f5b-9fa6-375a3db60f77". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.691570 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.706459 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:22:38 crc kubenswrapper[4723]: E0309 13:22:38.706905 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b35cba0-637b-481c-a44f-854ba4c3f86e" containerName="init" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.706924 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b35cba0-637b-481c-a44f-854ba4c3f86e" containerName="init" Mar 09 13:22:38 crc kubenswrapper[4723]: E0309 13:22:38.706938 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f180e14d-e014-49e0-8177-619c97476f71" containerName="sg-core" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.706945 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="f180e14d-e014-49e0-8177-619c97476f71" containerName="sg-core" Mar 09 13:22:38 crc kubenswrapper[4723]: E0309 13:22:38.706960 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b35cba0-637b-481c-a44f-854ba4c3f86e" containerName="dnsmasq-dns" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.706967 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b35cba0-637b-481c-a44f-854ba4c3f86e" containerName="dnsmasq-dns" Mar 09 13:22:38 crc kubenswrapper[4723]: E0309 13:22:38.706976 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed96f382-04dd-41ec-b370-832266d07122" containerName="placement-log" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.706983 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed96f382-04dd-41ec-b370-832266d07122" containerName="placement-log" Mar 09 13:22:38 crc kubenswrapper[4723]: E0309 13:22:38.706998 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e90760e-0ff0-4195-8bb9-d32fe674feb5" containerName="glance-log" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.707005 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e90760e-0ff0-4195-8bb9-d32fe674feb5" containerName="glance-log" Mar 09 13:22:38 crc kubenswrapper[4723]: E0309 13:22:38.707020 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed96f382-04dd-41ec-b370-832266d07122" containerName="placement-api" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.707027 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed96f382-04dd-41ec-b370-832266d07122" containerName="placement-api" Mar 09 13:22:38 crc kubenswrapper[4723]: E0309 13:22:38.707046 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f180e14d-e014-49e0-8177-619c97476f71" containerName="ceilometer-notification-agent" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.707055 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="f180e14d-e014-49e0-8177-619c97476f71" containerName="ceilometer-notification-agent" Mar 09 13:22:38 crc kubenswrapper[4723]: E0309 13:22:38.707069 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e90760e-0ff0-4195-8bb9-d32fe674feb5" containerName="glance-httpd" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.707075 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e90760e-0ff0-4195-8bb9-d32fe674feb5" containerName="glance-httpd" Mar 09 13:22:38 crc kubenswrapper[4723]: E0309 13:22:38.707089 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="679e1a62-9b64-4f60-a7b5-b218eed30fd7" containerName="heat-cfnapi" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.707095 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="679e1a62-9b64-4f60-a7b5-b218eed30fd7" containerName="heat-cfnapi" Mar 09 13:22:38 crc kubenswrapper[4723]: E0309 13:22:38.707114 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f180e14d-e014-49e0-8177-619c97476f71" containerName="proxy-httpd" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.707121 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="f180e14d-e014-49e0-8177-619c97476f71" containerName="proxy-httpd" Mar 09 13:22:38 crc kubenswrapper[4723]: E0309 13:22:38.707146 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f180e14d-e014-49e0-8177-619c97476f71" containerName="ceilometer-central-agent" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.707153 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="f180e14d-e014-49e0-8177-619c97476f71" containerName="ceilometer-central-agent" Mar 09 13:22:38 crc kubenswrapper[4723]: E0309 13:22:38.707169 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff" containerName="heat-api" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.707175 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff" containerName="heat-api" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.707404 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff" containerName="heat-api" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.707425 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="f180e14d-e014-49e0-8177-619c97476f71" containerName="proxy-httpd" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.707442 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e90760e-0ff0-4195-8bb9-d32fe674feb5" containerName="glance-httpd" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.707458 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e90760e-0ff0-4195-8bb9-d32fe674feb5" containerName="glance-log" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.707465 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b35cba0-637b-481c-a44f-854ba4c3f86e" containerName="dnsmasq-dns" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.707473 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="f180e14d-e014-49e0-8177-619c97476f71" containerName="ceilometer-central-agent" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.707481 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="679e1a62-9b64-4f60-a7b5-b218eed30fd7" containerName="heat-cfnapi" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.707490 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed96f382-04dd-41ec-b370-832266d07122" containerName="placement-log" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.707499 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="f180e14d-e014-49e0-8177-619c97476f71" containerName="sg-core" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.707506 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed96f382-04dd-41ec-b370-832266d07122" containerName="placement-api" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.707517 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="f180e14d-e014-49e0-8177-619c97476f71" containerName="ceilometer-notification-agent" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.709114 4723 scope.go:117] "RemoveContainer" containerID="f76c6b69b519dcbc0aca84270d607b9fdb4f7fc3927531998d5b7deedd7f3f99" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.710465 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.719734 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.719939 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.721918 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-858f65d478-kpb9g"] Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.730798 4723 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-c93877a3-be1d-4f5b-9fa6-375a3db60f77\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c93877a3-be1d-4f5b-9fa6-375a3db60f77\") on node \"crc\" " Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.735028 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-858f65d478-kpb9g"] Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.750919 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.767083 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.767319 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2f94cb12-90f7-4a5a-9da4-6520946b46be" containerName="glance-log" containerID="cri-o://0d93448417f193ea01c5258f51e9033e70438056797884f157722a8c20b6b8e2" gracePeriod=30 Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.767438 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2f94cb12-90f7-4a5a-9da4-6520946b46be" containerName="glance-httpd" containerID="cri-o://fadc14fdef084d5e31ef0114295b47042e605593134aed02c3c1b972b615ceee" gracePeriod=30 Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.776895 4723 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.777073 4723 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-c93877a3-be1d-4f5b-9fa6-375a3db60f77" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c93877a3-be1d-4f5b-9fa6-375a3db60f77") on node "crc" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.790237 4723 scope.go:117] "RemoveContainer" containerID="dbc687e3211b92eb828ad0207767237f7fe0ccc94f29cfde15316b72d8d5efbf" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.834304 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e185e279-02e9-4b2a-a0bc-edd302907c9c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e185e279-02e9-4b2a-a0bc-edd302907c9c\") " pod="openstack/ceilometer-0" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.834400 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e185e279-02e9-4b2a-a0bc-edd302907c9c-log-httpd\") pod \"ceilometer-0\" (UID: \"e185e279-02e9-4b2a-a0bc-edd302907c9c\") " pod="openstack/ceilometer-0" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.834461 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e185e279-02e9-4b2a-a0bc-edd302907c9c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e185e279-02e9-4b2a-a0bc-edd302907c9c\") " pod="openstack/ceilometer-0" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.834492 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cgkh\" (UniqueName: \"kubernetes.io/projected/e185e279-02e9-4b2a-a0bc-edd302907c9c-kube-api-access-5cgkh\") pod \"ceilometer-0\" (UID: \"e185e279-02e9-4b2a-a0bc-edd302907c9c\") " pod="openstack/ceilometer-0" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.834522 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e185e279-02e9-4b2a-a0bc-edd302907c9c-scripts\") pod \"ceilometer-0\" (UID: \"e185e279-02e9-4b2a-a0bc-edd302907c9c\") " pod="openstack/ceilometer-0" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.834620 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e185e279-02e9-4b2a-a0bc-edd302907c9c-run-httpd\") pod \"ceilometer-0\" (UID: \"e185e279-02e9-4b2a-a0bc-edd302907c9c\") " pod="openstack/ceilometer-0" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.834636 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e185e279-02e9-4b2a-a0bc-edd302907c9c-config-data\") pod \"ceilometer-0\" (UID: \"e185e279-02e9-4b2a-a0bc-edd302907c9c\") " pod="openstack/ceilometer-0" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.834745 4723 reconciler_common.go:293] "Volume detached for volume \"pvc-c93877a3-be1d-4f5b-9fa6-375a3db60f77\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c93877a3-be1d-4f5b-9fa6-375a3db60f77\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.841998 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.859915 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.874782 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.876654 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.881473 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.888122 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.909613 4723 scope.go:117] "RemoveContainer" containerID="9e3da25c95a32f77f29584cfcf43fa1688fa84629704fc08f032bef49a403eff" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.936272 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e185e279-02e9-4b2a-a0bc-edd302907c9c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e185e279-02e9-4b2a-a0bc-edd302907c9c\") " pod="openstack/ceilometer-0" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.936320 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e185e279-02e9-4b2a-a0bc-edd302907c9c-log-httpd\") pod \"ceilometer-0\" (UID: \"e185e279-02e9-4b2a-a0bc-edd302907c9c\") " pod="openstack/ceilometer-0" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.936393 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e185e279-02e9-4b2a-a0bc-edd302907c9c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e185e279-02e9-4b2a-a0bc-edd302907c9c\") " pod="openstack/ceilometer-0" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.936422 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cgkh\" (UniqueName: \"kubernetes.io/projected/e185e279-02e9-4b2a-a0bc-edd302907c9c-kube-api-access-5cgkh\") pod \"ceilometer-0\" (UID: \"e185e279-02e9-4b2a-a0bc-edd302907c9c\") " pod="openstack/ceilometer-0" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.936464 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e185e279-02e9-4b2a-a0bc-edd302907c9c-scripts\") pod \"ceilometer-0\" (UID: \"e185e279-02e9-4b2a-a0bc-edd302907c9c\") " pod="openstack/ceilometer-0" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.936542 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e185e279-02e9-4b2a-a0bc-edd302907c9c-run-httpd\") pod \"ceilometer-0\" (UID: \"e185e279-02e9-4b2a-a0bc-edd302907c9c\") " pod="openstack/ceilometer-0" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.936557 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e185e279-02e9-4b2a-a0bc-edd302907c9c-config-data\") pod \"ceilometer-0\" (UID: \"e185e279-02e9-4b2a-a0bc-edd302907c9c\") " pod="openstack/ceilometer-0" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.938726 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e90760e-0ff0-4195-8bb9-d32fe674feb5" path="/var/lib/kubelet/pods/2e90760e-0ff0-4195-8bb9-d32fe674feb5/volumes" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.939674 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b35cba0-637b-481c-a44f-854ba4c3f86e" path="/var/lib/kubelet/pods/4b35cba0-637b-481c-a44f-854ba4c3f86e/volumes" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.940472 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="679e1a62-9b64-4f60-a7b5-b218eed30fd7" path="/var/lib/kubelet/pods/679e1a62-9b64-4f60-a7b5-b218eed30fd7/volumes" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.941846 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff" path="/var/lib/kubelet/pods/e4a23d7d-3b71-4dd1-8d45-468c3a96b0ff/volumes" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.942945 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f180e14d-e014-49e0-8177-619c97476f71" path="/var/lib/kubelet/pods/f180e14d-e014-49e0-8177-619c97476f71/volumes" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.943985 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5d4f94b9d4-2l2jj" event={"ID":"227cded8-49e9-4484-94a3-5ffebb8e4e47","Type":"ContainerStarted","Data":"233f93a7e129cda557de9345227cb5c5f963726f0bf1ef8d1bfb537e3258e03d"} Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.944050 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-756d785958-qpphh" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.944068 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.944088 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-5d4f94b9d4-2l2jj" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.944764 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e185e279-02e9-4b2a-a0bc-edd302907c9c-log-httpd\") pod \"ceilometer-0\" (UID: \"e185e279-02e9-4b2a-a0bc-edd302907c9c\") " pod="openstack/ceilometer-0" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.950942 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e185e279-02e9-4b2a-a0bc-edd302907c9c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e185e279-02e9-4b2a-a0bc-edd302907c9c\") " pod="openstack/ceilometer-0" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.953993 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e185e279-02e9-4b2a-a0bc-edd302907c9c-run-httpd\") pod \"ceilometer-0\" (UID: \"e185e279-02e9-4b2a-a0bc-edd302907c9c\") " pod="openstack/ceilometer-0" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.954229 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cgkh\" (UniqueName: \"kubernetes.io/projected/e185e279-02e9-4b2a-a0bc-edd302907c9c-kube-api-access-5cgkh\") pod \"ceilometer-0\" (UID: \"e185e279-02e9-4b2a-a0bc-edd302907c9c\") " pod="openstack/ceilometer-0" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.958840 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6578b64f7d-9cxnx" event={"ID":"9b314084-941d-4d00-bae6-6fdce2dc24db","Type":"ContainerStarted","Data":"1848579cb3ae52f7204c15281b7aa115c813e778cf2db883f38374cd706d1d90"} Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.960378 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-6578b64f7d-9cxnx" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.960766 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e185e279-02e9-4b2a-a0bc-edd302907c9c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e185e279-02e9-4b2a-a0bc-edd302907c9c\") " pod="openstack/ceilometer-0" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.961202 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e185e279-02e9-4b2a-a0bc-edd302907c9c-config-data\") pod \"ceilometer-0\" (UID: \"e185e279-02e9-4b2a-a0bc-edd302907c9c\") " pod="openstack/ceilometer-0" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.973041 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e185e279-02e9-4b2a-a0bc-edd302907c9c-scripts\") pod \"ceilometer-0\" (UID: \"e185e279-02e9-4b2a-a0bc-edd302907c9c\") " pod="openstack/ceilometer-0" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.976178 4723 generic.go:334] "Generic (PLEG): container finished" podID="3a014777-41ba-4350-92ed-b6036f193d1c" containerID="3e2aa2a75f6e5f55fa01a216c46d670ae6ea8fcd7821e7f7c8628131f41808d2" exitCode=1 Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.976399 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-56845cc998-n2w92" event={"ID":"3a014777-41ba-4350-92ed-b6036f193d1c","Type":"ContainerDied","Data":"3e2aa2a75f6e5f55fa01a216c46d670ae6ea8fcd7821e7f7c8628131f41808d2"} Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.977577 4723 scope.go:117] "RemoveContainer" containerID="3e2aa2a75f6e5f55fa01a216c46d670ae6ea8fcd7821e7f7c8628131f41808d2" Mar 09 13:22:38 crc kubenswrapper[4723]: E0309 13:22:38.978143 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-56845cc998-n2w92_openstack(3a014777-41ba-4350-92ed-b6036f193d1c)\"" pod="openstack/heat-api-56845cc998-n2w92" podUID="3a014777-41ba-4350-92ed-b6036f193d1c" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.978720 4723 generic.go:334] "Generic (PLEG): container finished" podID="88ef3786-4edc-4e35-b54c-ae1edbfb27ca" containerID="e5ccfaec3a6781c5a52929c2d577af0275cd7269876705d10f7847b6a74a961d" exitCode=1 Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.978790 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-57d4ddfcc7-js2kz" event={"ID":"88ef3786-4edc-4e35-b54c-ae1edbfb27ca","Type":"ContainerDied","Data":"e5ccfaec3a6781c5a52929c2d577af0275cd7269876705d10f7847b6a74a961d"} Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.979270 4723 scope.go:117] "RemoveContainer" containerID="e5ccfaec3a6781c5a52929c2d577af0275cd7269876705d10f7847b6a74a961d" Mar 09 13:22:38 crc kubenswrapper[4723]: E0309 13:22:38.979487 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-57d4ddfcc7-js2kz_openstack(88ef3786-4edc-4e35-b54c-ae1edbfb27ca)\"" pod="openstack/heat-cfnapi-57d4ddfcc7-js2kz" podUID="88ef3786-4edc-4e35-b54c-ae1edbfb27ca" Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.996963 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-68d98b8999-qqz47" event={"ID":"a4841a92-8277-45f9-b366-8913a20ec8ad","Type":"ContainerStarted","Data":"39f1a1293b39599f2405d7759e1a23a7dee3c5c94e095be6f36696d10beb6dee"} Mar 09 13:22:38 crc kubenswrapper[4723]: I0309 13:22:38.999535 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-5d4f94b9d4-2l2jj" podStartSLOduration=11.999522056 podStartE2EDuration="11.999522056s" podCreationTimestamp="2026-03-09 13:22:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:22:38.971889947 +0000 UTC m=+1432.986357487" watchObservedRunningTime="2026-03-09 13:22:38.999522056 +0000 UTC m=+1433.013989596" Mar 09 13:22:39 crc kubenswrapper[4723]: I0309 13:22:39.035752 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-6578b64f7d-9cxnx" podStartSLOduration=9.03573591 podStartE2EDuration="9.03573591s" podCreationTimestamp="2026-03-09 13:22:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:22:39.012385305 +0000 UTC m=+1433.026852845" watchObservedRunningTime="2026-03-09 13:22:39.03573591 +0000 UTC m=+1433.050203450" Mar 09 13:22:39 crc kubenswrapper[4723]: I0309 13:22:39.053380 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27c3ccb2-5302-4c07-98d7-2e4a9267f423-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"27c3ccb2-5302-4c07-98d7-2e4a9267f423\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:22:39 crc kubenswrapper[4723]: I0309 13:22:39.053673 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27c3ccb2-5302-4c07-98d7-2e4a9267f423-scripts\") pod \"glance-default-internal-api-0\" (UID: \"27c3ccb2-5302-4c07-98d7-2e4a9267f423\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:22:39 crc kubenswrapper[4723]: I0309 13:22:39.054011 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/27c3ccb2-5302-4c07-98d7-2e4a9267f423-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"27c3ccb2-5302-4c07-98d7-2e4a9267f423\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:22:39 crc kubenswrapper[4723]: I0309 13:22:39.055381 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/27c3ccb2-5302-4c07-98d7-2e4a9267f423-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"27c3ccb2-5302-4c07-98d7-2e4a9267f423\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:22:39 crc kubenswrapper[4723]: I0309 13:22:39.055444 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvnzk\" (UniqueName: \"kubernetes.io/projected/27c3ccb2-5302-4c07-98d7-2e4a9267f423-kube-api-access-bvnzk\") pod \"glance-default-internal-api-0\" (UID: \"27c3ccb2-5302-4c07-98d7-2e4a9267f423\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:22:39 crc kubenswrapper[4723]: I0309 13:22:39.055627 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c93877a3-be1d-4f5b-9fa6-375a3db60f77\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c93877a3-be1d-4f5b-9fa6-375a3db60f77\") pod \"glance-default-internal-api-0\" (UID: \"27c3ccb2-5302-4c07-98d7-2e4a9267f423\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:22:39 crc kubenswrapper[4723]: I0309 13:22:39.055842 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27c3ccb2-5302-4c07-98d7-2e4a9267f423-config-data\") pod \"glance-default-internal-api-0\" (UID: \"27c3ccb2-5302-4c07-98d7-2e4a9267f423\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:22:39 crc kubenswrapper[4723]: I0309 13:22:39.055970 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27c3ccb2-5302-4c07-98d7-2e4a9267f423-logs\") pod \"glance-default-internal-api-0\" (UID: \"27c3ccb2-5302-4c07-98d7-2e4a9267f423\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:22:39 crc kubenswrapper[4723]: I0309 13:22:39.094323 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:22:39 crc kubenswrapper[4723]: I0309 13:22:39.144677 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-68d98b8999-qqz47" podStartSLOduration=9.144656761 podStartE2EDuration="9.144656761s" podCreationTimestamp="2026-03-09 13:22:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:22:39.091247183 +0000 UTC m=+1433.105714723" watchObservedRunningTime="2026-03-09 13:22:39.144656761 +0000 UTC m=+1433.159124301" Mar 09 13:22:39 crc kubenswrapper[4723]: I0309 13:22:39.149448 4723 scope.go:117] "RemoveContainer" containerID="3becec47c87fe6f0880331e7522b6d5a33482764cfb17be205a1568f7e3eab21" Mar 09 13:22:39 crc kubenswrapper[4723]: I0309 13:22:39.158266 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c93877a3-be1d-4f5b-9fa6-375a3db60f77\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c93877a3-be1d-4f5b-9fa6-375a3db60f77\") pod \"glance-default-internal-api-0\" (UID: \"27c3ccb2-5302-4c07-98d7-2e4a9267f423\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:22:39 crc kubenswrapper[4723]: I0309 13:22:39.158518 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27c3ccb2-5302-4c07-98d7-2e4a9267f423-config-data\") pod \"glance-default-internal-api-0\" (UID: \"27c3ccb2-5302-4c07-98d7-2e4a9267f423\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:22:39 crc kubenswrapper[4723]: I0309 13:22:39.158571 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27c3ccb2-5302-4c07-98d7-2e4a9267f423-logs\") pod \"glance-default-internal-api-0\" (UID: \"27c3ccb2-5302-4c07-98d7-2e4a9267f423\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:22:39 crc kubenswrapper[4723]: I0309 13:22:39.158600 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27c3ccb2-5302-4c07-98d7-2e4a9267f423-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"27c3ccb2-5302-4c07-98d7-2e4a9267f423\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:22:39 crc kubenswrapper[4723]: I0309 13:22:39.158776 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27c3ccb2-5302-4c07-98d7-2e4a9267f423-scripts\") pod \"glance-default-internal-api-0\" (UID: \"27c3ccb2-5302-4c07-98d7-2e4a9267f423\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:22:39 crc kubenswrapper[4723]: I0309 13:22:39.158811 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/27c3ccb2-5302-4c07-98d7-2e4a9267f423-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"27c3ccb2-5302-4c07-98d7-2e4a9267f423\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:22:39 crc kubenswrapper[4723]: I0309 13:22:39.158874 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/27c3ccb2-5302-4c07-98d7-2e4a9267f423-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"27c3ccb2-5302-4c07-98d7-2e4a9267f423\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:22:39 crc kubenswrapper[4723]: I0309 13:22:39.159020 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvnzk\" (UniqueName: \"kubernetes.io/projected/27c3ccb2-5302-4c07-98d7-2e4a9267f423-kube-api-access-bvnzk\") pod \"glance-default-internal-api-0\" (UID: \"27c3ccb2-5302-4c07-98d7-2e4a9267f423\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:22:39 crc kubenswrapper[4723]: I0309 13:22:39.164838 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27c3ccb2-5302-4c07-98d7-2e4a9267f423-config-data\") pod \"glance-default-internal-api-0\" (UID: \"27c3ccb2-5302-4c07-98d7-2e4a9267f423\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:22:39 crc kubenswrapper[4723]: I0309 13:22:39.165138 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27c3ccb2-5302-4c07-98d7-2e4a9267f423-logs\") pod \"glance-default-internal-api-0\" (UID: \"27c3ccb2-5302-4c07-98d7-2e4a9267f423\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:22:39 crc kubenswrapper[4723]: I0309 13:22:39.167497 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27c3ccb2-5302-4c07-98d7-2e4a9267f423-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"27c3ccb2-5302-4c07-98d7-2e4a9267f423\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:22:39 crc kubenswrapper[4723]: I0309 13:22:39.183177 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/27c3ccb2-5302-4c07-98d7-2e4a9267f423-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"27c3ccb2-5302-4c07-98d7-2e4a9267f423\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:22:39 crc kubenswrapper[4723]: I0309 13:22:39.183526 4723 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 13:22:39 crc kubenswrapper[4723]: I0309 13:22:39.183562 4723 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c93877a3-be1d-4f5b-9fa6-375a3db60f77\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c93877a3-be1d-4f5b-9fa6-375a3db60f77\") pod \"glance-default-internal-api-0\" (UID: \"27c3ccb2-5302-4c07-98d7-2e4a9267f423\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2988164fdbbdc0e6befebfe68058338f4fd9913a4b29356e34a132113ba27e6b/globalmount\"" pod="openstack/glance-default-internal-api-0" Mar 09 13:22:39 crc kubenswrapper[4723]: I0309 13:22:39.183882 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/27c3ccb2-5302-4c07-98d7-2e4a9267f423-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"27c3ccb2-5302-4c07-98d7-2e4a9267f423\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:22:39 crc kubenswrapper[4723]: I0309 13:22:39.188419 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27c3ccb2-5302-4c07-98d7-2e4a9267f423-scripts\") pod \"glance-default-internal-api-0\" (UID: \"27c3ccb2-5302-4c07-98d7-2e4a9267f423\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:22:39 crc kubenswrapper[4723]: I0309 13:22:39.196465 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvnzk\" (UniqueName: \"kubernetes.io/projected/27c3ccb2-5302-4c07-98d7-2e4a9267f423-kube-api-access-bvnzk\") pod \"glance-default-internal-api-0\" (UID: \"27c3ccb2-5302-4c07-98d7-2e4a9267f423\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:22:39 crc kubenswrapper[4723]: I0309 13:22:39.312852 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c93877a3-be1d-4f5b-9fa6-375a3db60f77\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c93877a3-be1d-4f5b-9fa6-375a3db60f77\") pod \"glance-default-internal-api-0\" (UID: \"27c3ccb2-5302-4c07-98d7-2e4a9267f423\") " pod="openstack/glance-default-internal-api-0" Mar 09 13:22:39 crc kubenswrapper[4723]: I0309 13:22:39.329479 4723 scope.go:117] "RemoveContainer" containerID="99498a678481ffb7e1eeaed904c95b0323f70750e638d937b590852c554b6b8f" Mar 09 13:22:39 crc kubenswrapper[4723]: I0309 13:22:39.375516 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 13:22:39 crc kubenswrapper[4723]: I0309 13:22:39.434746 4723 scope.go:117] "RemoveContainer" containerID="da4a0630f1ffaf294147e094d64c9e8ceced6a2b6526b66004cbf84bad23bb65" Mar 09 13:22:39 crc kubenswrapper[4723]: I0309 13:22:39.792061 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:22:39 crc kubenswrapper[4723]: W0309 13:22:39.797915 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode185e279_02e9_4b2a_a0bc_edd302907c9c.slice/crio-f7e991cabde4fe3b3ce342e722b7df9c5442844e4d4ffd0dd9c822c71888bb65 WatchSource:0}: Error finding container f7e991cabde4fe3b3ce342e722b7df9c5442844e4d4ffd0dd9c822c71888bb65: Status 404 returned error can't find the container with id f7e991cabde4fe3b3ce342e722b7df9c5442844e4d4ffd0dd9c822c71888bb65 Mar 09 13:22:40 crc kubenswrapper[4723]: I0309 13:22:40.010203 4723 scope.go:117] "RemoveContainer" containerID="e5ccfaec3a6781c5a52929c2d577af0275cd7269876705d10f7847b6a74a961d" Mar 09 13:22:40 crc kubenswrapper[4723]: I0309 13:22:40.010604 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e185e279-02e9-4b2a-a0bc-edd302907c9c","Type":"ContainerStarted","Data":"f7e991cabde4fe3b3ce342e722b7df9c5442844e4d4ffd0dd9c822c71888bb65"} Mar 09 13:22:40 crc kubenswrapper[4723]: E0309 13:22:40.010789 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-57d4ddfcc7-js2kz_openstack(88ef3786-4edc-4e35-b54c-ae1edbfb27ca)\"" pod="openstack/heat-cfnapi-57d4ddfcc7-js2kz" podUID="88ef3786-4edc-4e35-b54c-ae1edbfb27ca" Mar 09 13:22:40 crc kubenswrapper[4723]: I0309 13:22:40.017657 4723 generic.go:334] "Generic (PLEG): container finished" podID="2f94cb12-90f7-4a5a-9da4-6520946b46be" containerID="0d93448417f193ea01c5258f51e9033e70438056797884f157722a8c20b6b8e2" exitCode=143 Mar 09 13:22:40 crc kubenswrapper[4723]: I0309 13:22:40.017694 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2f94cb12-90f7-4a5a-9da4-6520946b46be","Type":"ContainerDied","Data":"0d93448417f193ea01c5258f51e9033e70438056797884f157722a8c20b6b8e2"} Mar 09 13:22:40 crc kubenswrapper[4723]: I0309 13:22:40.024076 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-68d98b8999-qqz47" Mar 09 13:22:40 crc kubenswrapper[4723]: I0309 13:22:40.024346 4723 scope.go:117] "RemoveContainer" containerID="3e2aa2a75f6e5f55fa01a216c46d670ae6ea8fcd7821e7f7c8628131f41808d2" Mar 09 13:22:40 crc kubenswrapper[4723]: E0309 13:22:40.024659 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-56845cc998-n2w92_openstack(3a014777-41ba-4350-92ed-b6036f193d1c)\"" pod="openstack/heat-api-56845cc998-n2w92" podUID="3a014777-41ba-4350-92ed-b6036f193d1c" Mar 09 13:22:40 crc kubenswrapper[4723]: W0309 13:22:40.085355 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27c3ccb2_5302_4c07_98d7_2e4a9267f423.slice/crio-89e1488a1cf9d8a5bd0870a079dfcac2cff6a6da346d7498afe03ce896d02426 WatchSource:0}: Error finding container 89e1488a1cf9d8a5bd0870a079dfcac2cff6a6da346d7498afe03ce896d02426: Status 404 returned error can't find the container with id 89e1488a1cf9d8a5bd0870a079dfcac2cff6a6da346d7498afe03ce896d02426 Mar 09 13:22:40 crc kubenswrapper[4723]: I0309 13:22:40.089590 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 13:22:41 crc kubenswrapper[4723]: I0309 13:22:41.046449 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e185e279-02e9-4b2a-a0bc-edd302907c9c","Type":"ContainerStarted","Data":"3779de9321416ef1a7dcab60b032dad7dbd6d7e4e149e246a3872b84914c4237"} Mar 09 13:22:41 crc kubenswrapper[4723]: I0309 13:22:41.053046 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"27c3ccb2-5302-4c07-98d7-2e4a9267f423","Type":"ContainerStarted","Data":"d1ccdf3b5ecdb54fe993cd9b4aefc6f633d0593b0bf10aa4b23dbdeeb8bbef2f"} Mar 09 13:22:41 crc kubenswrapper[4723]: I0309 13:22:41.053143 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"27c3ccb2-5302-4c07-98d7-2e4a9267f423","Type":"ContainerStarted","Data":"89e1488a1cf9d8a5bd0870a079dfcac2cff6a6da346d7498afe03ce896d02426"} Mar 09 13:22:42 crc kubenswrapper[4723]: I0309 13:22:42.070091 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"27c3ccb2-5302-4c07-98d7-2e4a9267f423","Type":"ContainerStarted","Data":"57c848bc3afd776818395eb844d95ba06c93c386e91ff74203be3b751d7f2e72"} Mar 09 13:22:42 crc kubenswrapper[4723]: I0309 13:22:42.073052 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e185e279-02e9-4b2a-a0bc-edd302907c9c","Type":"ContainerStarted","Data":"eb3756fafc9219ca50e44aeddb90125beb80a27e146322ffb383179dc607ffc5"} Mar 09 13:22:42 crc kubenswrapper[4723]: I0309 13:22:42.075367 4723 generic.go:334] "Generic (PLEG): container finished" podID="2f94cb12-90f7-4a5a-9da4-6520946b46be" containerID="fadc14fdef084d5e31ef0114295b47042e605593134aed02c3c1b972b615ceee" exitCode=0 Mar 09 13:22:42 crc kubenswrapper[4723]: I0309 13:22:42.075396 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2f94cb12-90f7-4a5a-9da4-6520946b46be","Type":"ContainerDied","Data":"fadc14fdef084d5e31ef0114295b47042e605593134aed02c3c1b972b615ceee"} Mar 09 13:22:42 crc kubenswrapper[4723]: I0309 13:22:42.671399 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-56845cc998-n2w92" Mar 09 13:22:42 crc kubenswrapper[4723]: I0309 13:22:42.671719 4723 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-56845cc998-n2w92" Mar 09 13:22:42 crc kubenswrapper[4723]: I0309 13:22:42.672666 4723 scope.go:117] "RemoveContainer" containerID="3e2aa2a75f6e5f55fa01a216c46d670ae6ea8fcd7821e7f7c8628131f41808d2" Mar 09 13:22:42 crc kubenswrapper[4723]: E0309 13:22:42.673103 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-56845cc998-n2w92_openstack(3a014777-41ba-4350-92ed-b6036f193d1c)\"" pod="openstack/heat-api-56845cc998-n2w92" podUID="3a014777-41ba-4350-92ed-b6036f193d1c" Mar 09 13:22:42 crc kubenswrapper[4723]: I0309 13:22:42.716944 4723 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-57d4ddfcc7-js2kz" Mar 09 13:22:42 crc kubenswrapper[4723]: I0309 13:22:42.718265 4723 scope.go:117] "RemoveContainer" containerID="e5ccfaec3a6781c5a52929c2d577af0275cd7269876705d10f7847b6a74a961d" Mar 09 13:22:42 crc kubenswrapper[4723]: E0309 13:22:42.718487 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-57d4ddfcc7-js2kz_openstack(88ef3786-4edc-4e35-b54c-ae1edbfb27ca)\"" pod="openstack/heat-cfnapi-57d4ddfcc7-js2kz" podUID="88ef3786-4edc-4e35-b54c-ae1edbfb27ca" Mar 09 13:22:42 crc kubenswrapper[4723]: I0309 13:22:42.720967 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 13:22:42 crc kubenswrapper[4723]: I0309 13:22:42.739894 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.739875502 podStartE2EDuration="4.739875502s" podCreationTimestamp="2026-03-09 13:22:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:22:42.096809244 +0000 UTC m=+1436.111276784" watchObservedRunningTime="2026-03-09 13:22:42.739875502 +0000 UTC m=+1436.754343042" Mar 09 13:22:42 crc kubenswrapper[4723]: I0309 13:22:42.760588 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f94cb12-90f7-4a5a-9da4-6520946b46be-logs\") pod \"2f94cb12-90f7-4a5a-9da4-6520946b46be\" (UID: \"2f94cb12-90f7-4a5a-9da4-6520946b46be\") " Mar 09 13:22:42 crc kubenswrapper[4723]: I0309 13:22:42.760660 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f94cb12-90f7-4a5a-9da4-6520946b46be-combined-ca-bundle\") pod \"2f94cb12-90f7-4a5a-9da4-6520946b46be\" (UID: \"2f94cb12-90f7-4a5a-9da4-6520946b46be\") " Mar 09 13:22:42 crc kubenswrapper[4723]: I0309 13:22:42.760899 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f94cb12-90f7-4a5a-9da4-6520946b46be-public-tls-certs\") pod \"2f94cb12-90f7-4a5a-9da4-6520946b46be\" (UID: \"2f94cb12-90f7-4a5a-9da4-6520946b46be\") " Mar 09 13:22:42 crc kubenswrapper[4723]: I0309 13:22:42.761107 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f94cb12-90f7-4a5a-9da4-6520946b46be-logs" (OuterVolumeSpecName: "logs") pod "2f94cb12-90f7-4a5a-9da4-6520946b46be" (UID: "2f94cb12-90f7-4a5a-9da4-6520946b46be"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:22:42 crc kubenswrapper[4723]: I0309 13:22:42.761458 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f94cb12-90f7-4a5a-9da4-6520946b46be-scripts\") pod \"2f94cb12-90f7-4a5a-9da4-6520946b46be\" (UID: \"2f94cb12-90f7-4a5a-9da4-6520946b46be\") " Mar 09 13:22:42 crc kubenswrapper[4723]: I0309 13:22:42.761551 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2f94cb12-90f7-4a5a-9da4-6520946b46be-httpd-run\") pod \"2f94cb12-90f7-4a5a-9da4-6520946b46be\" (UID: \"2f94cb12-90f7-4a5a-9da4-6520946b46be\") " Mar 09 13:22:42 crc kubenswrapper[4723]: I0309 13:22:42.761585 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4wkw\" (UniqueName: \"kubernetes.io/projected/2f94cb12-90f7-4a5a-9da4-6520946b46be-kube-api-access-m4wkw\") pod \"2f94cb12-90f7-4a5a-9da4-6520946b46be\" (UID: \"2f94cb12-90f7-4a5a-9da4-6520946b46be\") " Mar 09 13:22:42 crc kubenswrapper[4723]: I0309 13:22:42.761920 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f94cb12-90f7-4a5a-9da4-6520946b46be-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2f94cb12-90f7-4a5a-9da4-6520946b46be" (UID: "2f94cb12-90f7-4a5a-9da4-6520946b46be"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:22:42 crc kubenswrapper[4723]: I0309 13:22:42.762934 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-970052f0-8f5a-4d9c-9fec-85f44ded7440\") pod \"2f94cb12-90f7-4a5a-9da4-6520946b46be\" (UID: \"2f94cb12-90f7-4a5a-9da4-6520946b46be\") " Mar 09 13:22:42 crc kubenswrapper[4723]: I0309 13:22:42.763047 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f94cb12-90f7-4a5a-9da4-6520946b46be-config-data\") pod \"2f94cb12-90f7-4a5a-9da4-6520946b46be\" (UID: \"2f94cb12-90f7-4a5a-9da4-6520946b46be\") " Mar 09 13:22:42 crc kubenswrapper[4723]: I0309 13:22:42.764149 4723 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2f94cb12-90f7-4a5a-9da4-6520946b46be-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:42 crc kubenswrapper[4723]: I0309 13:22:42.764183 4723 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f94cb12-90f7-4a5a-9da4-6520946b46be-logs\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:42 crc kubenswrapper[4723]: I0309 13:22:42.791770 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f94cb12-90f7-4a5a-9da4-6520946b46be-scripts" (OuterVolumeSpecName: "scripts") pod "2f94cb12-90f7-4a5a-9da4-6520946b46be" (UID: "2f94cb12-90f7-4a5a-9da4-6520946b46be"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:42 crc kubenswrapper[4723]: I0309 13:22:42.831227 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f94cb12-90f7-4a5a-9da4-6520946b46be-kube-api-access-m4wkw" (OuterVolumeSpecName: "kube-api-access-m4wkw") pod "2f94cb12-90f7-4a5a-9da4-6520946b46be" (UID: "2f94cb12-90f7-4a5a-9da4-6520946b46be"). InnerVolumeSpecName "kube-api-access-m4wkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:42 crc kubenswrapper[4723]: I0309 13:22:42.862001 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-970052f0-8f5a-4d9c-9fec-85f44ded7440" (OuterVolumeSpecName: "glance") pod "2f94cb12-90f7-4a5a-9da4-6520946b46be" (UID: "2f94cb12-90f7-4a5a-9da4-6520946b46be"). InnerVolumeSpecName "pvc-970052f0-8f5a-4d9c-9fec-85f44ded7440". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 09 13:22:42 crc kubenswrapper[4723]: I0309 13:22:42.868779 4723 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f94cb12-90f7-4a5a-9da4-6520946b46be-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:42 crc kubenswrapper[4723]: I0309 13:22:42.868811 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4wkw\" (UniqueName: \"kubernetes.io/projected/2f94cb12-90f7-4a5a-9da4-6520946b46be-kube-api-access-m4wkw\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:42 crc kubenswrapper[4723]: I0309 13:22:42.868843 4723 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-970052f0-8f5a-4d9c-9fec-85f44ded7440\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-970052f0-8f5a-4d9c-9fec-85f44ded7440\") on node \"crc\" " Mar 09 13:22:42 crc kubenswrapper[4723]: I0309 13:22:42.872152 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f94cb12-90f7-4a5a-9da4-6520946b46be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f94cb12-90f7-4a5a-9da4-6520946b46be" (UID: "2f94cb12-90f7-4a5a-9da4-6520946b46be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:42 crc kubenswrapper[4723]: I0309 13:22:42.913383 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f94cb12-90f7-4a5a-9da4-6520946b46be-config-data" (OuterVolumeSpecName: "config-data") pod "2f94cb12-90f7-4a5a-9da4-6520946b46be" (UID: "2f94cb12-90f7-4a5a-9da4-6520946b46be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:42 crc kubenswrapper[4723]: I0309 13:22:42.914182 4723 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 09 13:22:42 crc kubenswrapper[4723]: I0309 13:22:42.914323 4723 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-970052f0-8f5a-4d9c-9fec-85f44ded7440" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-970052f0-8f5a-4d9c-9fec-85f44ded7440") on node "crc" Mar 09 13:22:42 crc kubenswrapper[4723]: I0309 13:22:42.926081 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f94cb12-90f7-4a5a-9da4-6520946b46be-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2f94cb12-90f7-4a5a-9da4-6520946b46be" (UID: "2f94cb12-90f7-4a5a-9da4-6520946b46be"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:42 crc kubenswrapper[4723]: I0309 13:22:42.970656 4723 reconciler_common.go:293] "Volume detached for volume \"pvc-970052f0-8f5a-4d9c-9fec-85f44ded7440\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-970052f0-8f5a-4d9c-9fec-85f44ded7440\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:42 crc kubenswrapper[4723]: I0309 13:22:42.970690 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f94cb12-90f7-4a5a-9da4-6520946b46be-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:42 crc kubenswrapper[4723]: I0309 13:22:42.970700 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f94cb12-90f7-4a5a-9da4-6520946b46be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:42 crc kubenswrapper[4723]: I0309 13:22:42.970709 4723 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f94cb12-90f7-4a5a-9da4-6520946b46be-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:43 crc kubenswrapper[4723]: I0309 13:22:43.087724 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2f94cb12-90f7-4a5a-9da4-6520946b46be","Type":"ContainerDied","Data":"77b9a9f3676621c7a67c476085a7467e59a8e369c486a3c383f88a71bc9b33e2"} Mar 09 13:22:43 crc kubenswrapper[4723]: I0309 13:22:43.087751 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 13:22:43 crc kubenswrapper[4723]: I0309 13:22:43.087787 4723 scope.go:117] "RemoveContainer" containerID="fadc14fdef084d5e31ef0114295b47042e605593134aed02c3c1b972b615ceee" Mar 09 13:22:43 crc kubenswrapper[4723]: I0309 13:22:43.090978 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e185e279-02e9-4b2a-a0bc-edd302907c9c","Type":"ContainerStarted","Data":"5068d2eebbeafa26cfbcdb01f31d16ca8bed49bf4d9e0a308b9d470b51d18486"} Mar 09 13:22:43 crc kubenswrapper[4723]: I0309 13:22:43.131458 4723 scope.go:117] "RemoveContainer" containerID="0d93448417f193ea01c5258f51e9033e70438056797884f157722a8c20b6b8e2" Mar 09 13:22:43 crc kubenswrapper[4723]: I0309 13:22:43.167900 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 13:22:43 crc kubenswrapper[4723]: I0309 13:22:43.202704 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 13:22:43 crc kubenswrapper[4723]: I0309 13:22:43.222599 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 13:22:43 crc kubenswrapper[4723]: E0309 13:22:43.223287 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f94cb12-90f7-4a5a-9da4-6520946b46be" containerName="glance-httpd" Mar 09 13:22:43 crc kubenswrapper[4723]: I0309 13:22:43.223318 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f94cb12-90f7-4a5a-9da4-6520946b46be" containerName="glance-httpd" Mar 09 13:22:43 crc kubenswrapper[4723]: E0309 13:22:43.223363 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f94cb12-90f7-4a5a-9da4-6520946b46be" containerName="glance-log" Mar 09 13:22:43 crc kubenswrapper[4723]: I0309 13:22:43.223369 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f94cb12-90f7-4a5a-9da4-6520946b46be" containerName="glance-log" Mar 09 13:22:43 crc kubenswrapper[4723]: I0309 13:22:43.223569 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f94cb12-90f7-4a5a-9da4-6520946b46be" containerName="glance-httpd" Mar 09 13:22:43 crc kubenswrapper[4723]: I0309 13:22:43.223597 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f94cb12-90f7-4a5a-9da4-6520946b46be" containerName="glance-log" Mar 09 13:22:43 crc kubenswrapper[4723]: I0309 13:22:43.224816 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 13:22:43 crc kubenswrapper[4723]: I0309 13:22:43.227123 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 09 13:22:43 crc kubenswrapper[4723]: I0309 13:22:43.229643 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 09 13:22:43 crc kubenswrapper[4723]: I0309 13:22:43.236033 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 13:22:43 crc kubenswrapper[4723]: I0309 13:22:43.277042 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f2s5\" (UniqueName: \"kubernetes.io/projected/a40a3caa-3f07-4139-9ca7-0daa1fbc2806-kube-api-access-7f2s5\") pod \"glance-default-external-api-0\" (UID: \"a40a3caa-3f07-4139-9ca7-0daa1fbc2806\") " pod="openstack/glance-default-external-api-0" Mar 09 13:22:43 crc kubenswrapper[4723]: I0309 13:22:43.277105 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a40a3caa-3f07-4139-9ca7-0daa1fbc2806-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a40a3caa-3f07-4139-9ca7-0daa1fbc2806\") " pod="openstack/glance-default-external-api-0" Mar 09 13:22:43 crc kubenswrapper[4723]: I0309 13:22:43.277129 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a40a3caa-3f07-4139-9ca7-0daa1fbc2806-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a40a3caa-3f07-4139-9ca7-0daa1fbc2806\") " pod="openstack/glance-default-external-api-0" Mar 09 13:22:43 crc kubenswrapper[4723]: I0309 13:22:43.277162 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a40a3caa-3f07-4139-9ca7-0daa1fbc2806-logs\") pod \"glance-default-external-api-0\" (UID: \"a40a3caa-3f07-4139-9ca7-0daa1fbc2806\") " pod="openstack/glance-default-external-api-0" Mar 09 13:22:43 crc kubenswrapper[4723]: I0309 13:22:43.277185 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a40a3caa-3f07-4139-9ca7-0daa1fbc2806-scripts\") pod \"glance-default-external-api-0\" (UID: \"a40a3caa-3f07-4139-9ca7-0daa1fbc2806\") " pod="openstack/glance-default-external-api-0" Mar 09 13:22:43 crc kubenswrapper[4723]: I0309 13:22:43.277206 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a40a3caa-3f07-4139-9ca7-0daa1fbc2806-config-data\") pod \"glance-default-external-api-0\" (UID: \"a40a3caa-3f07-4139-9ca7-0daa1fbc2806\") " pod="openstack/glance-default-external-api-0" Mar 09 13:22:43 crc kubenswrapper[4723]: I0309 13:22:43.277498 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a40a3caa-3f07-4139-9ca7-0daa1fbc2806-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a40a3caa-3f07-4139-9ca7-0daa1fbc2806\") " pod="openstack/glance-default-external-api-0" Mar 09 13:22:43 crc kubenswrapper[4723]: I0309 13:22:43.277606 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-970052f0-8f5a-4d9c-9fec-85f44ded7440\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-970052f0-8f5a-4d9c-9fec-85f44ded7440\") pod \"glance-default-external-api-0\" (UID: \"a40a3caa-3f07-4139-9ca7-0daa1fbc2806\") " pod="openstack/glance-default-external-api-0" Mar 09 13:22:43 crc kubenswrapper[4723]: I0309 13:22:43.378768 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a40a3caa-3f07-4139-9ca7-0daa1fbc2806-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a40a3caa-3f07-4139-9ca7-0daa1fbc2806\") " pod="openstack/glance-default-external-api-0" Mar 09 13:22:43 crc kubenswrapper[4723]: I0309 13:22:43.378828 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-970052f0-8f5a-4d9c-9fec-85f44ded7440\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-970052f0-8f5a-4d9c-9fec-85f44ded7440\") pod \"glance-default-external-api-0\" (UID: \"a40a3caa-3f07-4139-9ca7-0daa1fbc2806\") " pod="openstack/glance-default-external-api-0" Mar 09 13:22:43 crc kubenswrapper[4723]: I0309 13:22:43.378915 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f2s5\" (UniqueName: \"kubernetes.io/projected/a40a3caa-3f07-4139-9ca7-0daa1fbc2806-kube-api-access-7f2s5\") pod \"glance-default-external-api-0\" (UID: \"a40a3caa-3f07-4139-9ca7-0daa1fbc2806\") " pod="openstack/glance-default-external-api-0" Mar 09 13:22:43 crc kubenswrapper[4723]: I0309 13:22:43.378954 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a40a3caa-3f07-4139-9ca7-0daa1fbc2806-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a40a3caa-3f07-4139-9ca7-0daa1fbc2806\") " pod="openstack/glance-default-external-api-0" Mar 09 13:22:43 crc kubenswrapper[4723]: I0309 13:22:43.378980 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a40a3caa-3f07-4139-9ca7-0daa1fbc2806-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a40a3caa-3f07-4139-9ca7-0daa1fbc2806\") " pod="openstack/glance-default-external-api-0" Mar 09 13:22:43 crc kubenswrapper[4723]: I0309 13:22:43.379014 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a40a3caa-3f07-4139-9ca7-0daa1fbc2806-logs\") pod \"glance-default-external-api-0\" (UID: \"a40a3caa-3f07-4139-9ca7-0daa1fbc2806\") " pod="openstack/glance-default-external-api-0" Mar 09 13:22:43 crc kubenswrapper[4723]: I0309 13:22:43.379039 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a40a3caa-3f07-4139-9ca7-0daa1fbc2806-scripts\") pod \"glance-default-external-api-0\" (UID: \"a40a3caa-3f07-4139-9ca7-0daa1fbc2806\") " pod="openstack/glance-default-external-api-0" Mar 09 13:22:43 crc kubenswrapper[4723]: I0309 13:22:43.379069 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a40a3caa-3f07-4139-9ca7-0daa1fbc2806-config-data\") pod \"glance-default-external-api-0\" (UID: \"a40a3caa-3f07-4139-9ca7-0daa1fbc2806\") " pod="openstack/glance-default-external-api-0" Mar 09 13:22:43 crc kubenswrapper[4723]: I0309 13:22:43.379999 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a40a3caa-3f07-4139-9ca7-0daa1fbc2806-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a40a3caa-3f07-4139-9ca7-0daa1fbc2806\") " pod="openstack/glance-default-external-api-0" Mar 09 13:22:43 crc kubenswrapper[4723]: I0309 13:22:43.380206 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a40a3caa-3f07-4139-9ca7-0daa1fbc2806-logs\") pod \"glance-default-external-api-0\" (UID: \"a40a3caa-3f07-4139-9ca7-0daa1fbc2806\") " pod="openstack/glance-default-external-api-0" Mar 09 13:22:43 crc kubenswrapper[4723]: I0309 13:22:43.386518 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a40a3caa-3f07-4139-9ca7-0daa1fbc2806-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a40a3caa-3f07-4139-9ca7-0daa1fbc2806\") " pod="openstack/glance-default-external-api-0" Mar 09 13:22:43 crc kubenswrapper[4723]: I0309 13:22:43.386536 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a40a3caa-3f07-4139-9ca7-0daa1fbc2806-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a40a3caa-3f07-4139-9ca7-0daa1fbc2806\") " pod="openstack/glance-default-external-api-0" Mar 09 13:22:43 crc kubenswrapper[4723]: I0309 13:22:43.387150 4723 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 13:22:43 crc kubenswrapper[4723]: I0309 13:22:43.387161 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a40a3caa-3f07-4139-9ca7-0daa1fbc2806-config-data\") pod \"glance-default-external-api-0\" (UID: \"a40a3caa-3f07-4139-9ca7-0daa1fbc2806\") " pod="openstack/glance-default-external-api-0" Mar 09 13:22:43 crc kubenswrapper[4723]: I0309 13:22:43.387189 4723 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-970052f0-8f5a-4d9c-9fec-85f44ded7440\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-970052f0-8f5a-4d9c-9fec-85f44ded7440\") pod \"glance-default-external-api-0\" (UID: \"a40a3caa-3f07-4139-9ca7-0daa1fbc2806\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c915ec56eda8abd87809fc9b9acb00675f31cc32ff5e20cf139093d267df65a0/globalmount\"" pod="openstack/glance-default-external-api-0" Mar 09 13:22:43 crc kubenswrapper[4723]: I0309 13:22:43.393899 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a40a3caa-3f07-4139-9ca7-0daa1fbc2806-scripts\") pod \"glance-default-external-api-0\" (UID: \"a40a3caa-3f07-4139-9ca7-0daa1fbc2806\") " pod="openstack/glance-default-external-api-0" Mar 09 13:22:43 crc kubenswrapper[4723]: I0309 13:22:43.396221 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f2s5\" (UniqueName: \"kubernetes.io/projected/a40a3caa-3f07-4139-9ca7-0daa1fbc2806-kube-api-access-7f2s5\") pod \"glance-default-external-api-0\" (UID: \"a40a3caa-3f07-4139-9ca7-0daa1fbc2806\") " pod="openstack/glance-default-external-api-0" Mar 09 13:22:43 crc kubenswrapper[4723]: I0309 13:22:43.430577 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-970052f0-8f5a-4d9c-9fec-85f44ded7440\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-970052f0-8f5a-4d9c-9fec-85f44ded7440\") pod \"glance-default-external-api-0\" (UID: \"a40a3caa-3f07-4139-9ca7-0daa1fbc2806\") " pod="openstack/glance-default-external-api-0" Mar 09 13:22:43 crc kubenswrapper[4723]: I0309 13:22:43.548589 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 13:22:44 crc kubenswrapper[4723]: W0309 13:22:44.179185 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda40a3caa_3f07_4139_9ca7_0daa1fbc2806.slice/crio-7fe58ce1080fa6cbeca240859a26a1834add30c015d0de14893a640e6c8f5471 WatchSource:0}: Error finding container 7fe58ce1080fa6cbeca240859a26a1834add30c015d0de14893a640e6c8f5471: Status 404 returned error can't find the container with id 7fe58ce1080fa6cbeca240859a26a1834add30c015d0de14893a640e6c8f5471 Mar 09 13:22:44 crc kubenswrapper[4723]: I0309 13:22:44.179700 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 13:22:44 crc kubenswrapper[4723]: I0309 13:22:44.898379 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f94cb12-90f7-4a5a-9da4-6520946b46be" path="/var/lib/kubelet/pods/2f94cb12-90f7-4a5a-9da4-6520946b46be/volumes" Mar 09 13:22:45 crc kubenswrapper[4723]: I0309 13:22:45.135403 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e185e279-02e9-4b2a-a0bc-edd302907c9c","Type":"ContainerStarted","Data":"90ef697bc9033fdd9fe7af7cf982e4d905864fd9856482733e202f8e9ac2c2e5"} Mar 09 13:22:45 crc kubenswrapper[4723]: I0309 13:22:45.136107 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 13:22:45 crc kubenswrapper[4723]: I0309 13:22:45.147443 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a40a3caa-3f07-4139-9ca7-0daa1fbc2806","Type":"ContainerStarted","Data":"55da4537d0fbf96feb12c56e4c14609eeba4eacb3c8be46db0633198208a7231"} Mar 09 13:22:45 crc kubenswrapper[4723]: I0309 13:22:45.147501 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a40a3caa-3f07-4139-9ca7-0daa1fbc2806","Type":"ContainerStarted","Data":"7fe58ce1080fa6cbeca240859a26a1834add30c015d0de14893a640e6c8f5471"} Mar 09 13:22:45 crc kubenswrapper[4723]: I0309 13:22:45.178320 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.341904924 podStartE2EDuration="7.178301706s" podCreationTimestamp="2026-03-09 13:22:38 +0000 UTC" firstStartedPulling="2026-03-09 13:22:39.801484671 +0000 UTC m=+1433.815952211" lastFinishedPulling="2026-03-09 13:22:44.637881453 +0000 UTC m=+1438.652348993" observedRunningTime="2026-03-09 13:22:45.162224303 +0000 UTC m=+1439.176691853" watchObservedRunningTime="2026-03-09 13:22:45.178301706 +0000 UTC m=+1439.192769246" Mar 09 13:22:46 crc kubenswrapper[4723]: I0309 13:22:46.162962 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a40a3caa-3f07-4139-9ca7-0daa1fbc2806","Type":"ContainerStarted","Data":"411d3f2545c2299e312c823514379b1f509dda61780173730b576c4de6ba9b29"} Mar 09 13:22:46 crc kubenswrapper[4723]: I0309 13:22:46.187753 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.18773734 podStartE2EDuration="3.18773734s" podCreationTimestamp="2026-03-09 13:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:22:46.180666534 +0000 UTC m=+1440.195134064" watchObservedRunningTime="2026-03-09 13:22:46.18773734 +0000 UTC m=+1440.202204880" Mar 09 13:22:47 crc kubenswrapper[4723]: I0309 13:22:47.472930 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-68d98b8999-qqz47" Mar 09 13:22:47 crc kubenswrapper[4723]: I0309 13:22:47.548529 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-56845cc998-n2w92"] Mar 09 13:22:47 crc kubenswrapper[4723]: I0309 13:22:47.955289 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-6578b64f7d-9cxnx" Mar 09 13:22:48 crc kubenswrapper[4723]: I0309 13:22:48.034420 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-57d4ddfcc7-js2kz"] Mar 09 13:22:48 crc kubenswrapper[4723]: I0309 13:22:48.163161 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-56845cc998-n2w92" Mar 09 13:22:48 crc kubenswrapper[4723]: I0309 13:22:48.204477 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-56845cc998-n2w92" event={"ID":"3a014777-41ba-4350-92ed-b6036f193d1c","Type":"ContainerDied","Data":"f0bff2f88e1ba6e6a79a102f91021a85545c7d11b3b7a78645cc007728ba6b86"} Mar 09 13:22:48 crc kubenswrapper[4723]: I0309 13:22:48.204524 4723 scope.go:117] "RemoveContainer" containerID="3e2aa2a75f6e5f55fa01a216c46d670ae6ea8fcd7821e7f7c8628131f41808d2" Mar 09 13:22:48 crc kubenswrapper[4723]: I0309 13:22:48.204644 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-56845cc998-n2w92" Mar 09 13:22:48 crc kubenswrapper[4723]: I0309 13:22:48.209052 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a014777-41ba-4350-92ed-b6036f193d1c-combined-ca-bundle\") pod \"3a014777-41ba-4350-92ed-b6036f193d1c\" (UID: \"3a014777-41ba-4350-92ed-b6036f193d1c\") " Mar 09 13:22:48 crc kubenswrapper[4723]: I0309 13:22:48.210486 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a014777-41ba-4350-92ed-b6036f193d1c-config-data\") pod \"3a014777-41ba-4350-92ed-b6036f193d1c\" (UID: \"3a014777-41ba-4350-92ed-b6036f193d1c\") " Mar 09 13:22:48 crc kubenswrapper[4723]: I0309 13:22:48.267189 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a014777-41ba-4350-92ed-b6036f193d1c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3a014777-41ba-4350-92ed-b6036f193d1c" (UID: "3a014777-41ba-4350-92ed-b6036f193d1c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:48 crc kubenswrapper[4723]: I0309 13:22:48.313954 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a014777-41ba-4350-92ed-b6036f193d1c-config-data-custom\") pod \"3a014777-41ba-4350-92ed-b6036f193d1c\" (UID: \"3a014777-41ba-4350-92ed-b6036f193d1c\") " Mar 09 13:22:48 crc kubenswrapper[4723]: I0309 13:22:48.314330 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjkkd\" (UniqueName: \"kubernetes.io/projected/3a014777-41ba-4350-92ed-b6036f193d1c-kube-api-access-vjkkd\") pod \"3a014777-41ba-4350-92ed-b6036f193d1c\" (UID: \"3a014777-41ba-4350-92ed-b6036f193d1c\") " Mar 09 13:22:48 crc kubenswrapper[4723]: I0309 13:22:48.315568 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a014777-41ba-4350-92ed-b6036f193d1c-config-data" (OuterVolumeSpecName: "config-data") pod "3a014777-41ba-4350-92ed-b6036f193d1c" (UID: "3a014777-41ba-4350-92ed-b6036f193d1c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:48 crc kubenswrapper[4723]: I0309 13:22:48.317681 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a014777-41ba-4350-92ed-b6036f193d1c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:48 crc kubenswrapper[4723]: I0309 13:22:48.318077 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a014777-41ba-4350-92ed-b6036f193d1c-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:48 crc kubenswrapper[4723]: I0309 13:22:48.320374 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a014777-41ba-4350-92ed-b6036f193d1c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3a014777-41ba-4350-92ed-b6036f193d1c" (UID: "3a014777-41ba-4350-92ed-b6036f193d1c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:48 crc kubenswrapper[4723]: I0309 13:22:48.321447 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a014777-41ba-4350-92ed-b6036f193d1c-kube-api-access-vjkkd" (OuterVolumeSpecName: "kube-api-access-vjkkd") pod "3a014777-41ba-4350-92ed-b6036f193d1c" (UID: "3a014777-41ba-4350-92ed-b6036f193d1c"). InnerVolumeSpecName "kube-api-access-vjkkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:48 crc kubenswrapper[4723]: I0309 13:22:48.420687 4723 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a014777-41ba-4350-92ed-b6036f193d1c-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:48 crc kubenswrapper[4723]: I0309 13:22:48.420741 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjkkd\" (UniqueName: \"kubernetes.io/projected/3a014777-41ba-4350-92ed-b6036f193d1c-kube-api-access-vjkkd\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:48 crc kubenswrapper[4723]: I0309 13:22:48.450131 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-57d4ddfcc7-js2kz" Mar 09 13:22:48 crc kubenswrapper[4723]: I0309 13:22:48.522742 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkphm\" (UniqueName: \"kubernetes.io/projected/88ef3786-4edc-4e35-b54c-ae1edbfb27ca-kube-api-access-rkphm\") pod \"88ef3786-4edc-4e35-b54c-ae1edbfb27ca\" (UID: \"88ef3786-4edc-4e35-b54c-ae1edbfb27ca\") " Mar 09 13:22:48 crc kubenswrapper[4723]: I0309 13:22:48.524269 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88ef3786-4edc-4e35-b54c-ae1edbfb27ca-combined-ca-bundle\") pod \"88ef3786-4edc-4e35-b54c-ae1edbfb27ca\" (UID: \"88ef3786-4edc-4e35-b54c-ae1edbfb27ca\") " Mar 09 13:22:48 crc kubenswrapper[4723]: I0309 13:22:48.525638 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88ef3786-4edc-4e35-b54c-ae1edbfb27ca-config-data\") pod \"88ef3786-4edc-4e35-b54c-ae1edbfb27ca\" (UID: \"88ef3786-4edc-4e35-b54c-ae1edbfb27ca\") " Mar 09 13:22:48 crc kubenswrapper[4723]: I0309 13:22:48.525800 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88ef3786-4edc-4e35-b54c-ae1edbfb27ca-config-data-custom\") pod \"88ef3786-4edc-4e35-b54c-ae1edbfb27ca\" (UID: \"88ef3786-4edc-4e35-b54c-ae1edbfb27ca\") " Mar 09 13:22:48 crc kubenswrapper[4723]: I0309 13:22:48.534194 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88ef3786-4edc-4e35-b54c-ae1edbfb27ca-kube-api-access-rkphm" (OuterVolumeSpecName: "kube-api-access-rkphm") pod "88ef3786-4edc-4e35-b54c-ae1edbfb27ca" (UID: "88ef3786-4edc-4e35-b54c-ae1edbfb27ca"). InnerVolumeSpecName "kube-api-access-rkphm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:48 crc kubenswrapper[4723]: I0309 13:22:48.541636 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88ef3786-4edc-4e35-b54c-ae1edbfb27ca-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "88ef3786-4edc-4e35-b54c-ae1edbfb27ca" (UID: "88ef3786-4edc-4e35-b54c-ae1edbfb27ca"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:48 crc kubenswrapper[4723]: I0309 13:22:48.555815 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-56845cc998-n2w92"] Mar 09 13:22:48 crc kubenswrapper[4723]: I0309 13:22:48.576556 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-56845cc998-n2w92"] Mar 09 13:22:48 crc kubenswrapper[4723]: I0309 13:22:48.579601 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88ef3786-4edc-4e35-b54c-ae1edbfb27ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88ef3786-4edc-4e35-b54c-ae1edbfb27ca" (UID: "88ef3786-4edc-4e35-b54c-ae1edbfb27ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:48 crc kubenswrapper[4723]: I0309 13:22:48.599128 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88ef3786-4edc-4e35-b54c-ae1edbfb27ca-config-data" (OuterVolumeSpecName: "config-data") pod "88ef3786-4edc-4e35-b54c-ae1edbfb27ca" (UID: "88ef3786-4edc-4e35-b54c-ae1edbfb27ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:48 crc kubenswrapper[4723]: I0309 13:22:48.628917 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88ef3786-4edc-4e35-b54c-ae1edbfb27ca-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:48 crc kubenswrapper[4723]: I0309 13:22:48.629107 4723 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88ef3786-4edc-4e35-b54c-ae1edbfb27ca-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:48 crc kubenswrapper[4723]: I0309 13:22:48.629166 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkphm\" (UniqueName: \"kubernetes.io/projected/88ef3786-4edc-4e35-b54c-ae1edbfb27ca-kube-api-access-rkphm\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:48 crc kubenswrapper[4723]: I0309 13:22:48.629220 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88ef3786-4edc-4e35-b54c-ae1edbfb27ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:48 crc kubenswrapper[4723]: I0309 13:22:48.899177 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a014777-41ba-4350-92ed-b6036f193d1c" path="/var/lib/kubelet/pods/3a014777-41ba-4350-92ed-b6036f193d1c/volumes" Mar 09 13:22:49 crc kubenswrapper[4723]: I0309 13:22:49.226348 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-57d4ddfcc7-js2kz" Mar 09 13:22:49 crc kubenswrapper[4723]: I0309 13:22:49.226442 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-57d4ddfcc7-js2kz" event={"ID":"88ef3786-4edc-4e35-b54c-ae1edbfb27ca","Type":"ContainerDied","Data":"1091e6cedffaa100a5d79cf651377901da0852d6aad342854d9a7bf07e6fa862"} Mar 09 13:22:49 crc kubenswrapper[4723]: I0309 13:22:49.226563 4723 scope.go:117] "RemoveContainer" containerID="e5ccfaec3a6781c5a52929c2d577af0275cd7269876705d10f7847b6a74a961d" Mar 09 13:22:49 crc kubenswrapper[4723]: I0309 13:22:49.266967 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-57d4ddfcc7-js2kz"] Mar 09 13:22:49 crc kubenswrapper[4723]: I0309 13:22:49.281304 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-57d4ddfcc7-js2kz"] Mar 09 13:22:49 crc kubenswrapper[4723]: I0309 13:22:49.376495 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 09 13:22:49 crc kubenswrapper[4723]: I0309 13:22:49.376532 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 09 13:22:49 crc kubenswrapper[4723]: I0309 13:22:49.415285 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 09 13:22:49 crc kubenswrapper[4723]: I0309 13:22:49.456586 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 09 13:22:50 crc kubenswrapper[4723]: I0309 13:22:50.241920 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 09 13:22:50 crc kubenswrapper[4723]: I0309 13:22:50.242256 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 09 13:22:50 crc kubenswrapper[4723]: I0309 13:22:50.896240 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88ef3786-4edc-4e35-b54c-ae1edbfb27ca" path="/var/lib/kubelet/pods/88ef3786-4edc-4e35-b54c-ae1edbfb27ca/volumes" Mar 09 13:22:53 crc kubenswrapper[4723]: I0309 13:22:53.549132 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 09 13:22:53 crc kubenswrapper[4723]: I0309 13:22:53.549738 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 09 13:22:53 crc kubenswrapper[4723]: I0309 13:22:53.586522 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 09 13:22:53 crc kubenswrapper[4723]: I0309 13:22:53.611645 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 09 13:22:54 crc kubenswrapper[4723]: I0309 13:22:54.259155 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:22:54 crc kubenswrapper[4723]: I0309 13:22:54.259700 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e185e279-02e9-4b2a-a0bc-edd302907c9c" containerName="ceilometer-central-agent" containerID="cri-o://3779de9321416ef1a7dcab60b032dad7dbd6d7e4e149e246a3872b84914c4237" gracePeriod=30 Mar 09 13:22:54 crc kubenswrapper[4723]: I0309 13:22:54.259854 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e185e279-02e9-4b2a-a0bc-edd302907c9c" containerName="proxy-httpd" containerID="cri-o://90ef697bc9033fdd9fe7af7cf982e4d905864fd9856482733e202f8e9ac2c2e5" gracePeriod=30 Mar 09 13:22:54 crc kubenswrapper[4723]: I0309 13:22:54.259951 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e185e279-02e9-4b2a-a0bc-edd302907c9c" containerName="sg-core" containerID="cri-o://5068d2eebbeafa26cfbcdb01f31d16ca8bed49bf4d9e0a308b9d470b51d18486" gracePeriod=30 Mar 09 13:22:54 crc kubenswrapper[4723]: I0309 13:22:54.259984 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e185e279-02e9-4b2a-a0bc-edd302907c9c" containerName="ceilometer-notification-agent" containerID="cri-o://eb3756fafc9219ca50e44aeddb90125beb80a27e146322ffb383179dc607ffc5" gracePeriod=30 Mar 09 13:22:54 crc kubenswrapper[4723]: I0309 13:22:54.279778 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 09 13:22:54 crc kubenswrapper[4723]: I0309 13:22:54.283175 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 09 13:22:54 crc kubenswrapper[4723]: I0309 13:22:54.283363 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 09 13:22:54 crc kubenswrapper[4723]: I0309 13:22:54.743098 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 09 13:22:54 crc kubenswrapper[4723]: I0309 13:22:54.743519 4723 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 09 13:22:54 crc kubenswrapper[4723]: I0309 13:22:54.745406 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 09 13:22:55 crc kubenswrapper[4723]: I0309 13:22:55.304027 4723 generic.go:334] "Generic (PLEG): container finished" podID="e185e279-02e9-4b2a-a0bc-edd302907c9c" containerID="90ef697bc9033fdd9fe7af7cf982e4d905864fd9856482733e202f8e9ac2c2e5" exitCode=0 Mar 09 13:22:55 crc kubenswrapper[4723]: I0309 13:22:55.304457 4723 generic.go:334] "Generic (PLEG): container finished" podID="e185e279-02e9-4b2a-a0bc-edd302907c9c" containerID="5068d2eebbeafa26cfbcdb01f31d16ca8bed49bf4d9e0a308b9d470b51d18486" exitCode=2 Mar 09 13:22:55 crc kubenswrapper[4723]: I0309 13:22:55.304469 4723 generic.go:334] "Generic (PLEG): container finished" podID="e185e279-02e9-4b2a-a0bc-edd302907c9c" containerID="eb3756fafc9219ca50e44aeddb90125beb80a27e146322ffb383179dc607ffc5" exitCode=0 Mar 09 13:22:55 crc kubenswrapper[4723]: I0309 13:22:55.304477 4723 generic.go:334] "Generic (PLEG): container finished" podID="e185e279-02e9-4b2a-a0bc-edd302907c9c" containerID="3779de9321416ef1a7dcab60b032dad7dbd6d7e4e149e246a3872b84914c4237" exitCode=0 Mar 09 13:22:55 crc kubenswrapper[4723]: I0309 13:22:55.304102 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e185e279-02e9-4b2a-a0bc-edd302907c9c","Type":"ContainerDied","Data":"90ef697bc9033fdd9fe7af7cf982e4d905864fd9856482733e202f8e9ac2c2e5"} Mar 09 13:22:55 crc kubenswrapper[4723]: I0309 13:22:55.304600 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e185e279-02e9-4b2a-a0bc-edd302907c9c","Type":"ContainerDied","Data":"5068d2eebbeafa26cfbcdb01f31d16ca8bed49bf4d9e0a308b9d470b51d18486"} Mar 09 13:22:55 crc kubenswrapper[4723]: I0309 13:22:55.304674 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e185e279-02e9-4b2a-a0bc-edd302907c9c","Type":"ContainerDied","Data":"eb3756fafc9219ca50e44aeddb90125beb80a27e146322ffb383179dc607ffc5"} Mar 09 13:22:55 crc kubenswrapper[4723]: I0309 13:22:55.304685 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e185e279-02e9-4b2a-a0bc-edd302907c9c","Type":"ContainerDied","Data":"3779de9321416ef1a7dcab60b032dad7dbd6d7e4e149e246a3872b84914c4237"} Mar 09 13:22:55 crc kubenswrapper[4723]: I0309 13:22:55.307699 4723 generic.go:334] "Generic (PLEG): container finished" podID="7e8ed559-11fc-4511-9258-1681da84b5cd" containerID="1da2ab881be2e547108284e873355eae3dc6a7acd1feaa32a1e59f89b44f94c5" exitCode=0 Mar 09 13:22:55 crc kubenswrapper[4723]: I0309 13:22:55.307779 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-pppx5" event={"ID":"7e8ed559-11fc-4511-9258-1681da84b5cd","Type":"ContainerDied","Data":"1da2ab881be2e547108284e873355eae3dc6a7acd1feaa32a1e59f89b44f94c5"} Mar 09 13:22:55 crc kubenswrapper[4723]: I0309 13:22:55.582945 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:22:55 crc kubenswrapper[4723]: I0309 13:22:55.703041 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e185e279-02e9-4b2a-a0bc-edd302907c9c-config-data\") pod \"e185e279-02e9-4b2a-a0bc-edd302907c9c\" (UID: \"e185e279-02e9-4b2a-a0bc-edd302907c9c\") " Mar 09 13:22:55 crc kubenswrapper[4723]: I0309 13:22:55.703136 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e185e279-02e9-4b2a-a0bc-edd302907c9c-sg-core-conf-yaml\") pod \"e185e279-02e9-4b2a-a0bc-edd302907c9c\" (UID: \"e185e279-02e9-4b2a-a0bc-edd302907c9c\") " Mar 09 13:22:55 crc kubenswrapper[4723]: I0309 13:22:55.703162 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e185e279-02e9-4b2a-a0bc-edd302907c9c-combined-ca-bundle\") pod \"e185e279-02e9-4b2a-a0bc-edd302907c9c\" (UID: \"e185e279-02e9-4b2a-a0bc-edd302907c9c\") " Mar 09 13:22:55 crc kubenswrapper[4723]: I0309 13:22:55.703179 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cgkh\" (UniqueName: \"kubernetes.io/projected/e185e279-02e9-4b2a-a0bc-edd302907c9c-kube-api-access-5cgkh\") pod \"e185e279-02e9-4b2a-a0bc-edd302907c9c\" (UID: \"e185e279-02e9-4b2a-a0bc-edd302907c9c\") " Mar 09 13:22:55 crc kubenswrapper[4723]: I0309 13:22:55.703226 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e185e279-02e9-4b2a-a0bc-edd302907c9c-log-httpd\") pod \"e185e279-02e9-4b2a-a0bc-edd302907c9c\" (UID: \"e185e279-02e9-4b2a-a0bc-edd302907c9c\") " Mar 09 13:22:55 crc kubenswrapper[4723]: I0309 13:22:55.703247 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e185e279-02e9-4b2a-a0bc-edd302907c9c-run-httpd\") pod \"e185e279-02e9-4b2a-a0bc-edd302907c9c\" (UID: \"e185e279-02e9-4b2a-a0bc-edd302907c9c\") " Mar 09 13:22:55 crc kubenswrapper[4723]: I0309 13:22:55.703323 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e185e279-02e9-4b2a-a0bc-edd302907c9c-scripts\") pod \"e185e279-02e9-4b2a-a0bc-edd302907c9c\" (UID: \"e185e279-02e9-4b2a-a0bc-edd302907c9c\") " Mar 09 13:22:55 crc kubenswrapper[4723]: I0309 13:22:55.703706 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e185e279-02e9-4b2a-a0bc-edd302907c9c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e185e279-02e9-4b2a-a0bc-edd302907c9c" (UID: "e185e279-02e9-4b2a-a0bc-edd302907c9c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:22:55 crc kubenswrapper[4723]: I0309 13:22:55.703800 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e185e279-02e9-4b2a-a0bc-edd302907c9c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e185e279-02e9-4b2a-a0bc-edd302907c9c" (UID: "e185e279-02e9-4b2a-a0bc-edd302907c9c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:22:55 crc kubenswrapper[4723]: I0309 13:22:55.704591 4723 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e185e279-02e9-4b2a-a0bc-edd302907c9c-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:55 crc kubenswrapper[4723]: I0309 13:22:55.704612 4723 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e185e279-02e9-4b2a-a0bc-edd302907c9c-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:55 crc kubenswrapper[4723]: I0309 13:22:55.713070 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e185e279-02e9-4b2a-a0bc-edd302907c9c-scripts" (OuterVolumeSpecName: "scripts") pod "e185e279-02e9-4b2a-a0bc-edd302907c9c" (UID: "e185e279-02e9-4b2a-a0bc-edd302907c9c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:55 crc kubenswrapper[4723]: I0309 13:22:55.715336 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e185e279-02e9-4b2a-a0bc-edd302907c9c-kube-api-access-5cgkh" (OuterVolumeSpecName: "kube-api-access-5cgkh") pod "e185e279-02e9-4b2a-a0bc-edd302907c9c" (UID: "e185e279-02e9-4b2a-a0bc-edd302907c9c"). InnerVolumeSpecName "kube-api-access-5cgkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:55 crc kubenswrapper[4723]: I0309 13:22:55.757730 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e185e279-02e9-4b2a-a0bc-edd302907c9c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e185e279-02e9-4b2a-a0bc-edd302907c9c" (UID: "e185e279-02e9-4b2a-a0bc-edd302907c9c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:55 crc kubenswrapper[4723]: I0309 13:22:55.801774 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e185e279-02e9-4b2a-a0bc-edd302907c9c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e185e279-02e9-4b2a-a0bc-edd302907c9c" (UID: "e185e279-02e9-4b2a-a0bc-edd302907c9c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:55 crc kubenswrapper[4723]: I0309 13:22:55.806557 4723 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e185e279-02e9-4b2a-a0bc-edd302907c9c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:55 crc kubenswrapper[4723]: I0309 13:22:55.806590 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e185e279-02e9-4b2a-a0bc-edd302907c9c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:55 crc kubenswrapper[4723]: I0309 13:22:55.806601 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cgkh\" (UniqueName: \"kubernetes.io/projected/e185e279-02e9-4b2a-a0bc-edd302907c9c-kube-api-access-5cgkh\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:55 crc kubenswrapper[4723]: I0309 13:22:55.806614 4723 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e185e279-02e9-4b2a-a0bc-edd302907c9c-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:55 crc kubenswrapper[4723]: I0309 13:22:55.855631 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e185e279-02e9-4b2a-a0bc-edd302907c9c-config-data" (OuterVolumeSpecName: "config-data") pod "e185e279-02e9-4b2a-a0bc-edd302907c9c" (UID: "e185e279-02e9-4b2a-a0bc-edd302907c9c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:55 crc kubenswrapper[4723]: I0309 13:22:55.908486 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e185e279-02e9-4b2a-a0bc-edd302907c9c-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.330277 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.334119 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e185e279-02e9-4b2a-a0bc-edd302907c9c","Type":"ContainerDied","Data":"f7e991cabde4fe3b3ce342e722b7df9c5442844e4d4ffd0dd9c822c71888bb65"} Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.334166 4723 scope.go:117] "RemoveContainer" containerID="90ef697bc9033fdd9fe7af7cf982e4d905864fd9856482733e202f8e9ac2c2e5" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.382856 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.386631 4723 scope.go:117] "RemoveContainer" containerID="5068d2eebbeafa26cfbcdb01f31d16ca8bed49bf4d9e0a308b9d470b51d18486" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.395591 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.483529 4723 scope.go:117] "RemoveContainer" containerID="eb3756fafc9219ca50e44aeddb90125beb80a27e146322ffb383179dc607ffc5" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.488801 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:22:56 crc kubenswrapper[4723]: E0309 13:22:56.491936 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e185e279-02e9-4b2a-a0bc-edd302907c9c" containerName="sg-core" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.491966 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="e185e279-02e9-4b2a-a0bc-edd302907c9c" containerName="sg-core" Mar 09 13:22:56 crc kubenswrapper[4723]: E0309 13:22:56.491984 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88ef3786-4edc-4e35-b54c-ae1edbfb27ca" containerName="heat-cfnapi" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.491990 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="88ef3786-4edc-4e35-b54c-ae1edbfb27ca" containerName="heat-cfnapi" Mar 09 13:22:56 crc kubenswrapper[4723]: E0309 13:22:56.492010 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a014777-41ba-4350-92ed-b6036f193d1c" containerName="heat-api" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.492016 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a014777-41ba-4350-92ed-b6036f193d1c" containerName="heat-api" Mar 09 13:22:56 crc kubenswrapper[4723]: E0309 13:22:56.492025 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e185e279-02e9-4b2a-a0bc-edd302907c9c" containerName="ceilometer-notification-agent" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.492049 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="e185e279-02e9-4b2a-a0bc-edd302907c9c" containerName="ceilometer-notification-agent" Mar 09 13:22:56 crc kubenswrapper[4723]: E0309 13:22:56.492059 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e185e279-02e9-4b2a-a0bc-edd302907c9c" containerName="ceilometer-central-agent" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.492065 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="e185e279-02e9-4b2a-a0bc-edd302907c9c" containerName="ceilometer-central-agent" Mar 09 13:22:56 crc kubenswrapper[4723]: E0309 13:22:56.492084 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a014777-41ba-4350-92ed-b6036f193d1c" containerName="heat-api" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.492090 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a014777-41ba-4350-92ed-b6036f193d1c" containerName="heat-api" Mar 09 13:22:56 crc kubenswrapper[4723]: E0309 13:22:56.492104 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e185e279-02e9-4b2a-a0bc-edd302907c9c" containerName="proxy-httpd" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.492110 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="e185e279-02e9-4b2a-a0bc-edd302907c9c" containerName="proxy-httpd" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.499983 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="e185e279-02e9-4b2a-a0bc-edd302907c9c" containerName="sg-core" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.500127 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a014777-41ba-4350-92ed-b6036f193d1c" containerName="heat-api" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.500143 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="e185e279-02e9-4b2a-a0bc-edd302907c9c" containerName="ceilometer-notification-agent" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.500163 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="88ef3786-4edc-4e35-b54c-ae1edbfb27ca" containerName="heat-cfnapi" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.500188 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a014777-41ba-4350-92ed-b6036f193d1c" containerName="heat-api" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.500200 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="e185e279-02e9-4b2a-a0bc-edd302907c9c" containerName="proxy-httpd" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.500210 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="e185e279-02e9-4b2a-a0bc-edd302907c9c" containerName="ceilometer-central-agent" Mar 09 13:22:56 crc kubenswrapper[4723]: E0309 13:22:56.500967 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88ef3786-4edc-4e35-b54c-ae1edbfb27ca" containerName="heat-cfnapi" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.500982 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="88ef3786-4edc-4e35-b54c-ae1edbfb27ca" containerName="heat-cfnapi" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.501219 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="88ef3786-4edc-4e35-b54c-ae1edbfb27ca" containerName="heat-cfnapi" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.503140 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.507263 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.507317 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.517126 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.568811 4723 scope.go:117] "RemoveContainer" containerID="3779de9321416ef1a7dcab60b032dad7dbd6d7e4e149e246a3872b84914c4237" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.627655 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8-run-httpd\") pod \"ceilometer-0\" (UID: \"1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8\") " pod="openstack/ceilometer-0" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.627991 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8-scripts\") pod \"ceilometer-0\" (UID: \"1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8\") " pod="openstack/ceilometer-0" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.628109 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8\") " pod="openstack/ceilometer-0" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.628224 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8-config-data\") pod \"ceilometer-0\" (UID: \"1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8\") " pod="openstack/ceilometer-0" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.628330 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8-log-httpd\") pod \"ceilometer-0\" (UID: \"1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8\") " pod="openstack/ceilometer-0" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.628413 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8\") " pod="openstack/ceilometer-0" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.628479 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfcll\" (UniqueName: \"kubernetes.io/projected/1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8-kube-api-access-tfcll\") pod \"ceilometer-0\" (UID: \"1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8\") " pod="openstack/ceilometer-0" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.718337 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.720224 4723 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.730142 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8\") " pod="openstack/ceilometer-0" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.730189 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8-config-data\") pod \"ceilometer-0\" (UID: \"1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8\") " pod="openstack/ceilometer-0" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.730243 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8-log-httpd\") pod \"ceilometer-0\" (UID: \"1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8\") " pod="openstack/ceilometer-0" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.730266 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8\") " pod="openstack/ceilometer-0" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.730284 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfcll\" (UniqueName: \"kubernetes.io/projected/1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8-kube-api-access-tfcll\") pod \"ceilometer-0\" (UID: \"1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8\") " pod="openstack/ceilometer-0" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.730365 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8-run-httpd\") pod \"ceilometer-0\" (UID: \"1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8\") " pod="openstack/ceilometer-0" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.730410 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8-scripts\") pod \"ceilometer-0\" (UID: \"1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8\") " pod="openstack/ceilometer-0" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.736473 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.737054 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8-log-httpd\") pod \"ceilometer-0\" (UID: \"1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8\") " pod="openstack/ceilometer-0" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.737273 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8-run-httpd\") pod \"ceilometer-0\" (UID: \"1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8\") " pod="openstack/ceilometer-0" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.738235 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8\") " pod="openstack/ceilometer-0" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.749326 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8-config-data\") pod \"ceilometer-0\" (UID: \"1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8\") " pod="openstack/ceilometer-0" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.761419 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfcll\" (UniqueName: \"kubernetes.io/projected/1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8-kube-api-access-tfcll\") pod \"ceilometer-0\" (UID: \"1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8\") " pod="openstack/ceilometer-0" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.769029 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8\") " pod="openstack/ceilometer-0" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.776151 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8-scripts\") pod \"ceilometer-0\" (UID: \"1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8\") " pod="openstack/ceilometer-0" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.861689 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:22:56 crc kubenswrapper[4723]: I0309 13:22:56.925690 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e185e279-02e9-4b2a-a0bc-edd302907c9c" path="/var/lib/kubelet/pods/e185e279-02e9-4b2a-a0bc-edd302907c9c/volumes" Mar 09 13:22:57 crc kubenswrapper[4723]: I0309 13:22:57.049783 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-pppx5" Mar 09 13:22:57 crc kubenswrapper[4723]: I0309 13:22:57.143412 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e8ed559-11fc-4511-9258-1681da84b5cd-config-data\") pod \"7e8ed559-11fc-4511-9258-1681da84b5cd\" (UID: \"7e8ed559-11fc-4511-9258-1681da84b5cd\") " Mar 09 13:22:57 crc kubenswrapper[4723]: I0309 13:22:57.143535 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e8ed559-11fc-4511-9258-1681da84b5cd-combined-ca-bundle\") pod \"7e8ed559-11fc-4511-9258-1681da84b5cd\" (UID: \"7e8ed559-11fc-4511-9258-1681da84b5cd\") " Mar 09 13:22:57 crc kubenswrapper[4723]: I0309 13:22:57.144045 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e8ed559-11fc-4511-9258-1681da84b5cd-scripts\") pod \"7e8ed559-11fc-4511-9258-1681da84b5cd\" (UID: \"7e8ed559-11fc-4511-9258-1681da84b5cd\") " Mar 09 13:22:57 crc kubenswrapper[4723]: I0309 13:22:57.144147 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfngs\" (UniqueName: \"kubernetes.io/projected/7e8ed559-11fc-4511-9258-1681da84b5cd-kube-api-access-jfngs\") pod \"7e8ed559-11fc-4511-9258-1681da84b5cd\" (UID: \"7e8ed559-11fc-4511-9258-1681da84b5cd\") " Mar 09 13:22:57 crc kubenswrapper[4723]: I0309 13:22:57.157191 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e8ed559-11fc-4511-9258-1681da84b5cd-kube-api-access-jfngs" (OuterVolumeSpecName: "kube-api-access-jfngs") pod "7e8ed559-11fc-4511-9258-1681da84b5cd" (UID: "7e8ed559-11fc-4511-9258-1681da84b5cd"). InnerVolumeSpecName "kube-api-access-jfngs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:57 crc kubenswrapper[4723]: I0309 13:22:57.165593 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e8ed559-11fc-4511-9258-1681da84b5cd-scripts" (OuterVolumeSpecName: "scripts") pod "7e8ed559-11fc-4511-9258-1681da84b5cd" (UID: "7e8ed559-11fc-4511-9258-1681da84b5cd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:57 crc kubenswrapper[4723]: I0309 13:22:57.195039 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e8ed559-11fc-4511-9258-1681da84b5cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e8ed559-11fc-4511-9258-1681da84b5cd" (UID: "7e8ed559-11fc-4511-9258-1681da84b5cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:57 crc kubenswrapper[4723]: I0309 13:22:57.250207 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e8ed559-11fc-4511-9258-1681da84b5cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:57 crc kubenswrapper[4723]: I0309 13:22:57.250532 4723 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e8ed559-11fc-4511-9258-1681da84b5cd-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:57 crc kubenswrapper[4723]: I0309 13:22:57.250547 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfngs\" (UniqueName: \"kubernetes.io/projected/7e8ed559-11fc-4511-9258-1681da84b5cd-kube-api-access-jfngs\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:57 crc kubenswrapper[4723]: I0309 13:22:57.271016 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e8ed559-11fc-4511-9258-1681da84b5cd-config-data" (OuterVolumeSpecName: "config-data") pod "7e8ed559-11fc-4511-9258-1681da84b5cd" (UID: "7e8ed559-11fc-4511-9258-1681da84b5cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:57 crc kubenswrapper[4723]: I0309 13:22:57.352911 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e8ed559-11fc-4511-9258-1681da84b5cd-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:57 crc kubenswrapper[4723]: I0309 13:22:57.368465 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-pppx5" Mar 09 13:22:57 crc kubenswrapper[4723]: I0309 13:22:57.369406 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-pppx5" event={"ID":"7e8ed559-11fc-4511-9258-1681da84b5cd","Type":"ContainerDied","Data":"b7faf82aadc8e8983e17c07442f63ed164016bbbab33b67276e6987e42d185df"} Mar 09 13:22:57 crc kubenswrapper[4723]: I0309 13:22:57.369432 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7faf82aadc8e8983e17c07442f63ed164016bbbab33b67276e6987e42d185df" Mar 09 13:22:57 crc kubenswrapper[4723]: I0309 13:22:57.439938 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 09 13:22:57 crc kubenswrapper[4723]: E0309 13:22:57.440529 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e8ed559-11fc-4511-9258-1681da84b5cd" containerName="nova-cell0-conductor-db-sync" Mar 09 13:22:57 crc kubenswrapper[4723]: I0309 13:22:57.440563 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e8ed559-11fc-4511-9258-1681da84b5cd" containerName="nova-cell0-conductor-db-sync" Mar 09 13:22:57 crc kubenswrapper[4723]: I0309 13:22:57.440848 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e8ed559-11fc-4511-9258-1681da84b5cd" containerName="nova-cell0-conductor-db-sync" Mar 09 13:22:57 crc kubenswrapper[4723]: I0309 13:22:57.441849 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 09 13:22:57 crc kubenswrapper[4723]: I0309 13:22:57.446130 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 09 13:22:57 crc kubenswrapper[4723]: I0309 13:22:57.446534 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-jwtm7" Mar 09 13:22:57 crc kubenswrapper[4723]: I0309 13:22:57.483929 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 09 13:22:57 crc kubenswrapper[4723]: I0309 13:22:57.556967 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d4aa80-3125-4750-83a9-af82898eaaf6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a4d4aa80-3125-4750-83a9-af82898eaaf6\") " pod="openstack/nova-cell0-conductor-0" Mar 09 13:22:57 crc kubenswrapper[4723]: I0309 13:22:57.557014 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4d4aa80-3125-4750-83a9-af82898eaaf6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a4d4aa80-3125-4750-83a9-af82898eaaf6\") " pod="openstack/nova-cell0-conductor-0" Mar 09 13:22:57 crc kubenswrapper[4723]: I0309 13:22:57.557141 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krl92\" (UniqueName: \"kubernetes.io/projected/a4d4aa80-3125-4750-83a9-af82898eaaf6-kube-api-access-krl92\") pod \"nova-cell0-conductor-0\" (UID: \"a4d4aa80-3125-4750-83a9-af82898eaaf6\") " pod="openstack/nova-cell0-conductor-0" Mar 09 13:22:57 crc kubenswrapper[4723]: I0309 13:22:57.570916 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:22:57 crc kubenswrapper[4723]: I0309 13:22:57.659602 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d4aa80-3125-4750-83a9-af82898eaaf6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a4d4aa80-3125-4750-83a9-af82898eaaf6\") " pod="openstack/nova-cell0-conductor-0" Mar 09 13:22:57 crc kubenswrapper[4723]: I0309 13:22:57.659918 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4d4aa80-3125-4750-83a9-af82898eaaf6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a4d4aa80-3125-4750-83a9-af82898eaaf6\") " pod="openstack/nova-cell0-conductor-0" Mar 09 13:22:57 crc kubenswrapper[4723]: I0309 13:22:57.659981 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krl92\" (UniqueName: \"kubernetes.io/projected/a4d4aa80-3125-4750-83a9-af82898eaaf6-kube-api-access-krl92\") pod \"nova-cell0-conductor-0\" (UID: \"a4d4aa80-3125-4750-83a9-af82898eaaf6\") " pod="openstack/nova-cell0-conductor-0" Mar 09 13:22:57 crc kubenswrapper[4723]: I0309 13:22:57.670620 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4d4aa80-3125-4750-83a9-af82898eaaf6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a4d4aa80-3125-4750-83a9-af82898eaaf6\") " pod="openstack/nova-cell0-conductor-0" Mar 09 13:22:57 crc kubenswrapper[4723]: I0309 13:22:57.671535 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d4aa80-3125-4750-83a9-af82898eaaf6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a4d4aa80-3125-4750-83a9-af82898eaaf6\") " pod="openstack/nova-cell0-conductor-0" Mar 09 13:22:57 crc kubenswrapper[4723]: I0309 13:22:57.686629 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krl92\" (UniqueName: \"kubernetes.io/projected/a4d4aa80-3125-4750-83a9-af82898eaaf6-kube-api-access-krl92\") pod \"nova-cell0-conductor-0\" (UID: \"a4d4aa80-3125-4750-83a9-af82898eaaf6\") " pod="openstack/nova-cell0-conductor-0" Mar 09 13:22:57 crc kubenswrapper[4723]: I0309 13:22:57.704029 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-5d4f94b9d4-2l2jj" Mar 09 13:22:57 crc kubenswrapper[4723]: I0309 13:22:57.781621 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-756d785958-qpphh"] Mar 09 13:22:57 crc kubenswrapper[4723]: I0309 13:22:57.781804 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-756d785958-qpphh" podUID="d365869f-e896-43d2-80ab-520b5e71beae" containerName="heat-engine" containerID="cri-o://8e99ef46ce36916e98dcddaff58d8238fccadac0643c954b4d8d8ad6c6eefab6" gracePeriod=60 Mar 09 13:22:57 crc kubenswrapper[4723]: I0309 13:22:57.814424 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 09 13:22:58 crc kubenswrapper[4723]: I0309 13:22:58.433383 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8","Type":"ContainerStarted","Data":"dceaaf84d71a67b7c09a61e856d34a0f4fed9916d235ea6cfdf7d1222ed3bc39"} Mar 09 13:22:58 crc kubenswrapper[4723]: I0309 13:22:58.433723 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8","Type":"ContainerStarted","Data":"06d8f92ac2d7cf3c85dcd1f111b13008adf09650b861e6362031a089f1cee78c"} Mar 09 13:22:58 crc kubenswrapper[4723]: I0309 13:22:58.433783 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 09 13:22:58 crc kubenswrapper[4723]: E0309 13:22:58.801269 4723 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e99ef46ce36916e98dcddaff58d8238fccadac0643c954b4d8d8ad6c6eefab6" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 09 13:22:58 crc kubenswrapper[4723]: E0309 13:22:58.803417 4723 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e99ef46ce36916e98dcddaff58d8238fccadac0643c954b4d8d8ad6c6eefab6" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 09 13:22:58 crc kubenswrapper[4723]: E0309 13:22:58.805706 4723 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e99ef46ce36916e98dcddaff58d8238fccadac0643c954b4d8d8ad6c6eefab6" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 09 13:22:58 crc kubenswrapper[4723]: E0309 13:22:58.805774 4723 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-756d785958-qpphh" podUID="d365869f-e896-43d2-80ab-520b5e71beae" containerName="heat-engine" Mar 09 13:22:59 crc kubenswrapper[4723]: I0309 13:22:59.453299 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8","Type":"ContainerStarted","Data":"70350c7787d44e0ca20bc0fe6d5452f8301b494d82f9e235a1e0739913495601"} Mar 09 13:22:59 crc kubenswrapper[4723]: I0309 13:22:59.457483 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a4d4aa80-3125-4750-83a9-af82898eaaf6","Type":"ContainerStarted","Data":"ddf7255f8e0a77da4cfc8f37c1cb56df0784c289a8deb6e73189ad719a7cd4b7"} Mar 09 13:22:59 crc kubenswrapper[4723]: I0309 13:22:59.457525 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a4d4aa80-3125-4750-83a9-af82898eaaf6","Type":"ContainerStarted","Data":"d786a9ad421d9f83ef5521d7991abd73871a0d832543ac4ca76e2c2af8213091"} Mar 09 13:22:59 crc kubenswrapper[4723]: I0309 13:22:59.457790 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 09 13:22:59 crc kubenswrapper[4723]: I0309 13:22:59.494095 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.494076026 podStartE2EDuration="2.494076026s" podCreationTimestamp="2026-03-09 13:22:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:22:59.477020956 +0000 UTC m=+1453.491488496" watchObservedRunningTime="2026-03-09 13:22:59.494076026 +0000 UTC m=+1453.508543566" Mar 09 13:23:00 crc kubenswrapper[4723]: I0309 13:23:00.468933 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8","Type":"ContainerStarted","Data":"e780775b380391918c73cf9dc825a15d45262e2e101c41b4abfca4f47185a68a"} Mar 09 13:23:02 crc kubenswrapper[4723]: I0309 13:23:02.495953 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8","Type":"ContainerStarted","Data":"3eeab919fec047cae3361fb461edd6331ef9d102536b843dcdfc143e497df80e"} Mar 09 13:23:02 crc kubenswrapper[4723]: I0309 13:23:02.496523 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 13:23:02 crc kubenswrapper[4723]: I0309 13:23:02.529992 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.150528327 podStartE2EDuration="6.529975476s" podCreationTimestamp="2026-03-09 13:22:56 +0000 UTC" firstStartedPulling="2026-03-09 13:22:57.575006839 +0000 UTC m=+1451.589474379" lastFinishedPulling="2026-03-09 13:23:01.954453988 +0000 UTC m=+1455.968921528" observedRunningTime="2026-03-09 13:23:02.515286379 +0000 UTC m=+1456.529753919" watchObservedRunningTime="2026-03-09 13:23:02.529975476 +0000 UTC m=+1456.544443016" Mar 09 13:23:06 crc kubenswrapper[4723]: I0309 13:23:06.551906 4723 generic.go:334] "Generic (PLEG): container finished" podID="d365869f-e896-43d2-80ab-520b5e71beae" containerID="8e99ef46ce36916e98dcddaff58d8238fccadac0643c954b4d8d8ad6c6eefab6" exitCode=0 Mar 09 13:23:06 crc kubenswrapper[4723]: I0309 13:23:06.552524 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-756d785958-qpphh" event={"ID":"d365869f-e896-43d2-80ab-520b5e71beae","Type":"ContainerDied","Data":"8e99ef46ce36916e98dcddaff58d8238fccadac0643c954b4d8d8ad6c6eefab6"} Mar 09 13:23:06 crc kubenswrapper[4723]: I0309 13:23:06.760642 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-756d785958-qpphh" Mar 09 13:23:06 crc kubenswrapper[4723]: I0309 13:23:06.913129 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89zmp\" (UniqueName: \"kubernetes.io/projected/d365869f-e896-43d2-80ab-520b5e71beae-kube-api-access-89zmp\") pod \"d365869f-e896-43d2-80ab-520b5e71beae\" (UID: \"d365869f-e896-43d2-80ab-520b5e71beae\") " Mar 09 13:23:06 crc kubenswrapper[4723]: I0309 13:23:06.913476 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d365869f-e896-43d2-80ab-520b5e71beae-config-data-custom\") pod \"d365869f-e896-43d2-80ab-520b5e71beae\" (UID: \"d365869f-e896-43d2-80ab-520b5e71beae\") " Mar 09 13:23:06 crc kubenswrapper[4723]: I0309 13:23:06.913580 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d365869f-e896-43d2-80ab-520b5e71beae-config-data\") pod \"d365869f-e896-43d2-80ab-520b5e71beae\" (UID: \"d365869f-e896-43d2-80ab-520b5e71beae\") " Mar 09 13:23:06 crc kubenswrapper[4723]: I0309 13:23:06.913620 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d365869f-e896-43d2-80ab-520b5e71beae-combined-ca-bundle\") pod \"d365869f-e896-43d2-80ab-520b5e71beae\" (UID: \"d365869f-e896-43d2-80ab-520b5e71beae\") " Mar 09 13:23:06 crc kubenswrapper[4723]: I0309 13:23:06.920418 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d365869f-e896-43d2-80ab-520b5e71beae-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d365869f-e896-43d2-80ab-520b5e71beae" (UID: "d365869f-e896-43d2-80ab-520b5e71beae"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:23:06 crc kubenswrapper[4723]: I0309 13:23:06.920503 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d365869f-e896-43d2-80ab-520b5e71beae-kube-api-access-89zmp" (OuterVolumeSpecName: "kube-api-access-89zmp") pod "d365869f-e896-43d2-80ab-520b5e71beae" (UID: "d365869f-e896-43d2-80ab-520b5e71beae"). InnerVolumeSpecName "kube-api-access-89zmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:23:06 crc kubenswrapper[4723]: I0309 13:23:06.957073 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d365869f-e896-43d2-80ab-520b5e71beae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d365869f-e896-43d2-80ab-520b5e71beae" (UID: "d365869f-e896-43d2-80ab-520b5e71beae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:23:06 crc kubenswrapper[4723]: I0309 13:23:06.986482 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d365869f-e896-43d2-80ab-520b5e71beae-config-data" (OuterVolumeSpecName: "config-data") pod "d365869f-e896-43d2-80ab-520b5e71beae" (UID: "d365869f-e896-43d2-80ab-520b5e71beae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:23:07 crc kubenswrapper[4723]: I0309 13:23:07.017554 4723 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d365869f-e896-43d2-80ab-520b5e71beae-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:07 crc kubenswrapper[4723]: I0309 13:23:07.017587 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d365869f-e896-43d2-80ab-520b5e71beae-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:07 crc kubenswrapper[4723]: I0309 13:23:07.017597 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d365869f-e896-43d2-80ab-520b5e71beae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:07 crc kubenswrapper[4723]: I0309 13:23:07.017605 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89zmp\" (UniqueName: \"kubernetes.io/projected/d365869f-e896-43d2-80ab-520b5e71beae-kube-api-access-89zmp\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:07 crc kubenswrapper[4723]: I0309 13:23:07.565123 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-756d785958-qpphh" event={"ID":"d365869f-e896-43d2-80ab-520b5e71beae","Type":"ContainerDied","Data":"6d7ae58fdb96b3af8f09513dbd473244095b6f1664631e890ca50783a2b90f10"} Mar 09 13:23:07 crc kubenswrapper[4723]: I0309 13:23:07.565185 4723 scope.go:117] "RemoveContainer" containerID="8e99ef46ce36916e98dcddaff58d8238fccadac0643c954b4d8d8ad6c6eefab6" Mar 09 13:23:07 crc kubenswrapper[4723]: I0309 13:23:07.565205 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-756d785958-qpphh" Mar 09 13:23:07 crc kubenswrapper[4723]: I0309 13:23:07.601505 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-756d785958-qpphh"] Mar 09 13:23:07 crc kubenswrapper[4723]: I0309 13:23:07.613906 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-756d785958-qpphh"] Mar 09 13:23:07 crc kubenswrapper[4723]: I0309 13:23:07.848697 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 09 13:23:08 crc kubenswrapper[4723]: I0309 13:23:08.836538 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-mfbfj"] Mar 09 13:23:08 crc kubenswrapper[4723]: E0309 13:23:08.837275 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d365869f-e896-43d2-80ab-520b5e71beae" containerName="heat-engine" Mar 09 13:23:08 crc kubenswrapper[4723]: I0309 13:23:08.837288 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="d365869f-e896-43d2-80ab-520b5e71beae" containerName="heat-engine" Mar 09 13:23:08 crc kubenswrapper[4723]: I0309 13:23:08.837488 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="d365869f-e896-43d2-80ab-520b5e71beae" containerName="heat-engine" Mar 09 13:23:08 crc kubenswrapper[4723]: I0309 13:23:08.838238 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mfbfj" Mar 09 13:23:08 crc kubenswrapper[4723]: I0309 13:23:08.840246 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 09 13:23:08 crc kubenswrapper[4723]: I0309 13:23:08.840319 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 09 13:23:08 crc kubenswrapper[4723]: I0309 13:23:08.854534 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-mfbfj"] Mar 09 13:23:08 crc kubenswrapper[4723]: I0309 13:23:08.898448 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d365869f-e896-43d2-80ab-520b5e71beae" path="/var/lib/kubelet/pods/d365869f-e896-43d2-80ab-520b5e71beae/volumes" Mar 09 13:23:08 crc kubenswrapper[4723]: I0309 13:23:08.957938 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh8gd\" (UniqueName: \"kubernetes.io/projected/c4e81d20-6788-48e4-a38c-2dda5e6cc206-kube-api-access-gh8gd\") pod \"nova-cell0-cell-mapping-mfbfj\" (UID: \"c4e81d20-6788-48e4-a38c-2dda5e6cc206\") " pod="openstack/nova-cell0-cell-mapping-mfbfj" Mar 09 13:23:08 crc kubenswrapper[4723]: I0309 13:23:08.958177 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e81d20-6788-48e4-a38c-2dda5e6cc206-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mfbfj\" (UID: \"c4e81d20-6788-48e4-a38c-2dda5e6cc206\") " pod="openstack/nova-cell0-cell-mapping-mfbfj" Mar 09 13:23:08 crc kubenswrapper[4723]: I0309 13:23:08.958270 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4e81d20-6788-48e4-a38c-2dda5e6cc206-config-data\") pod \"nova-cell0-cell-mapping-mfbfj\" (UID: \"c4e81d20-6788-48e4-a38c-2dda5e6cc206\") " pod="openstack/nova-cell0-cell-mapping-mfbfj" Mar 09 13:23:08 crc kubenswrapper[4723]: I0309 13:23:08.958535 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4e81d20-6788-48e4-a38c-2dda5e6cc206-scripts\") pod \"nova-cell0-cell-mapping-mfbfj\" (UID: \"c4e81d20-6788-48e4-a38c-2dda5e6cc206\") " pod="openstack/nova-cell0-cell-mapping-mfbfj" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.060698 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4e81d20-6788-48e4-a38c-2dda5e6cc206-config-data\") pod \"nova-cell0-cell-mapping-mfbfj\" (UID: \"c4e81d20-6788-48e4-a38c-2dda5e6cc206\") " pod="openstack/nova-cell0-cell-mapping-mfbfj" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.060808 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4e81d20-6788-48e4-a38c-2dda5e6cc206-scripts\") pod \"nova-cell0-cell-mapping-mfbfj\" (UID: \"c4e81d20-6788-48e4-a38c-2dda5e6cc206\") " pod="openstack/nova-cell0-cell-mapping-mfbfj" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.062220 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh8gd\" (UniqueName: \"kubernetes.io/projected/c4e81d20-6788-48e4-a38c-2dda5e6cc206-kube-api-access-gh8gd\") pod \"nova-cell0-cell-mapping-mfbfj\" (UID: \"c4e81d20-6788-48e4-a38c-2dda5e6cc206\") " pod="openstack/nova-cell0-cell-mapping-mfbfj" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.062608 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e81d20-6788-48e4-a38c-2dda5e6cc206-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mfbfj\" (UID: \"c4e81d20-6788-48e4-a38c-2dda5e6cc206\") " pod="openstack/nova-cell0-cell-mapping-mfbfj" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.069382 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4e81d20-6788-48e4-a38c-2dda5e6cc206-config-data\") pod \"nova-cell0-cell-mapping-mfbfj\" (UID: \"c4e81d20-6788-48e4-a38c-2dda5e6cc206\") " pod="openstack/nova-cell0-cell-mapping-mfbfj" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.073623 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e81d20-6788-48e4-a38c-2dda5e6cc206-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mfbfj\" (UID: \"c4e81d20-6788-48e4-a38c-2dda5e6cc206\") " pod="openstack/nova-cell0-cell-mapping-mfbfj" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.075305 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4e81d20-6788-48e4-a38c-2dda5e6cc206-scripts\") pod \"nova-cell0-cell-mapping-mfbfj\" (UID: \"c4e81d20-6788-48e4-a38c-2dda5e6cc206\") " pod="openstack/nova-cell0-cell-mapping-mfbfj" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.100242 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.102637 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.106684 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh8gd\" (UniqueName: \"kubernetes.io/projected/c4e81d20-6788-48e4-a38c-2dda5e6cc206-kube-api-access-gh8gd\") pod \"nova-cell0-cell-mapping-mfbfj\" (UID: \"c4e81d20-6788-48e4-a38c-2dda5e6cc206\") " pod="openstack/nova-cell0-cell-mapping-mfbfj" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.110592 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.148191 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.152790 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.157754 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mfbfj" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.158746 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.206348 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.236621 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.267345 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/151bf67d-b202-4687-afcc-6247f5c670e9-logs\") pod \"nova-api-0\" (UID: \"151bf67d-b202-4687-afcc-6247f5c670e9\") " pod="openstack/nova-api-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.267425 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/151bf67d-b202-4687-afcc-6247f5c670e9-config-data\") pod \"nova-api-0\" (UID: \"151bf67d-b202-4687-afcc-6247f5c670e9\") " pod="openstack/nova-api-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.267451 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8105baf-a986-4aba-a114-10e0b997f27c-logs\") pod \"nova-metadata-0\" (UID: \"f8105baf-a986-4aba-a114-10e0b997f27c\") " pod="openstack/nova-metadata-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.267476 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st24r\" (UniqueName: \"kubernetes.io/projected/f8105baf-a986-4aba-a114-10e0b997f27c-kube-api-access-st24r\") pod \"nova-metadata-0\" (UID: \"f8105baf-a986-4aba-a114-10e0b997f27c\") " pod="openstack/nova-metadata-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.267513 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8105baf-a986-4aba-a114-10e0b997f27c-config-data\") pod \"nova-metadata-0\" (UID: \"f8105baf-a986-4aba-a114-10e0b997f27c\") " pod="openstack/nova-metadata-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.267677 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/151bf67d-b202-4687-afcc-6247f5c670e9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"151bf67d-b202-4687-afcc-6247f5c670e9\") " pod="openstack/nova-api-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.267705 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wcbb\" (UniqueName: \"kubernetes.io/projected/151bf67d-b202-4687-afcc-6247f5c670e9-kube-api-access-7wcbb\") pod \"nova-api-0\" (UID: \"151bf67d-b202-4687-afcc-6247f5c670e9\") " pod="openstack/nova-api-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.267738 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8105baf-a986-4aba-a114-10e0b997f27c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f8105baf-a986-4aba-a114-10e0b997f27c\") " pod="openstack/nova-metadata-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.277981 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.279572 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.282235 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.311534 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.375059 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/151bf67d-b202-4687-afcc-6247f5c670e9-config-data\") pod \"nova-api-0\" (UID: \"151bf67d-b202-4687-afcc-6247f5c670e9\") " pod="openstack/nova-api-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.376899 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8105baf-a986-4aba-a114-10e0b997f27c-logs\") pod \"nova-metadata-0\" (UID: \"f8105baf-a986-4aba-a114-10e0b997f27c\") " pod="openstack/nova-metadata-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.376937 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-st24r\" (UniqueName: \"kubernetes.io/projected/f8105baf-a986-4aba-a114-10e0b997f27c-kube-api-access-st24r\") pod \"nova-metadata-0\" (UID: \"f8105baf-a986-4aba-a114-10e0b997f27c\") " pod="openstack/nova-metadata-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.376989 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8105baf-a986-4aba-a114-10e0b997f27c-config-data\") pod \"nova-metadata-0\" (UID: \"f8105baf-a986-4aba-a114-10e0b997f27c\") " pod="openstack/nova-metadata-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.377555 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5014a76c-6fd4-44e5-9151-f07dcfb5f1d4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5014a76c-6fd4-44e5-9151-f07dcfb5f1d4\") " pod="openstack/nova-scheduler-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.377607 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/151bf67d-b202-4687-afcc-6247f5c670e9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"151bf67d-b202-4687-afcc-6247f5c670e9\") " pod="openstack/nova-api-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.377635 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wcbb\" (UniqueName: \"kubernetes.io/projected/151bf67d-b202-4687-afcc-6247f5c670e9-kube-api-access-7wcbb\") pod \"nova-api-0\" (UID: \"151bf67d-b202-4687-afcc-6247f5c670e9\") " pod="openstack/nova-api-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.377662 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8105baf-a986-4aba-a114-10e0b997f27c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f8105baf-a986-4aba-a114-10e0b997f27c\") " pod="openstack/nova-metadata-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.377719 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l9gc\" (UniqueName: \"kubernetes.io/projected/5014a76c-6fd4-44e5-9151-f07dcfb5f1d4-kube-api-access-7l9gc\") pod \"nova-scheduler-0\" (UID: \"5014a76c-6fd4-44e5-9151-f07dcfb5f1d4\") " pod="openstack/nova-scheduler-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.377830 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/151bf67d-b202-4687-afcc-6247f5c670e9-logs\") pod \"nova-api-0\" (UID: \"151bf67d-b202-4687-afcc-6247f5c670e9\") " pod="openstack/nova-api-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.377893 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5014a76c-6fd4-44e5-9151-f07dcfb5f1d4-config-data\") pod \"nova-scheduler-0\" (UID: \"5014a76c-6fd4-44e5-9151-f07dcfb5f1d4\") " pod="openstack/nova-scheduler-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.379380 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/151bf67d-b202-4687-afcc-6247f5c670e9-logs\") pod \"nova-api-0\" (UID: \"151bf67d-b202-4687-afcc-6247f5c670e9\") " pod="openstack/nova-api-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.391780 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8105baf-a986-4aba-a114-10e0b997f27c-logs\") pod \"nova-metadata-0\" (UID: \"f8105baf-a986-4aba-a114-10e0b997f27c\") " pod="openstack/nova-metadata-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.393671 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/151bf67d-b202-4687-afcc-6247f5c670e9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"151bf67d-b202-4687-afcc-6247f5c670e9\") " pod="openstack/nova-api-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.395386 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8105baf-a986-4aba-a114-10e0b997f27c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f8105baf-a986-4aba-a114-10e0b997f27c\") " pod="openstack/nova-metadata-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.401566 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-st24r\" (UniqueName: \"kubernetes.io/projected/f8105baf-a986-4aba-a114-10e0b997f27c-kube-api-access-st24r\") pod \"nova-metadata-0\" (UID: \"f8105baf-a986-4aba-a114-10e0b997f27c\") " pod="openstack/nova-metadata-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.402506 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/151bf67d-b202-4687-afcc-6247f5c670e9-config-data\") pod \"nova-api-0\" (UID: \"151bf67d-b202-4687-afcc-6247f5c670e9\") " pod="openstack/nova-api-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.403484 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8105baf-a986-4aba-a114-10e0b997f27c-config-data\") pod \"nova-metadata-0\" (UID: \"f8105baf-a986-4aba-a114-10e0b997f27c\") " pod="openstack/nova-metadata-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.412446 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5fbc4d444f-mlq2z"] Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.414301 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fbc4d444f-mlq2z" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.450023 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fbc4d444f-mlq2z"] Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.457366 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wcbb\" (UniqueName: \"kubernetes.io/projected/151bf67d-b202-4687-afcc-6247f5c670e9-kube-api-access-7wcbb\") pod \"nova-api-0\" (UID: \"151bf67d-b202-4687-afcc-6247f5c670e9\") " pod="openstack/nova-api-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.478835 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.482039 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.484608 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.484881 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5014a76c-6fd4-44e5-9151-f07dcfb5f1d4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5014a76c-6fd4-44e5-9151-f07dcfb5f1d4\") " pod="openstack/nova-scheduler-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.485235 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7l9gc\" (UniqueName: \"kubernetes.io/projected/5014a76c-6fd4-44e5-9151-f07dcfb5f1d4-kube-api-access-7l9gc\") pod \"nova-scheduler-0\" (UID: \"5014a76c-6fd4-44e5-9151-f07dcfb5f1d4\") " pod="openstack/nova-scheduler-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.485706 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5014a76c-6fd4-44e5-9151-f07dcfb5f1d4-config-data\") pod \"nova-scheduler-0\" (UID: \"5014a76c-6fd4-44e5-9151-f07dcfb5f1d4\") " pod="openstack/nova-scheduler-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.491378 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5014a76c-6fd4-44e5-9151-f07dcfb5f1d4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5014a76c-6fd4-44e5-9151-f07dcfb5f1d4\") " pod="openstack/nova-scheduler-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.507049 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5014a76c-6fd4-44e5-9151-f07dcfb5f1d4-config-data\") pod \"nova-scheduler-0\" (UID: \"5014a76c-6fd4-44e5-9151-f07dcfb5f1d4\") " pod="openstack/nova-scheduler-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.510538 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l9gc\" (UniqueName: \"kubernetes.io/projected/5014a76c-6fd4-44e5-9151-f07dcfb5f1d4-kube-api-access-7l9gc\") pod \"nova-scheduler-0\" (UID: \"5014a76c-6fd4-44e5-9151-f07dcfb5f1d4\") " pod="openstack/nova-scheduler-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.516233 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.517227 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.594932 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/435ec576-f731-4d62-9eeb-804d4ae4f52a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"435ec576-f731-4d62-9eeb-804d4ae4f52a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.596172 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/330db7f4-8928-4101-b28f-e4a129b90227-dns-svc\") pod \"dnsmasq-dns-5fbc4d444f-mlq2z\" (UID: \"330db7f4-8928-4101-b28f-e4a129b90227\") " pod="openstack/dnsmasq-dns-5fbc4d444f-mlq2z" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.596350 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/330db7f4-8928-4101-b28f-e4a129b90227-dns-swift-storage-0\") pod \"dnsmasq-dns-5fbc4d444f-mlq2z\" (UID: \"330db7f4-8928-4101-b28f-e4a129b90227\") " pod="openstack/dnsmasq-dns-5fbc4d444f-mlq2z" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.596397 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n962m\" (UniqueName: \"kubernetes.io/projected/330db7f4-8928-4101-b28f-e4a129b90227-kube-api-access-n962m\") pod \"dnsmasq-dns-5fbc4d444f-mlq2z\" (UID: \"330db7f4-8928-4101-b28f-e4a129b90227\") " pod="openstack/dnsmasq-dns-5fbc4d444f-mlq2z" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.596475 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/330db7f4-8928-4101-b28f-e4a129b90227-ovsdbserver-sb\") pod \"dnsmasq-dns-5fbc4d444f-mlq2z\" (UID: \"330db7f4-8928-4101-b28f-e4a129b90227\") " pod="openstack/dnsmasq-dns-5fbc4d444f-mlq2z" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.596622 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpj8x\" (UniqueName: \"kubernetes.io/projected/435ec576-f731-4d62-9eeb-804d4ae4f52a-kube-api-access-mpj8x\") pod \"nova-cell1-novncproxy-0\" (UID: \"435ec576-f731-4d62-9eeb-804d4ae4f52a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.596770 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/330db7f4-8928-4101-b28f-e4a129b90227-config\") pod \"dnsmasq-dns-5fbc4d444f-mlq2z\" (UID: \"330db7f4-8928-4101-b28f-e4a129b90227\") " pod="openstack/dnsmasq-dns-5fbc4d444f-mlq2z" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.596793 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/435ec576-f731-4d62-9eeb-804d4ae4f52a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"435ec576-f731-4d62-9eeb-804d4ae4f52a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.596942 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/330db7f4-8928-4101-b28f-e4a129b90227-ovsdbserver-nb\") pod \"dnsmasq-dns-5fbc4d444f-mlq2z\" (UID: \"330db7f4-8928-4101-b28f-e4a129b90227\") " pod="openstack/dnsmasq-dns-5fbc4d444f-mlq2z" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.667678 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.685713 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.700112 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpj8x\" (UniqueName: \"kubernetes.io/projected/435ec576-f731-4d62-9eeb-804d4ae4f52a-kube-api-access-mpj8x\") pod \"nova-cell1-novncproxy-0\" (UID: \"435ec576-f731-4d62-9eeb-804d4ae4f52a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.700220 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/330db7f4-8928-4101-b28f-e4a129b90227-config\") pod \"dnsmasq-dns-5fbc4d444f-mlq2z\" (UID: \"330db7f4-8928-4101-b28f-e4a129b90227\") " pod="openstack/dnsmasq-dns-5fbc4d444f-mlq2z" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.700248 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/435ec576-f731-4d62-9eeb-804d4ae4f52a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"435ec576-f731-4d62-9eeb-804d4ae4f52a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.700317 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/330db7f4-8928-4101-b28f-e4a129b90227-ovsdbserver-nb\") pod \"dnsmasq-dns-5fbc4d444f-mlq2z\" (UID: \"330db7f4-8928-4101-b28f-e4a129b90227\") " pod="openstack/dnsmasq-dns-5fbc4d444f-mlq2z" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.700391 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/435ec576-f731-4d62-9eeb-804d4ae4f52a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"435ec576-f731-4d62-9eeb-804d4ae4f52a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.700466 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/330db7f4-8928-4101-b28f-e4a129b90227-dns-svc\") pod \"dnsmasq-dns-5fbc4d444f-mlq2z\" (UID: \"330db7f4-8928-4101-b28f-e4a129b90227\") " pod="openstack/dnsmasq-dns-5fbc4d444f-mlq2z" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.700552 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/330db7f4-8928-4101-b28f-e4a129b90227-dns-swift-storage-0\") pod \"dnsmasq-dns-5fbc4d444f-mlq2z\" (UID: \"330db7f4-8928-4101-b28f-e4a129b90227\") " pod="openstack/dnsmasq-dns-5fbc4d444f-mlq2z" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.700589 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n962m\" (UniqueName: \"kubernetes.io/projected/330db7f4-8928-4101-b28f-e4a129b90227-kube-api-access-n962m\") pod \"dnsmasq-dns-5fbc4d444f-mlq2z\" (UID: \"330db7f4-8928-4101-b28f-e4a129b90227\") " pod="openstack/dnsmasq-dns-5fbc4d444f-mlq2z" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.700644 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/330db7f4-8928-4101-b28f-e4a129b90227-ovsdbserver-sb\") pod \"dnsmasq-dns-5fbc4d444f-mlq2z\" (UID: \"330db7f4-8928-4101-b28f-e4a129b90227\") " pod="openstack/dnsmasq-dns-5fbc4d444f-mlq2z" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.702794 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/330db7f4-8928-4101-b28f-e4a129b90227-dns-svc\") pod \"dnsmasq-dns-5fbc4d444f-mlq2z\" (UID: \"330db7f4-8928-4101-b28f-e4a129b90227\") " pod="openstack/dnsmasq-dns-5fbc4d444f-mlq2z" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.703540 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/330db7f4-8928-4101-b28f-e4a129b90227-config\") pod \"dnsmasq-dns-5fbc4d444f-mlq2z\" (UID: \"330db7f4-8928-4101-b28f-e4a129b90227\") " pod="openstack/dnsmasq-dns-5fbc4d444f-mlq2z" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.704092 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/330db7f4-8928-4101-b28f-e4a129b90227-ovsdbserver-sb\") pod \"dnsmasq-dns-5fbc4d444f-mlq2z\" (UID: \"330db7f4-8928-4101-b28f-e4a129b90227\") " pod="openstack/dnsmasq-dns-5fbc4d444f-mlq2z" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.710522 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/435ec576-f731-4d62-9eeb-804d4ae4f52a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"435ec576-f731-4d62-9eeb-804d4ae4f52a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.711592 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/330db7f4-8928-4101-b28f-e4a129b90227-dns-swift-storage-0\") pod \"dnsmasq-dns-5fbc4d444f-mlq2z\" (UID: \"330db7f4-8928-4101-b28f-e4a129b90227\") " pod="openstack/dnsmasq-dns-5fbc4d444f-mlq2z" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.711708 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/330db7f4-8928-4101-b28f-e4a129b90227-ovsdbserver-nb\") pod \"dnsmasq-dns-5fbc4d444f-mlq2z\" (UID: \"330db7f4-8928-4101-b28f-e4a129b90227\") " pod="openstack/dnsmasq-dns-5fbc4d444f-mlq2z" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.725428 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/435ec576-f731-4d62-9eeb-804d4ae4f52a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"435ec576-f731-4d62-9eeb-804d4ae4f52a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.732543 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n962m\" (UniqueName: \"kubernetes.io/projected/330db7f4-8928-4101-b28f-e4a129b90227-kube-api-access-n962m\") pod \"dnsmasq-dns-5fbc4d444f-mlq2z\" (UID: \"330db7f4-8928-4101-b28f-e4a129b90227\") " pod="openstack/dnsmasq-dns-5fbc4d444f-mlq2z" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.735659 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpj8x\" (UniqueName: \"kubernetes.io/projected/435ec576-f731-4d62-9eeb-804d4ae4f52a-kube-api-access-mpj8x\") pod \"nova-cell1-novncproxy-0\" (UID: \"435ec576-f731-4d62-9eeb-804d4ae4f52a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.859127 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fbc4d444f-mlq2z" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.863144 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:23:09 crc kubenswrapper[4723]: I0309 13:23:09.876351 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-mfbfj"] Mar 09 13:23:10 crc kubenswrapper[4723]: I0309 13:23:10.191529 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 13:23:10 crc kubenswrapper[4723]: I0309 13:23:10.424568 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:23:10 crc kubenswrapper[4723]: I0309 13:23:10.425156 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8" containerName="ceilometer-central-agent" containerID="cri-o://dceaaf84d71a67b7c09a61e856d34a0f4fed9916d235ea6cfdf7d1222ed3bc39" gracePeriod=30 Mar 09 13:23:10 crc kubenswrapper[4723]: I0309 13:23:10.425585 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8" containerName="proxy-httpd" containerID="cri-o://3eeab919fec047cae3361fb461edd6331ef9d102536b843dcdfc143e497df80e" gracePeriod=30 Mar 09 13:23:10 crc kubenswrapper[4723]: I0309 13:23:10.425631 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8" containerName="sg-core" containerID="cri-o://e780775b380391918c73cf9dc825a15d45262e2e101c41b4abfca4f47185a68a" gracePeriod=30 Mar 09 13:23:10 crc kubenswrapper[4723]: I0309 13:23:10.425683 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8" containerName="ceilometer-notification-agent" containerID="cri-o://70350c7787d44e0ca20bc0fe6d5452f8301b494d82f9e235a1e0739913495601" gracePeriod=30 Mar 09 13:23:10 crc kubenswrapper[4723]: I0309 13:23:10.460927 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:23:10 crc kubenswrapper[4723]: W0309 13:23:10.493962 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod151bf67d_b202_4687_afcc_6247f5c670e9.slice/crio-33e9e650e2b7ce914c4eba0c6bef74ca69521386bca465463cedc83e8b69ff75 WatchSource:0}: Error finding container 33e9e650e2b7ce914c4eba0c6bef74ca69521386bca465463cedc83e8b69ff75: Status 404 returned error can't find the container with id 33e9e650e2b7ce914c4eba0c6bef74ca69521386bca465463cedc83e8b69ff75 Mar 09 13:23:10 crc kubenswrapper[4723]: I0309 13:23:10.638061 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5014a76c-6fd4-44e5-9151-f07dcfb5f1d4","Type":"ContainerStarted","Data":"9eacfab77052b22c148588f266a26da7a0b98dac07572e911a0c5740f5fe6f56"} Mar 09 13:23:10 crc kubenswrapper[4723]: I0309 13:23:10.640067 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mfbfj" event={"ID":"c4e81d20-6788-48e4-a38c-2dda5e6cc206","Type":"ContainerStarted","Data":"56eda3a218baef03a2d03f600372743aaa19368443fb06818b6fa4bb26fd9645"} Mar 09 13:23:10 crc kubenswrapper[4723]: I0309 13:23:10.640098 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mfbfj" event={"ID":"c4e81d20-6788-48e4-a38c-2dda5e6cc206","Type":"ContainerStarted","Data":"2e49d0262e2d850a995f4d1a7bd5c7dc94a65a4000986986bd593d60571b4642"} Mar 09 13:23:10 crc kubenswrapper[4723]: I0309 13:23:10.689144 4723 generic.go:334] "Generic (PLEG): container finished" podID="1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8" containerID="e780775b380391918c73cf9dc825a15d45262e2e101c41b4abfca4f47185a68a" exitCode=2 Mar 09 13:23:10 crc kubenswrapper[4723]: I0309 13:23:10.689220 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8","Type":"ContainerDied","Data":"e780775b380391918c73cf9dc825a15d45262e2e101c41b4abfca4f47185a68a"} Mar 09 13:23:10 crc kubenswrapper[4723]: I0309 13:23:10.703923 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"151bf67d-b202-4687-afcc-6247f5c670e9","Type":"ContainerStarted","Data":"33e9e650e2b7ce914c4eba0c6bef74ca69521386bca465463cedc83e8b69ff75"} Mar 09 13:23:10 crc kubenswrapper[4723]: I0309 13:23:10.852239 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-mfbfj" podStartSLOduration=2.852221507 podStartE2EDuration="2.852221507s" podCreationTimestamp="2026-03-09 13:23:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:10.674169765 +0000 UTC m=+1464.688637305" watchObservedRunningTime="2026-03-09 13:23:10.852221507 +0000 UTC m=+1464.866689047" Mar 09 13:23:10 crc kubenswrapper[4723]: I0309 13:23:10.909721 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:23:10 crc kubenswrapper[4723]: I0309 13:23:10.913680 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 13:23:11 crc kubenswrapper[4723]: I0309 13:23:11.159237 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fbc4d444f-mlq2z"] Mar 09 13:23:11 crc kubenswrapper[4723]: W0309 13:23:11.185649 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod330db7f4_8928_4101_b28f_e4a129b90227.slice/crio-79edb7c88617db0eeec9de18a881a1f497e595c08408f78fb6f14ad5c189471c WatchSource:0}: Error finding container 79edb7c88617db0eeec9de18a881a1f497e595c08408f78fb6f14ad5c189471c: Status 404 returned error can't find the container with id 79edb7c88617db0eeec9de18a881a1f497e595c08408f78fb6f14ad5c189471c Mar 09 13:23:11 crc kubenswrapper[4723]: I0309 13:23:11.397853 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hx7mw"] Mar 09 13:23:11 crc kubenswrapper[4723]: I0309 13:23:11.403185 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hx7mw" Mar 09 13:23:11 crc kubenswrapper[4723]: I0309 13:23:11.411986 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 09 13:23:11 crc kubenswrapper[4723]: I0309 13:23:11.412265 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 09 13:23:11 crc kubenswrapper[4723]: I0309 13:23:11.413719 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hx7mw"] Mar 09 13:23:11 crc kubenswrapper[4723]: I0309 13:23:11.536363 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f23bcc5-001d-4f5e-a28f-ed00ab283c01-scripts\") pod \"nova-cell1-conductor-db-sync-hx7mw\" (UID: \"1f23bcc5-001d-4f5e-a28f-ed00ab283c01\") " pod="openstack/nova-cell1-conductor-db-sync-hx7mw" Mar 09 13:23:11 crc kubenswrapper[4723]: I0309 13:23:11.536408 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxkvl\" (UniqueName: \"kubernetes.io/projected/1f23bcc5-001d-4f5e-a28f-ed00ab283c01-kube-api-access-nxkvl\") pod \"nova-cell1-conductor-db-sync-hx7mw\" (UID: \"1f23bcc5-001d-4f5e-a28f-ed00ab283c01\") " pod="openstack/nova-cell1-conductor-db-sync-hx7mw" Mar 09 13:23:11 crc kubenswrapper[4723]: I0309 13:23:11.536559 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f23bcc5-001d-4f5e-a28f-ed00ab283c01-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-hx7mw\" (UID: \"1f23bcc5-001d-4f5e-a28f-ed00ab283c01\") " pod="openstack/nova-cell1-conductor-db-sync-hx7mw" Mar 09 13:23:11 crc kubenswrapper[4723]: I0309 13:23:11.536618 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f23bcc5-001d-4f5e-a28f-ed00ab283c01-config-data\") pod \"nova-cell1-conductor-db-sync-hx7mw\" (UID: \"1f23bcc5-001d-4f5e-a28f-ed00ab283c01\") " pod="openstack/nova-cell1-conductor-db-sync-hx7mw" Mar 09 13:23:11 crc kubenswrapper[4723]: I0309 13:23:11.642690 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f23bcc5-001d-4f5e-a28f-ed00ab283c01-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-hx7mw\" (UID: \"1f23bcc5-001d-4f5e-a28f-ed00ab283c01\") " pod="openstack/nova-cell1-conductor-db-sync-hx7mw" Mar 09 13:23:11 crc kubenswrapper[4723]: I0309 13:23:11.642787 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f23bcc5-001d-4f5e-a28f-ed00ab283c01-config-data\") pod \"nova-cell1-conductor-db-sync-hx7mw\" (UID: \"1f23bcc5-001d-4f5e-a28f-ed00ab283c01\") " pod="openstack/nova-cell1-conductor-db-sync-hx7mw" Mar 09 13:23:11 crc kubenswrapper[4723]: I0309 13:23:11.643059 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f23bcc5-001d-4f5e-a28f-ed00ab283c01-scripts\") pod \"nova-cell1-conductor-db-sync-hx7mw\" (UID: \"1f23bcc5-001d-4f5e-a28f-ed00ab283c01\") " pod="openstack/nova-cell1-conductor-db-sync-hx7mw" Mar 09 13:23:11 crc kubenswrapper[4723]: I0309 13:23:11.643109 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxkvl\" (UniqueName: \"kubernetes.io/projected/1f23bcc5-001d-4f5e-a28f-ed00ab283c01-kube-api-access-nxkvl\") pod \"nova-cell1-conductor-db-sync-hx7mw\" (UID: \"1f23bcc5-001d-4f5e-a28f-ed00ab283c01\") " pod="openstack/nova-cell1-conductor-db-sync-hx7mw" Mar 09 13:23:11 crc kubenswrapper[4723]: I0309 13:23:11.653085 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f23bcc5-001d-4f5e-a28f-ed00ab283c01-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-hx7mw\" (UID: \"1f23bcc5-001d-4f5e-a28f-ed00ab283c01\") " pod="openstack/nova-cell1-conductor-db-sync-hx7mw" Mar 09 13:23:11 crc kubenswrapper[4723]: I0309 13:23:11.664921 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f23bcc5-001d-4f5e-a28f-ed00ab283c01-scripts\") pod \"nova-cell1-conductor-db-sync-hx7mw\" (UID: \"1f23bcc5-001d-4f5e-a28f-ed00ab283c01\") " pod="openstack/nova-cell1-conductor-db-sync-hx7mw" Mar 09 13:23:11 crc kubenswrapper[4723]: I0309 13:23:11.667431 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f23bcc5-001d-4f5e-a28f-ed00ab283c01-config-data\") pod \"nova-cell1-conductor-db-sync-hx7mw\" (UID: \"1f23bcc5-001d-4f5e-a28f-ed00ab283c01\") " pod="openstack/nova-cell1-conductor-db-sync-hx7mw" Mar 09 13:23:11 crc kubenswrapper[4723]: I0309 13:23:11.669969 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxkvl\" (UniqueName: \"kubernetes.io/projected/1f23bcc5-001d-4f5e-a28f-ed00ab283c01-kube-api-access-nxkvl\") pod \"nova-cell1-conductor-db-sync-hx7mw\" (UID: \"1f23bcc5-001d-4f5e-a28f-ed00ab283c01\") " pod="openstack/nova-cell1-conductor-db-sync-hx7mw" Mar 09 13:23:11 crc kubenswrapper[4723]: I0309 13:23:11.737373 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hx7mw" Mar 09 13:23:11 crc kubenswrapper[4723]: I0309 13:23:11.741739 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f8105baf-a986-4aba-a114-10e0b997f27c","Type":"ContainerStarted","Data":"d7d544c231b40455d9ac318d6bd896404f076d2ad3c0dc74aae2d1075887e655"} Mar 09 13:23:11 crc kubenswrapper[4723]: I0309 13:23:11.753322 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"435ec576-f731-4d62-9eeb-804d4ae4f52a","Type":"ContainerStarted","Data":"223b764c8cfaddd4cf5cc631cf01997c2aeeb8b295280359c9669a0b5f1a7ea2"} Mar 09 13:23:11 crc kubenswrapper[4723]: I0309 13:23:11.759645 4723 generic.go:334] "Generic (PLEG): container finished" podID="330db7f4-8928-4101-b28f-e4a129b90227" containerID="3d76385ca8602290fdacc0d7567b698ad2e3dc17ea6a31eb0361bc3868c00851" exitCode=0 Mar 09 13:23:11 crc kubenswrapper[4723]: I0309 13:23:11.760763 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fbc4d444f-mlq2z" event={"ID":"330db7f4-8928-4101-b28f-e4a129b90227","Type":"ContainerDied","Data":"3d76385ca8602290fdacc0d7567b698ad2e3dc17ea6a31eb0361bc3868c00851"} Mar 09 13:23:11 crc kubenswrapper[4723]: I0309 13:23:11.760795 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fbc4d444f-mlq2z" event={"ID":"330db7f4-8928-4101-b28f-e4a129b90227","Type":"ContainerStarted","Data":"79edb7c88617db0eeec9de18a881a1f497e595c08408f78fb6f14ad5c189471c"} Mar 09 13:23:11 crc kubenswrapper[4723]: I0309 13:23:11.790730 4723 generic.go:334] "Generic (PLEG): container finished" podID="1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8" containerID="3eeab919fec047cae3361fb461edd6331ef9d102536b843dcdfc143e497df80e" exitCode=0 Mar 09 13:23:11 crc kubenswrapper[4723]: I0309 13:23:11.790770 4723 generic.go:334] "Generic (PLEG): container finished" podID="1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8" containerID="70350c7787d44e0ca20bc0fe6d5452f8301b494d82f9e235a1e0739913495601" exitCode=0 Mar 09 13:23:11 crc kubenswrapper[4723]: I0309 13:23:11.791538 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8","Type":"ContainerDied","Data":"3eeab919fec047cae3361fb461edd6331ef9d102536b843dcdfc143e497df80e"} Mar 09 13:23:11 crc kubenswrapper[4723]: I0309 13:23:11.791588 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8","Type":"ContainerDied","Data":"70350c7787d44e0ca20bc0fe6d5452f8301b494d82f9e235a1e0739913495601"} Mar 09 13:23:12 crc kubenswrapper[4723]: I0309 13:23:12.457468 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hx7mw"] Mar 09 13:23:12 crc kubenswrapper[4723]: W0309 13:23:12.467763 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f23bcc5_001d_4f5e_a28f_ed00ab283c01.slice/crio-a75b429874e05c3595dbba592b12efff830d878e493f4ef2ac34179a10b11824 WatchSource:0}: Error finding container a75b429874e05c3595dbba592b12efff830d878e493f4ef2ac34179a10b11824: Status 404 returned error can't find the container with id a75b429874e05c3595dbba592b12efff830d878e493f4ef2ac34179a10b11824 Mar 09 13:23:12 crc kubenswrapper[4723]: I0309 13:23:12.820429 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hx7mw" event={"ID":"1f23bcc5-001d-4f5e-a28f-ed00ab283c01","Type":"ContainerStarted","Data":"a75b429874e05c3595dbba592b12efff830d878e493f4ef2ac34179a10b11824"} Mar 09 13:23:12 crc kubenswrapper[4723]: I0309 13:23:12.824953 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fbc4d444f-mlq2z" event={"ID":"330db7f4-8928-4101-b28f-e4a129b90227","Type":"ContainerStarted","Data":"7942d56e2ba234419f069386784848fa3fa7a8bb97e4e04d5dbb94e9322d0277"} Mar 09 13:23:12 crc kubenswrapper[4723]: I0309 13:23:12.826479 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5fbc4d444f-mlq2z" Mar 09 13:23:12 crc kubenswrapper[4723]: I0309 13:23:12.861244 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5fbc4d444f-mlq2z" podStartSLOduration=3.861210544 podStartE2EDuration="3.861210544s" podCreationTimestamp="2026-03-09 13:23:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:12.852245037 +0000 UTC m=+1466.866712587" watchObservedRunningTime="2026-03-09 13:23:12.861210544 +0000 UTC m=+1466.875678084" Mar 09 13:23:12 crc kubenswrapper[4723]: I0309 13:23:12.959086 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 13:23:12 crc kubenswrapper[4723]: I0309 13:23:12.976158 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:23:13 crc kubenswrapper[4723]: I0309 13:23:13.837820 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hx7mw" event={"ID":"1f23bcc5-001d-4f5e-a28f-ed00ab283c01","Type":"ContainerStarted","Data":"7a117ee67ee75bb4a7bae9848fb8f37e812495e259379c7b120d3f64f1648ac0"} Mar 09 13:23:13 crc kubenswrapper[4723]: I0309 13:23:13.859749 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-hx7mw" podStartSLOduration=2.85973027 podStartE2EDuration="2.85973027s" podCreationTimestamp="2026-03-09 13:23:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:13.851077082 +0000 UTC m=+1467.865544622" watchObservedRunningTime="2026-03-09 13:23:13.85973027 +0000 UTC m=+1467.874197800" Mar 09 13:23:14 crc kubenswrapper[4723]: I0309 13:23:14.624382 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-5jgl8"] Mar 09 13:23:14 crc kubenswrapper[4723]: I0309 13:23:14.632382 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-5jgl8" Mar 09 13:23:14 crc kubenswrapper[4723]: I0309 13:23:14.660734 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-5jgl8"] Mar 09 13:23:14 crc kubenswrapper[4723]: I0309 13:23:14.718887 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eaf73308-8e25-4470-a1d4-a2f9d59b1cd6-operator-scripts\") pod \"aodh-db-create-5jgl8\" (UID: \"eaf73308-8e25-4470-a1d4-a2f9d59b1cd6\") " pod="openstack/aodh-db-create-5jgl8" Mar 09 13:23:14 crc kubenswrapper[4723]: I0309 13:23:14.718977 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8pgk\" (UniqueName: \"kubernetes.io/projected/eaf73308-8e25-4470-a1d4-a2f9d59b1cd6-kube-api-access-x8pgk\") pod \"aodh-db-create-5jgl8\" (UID: \"eaf73308-8e25-4470-a1d4-a2f9d59b1cd6\") " pod="openstack/aodh-db-create-5jgl8" Mar 09 13:23:14 crc kubenswrapper[4723]: I0309 13:23:14.721129 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-8808-account-create-update-jg69c"] Mar 09 13:23:14 crc kubenswrapper[4723]: I0309 13:23:14.722652 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-8808-account-create-update-jg69c" Mar 09 13:23:14 crc kubenswrapper[4723]: I0309 13:23:14.727292 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Mar 09 13:23:14 crc kubenswrapper[4723]: I0309 13:23:14.756340 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-8808-account-create-update-jg69c"] Mar 09 13:23:14 crc kubenswrapper[4723]: I0309 13:23:14.821570 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8pgk\" (UniqueName: \"kubernetes.io/projected/eaf73308-8e25-4470-a1d4-a2f9d59b1cd6-kube-api-access-x8pgk\") pod \"aodh-db-create-5jgl8\" (UID: \"eaf73308-8e25-4470-a1d4-a2f9d59b1cd6\") " pod="openstack/aodh-db-create-5jgl8" Mar 09 13:23:14 crc kubenswrapper[4723]: I0309 13:23:14.821661 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49c81a50-9366-425a-b957-330022266b2d-operator-scripts\") pod \"aodh-8808-account-create-update-jg69c\" (UID: \"49c81a50-9366-425a-b957-330022266b2d\") " pod="openstack/aodh-8808-account-create-update-jg69c" Mar 09 13:23:14 crc kubenswrapper[4723]: I0309 13:23:14.821688 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mvlc\" (UniqueName: \"kubernetes.io/projected/49c81a50-9366-425a-b957-330022266b2d-kube-api-access-2mvlc\") pod \"aodh-8808-account-create-update-jg69c\" (UID: \"49c81a50-9366-425a-b957-330022266b2d\") " pod="openstack/aodh-8808-account-create-update-jg69c" Mar 09 13:23:14 crc kubenswrapper[4723]: I0309 13:23:14.822052 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eaf73308-8e25-4470-a1d4-a2f9d59b1cd6-operator-scripts\") pod \"aodh-db-create-5jgl8\" (UID: \"eaf73308-8e25-4470-a1d4-a2f9d59b1cd6\") " pod="openstack/aodh-db-create-5jgl8" Mar 09 13:23:14 crc kubenswrapper[4723]: I0309 13:23:14.822755 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eaf73308-8e25-4470-a1d4-a2f9d59b1cd6-operator-scripts\") pod \"aodh-db-create-5jgl8\" (UID: \"eaf73308-8e25-4470-a1d4-a2f9d59b1cd6\") " pod="openstack/aodh-db-create-5jgl8" Mar 09 13:23:14 crc kubenswrapper[4723]: I0309 13:23:14.840314 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8pgk\" (UniqueName: \"kubernetes.io/projected/eaf73308-8e25-4470-a1d4-a2f9d59b1cd6-kube-api-access-x8pgk\") pod \"aodh-db-create-5jgl8\" (UID: \"eaf73308-8e25-4470-a1d4-a2f9d59b1cd6\") " pod="openstack/aodh-db-create-5jgl8" Mar 09 13:23:14 crc kubenswrapper[4723]: I0309 13:23:14.853073 4723 generic.go:334] "Generic (PLEG): container finished" podID="1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8" containerID="dceaaf84d71a67b7c09a61e856d34a0f4fed9916d235ea6cfdf7d1222ed3bc39" exitCode=0 Mar 09 13:23:14 crc kubenswrapper[4723]: I0309 13:23:14.854008 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8","Type":"ContainerDied","Data":"dceaaf84d71a67b7c09a61e856d34a0f4fed9916d235ea6cfdf7d1222ed3bc39"} Mar 09 13:23:14 crc kubenswrapper[4723]: I0309 13:23:14.925912 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49c81a50-9366-425a-b957-330022266b2d-operator-scripts\") pod \"aodh-8808-account-create-update-jg69c\" (UID: \"49c81a50-9366-425a-b957-330022266b2d\") " pod="openstack/aodh-8808-account-create-update-jg69c" Mar 09 13:23:14 crc kubenswrapper[4723]: I0309 13:23:14.925962 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mvlc\" (UniqueName: \"kubernetes.io/projected/49c81a50-9366-425a-b957-330022266b2d-kube-api-access-2mvlc\") pod \"aodh-8808-account-create-update-jg69c\" (UID: \"49c81a50-9366-425a-b957-330022266b2d\") " pod="openstack/aodh-8808-account-create-update-jg69c" Mar 09 13:23:14 crc kubenswrapper[4723]: I0309 13:23:14.931902 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49c81a50-9366-425a-b957-330022266b2d-operator-scripts\") pod \"aodh-8808-account-create-update-jg69c\" (UID: \"49c81a50-9366-425a-b957-330022266b2d\") " pod="openstack/aodh-8808-account-create-update-jg69c" Mar 09 13:23:14 crc kubenswrapper[4723]: I0309 13:23:14.945634 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mvlc\" (UniqueName: \"kubernetes.io/projected/49c81a50-9366-425a-b957-330022266b2d-kube-api-access-2mvlc\") pod \"aodh-8808-account-create-update-jg69c\" (UID: \"49c81a50-9366-425a-b957-330022266b2d\") " pod="openstack/aodh-8808-account-create-update-jg69c" Mar 09 13:23:14 crc kubenswrapper[4723]: I0309 13:23:14.984800 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-5jgl8" Mar 09 13:23:15 crc kubenswrapper[4723]: I0309 13:23:15.044594 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-8808-account-create-update-jg69c" Mar 09 13:23:15 crc kubenswrapper[4723]: I0309 13:23:15.081949 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 13:23:15 crc kubenswrapper[4723]: I0309 13:23:15.168307 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 09 13:23:15 crc kubenswrapper[4723]: I0309 13:23:15.168584 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="a4d4aa80-3125-4750-83a9-af82898eaaf6" containerName="nova-cell0-conductor-conductor" containerID="cri-o://ddf7255f8e0a77da4cfc8f37c1cb56df0784c289a8deb6e73189ad719a7cd4b7" gracePeriod=30 Mar 09 13:23:15 crc kubenswrapper[4723]: I0309 13:23:15.200174 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:23:15 crc kubenswrapper[4723]: I0309 13:23:15.604140 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:23:15 crc kubenswrapper[4723]: I0309 13:23:15.679185 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8-run-httpd\") pod \"1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8\" (UID: \"1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8\") " Mar 09 13:23:15 crc kubenswrapper[4723]: I0309 13:23:15.679507 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8-config-data\") pod \"1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8\" (UID: \"1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8\") " Mar 09 13:23:15 crc kubenswrapper[4723]: I0309 13:23:15.679564 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfcll\" (UniqueName: \"kubernetes.io/projected/1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8-kube-api-access-tfcll\") pod \"1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8\" (UID: \"1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8\") " Mar 09 13:23:15 crc kubenswrapper[4723]: I0309 13:23:15.679657 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8-combined-ca-bundle\") pod \"1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8\" (UID: \"1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8\") " Mar 09 13:23:15 crc kubenswrapper[4723]: I0309 13:23:15.679843 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8-sg-core-conf-yaml\") pod \"1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8\" (UID: \"1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8\") " Mar 09 13:23:15 crc kubenswrapper[4723]: I0309 13:23:15.679903 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8-log-httpd\") pod \"1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8\" (UID: \"1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8\") " Mar 09 13:23:15 crc kubenswrapper[4723]: I0309 13:23:15.679956 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8-scripts\") pod \"1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8\" (UID: \"1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8\") " Mar 09 13:23:15 crc kubenswrapper[4723]: I0309 13:23:15.684763 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8" (UID: "1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:23:15 crc kubenswrapper[4723]: I0309 13:23:15.687160 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8" (UID: "1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:23:15 crc kubenswrapper[4723]: I0309 13:23:15.728946 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8-kube-api-access-tfcll" (OuterVolumeSpecName: "kube-api-access-tfcll") pod "1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8" (UID: "1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8"). InnerVolumeSpecName "kube-api-access-tfcll". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:23:15 crc kubenswrapper[4723]: I0309 13:23:15.751599 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8-scripts" (OuterVolumeSpecName: "scripts") pod "1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8" (UID: "1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:23:15 crc kubenswrapper[4723]: I0309 13:23:15.781539 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-5jgl8"] Mar 09 13:23:15 crc kubenswrapper[4723]: I0309 13:23:15.784452 4723 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:15 crc kubenswrapper[4723]: I0309 13:23:15.784483 4723 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:15 crc kubenswrapper[4723]: I0309 13:23:15.784493 4723 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:15 crc kubenswrapper[4723]: I0309 13:23:15.784502 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfcll\" (UniqueName: \"kubernetes.io/projected/1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8-kube-api-access-tfcll\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:15 crc kubenswrapper[4723]: I0309 13:23:15.873611 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-5jgl8" event={"ID":"eaf73308-8e25-4470-a1d4-a2f9d59b1cd6","Type":"ContainerStarted","Data":"2f935fff866c4b831ea89f4c75f6baa18df0593893c14e03d5685a12b6dba027"} Mar 09 13:23:15 crc kubenswrapper[4723]: I0309 13:23:15.886193 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="435ec576-f731-4d62-9eeb-804d4ae4f52a" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://c3ce48aa40ab0a7d443453576bc9289d614a2e1a39312338de9fa7df423a9349" gracePeriod=30 Mar 09 13:23:15 crc kubenswrapper[4723]: I0309 13:23:15.886251 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"435ec576-f731-4d62-9eeb-804d4ae4f52a","Type":"ContainerStarted","Data":"c3ce48aa40ab0a7d443453576bc9289d614a2e1a39312338de9fa7df423a9349"} Mar 09 13:23:15 crc kubenswrapper[4723]: I0309 13:23:15.894815 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8","Type":"ContainerDied","Data":"06d8f92ac2d7cf3c85dcd1f111b13008adf09650b861e6362031a089f1cee78c"} Mar 09 13:23:15 crc kubenswrapper[4723]: I0309 13:23:15.894896 4723 scope.go:117] "RemoveContainer" containerID="3eeab919fec047cae3361fb461edd6331ef9d102536b843dcdfc143e497df80e" Mar 09 13:23:15 crc kubenswrapper[4723]: I0309 13:23:15.894981 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:23:15 crc kubenswrapper[4723]: I0309 13:23:15.939702 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.746838306 podStartE2EDuration="6.939684448s" podCreationTimestamp="2026-03-09 13:23:09 +0000 UTC" firstStartedPulling="2026-03-09 13:23:10.881195201 +0000 UTC m=+1464.895662741" lastFinishedPulling="2026-03-09 13:23:15.074041343 +0000 UTC m=+1469.088508883" observedRunningTime="2026-03-09 13:23:15.905966319 +0000 UTC m=+1469.920433879" watchObservedRunningTime="2026-03-09 13:23:15.939684448 +0000 UTC m=+1469.954151988" Mar 09 13:23:16 crc kubenswrapper[4723]: W0309 13:23:16.062419 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49c81a50_9366_425a_b957_330022266b2d.slice/crio-8431860bd55aa066753f2a4d5c2a869b692cf1cbdb65112fbfe98bbd1033b167 WatchSource:0}: Error finding container 8431860bd55aa066753f2a4d5c2a869b692cf1cbdb65112fbfe98bbd1033b167: Status 404 returned error can't find the container with id 8431860bd55aa066753f2a4d5c2a869b692cf1cbdb65112fbfe98bbd1033b167 Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.074065 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-8808-account-create-update-jg69c"] Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.131691 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8" (UID: "1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.203138 4723 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.237019 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8-config-data" (OuterVolumeSpecName: "config-data") pod "1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8" (UID: "1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.276749 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8" (UID: "1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.308474 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.308514 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.439618 4723 scope.go:117] "RemoveContainer" containerID="e780775b380391918c73cf9dc825a15d45262e2e101c41b4abfca4f47185a68a" Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.525878 4723 scope.go:117] "RemoveContainer" containerID="70350c7787d44e0ca20bc0fe6d5452f8301b494d82f9e235a1e0739913495601" Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.589557 4723 scope.go:117] "RemoveContainer" containerID="dceaaf84d71a67b7c09a61e856d34a0f4fed9916d235ea6cfdf7d1222ed3bc39" Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.608454 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.652046 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.670464 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:23:16 crc kubenswrapper[4723]: E0309 13:23:16.671280 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8" containerName="ceilometer-central-agent" Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.671302 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8" containerName="ceilometer-central-agent" Mar 09 13:23:16 crc kubenswrapper[4723]: E0309 13:23:16.671320 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8" containerName="ceilometer-notification-agent" Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.671326 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8" containerName="ceilometer-notification-agent" Mar 09 13:23:16 crc kubenswrapper[4723]: E0309 13:23:16.671350 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8" containerName="sg-core" Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.671357 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8" containerName="sg-core" Mar 09 13:23:16 crc kubenswrapper[4723]: E0309 13:23:16.671376 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8" containerName="proxy-httpd" Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.671383 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8" containerName="proxy-httpd" Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.671646 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8" containerName="sg-core" Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.671663 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8" containerName="ceilometer-notification-agent" Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.671683 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8" containerName="ceilometer-central-agent" Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.671700 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8" containerName="proxy-httpd" Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.674237 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.677320 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.681048 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.690280 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.731412 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hwk4\" (UniqueName: \"kubernetes.io/projected/0ec76144-c358-4a57-bf5e-b3fd37531ae7-kube-api-access-7hwk4\") pod \"ceilometer-0\" (UID: \"0ec76144-c358-4a57-bf5e-b3fd37531ae7\") " pod="openstack/ceilometer-0" Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.731743 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ec76144-c358-4a57-bf5e-b3fd37531ae7-run-httpd\") pod \"ceilometer-0\" (UID: \"0ec76144-c358-4a57-bf5e-b3fd37531ae7\") " pod="openstack/ceilometer-0" Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.731780 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0ec76144-c358-4a57-bf5e-b3fd37531ae7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0ec76144-c358-4a57-bf5e-b3fd37531ae7\") " pod="openstack/ceilometer-0" Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.731834 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ec76144-c358-4a57-bf5e-b3fd37531ae7-scripts\") pod \"ceilometer-0\" (UID: \"0ec76144-c358-4a57-bf5e-b3fd37531ae7\") " pod="openstack/ceilometer-0" Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.731939 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ec76144-c358-4a57-bf5e-b3fd37531ae7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0ec76144-c358-4a57-bf5e-b3fd37531ae7\") " pod="openstack/ceilometer-0" Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.731957 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ec76144-c358-4a57-bf5e-b3fd37531ae7-config-data\") pod \"ceilometer-0\" (UID: \"0ec76144-c358-4a57-bf5e-b3fd37531ae7\") " pod="openstack/ceilometer-0" Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.731987 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ec76144-c358-4a57-bf5e-b3fd37531ae7-log-httpd\") pod \"ceilometer-0\" (UID: \"0ec76144-c358-4a57-bf5e-b3fd37531ae7\") " pod="openstack/ceilometer-0" Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.833845 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ec76144-c358-4a57-bf5e-b3fd37531ae7-run-httpd\") pod \"ceilometer-0\" (UID: \"0ec76144-c358-4a57-bf5e-b3fd37531ae7\") " pod="openstack/ceilometer-0" Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.833914 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0ec76144-c358-4a57-bf5e-b3fd37531ae7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0ec76144-c358-4a57-bf5e-b3fd37531ae7\") " pod="openstack/ceilometer-0" Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.833983 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ec76144-c358-4a57-bf5e-b3fd37531ae7-scripts\") pod \"ceilometer-0\" (UID: \"0ec76144-c358-4a57-bf5e-b3fd37531ae7\") " pod="openstack/ceilometer-0" Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.834071 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ec76144-c358-4a57-bf5e-b3fd37531ae7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0ec76144-c358-4a57-bf5e-b3fd37531ae7\") " pod="openstack/ceilometer-0" Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.834097 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ec76144-c358-4a57-bf5e-b3fd37531ae7-config-data\") pod \"ceilometer-0\" (UID: \"0ec76144-c358-4a57-bf5e-b3fd37531ae7\") " pod="openstack/ceilometer-0" Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.834140 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ec76144-c358-4a57-bf5e-b3fd37531ae7-log-httpd\") pod \"ceilometer-0\" (UID: \"0ec76144-c358-4a57-bf5e-b3fd37531ae7\") " pod="openstack/ceilometer-0" Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.834177 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hwk4\" (UniqueName: \"kubernetes.io/projected/0ec76144-c358-4a57-bf5e-b3fd37531ae7-kube-api-access-7hwk4\") pod \"ceilometer-0\" (UID: \"0ec76144-c358-4a57-bf5e-b3fd37531ae7\") " pod="openstack/ceilometer-0" Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.835681 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ec76144-c358-4a57-bf5e-b3fd37531ae7-run-httpd\") pod \"ceilometer-0\" (UID: \"0ec76144-c358-4a57-bf5e-b3fd37531ae7\") " pod="openstack/ceilometer-0" Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.836106 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ec76144-c358-4a57-bf5e-b3fd37531ae7-log-httpd\") pod \"ceilometer-0\" (UID: \"0ec76144-c358-4a57-bf5e-b3fd37531ae7\") " pod="openstack/ceilometer-0" Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.838433 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0ec76144-c358-4a57-bf5e-b3fd37531ae7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0ec76144-c358-4a57-bf5e-b3fd37531ae7\") " pod="openstack/ceilometer-0" Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.840669 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ec76144-c358-4a57-bf5e-b3fd37531ae7-config-data\") pod \"ceilometer-0\" (UID: \"0ec76144-c358-4a57-bf5e-b3fd37531ae7\") " pod="openstack/ceilometer-0" Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.844552 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ec76144-c358-4a57-bf5e-b3fd37531ae7-scripts\") pod \"ceilometer-0\" (UID: \"0ec76144-c358-4a57-bf5e-b3fd37531ae7\") " pod="openstack/ceilometer-0" Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.861314 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ec76144-c358-4a57-bf5e-b3fd37531ae7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0ec76144-c358-4a57-bf5e-b3fd37531ae7\") " pod="openstack/ceilometer-0" Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.861717 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hwk4\" (UniqueName: \"kubernetes.io/projected/0ec76144-c358-4a57-bf5e-b3fd37531ae7-kube-api-access-7hwk4\") pod \"ceilometer-0\" (UID: \"0ec76144-c358-4a57-bf5e-b3fd37531ae7\") " pod="openstack/ceilometer-0" Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.912793 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8" path="/var/lib/kubelet/pods/1ae3c0a4-1680-4d2b-883c-64a4d4ab21e8/volumes" Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.924197 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"151bf67d-b202-4687-afcc-6247f5c670e9","Type":"ContainerStarted","Data":"9efbaaf202156c3d59bb73f09cef3c0e066469fbc2e2ca2c7c82b309876a9180"} Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.940144 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f8105baf-a986-4aba-a114-10e0b997f27c","Type":"ContainerStarted","Data":"4e920ad0222517ce6f2b8194793fb7169ed22f3ef4557f83f51a2df34eb61182"} Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.940186 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f8105baf-a986-4aba-a114-10e0b997f27c","Type":"ContainerStarted","Data":"5cc5b95b52319cf3fad4e4d63d8270110843673ba6b1cc1c712af8d9c5935160"} Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.940313 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f8105baf-a986-4aba-a114-10e0b997f27c" containerName="nova-metadata-log" containerID="cri-o://5cc5b95b52319cf3fad4e4d63d8270110843673ba6b1cc1c712af8d9c5935160" gracePeriod=30 Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.940393 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f8105baf-a986-4aba-a114-10e0b997f27c" containerName="nova-metadata-metadata" containerID="cri-o://4e920ad0222517ce6f2b8194793fb7169ed22f3ef4557f83f51a2df34eb61182" gracePeriod=30 Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.952020 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-5jgl8" event={"ID":"eaf73308-8e25-4470-a1d4-a2f9d59b1cd6","Type":"ContainerStarted","Data":"b684cb0dc2c174e2348d5059f387bd692515f3164a959818841846b2a4b8a46e"} Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.958131 4723 generic.go:334] "Generic (PLEG): container finished" podID="a4d4aa80-3125-4750-83a9-af82898eaaf6" containerID="ddf7255f8e0a77da4cfc8f37c1cb56df0784c289a8deb6e73189ad719a7cd4b7" exitCode=0 Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.958450 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a4d4aa80-3125-4750-83a9-af82898eaaf6","Type":"ContainerDied","Data":"ddf7255f8e0a77da4cfc8f37c1cb56df0784c289a8deb6e73189ad719a7cd4b7"} Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.958487 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a4d4aa80-3125-4750-83a9-af82898eaaf6","Type":"ContainerDied","Data":"d786a9ad421d9f83ef5521d7991abd73871a0d832543ac4ca76e2c2af8213091"} Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.958502 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d786a9ad421d9f83ef5521d7991abd73871a0d832543ac4ca76e2c2af8213091" Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.964883 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-8808-account-create-update-jg69c" event={"ID":"49c81a50-9366-425a-b957-330022266b2d","Type":"ContainerStarted","Data":"364ddbc084e24139ed96474b977fea3b6f71ac47d2a5348d7ea9b387eda7ee73"} Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.964944 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-8808-account-create-update-jg69c" event={"ID":"49c81a50-9366-425a-b957-330022266b2d","Type":"ContainerStarted","Data":"8431860bd55aa066753f2a4d5c2a869b692cf1cbdb65112fbfe98bbd1033b167"} Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.973195 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.973937 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5014a76c-6fd4-44e5-9151-f07dcfb5f1d4","Type":"ContainerStarted","Data":"d76142d3b9987ec272ef4164909218ec50cdf18d8bd433d5347585e908c93c1b"} Mar 09 13:23:16 crc kubenswrapper[4723]: I0309 13:23:16.974104 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="5014a76c-6fd4-44e5-9151-f07dcfb5f1d4" containerName="nova-scheduler-scheduler" containerID="cri-o://d76142d3b9987ec272ef4164909218ec50cdf18d8bd433d5347585e908c93c1b" gracePeriod=30 Mar 09 13:23:17 crc kubenswrapper[4723]: I0309 13:23:17.033311 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:23:17 crc kubenswrapper[4723]: I0309 13:23:17.042646 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4d4aa80-3125-4750-83a9-af82898eaaf6-config-data\") pod \"a4d4aa80-3125-4750-83a9-af82898eaaf6\" (UID: \"a4d4aa80-3125-4750-83a9-af82898eaaf6\") " Mar 09 13:23:17 crc kubenswrapper[4723]: I0309 13:23:17.042696 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krl92\" (UniqueName: \"kubernetes.io/projected/a4d4aa80-3125-4750-83a9-af82898eaaf6-kube-api-access-krl92\") pod \"a4d4aa80-3125-4750-83a9-af82898eaaf6\" (UID: \"a4d4aa80-3125-4750-83a9-af82898eaaf6\") " Mar 09 13:23:17 crc kubenswrapper[4723]: I0309 13:23:17.042827 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d4aa80-3125-4750-83a9-af82898eaaf6-combined-ca-bundle\") pod \"a4d4aa80-3125-4750-83a9-af82898eaaf6\" (UID: \"a4d4aa80-3125-4750-83a9-af82898eaaf6\") " Mar 09 13:23:17 crc kubenswrapper[4723]: I0309 13:23:17.050968 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4d4aa80-3125-4750-83a9-af82898eaaf6-kube-api-access-krl92" (OuterVolumeSpecName: "kube-api-access-krl92") pod "a4d4aa80-3125-4750-83a9-af82898eaaf6" (UID: "a4d4aa80-3125-4750-83a9-af82898eaaf6"). InnerVolumeSpecName "kube-api-access-krl92". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:23:17 crc kubenswrapper[4723]: I0309 13:23:17.084333 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.822903034 podStartE2EDuration="8.084308693s" podCreationTimestamp="2026-03-09 13:23:09 +0000 UTC" firstStartedPulling="2026-03-09 13:23:10.861488292 +0000 UTC m=+1464.875955832" lastFinishedPulling="2026-03-09 13:23:15.122893951 +0000 UTC m=+1469.137361491" observedRunningTime="2026-03-09 13:23:17.013272771 +0000 UTC m=+1471.027740321" watchObservedRunningTime="2026-03-09 13:23:17.084308693 +0000 UTC m=+1471.098776233" Mar 09 13:23:17 crc kubenswrapper[4723]: I0309 13:23:17.146660 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krl92\" (UniqueName: \"kubernetes.io/projected/a4d4aa80-3125-4750-83a9-af82898eaaf6-kube-api-access-krl92\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:17 crc kubenswrapper[4723]: I0309 13:23:17.159535 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4d4aa80-3125-4750-83a9-af82898eaaf6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4d4aa80-3125-4750-83a9-af82898eaaf6" (UID: "a4d4aa80-3125-4750-83a9-af82898eaaf6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:23:17 crc kubenswrapper[4723]: I0309 13:23:17.185991 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4d4aa80-3125-4750-83a9-af82898eaaf6-config-data" (OuterVolumeSpecName: "config-data") pod "a4d4aa80-3125-4750-83a9-af82898eaaf6" (UID: "a4d4aa80-3125-4750-83a9-af82898eaaf6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:23:17 crc kubenswrapper[4723]: I0309 13:23:17.234971 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-8808-account-create-update-jg69c" podStartSLOduration=3.234951313 podStartE2EDuration="3.234951313s" podCreationTimestamp="2026-03-09 13:23:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:17.034006737 +0000 UTC m=+1471.048474277" watchObservedRunningTime="2026-03-09 13:23:17.234951313 +0000 UTC m=+1471.249418853" Mar 09 13:23:17 crc kubenswrapper[4723]: I0309 13:23:17.249519 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4d4aa80-3125-4750-83a9-af82898eaaf6-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:17 crc kubenswrapper[4723]: I0309 13:23:17.249541 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d4aa80-3125-4750-83a9-af82898eaaf6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:17 crc kubenswrapper[4723]: I0309 13:23:17.256459 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.465006922 podStartE2EDuration="8.25644021s" podCreationTimestamp="2026-03-09 13:23:09 +0000 UTC" firstStartedPulling="2026-03-09 13:23:10.240714551 +0000 UTC m=+1464.255182091" lastFinishedPulling="2026-03-09 13:23:15.032147839 +0000 UTC m=+1469.046615379" observedRunningTime="2026-03-09 13:23:17.064929122 +0000 UTC m=+1471.079396682" watchObservedRunningTime="2026-03-09 13:23:17.25644021 +0000 UTC m=+1471.270907750" Mar 09 13:23:17 crc kubenswrapper[4723]: I0309 13:23:17.267307 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-create-5jgl8" podStartSLOduration=3.267284496 podStartE2EDuration="3.267284496s" podCreationTimestamp="2026-03-09 13:23:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:17.08493606 +0000 UTC m=+1471.099403620" watchObservedRunningTime="2026-03-09 13:23:17.267284496 +0000 UTC m=+1471.281752036" Mar 09 13:23:17 crc kubenswrapper[4723]: I0309 13:23:17.671412 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:23:17 crc kubenswrapper[4723]: I0309 13:23:17.996382 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ec76144-c358-4a57-bf5e-b3fd37531ae7","Type":"ContainerStarted","Data":"e63ddc8565dfd2cec19d6aac02bbdefd0a04dad15c43bdc7d17003322a628264"} Mar 09 13:23:18 crc kubenswrapper[4723]: I0309 13:23:18.006126 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"151bf67d-b202-4687-afcc-6247f5c670e9","Type":"ContainerStarted","Data":"e50d47e723c7c8e96c8ee1d411941bb2f712a42d3ccfe09e2871752ceffdaf83"} Mar 09 13:23:18 crc kubenswrapper[4723]: I0309 13:23:18.006317 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="151bf67d-b202-4687-afcc-6247f5c670e9" containerName="nova-api-log" containerID="cri-o://9efbaaf202156c3d59bb73f09cef3c0e066469fbc2e2ca2c7c82b309876a9180" gracePeriod=30 Mar 09 13:23:18 crc kubenswrapper[4723]: I0309 13:23:18.006965 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="151bf67d-b202-4687-afcc-6247f5c670e9" containerName="nova-api-api" containerID="cri-o://e50d47e723c7c8e96c8ee1d411941bb2f712a42d3ccfe09e2871752ceffdaf83" gracePeriod=30 Mar 09 13:23:18 crc kubenswrapper[4723]: I0309 13:23:18.016917 4723 generic.go:334] "Generic (PLEG): container finished" podID="f8105baf-a986-4aba-a114-10e0b997f27c" containerID="5cc5b95b52319cf3fad4e4d63d8270110843673ba6b1cc1c712af8d9c5935160" exitCode=143 Mar 09 13:23:18 crc kubenswrapper[4723]: I0309 13:23:18.017017 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f8105baf-a986-4aba-a114-10e0b997f27c","Type":"ContainerDied","Data":"5cc5b95b52319cf3fad4e4d63d8270110843673ba6b1cc1c712af8d9c5935160"} Mar 09 13:23:18 crc kubenswrapper[4723]: I0309 13:23:18.019275 4723 generic.go:334] "Generic (PLEG): container finished" podID="eaf73308-8e25-4470-a1d4-a2f9d59b1cd6" containerID="b684cb0dc2c174e2348d5059f387bd692515f3164a959818841846b2a4b8a46e" exitCode=0 Mar 09 13:23:18 crc kubenswrapper[4723]: I0309 13:23:18.019351 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-5jgl8" event={"ID":"eaf73308-8e25-4470-a1d4-a2f9d59b1cd6","Type":"ContainerDied","Data":"b684cb0dc2c174e2348d5059f387bd692515f3164a959818841846b2a4b8a46e"} Mar 09 13:23:18 crc kubenswrapper[4723]: I0309 13:23:18.028314 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=4.332605707 podStartE2EDuration="9.028293412s" podCreationTimestamp="2026-03-09 13:23:09 +0000 UTC" firstStartedPulling="2026-03-09 13:23:10.513931852 +0000 UTC m=+1464.528399392" lastFinishedPulling="2026-03-09 13:23:15.209619557 +0000 UTC m=+1469.224087097" observedRunningTime="2026-03-09 13:23:18.021086992 +0000 UTC m=+1472.035554542" watchObservedRunningTime="2026-03-09 13:23:18.028293412 +0000 UTC m=+1472.042760952" Mar 09 13:23:18 crc kubenswrapper[4723]: I0309 13:23:18.029246 4723 generic.go:334] "Generic (PLEG): container finished" podID="49c81a50-9366-425a-b957-330022266b2d" containerID="364ddbc084e24139ed96474b977fea3b6f71ac47d2a5348d7ea9b387eda7ee73" exitCode=0 Mar 09 13:23:18 crc kubenswrapper[4723]: I0309 13:23:18.029326 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 09 13:23:18 crc kubenswrapper[4723]: I0309 13:23:18.029566 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-8808-account-create-update-jg69c" event={"ID":"49c81a50-9366-425a-b957-330022266b2d","Type":"ContainerDied","Data":"364ddbc084e24139ed96474b977fea3b6f71ac47d2a5348d7ea9b387eda7ee73"} Mar 09 13:23:18 crc kubenswrapper[4723]: I0309 13:23:18.162997 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 09 13:23:18 crc kubenswrapper[4723]: I0309 13:23:18.179250 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 09 13:23:18 crc kubenswrapper[4723]: I0309 13:23:18.199814 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 09 13:23:18 crc kubenswrapper[4723]: E0309 13:23:18.200333 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4d4aa80-3125-4750-83a9-af82898eaaf6" containerName="nova-cell0-conductor-conductor" Mar 09 13:23:18 crc kubenswrapper[4723]: I0309 13:23:18.200351 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4d4aa80-3125-4750-83a9-af82898eaaf6" containerName="nova-cell0-conductor-conductor" Mar 09 13:23:18 crc kubenswrapper[4723]: I0309 13:23:18.200627 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4d4aa80-3125-4750-83a9-af82898eaaf6" containerName="nova-cell0-conductor-conductor" Mar 09 13:23:18 crc kubenswrapper[4723]: I0309 13:23:18.204236 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 09 13:23:18 crc kubenswrapper[4723]: I0309 13:23:18.209397 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 09 13:23:18 crc kubenswrapper[4723]: I0309 13:23:18.229931 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 09 13:23:18 crc kubenswrapper[4723]: I0309 13:23:18.293762 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d33e56a-fbb0-4ac1-a383-9ad42906d556-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2d33e56a-fbb0-4ac1-a383-9ad42906d556\") " pod="openstack/nova-cell0-conductor-0" Mar 09 13:23:18 crc kubenswrapper[4723]: I0309 13:23:18.293902 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcbk7\" (UniqueName: \"kubernetes.io/projected/2d33e56a-fbb0-4ac1-a383-9ad42906d556-kube-api-access-xcbk7\") pod \"nova-cell0-conductor-0\" (UID: \"2d33e56a-fbb0-4ac1-a383-9ad42906d556\") " pod="openstack/nova-cell0-conductor-0" Mar 09 13:23:18 crc kubenswrapper[4723]: I0309 13:23:18.294003 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d33e56a-fbb0-4ac1-a383-9ad42906d556-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2d33e56a-fbb0-4ac1-a383-9ad42906d556\") " pod="openstack/nova-cell0-conductor-0" Mar 09 13:23:18 crc kubenswrapper[4723]: I0309 13:23:18.396170 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcbk7\" (UniqueName: \"kubernetes.io/projected/2d33e56a-fbb0-4ac1-a383-9ad42906d556-kube-api-access-xcbk7\") pod \"nova-cell0-conductor-0\" (UID: \"2d33e56a-fbb0-4ac1-a383-9ad42906d556\") " pod="openstack/nova-cell0-conductor-0" Mar 09 13:23:18 crc kubenswrapper[4723]: I0309 13:23:18.396313 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d33e56a-fbb0-4ac1-a383-9ad42906d556-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2d33e56a-fbb0-4ac1-a383-9ad42906d556\") " pod="openstack/nova-cell0-conductor-0" Mar 09 13:23:18 crc kubenswrapper[4723]: I0309 13:23:18.396514 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d33e56a-fbb0-4ac1-a383-9ad42906d556-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2d33e56a-fbb0-4ac1-a383-9ad42906d556\") " pod="openstack/nova-cell0-conductor-0" Mar 09 13:23:18 crc kubenswrapper[4723]: I0309 13:23:18.401127 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d33e56a-fbb0-4ac1-a383-9ad42906d556-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2d33e56a-fbb0-4ac1-a383-9ad42906d556\") " pod="openstack/nova-cell0-conductor-0" Mar 09 13:23:18 crc kubenswrapper[4723]: I0309 13:23:18.401248 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d33e56a-fbb0-4ac1-a383-9ad42906d556-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2d33e56a-fbb0-4ac1-a383-9ad42906d556\") " pod="openstack/nova-cell0-conductor-0" Mar 09 13:23:18 crc kubenswrapper[4723]: I0309 13:23:18.412778 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcbk7\" (UniqueName: \"kubernetes.io/projected/2d33e56a-fbb0-4ac1-a383-9ad42906d556-kube-api-access-xcbk7\") pod \"nova-cell0-conductor-0\" (UID: \"2d33e56a-fbb0-4ac1-a383-9ad42906d556\") " pod="openstack/nova-cell0-conductor-0" Mar 09 13:23:18 crc kubenswrapper[4723]: I0309 13:23:18.526458 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 09 13:23:18 crc kubenswrapper[4723]: I0309 13:23:18.768800 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 13:23:18 crc kubenswrapper[4723]: I0309 13:23:18.839850 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/151bf67d-b202-4687-afcc-6247f5c670e9-logs\") pod \"151bf67d-b202-4687-afcc-6247f5c670e9\" (UID: \"151bf67d-b202-4687-afcc-6247f5c670e9\") " Mar 09 13:23:18 crc kubenswrapper[4723]: I0309 13:23:18.840236 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/151bf67d-b202-4687-afcc-6247f5c670e9-config-data\") pod \"151bf67d-b202-4687-afcc-6247f5c670e9\" (UID: \"151bf67d-b202-4687-afcc-6247f5c670e9\") " Mar 09 13:23:18 crc kubenswrapper[4723]: I0309 13:23:18.840295 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wcbb\" (UniqueName: \"kubernetes.io/projected/151bf67d-b202-4687-afcc-6247f5c670e9-kube-api-access-7wcbb\") pod \"151bf67d-b202-4687-afcc-6247f5c670e9\" (UID: \"151bf67d-b202-4687-afcc-6247f5c670e9\") " Mar 09 13:23:18 crc kubenswrapper[4723]: I0309 13:23:18.840347 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/151bf67d-b202-4687-afcc-6247f5c670e9-combined-ca-bundle\") pod \"151bf67d-b202-4687-afcc-6247f5c670e9\" (UID: \"151bf67d-b202-4687-afcc-6247f5c670e9\") " Mar 09 13:23:18 crc kubenswrapper[4723]: I0309 13:23:18.840974 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/151bf67d-b202-4687-afcc-6247f5c670e9-logs" (OuterVolumeSpecName: "logs") pod "151bf67d-b202-4687-afcc-6247f5c670e9" (UID: "151bf67d-b202-4687-afcc-6247f5c670e9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:23:18 crc kubenswrapper[4723]: I0309 13:23:18.847099 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/151bf67d-b202-4687-afcc-6247f5c670e9-kube-api-access-7wcbb" (OuterVolumeSpecName: "kube-api-access-7wcbb") pod "151bf67d-b202-4687-afcc-6247f5c670e9" (UID: "151bf67d-b202-4687-afcc-6247f5c670e9"). InnerVolumeSpecName "kube-api-access-7wcbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:23:18 crc kubenswrapper[4723]: I0309 13:23:18.909245 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4d4aa80-3125-4750-83a9-af82898eaaf6" path="/var/lib/kubelet/pods/a4d4aa80-3125-4750-83a9-af82898eaaf6/volumes" Mar 09 13:23:18 crc kubenswrapper[4723]: I0309 13:23:18.920395 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/151bf67d-b202-4687-afcc-6247f5c670e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "151bf67d-b202-4687-afcc-6247f5c670e9" (UID: "151bf67d-b202-4687-afcc-6247f5c670e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:23:18 crc kubenswrapper[4723]: I0309 13:23:18.924673 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/151bf67d-b202-4687-afcc-6247f5c670e9-config-data" (OuterVolumeSpecName: "config-data") pod "151bf67d-b202-4687-afcc-6247f5c670e9" (UID: "151bf67d-b202-4687-afcc-6247f5c670e9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:23:18 crc kubenswrapper[4723]: I0309 13:23:18.944193 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wcbb\" (UniqueName: \"kubernetes.io/projected/151bf67d-b202-4687-afcc-6247f5c670e9-kube-api-access-7wcbb\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:18 crc kubenswrapper[4723]: I0309 13:23:18.944268 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/151bf67d-b202-4687-afcc-6247f5c670e9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:18 crc kubenswrapper[4723]: I0309 13:23:18.944280 4723 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/151bf67d-b202-4687-afcc-6247f5c670e9-logs\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:18 crc kubenswrapper[4723]: I0309 13:23:18.944289 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/151bf67d-b202-4687-afcc-6247f5c670e9-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.048279 4723 generic.go:334] "Generic (PLEG): container finished" podID="151bf67d-b202-4687-afcc-6247f5c670e9" containerID="e50d47e723c7c8e96c8ee1d411941bb2f712a42d3ccfe09e2871752ceffdaf83" exitCode=0 Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.048317 4723 generic.go:334] "Generic (PLEG): container finished" podID="151bf67d-b202-4687-afcc-6247f5c670e9" containerID="9efbaaf202156c3d59bb73f09cef3c0e066469fbc2e2ca2c7c82b309876a9180" exitCode=143 Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.048367 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"151bf67d-b202-4687-afcc-6247f5c670e9","Type":"ContainerDied","Data":"e50d47e723c7c8e96c8ee1d411941bb2f712a42d3ccfe09e2871752ceffdaf83"} Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.048396 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"151bf67d-b202-4687-afcc-6247f5c670e9","Type":"ContainerDied","Data":"9efbaaf202156c3d59bb73f09cef3c0e066469fbc2e2ca2c7c82b309876a9180"} Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.048410 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"151bf67d-b202-4687-afcc-6247f5c670e9","Type":"ContainerDied","Data":"33e9e650e2b7ce914c4eba0c6bef74ca69521386bca465463cedc83e8b69ff75"} Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.048428 4723 scope.go:117] "RemoveContainer" containerID="e50d47e723c7c8e96c8ee1d411941bb2f712a42d3ccfe09e2871752ceffdaf83" Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.048578 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.052485 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ec76144-c358-4a57-bf5e-b3fd37531ae7","Type":"ContainerStarted","Data":"85df1e22230c029105d30e0b99b6d8fd55c16180865e5e0cbec9110f3f0625a9"} Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.095850 4723 scope.go:117] "RemoveContainer" containerID="9efbaaf202156c3d59bb73f09cef3c0e066469fbc2e2ca2c7c82b309876a9180" Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.118010 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.138917 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.151833 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 09 13:23:19 crc kubenswrapper[4723]: E0309 13:23:19.152444 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="151bf67d-b202-4687-afcc-6247f5c670e9" containerName="nova-api-api" Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.152457 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="151bf67d-b202-4687-afcc-6247f5c670e9" containerName="nova-api-api" Mar 09 13:23:19 crc kubenswrapper[4723]: E0309 13:23:19.152466 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="151bf67d-b202-4687-afcc-6247f5c670e9" containerName="nova-api-log" Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.152471 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="151bf67d-b202-4687-afcc-6247f5c670e9" containerName="nova-api-log" Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.152685 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="151bf67d-b202-4687-afcc-6247f5c670e9" containerName="nova-api-api" Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.152702 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="151bf67d-b202-4687-afcc-6247f5c670e9" containerName="nova-api-log" Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.155603 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.162185 4723 scope.go:117] "RemoveContainer" containerID="e50d47e723c7c8e96c8ee1d411941bb2f712a42d3ccfe09e2871752ceffdaf83" Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.162566 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.164380 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:23:19 crc kubenswrapper[4723]: E0309 13:23:19.164666 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e50d47e723c7c8e96c8ee1d411941bb2f712a42d3ccfe09e2871752ceffdaf83\": container with ID starting with e50d47e723c7c8e96c8ee1d411941bb2f712a42d3ccfe09e2871752ceffdaf83 not found: ID does not exist" containerID="e50d47e723c7c8e96c8ee1d411941bb2f712a42d3ccfe09e2871752ceffdaf83" Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.164693 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e50d47e723c7c8e96c8ee1d411941bb2f712a42d3ccfe09e2871752ceffdaf83"} err="failed to get container status \"e50d47e723c7c8e96c8ee1d411941bb2f712a42d3ccfe09e2871752ceffdaf83\": rpc error: code = NotFound desc = could not find container \"e50d47e723c7c8e96c8ee1d411941bb2f712a42d3ccfe09e2871752ceffdaf83\": container with ID starting with e50d47e723c7c8e96c8ee1d411941bb2f712a42d3ccfe09e2871752ceffdaf83 not found: ID does not exist" Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.164715 4723 scope.go:117] "RemoveContainer" containerID="9efbaaf202156c3d59bb73f09cef3c0e066469fbc2e2ca2c7c82b309876a9180" Mar 09 13:23:19 crc kubenswrapper[4723]: E0309 13:23:19.166761 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9efbaaf202156c3d59bb73f09cef3c0e066469fbc2e2ca2c7c82b309876a9180\": container with ID starting with 9efbaaf202156c3d59bb73f09cef3c0e066469fbc2e2ca2c7c82b309876a9180 not found: ID does not exist" containerID="9efbaaf202156c3d59bb73f09cef3c0e066469fbc2e2ca2c7c82b309876a9180" Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.166779 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9efbaaf202156c3d59bb73f09cef3c0e066469fbc2e2ca2c7c82b309876a9180"} err="failed to get container status \"9efbaaf202156c3d59bb73f09cef3c0e066469fbc2e2ca2c7c82b309876a9180\": rpc error: code = NotFound desc = could not find container \"9efbaaf202156c3d59bb73f09cef3c0e066469fbc2e2ca2c7c82b309876a9180\": container with ID starting with 9efbaaf202156c3d59bb73f09cef3c0e066469fbc2e2ca2c7c82b309876a9180 not found: ID does not exist" Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.166797 4723 scope.go:117] "RemoveContainer" containerID="e50d47e723c7c8e96c8ee1d411941bb2f712a42d3ccfe09e2871752ceffdaf83" Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.169072 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e50d47e723c7c8e96c8ee1d411941bb2f712a42d3ccfe09e2871752ceffdaf83"} err="failed to get container status \"e50d47e723c7c8e96c8ee1d411941bb2f712a42d3ccfe09e2871752ceffdaf83\": rpc error: code = NotFound desc = could not find container \"e50d47e723c7c8e96c8ee1d411941bb2f712a42d3ccfe09e2871752ceffdaf83\": container with ID starting with e50d47e723c7c8e96c8ee1d411941bb2f712a42d3ccfe09e2871752ceffdaf83 not found: ID does not exist" Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.169112 4723 scope.go:117] "RemoveContainer" containerID="9efbaaf202156c3d59bb73f09cef3c0e066469fbc2e2ca2c7c82b309876a9180" Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.171661 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9efbaaf202156c3d59bb73f09cef3c0e066469fbc2e2ca2c7c82b309876a9180"} err="failed to get container status \"9efbaaf202156c3d59bb73f09cef3c0e066469fbc2e2ca2c7c82b309876a9180\": rpc error: code = NotFound desc = could not find container \"9efbaaf202156c3d59bb73f09cef3c0e066469fbc2e2ca2c7c82b309876a9180\": container with ID starting with 9efbaaf202156c3d59bb73f09cef3c0e066469fbc2e2ca2c7c82b309876a9180 not found: ID does not exist" Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.259503 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bdee676-b46b-477b-ba90-669494e8a6b0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1bdee676-b46b-477b-ba90-669494e8a6b0\") " pod="openstack/nova-api-0" Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.259572 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97lns\" (UniqueName: \"kubernetes.io/projected/1bdee676-b46b-477b-ba90-669494e8a6b0-kube-api-access-97lns\") pod \"nova-api-0\" (UID: \"1bdee676-b46b-477b-ba90-669494e8a6b0\") " pod="openstack/nova-api-0" Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.259614 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bdee676-b46b-477b-ba90-669494e8a6b0-config-data\") pod \"nova-api-0\" (UID: \"1bdee676-b46b-477b-ba90-669494e8a6b0\") " pod="openstack/nova-api-0" Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.259685 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1bdee676-b46b-477b-ba90-669494e8a6b0-logs\") pod \"nova-api-0\" (UID: \"1bdee676-b46b-477b-ba90-669494e8a6b0\") " pod="openstack/nova-api-0" Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.321982 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.362295 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bdee676-b46b-477b-ba90-669494e8a6b0-config-data\") pod \"nova-api-0\" (UID: \"1bdee676-b46b-477b-ba90-669494e8a6b0\") " pod="openstack/nova-api-0" Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.363145 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1bdee676-b46b-477b-ba90-669494e8a6b0-logs\") pod \"nova-api-0\" (UID: \"1bdee676-b46b-477b-ba90-669494e8a6b0\") " pod="openstack/nova-api-0" Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.363925 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1bdee676-b46b-477b-ba90-669494e8a6b0-logs\") pod \"nova-api-0\" (UID: \"1bdee676-b46b-477b-ba90-669494e8a6b0\") " pod="openstack/nova-api-0" Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.365262 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bdee676-b46b-477b-ba90-669494e8a6b0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1bdee676-b46b-477b-ba90-669494e8a6b0\") " pod="openstack/nova-api-0" Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.365342 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97lns\" (UniqueName: \"kubernetes.io/projected/1bdee676-b46b-477b-ba90-669494e8a6b0-kube-api-access-97lns\") pod \"nova-api-0\" (UID: \"1bdee676-b46b-477b-ba90-669494e8a6b0\") " pod="openstack/nova-api-0" Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.365679 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bdee676-b46b-477b-ba90-669494e8a6b0-config-data\") pod \"nova-api-0\" (UID: \"1bdee676-b46b-477b-ba90-669494e8a6b0\") " pod="openstack/nova-api-0" Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.372333 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bdee676-b46b-477b-ba90-669494e8a6b0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1bdee676-b46b-477b-ba90-669494e8a6b0\") " pod="openstack/nova-api-0" Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.387046 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97lns\" (UniqueName: \"kubernetes.io/projected/1bdee676-b46b-477b-ba90-669494e8a6b0-kube-api-access-97lns\") pod \"nova-api-0\" (UID: \"1bdee676-b46b-477b-ba90-669494e8a6b0\") " pod="openstack/nova-api-0" Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.486423 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.518582 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.670102 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-5jgl8" Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.689284 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.689458 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.776577 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8pgk\" (UniqueName: \"kubernetes.io/projected/eaf73308-8e25-4470-a1d4-a2f9d59b1cd6-kube-api-access-x8pgk\") pod \"eaf73308-8e25-4470-a1d4-a2f9d59b1cd6\" (UID: \"eaf73308-8e25-4470-a1d4-a2f9d59b1cd6\") " Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.776623 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-8808-account-create-update-jg69c" Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.776664 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eaf73308-8e25-4470-a1d4-a2f9d59b1cd6-operator-scripts\") pod \"eaf73308-8e25-4470-a1d4-a2f9d59b1cd6\" (UID: \"eaf73308-8e25-4470-a1d4-a2f9d59b1cd6\") " Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.779126 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eaf73308-8e25-4470-a1d4-a2f9d59b1cd6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eaf73308-8e25-4470-a1d4-a2f9d59b1cd6" (UID: "eaf73308-8e25-4470-a1d4-a2f9d59b1cd6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.785637 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaf73308-8e25-4470-a1d4-a2f9d59b1cd6-kube-api-access-x8pgk" (OuterVolumeSpecName: "kube-api-access-x8pgk") pod "eaf73308-8e25-4470-a1d4-a2f9d59b1cd6" (UID: "eaf73308-8e25-4470-a1d4-a2f9d59b1cd6"). InnerVolumeSpecName "kube-api-access-x8pgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.864136 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.866716 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5fbc4d444f-mlq2z" Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.879685 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mvlc\" (UniqueName: \"kubernetes.io/projected/49c81a50-9366-425a-b957-330022266b2d-kube-api-access-2mvlc\") pod \"49c81a50-9366-425a-b957-330022266b2d\" (UID: \"49c81a50-9366-425a-b957-330022266b2d\") " Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.880161 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49c81a50-9366-425a-b957-330022266b2d-operator-scripts\") pod \"49c81a50-9366-425a-b957-330022266b2d\" (UID: \"49c81a50-9366-425a-b957-330022266b2d\") " Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.880759 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c81a50-9366-425a-b957-330022266b2d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "49c81a50-9366-425a-b957-330022266b2d" (UID: "49c81a50-9366-425a-b957-330022266b2d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.883686 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c81a50-9366-425a-b957-330022266b2d-kube-api-access-2mvlc" (OuterVolumeSpecName: "kube-api-access-2mvlc") pod "49c81a50-9366-425a-b957-330022266b2d" (UID: "49c81a50-9366-425a-b957-330022266b2d"). InnerVolumeSpecName "kube-api-access-2mvlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.883697 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8pgk\" (UniqueName: \"kubernetes.io/projected/eaf73308-8e25-4470-a1d4-a2f9d59b1cd6-kube-api-access-x8pgk\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.907780 4723 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eaf73308-8e25-4470-a1d4-a2f9d59b1cd6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.907967 4723 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49c81a50-9366-425a-b957-330022266b2d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.971085 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f6bc4c6c9-vrvk6"] Mar 09 13:23:19 crc kubenswrapper[4723]: I0309 13:23:19.971365 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-f6bc4c6c9-vrvk6" podUID="f49fba07-b6b5-4e69-9c8a-2c3e1b09182f" containerName="dnsmasq-dns" containerID="cri-o://708242fd59bd257212f9758436d825acf296795bda0ecb23d35b5f78ea6f1191" gracePeriod=10 Mar 09 13:23:20 crc kubenswrapper[4723]: I0309 13:23:20.020114 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mvlc\" (UniqueName: \"kubernetes.io/projected/49c81a50-9366-425a-b957-330022266b2d-kube-api-access-2mvlc\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:20 crc kubenswrapper[4723]: I0309 13:23:20.071134 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ec76144-c358-4a57-bf5e-b3fd37531ae7","Type":"ContainerStarted","Data":"e2e269e516560e901d6fa603389152c0f546bda7295e7d75ae3e244622069e61"} Mar 09 13:23:20 crc kubenswrapper[4723]: I0309 13:23:20.094366 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"2d33e56a-fbb0-4ac1-a383-9ad42906d556","Type":"ContainerStarted","Data":"e739700d66d7557481295992402b43da5bda9e2ff73bd908ccb796eedc71c9ea"} Mar 09 13:23:20 crc kubenswrapper[4723]: I0309 13:23:20.094412 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"2d33e56a-fbb0-4ac1-a383-9ad42906d556","Type":"ContainerStarted","Data":"06ac23d4cc356e906743c6d218cdf95a8ea794366269d9c0939252490e92f028"} Mar 09 13:23:20 crc kubenswrapper[4723]: I0309 13:23:20.095529 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 09 13:23:20 crc kubenswrapper[4723]: I0309 13:23:20.106113 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-5jgl8" event={"ID":"eaf73308-8e25-4470-a1d4-a2f9d59b1cd6","Type":"ContainerDied","Data":"2f935fff866c4b831ea89f4c75f6baa18df0593893c14e03d5685a12b6dba027"} Mar 09 13:23:20 crc kubenswrapper[4723]: I0309 13:23:20.106158 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f935fff866c4b831ea89f4c75f6baa18df0593893c14e03d5685a12b6dba027" Mar 09 13:23:20 crc kubenswrapper[4723]: I0309 13:23:20.106229 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-5jgl8" Mar 09 13:23:20 crc kubenswrapper[4723]: I0309 13:23:20.120367 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-8808-account-create-update-jg69c" event={"ID":"49c81a50-9366-425a-b957-330022266b2d","Type":"ContainerDied","Data":"8431860bd55aa066753f2a4d5c2a869b692cf1cbdb65112fbfe98bbd1033b167"} Mar 09 13:23:20 crc kubenswrapper[4723]: I0309 13:23:20.120464 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8431860bd55aa066753f2a4d5c2a869b692cf1cbdb65112fbfe98bbd1033b167" Mar 09 13:23:20 crc kubenswrapper[4723]: I0309 13:23:20.120539 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-8808-account-create-update-jg69c" Mar 09 13:23:20 crc kubenswrapper[4723]: I0309 13:23:20.257188 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:23:20 crc kubenswrapper[4723]: I0309 13:23:20.277589 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.277565451 podStartE2EDuration="2.277565451s" podCreationTimestamp="2026-03-09 13:23:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:20.115251063 +0000 UTC m=+1474.129718613" watchObservedRunningTime="2026-03-09 13:23:20.277565451 +0000 UTC m=+1474.292032991" Mar 09 13:23:20 crc kubenswrapper[4723]: I0309 13:23:20.663213 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f6bc4c6c9-vrvk6" Mar 09 13:23:20 crc kubenswrapper[4723]: I0309 13:23:20.748984 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f49fba07-b6b5-4e69-9c8a-2c3e1b09182f-dns-svc\") pod \"f49fba07-b6b5-4e69-9c8a-2c3e1b09182f\" (UID: \"f49fba07-b6b5-4e69-9c8a-2c3e1b09182f\") " Mar 09 13:23:20 crc kubenswrapper[4723]: I0309 13:23:20.749259 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f49fba07-b6b5-4e69-9c8a-2c3e1b09182f-dns-swift-storage-0\") pod \"f49fba07-b6b5-4e69-9c8a-2c3e1b09182f\" (UID: \"f49fba07-b6b5-4e69-9c8a-2c3e1b09182f\") " Mar 09 13:23:20 crc kubenswrapper[4723]: I0309 13:23:20.749446 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f49fba07-b6b5-4e69-9c8a-2c3e1b09182f-ovsdbserver-sb\") pod \"f49fba07-b6b5-4e69-9c8a-2c3e1b09182f\" (UID: \"f49fba07-b6b5-4e69-9c8a-2c3e1b09182f\") " Mar 09 13:23:20 crc kubenswrapper[4723]: I0309 13:23:20.749501 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f49fba07-b6b5-4e69-9c8a-2c3e1b09182f-ovsdbserver-nb\") pod \"f49fba07-b6b5-4e69-9c8a-2c3e1b09182f\" (UID: \"f49fba07-b6b5-4e69-9c8a-2c3e1b09182f\") " Mar 09 13:23:20 crc kubenswrapper[4723]: I0309 13:23:20.749577 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kk72\" (UniqueName: \"kubernetes.io/projected/f49fba07-b6b5-4e69-9c8a-2c3e1b09182f-kube-api-access-7kk72\") pod \"f49fba07-b6b5-4e69-9c8a-2c3e1b09182f\" (UID: \"f49fba07-b6b5-4e69-9c8a-2c3e1b09182f\") " Mar 09 13:23:20 crc kubenswrapper[4723]: I0309 13:23:20.749611 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f49fba07-b6b5-4e69-9c8a-2c3e1b09182f-config\") pod \"f49fba07-b6b5-4e69-9c8a-2c3e1b09182f\" (UID: \"f49fba07-b6b5-4e69-9c8a-2c3e1b09182f\") " Mar 09 13:23:20 crc kubenswrapper[4723]: I0309 13:23:20.790962 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f49fba07-b6b5-4e69-9c8a-2c3e1b09182f-kube-api-access-7kk72" (OuterVolumeSpecName: "kube-api-access-7kk72") pod "f49fba07-b6b5-4e69-9c8a-2c3e1b09182f" (UID: "f49fba07-b6b5-4e69-9c8a-2c3e1b09182f"). InnerVolumeSpecName "kube-api-access-7kk72". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:23:20 crc kubenswrapper[4723]: I0309 13:23:20.838519 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f49fba07-b6b5-4e69-9c8a-2c3e1b09182f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f49fba07-b6b5-4e69-9c8a-2c3e1b09182f" (UID: "f49fba07-b6b5-4e69-9c8a-2c3e1b09182f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:23:20 crc kubenswrapper[4723]: I0309 13:23:20.855213 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kk72\" (UniqueName: \"kubernetes.io/projected/f49fba07-b6b5-4e69-9c8a-2c3e1b09182f-kube-api-access-7kk72\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:20 crc kubenswrapper[4723]: I0309 13:23:20.855247 4723 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f49fba07-b6b5-4e69-9c8a-2c3e1b09182f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:20 crc kubenswrapper[4723]: I0309 13:23:20.889277 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f49fba07-b6b5-4e69-9c8a-2c3e1b09182f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f49fba07-b6b5-4e69-9c8a-2c3e1b09182f" (UID: "f49fba07-b6b5-4e69-9c8a-2c3e1b09182f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:23:20 crc kubenswrapper[4723]: I0309 13:23:20.900410 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f49fba07-b6b5-4e69-9c8a-2c3e1b09182f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f49fba07-b6b5-4e69-9c8a-2c3e1b09182f" (UID: "f49fba07-b6b5-4e69-9c8a-2c3e1b09182f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:23:20 crc kubenswrapper[4723]: I0309 13:23:20.906948 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="151bf67d-b202-4687-afcc-6247f5c670e9" path="/var/lib/kubelet/pods/151bf67d-b202-4687-afcc-6247f5c670e9/volumes" Mar 09 13:23:20 crc kubenswrapper[4723]: I0309 13:23:20.911072 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f49fba07-b6b5-4e69-9c8a-2c3e1b09182f-config" (OuterVolumeSpecName: "config") pod "f49fba07-b6b5-4e69-9c8a-2c3e1b09182f" (UID: "f49fba07-b6b5-4e69-9c8a-2c3e1b09182f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:23:20 crc kubenswrapper[4723]: I0309 13:23:20.940968 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f49fba07-b6b5-4e69-9c8a-2c3e1b09182f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f49fba07-b6b5-4e69-9c8a-2c3e1b09182f" (UID: "f49fba07-b6b5-4e69-9c8a-2c3e1b09182f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:23:20 crc kubenswrapper[4723]: I0309 13:23:20.957029 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f49fba07-b6b5-4e69-9c8a-2c3e1b09182f-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:20 crc kubenswrapper[4723]: I0309 13:23:20.957057 4723 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f49fba07-b6b5-4e69-9c8a-2c3e1b09182f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:20 crc kubenswrapper[4723]: I0309 13:23:20.957066 4723 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f49fba07-b6b5-4e69-9c8a-2c3e1b09182f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:20 crc kubenswrapper[4723]: I0309 13:23:20.957077 4723 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f49fba07-b6b5-4e69-9c8a-2c3e1b09182f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:21 crc kubenswrapper[4723]: I0309 13:23:21.133780 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1bdee676-b46b-477b-ba90-669494e8a6b0","Type":"ContainerStarted","Data":"2c13c083cdef2d26ea58c637737f65c467103f2e07facbc6f7a945a865d7fe77"} Mar 09 13:23:21 crc kubenswrapper[4723]: I0309 13:23:21.133838 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1bdee676-b46b-477b-ba90-669494e8a6b0","Type":"ContainerStarted","Data":"28b113e59c97cba2bc909ba80e8cf2d104fcb2ee897edb9dd1309acecdc972a7"} Mar 09 13:23:21 crc kubenswrapper[4723]: I0309 13:23:21.133853 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1bdee676-b46b-477b-ba90-669494e8a6b0","Type":"ContainerStarted","Data":"92b136921af210347dc86422ba008b341842fa1d1eb09d8eb98c6ae6148ec92e"} Mar 09 13:23:21 crc kubenswrapper[4723]: I0309 13:23:21.136253 4723 generic.go:334] "Generic (PLEG): container finished" podID="f49fba07-b6b5-4e69-9c8a-2c3e1b09182f" containerID="708242fd59bd257212f9758436d825acf296795bda0ecb23d35b5f78ea6f1191" exitCode=0 Mar 09 13:23:21 crc kubenswrapper[4723]: I0309 13:23:21.136296 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f6bc4c6c9-vrvk6" Mar 09 13:23:21 crc kubenswrapper[4723]: I0309 13:23:21.136321 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f6bc4c6c9-vrvk6" event={"ID":"f49fba07-b6b5-4e69-9c8a-2c3e1b09182f","Type":"ContainerDied","Data":"708242fd59bd257212f9758436d825acf296795bda0ecb23d35b5f78ea6f1191"} Mar 09 13:23:21 crc kubenswrapper[4723]: I0309 13:23:21.136349 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f6bc4c6c9-vrvk6" event={"ID":"f49fba07-b6b5-4e69-9c8a-2c3e1b09182f","Type":"ContainerDied","Data":"be3cd259005cb1ce5d04e521e59fbe62fd86cd56879a5a2a12c42e94ee1b09b3"} Mar 09 13:23:21 crc kubenswrapper[4723]: I0309 13:23:21.136368 4723 scope.go:117] "RemoveContainer" containerID="708242fd59bd257212f9758436d825acf296795bda0ecb23d35b5f78ea6f1191" Mar 09 13:23:21 crc kubenswrapper[4723]: I0309 13:23:21.144238 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ec76144-c358-4a57-bf5e-b3fd37531ae7","Type":"ContainerStarted","Data":"fa779e47056a0ebaf99c16d74c22c99b6b8add0cec4df04e7ed1cf0c8b8ce442"} Mar 09 13:23:21 crc kubenswrapper[4723]: I0309 13:23:21.160180 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.160156422 podStartE2EDuration="2.160156422s" podCreationTimestamp="2026-03-09 13:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:21.153947838 +0000 UTC m=+1475.168415378" watchObservedRunningTime="2026-03-09 13:23:21.160156422 +0000 UTC m=+1475.174623962" Mar 09 13:23:21 crc kubenswrapper[4723]: I0309 13:23:21.170190 4723 scope.go:117] "RemoveContainer" containerID="09e6092ee38ade1a7a0616246728a205c0da58281be3c8201d48b12596d7a897" Mar 09 13:23:21 crc kubenswrapper[4723]: I0309 13:23:21.194836 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f6bc4c6c9-vrvk6"] Mar 09 13:23:21 crc kubenswrapper[4723]: I0309 13:23:21.205402 4723 scope.go:117] "RemoveContainer" containerID="708242fd59bd257212f9758436d825acf296795bda0ecb23d35b5f78ea6f1191" Mar 09 13:23:21 crc kubenswrapper[4723]: E0309 13:23:21.205774 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"708242fd59bd257212f9758436d825acf296795bda0ecb23d35b5f78ea6f1191\": container with ID starting with 708242fd59bd257212f9758436d825acf296795bda0ecb23d35b5f78ea6f1191 not found: ID does not exist" containerID="708242fd59bd257212f9758436d825acf296795bda0ecb23d35b5f78ea6f1191" Mar 09 13:23:21 crc kubenswrapper[4723]: I0309 13:23:21.205814 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"708242fd59bd257212f9758436d825acf296795bda0ecb23d35b5f78ea6f1191"} err="failed to get container status \"708242fd59bd257212f9758436d825acf296795bda0ecb23d35b5f78ea6f1191\": rpc error: code = NotFound desc = could not find container \"708242fd59bd257212f9758436d825acf296795bda0ecb23d35b5f78ea6f1191\": container with ID starting with 708242fd59bd257212f9758436d825acf296795bda0ecb23d35b5f78ea6f1191 not found: ID does not exist" Mar 09 13:23:21 crc kubenswrapper[4723]: I0309 13:23:21.205840 4723 scope.go:117] "RemoveContainer" containerID="09e6092ee38ade1a7a0616246728a205c0da58281be3c8201d48b12596d7a897" Mar 09 13:23:21 crc kubenswrapper[4723]: E0309 13:23:21.206150 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09e6092ee38ade1a7a0616246728a205c0da58281be3c8201d48b12596d7a897\": container with ID starting with 09e6092ee38ade1a7a0616246728a205c0da58281be3c8201d48b12596d7a897 not found: ID does not exist" containerID="09e6092ee38ade1a7a0616246728a205c0da58281be3c8201d48b12596d7a897" Mar 09 13:23:21 crc kubenswrapper[4723]: I0309 13:23:21.206172 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09e6092ee38ade1a7a0616246728a205c0da58281be3c8201d48b12596d7a897"} err="failed to get container status \"09e6092ee38ade1a7a0616246728a205c0da58281be3c8201d48b12596d7a897\": rpc error: code = NotFound desc = could not find container \"09e6092ee38ade1a7a0616246728a205c0da58281be3c8201d48b12596d7a897\": container with ID starting with 09e6092ee38ade1a7a0616246728a205c0da58281be3c8201d48b12596d7a897 not found: ID does not exist" Mar 09 13:23:21 crc kubenswrapper[4723]: I0309 13:23:21.217402 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f6bc4c6c9-vrvk6"] Mar 09 13:23:22 crc kubenswrapper[4723]: I0309 13:23:22.176428 4723 generic.go:334] "Generic (PLEG): container finished" podID="c4e81d20-6788-48e4-a38c-2dda5e6cc206" containerID="56eda3a218baef03a2d03f600372743aaa19368443fb06818b6fa4bb26fd9645" exitCode=0 Mar 09 13:23:22 crc kubenswrapper[4723]: I0309 13:23:22.176530 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mfbfj" event={"ID":"c4e81d20-6788-48e4-a38c-2dda5e6cc206","Type":"ContainerDied","Data":"56eda3a218baef03a2d03f600372743aaa19368443fb06818b6fa4bb26fd9645"} Mar 09 13:23:22 crc kubenswrapper[4723]: I0309 13:23:22.897017 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f49fba07-b6b5-4e69-9c8a-2c3e1b09182f" path="/var/lib/kubelet/pods/f49fba07-b6b5-4e69-9c8a-2c3e1b09182f/volumes" Mar 09 13:23:23 crc kubenswrapper[4723]: I0309 13:23:23.189740 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ec76144-c358-4a57-bf5e-b3fd37531ae7","Type":"ContainerStarted","Data":"4ba4539b9017d6bfe47e6cf804915e83bfa20355355863196282e7c2b529f246"} Mar 09 13:23:23 crc kubenswrapper[4723]: I0309 13:23:23.190260 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 13:23:23 crc kubenswrapper[4723]: I0309 13:23:23.192803 4723 generic.go:334] "Generic (PLEG): container finished" podID="1f23bcc5-001d-4f5e-a28f-ed00ab283c01" containerID="7a117ee67ee75bb4a7bae9848fb8f37e812495e259379c7b120d3f64f1648ac0" exitCode=0 Mar 09 13:23:23 crc kubenswrapper[4723]: I0309 13:23:23.192928 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hx7mw" event={"ID":"1f23bcc5-001d-4f5e-a28f-ed00ab283c01","Type":"ContainerDied","Data":"7a117ee67ee75bb4a7bae9848fb8f37e812495e259379c7b120d3f64f1648ac0"} Mar 09 13:23:23 crc kubenswrapper[4723]: I0309 13:23:23.222988 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.68920286 podStartE2EDuration="7.222970887s" podCreationTimestamp="2026-03-09 13:23:16 +0000 UTC" firstStartedPulling="2026-03-09 13:23:17.678907204 +0000 UTC m=+1471.693374744" lastFinishedPulling="2026-03-09 13:23:22.212675231 +0000 UTC m=+1476.227142771" observedRunningTime="2026-03-09 13:23:23.222283618 +0000 UTC m=+1477.236751168" watchObservedRunningTime="2026-03-09 13:23:23.222970887 +0000 UTC m=+1477.237438447" Mar 09 13:23:23 crc kubenswrapper[4723]: I0309 13:23:23.661203 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mfbfj" Mar 09 13:23:23 crc kubenswrapper[4723]: I0309 13:23:23.774242 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4e81d20-6788-48e4-a38c-2dda5e6cc206-scripts\") pod \"c4e81d20-6788-48e4-a38c-2dda5e6cc206\" (UID: \"c4e81d20-6788-48e4-a38c-2dda5e6cc206\") " Mar 09 13:23:23 crc kubenswrapper[4723]: I0309 13:23:23.774354 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4e81d20-6788-48e4-a38c-2dda5e6cc206-config-data\") pod \"c4e81d20-6788-48e4-a38c-2dda5e6cc206\" (UID: \"c4e81d20-6788-48e4-a38c-2dda5e6cc206\") " Mar 09 13:23:23 crc kubenswrapper[4723]: I0309 13:23:23.774504 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gh8gd\" (UniqueName: \"kubernetes.io/projected/c4e81d20-6788-48e4-a38c-2dda5e6cc206-kube-api-access-gh8gd\") pod \"c4e81d20-6788-48e4-a38c-2dda5e6cc206\" (UID: \"c4e81d20-6788-48e4-a38c-2dda5e6cc206\") " Mar 09 13:23:23 crc kubenswrapper[4723]: I0309 13:23:23.775011 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e81d20-6788-48e4-a38c-2dda5e6cc206-combined-ca-bundle\") pod \"c4e81d20-6788-48e4-a38c-2dda5e6cc206\" (UID: \"c4e81d20-6788-48e4-a38c-2dda5e6cc206\") " Mar 09 13:23:23 crc kubenswrapper[4723]: I0309 13:23:23.780225 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4e81d20-6788-48e4-a38c-2dda5e6cc206-kube-api-access-gh8gd" (OuterVolumeSpecName: "kube-api-access-gh8gd") pod "c4e81d20-6788-48e4-a38c-2dda5e6cc206" (UID: "c4e81d20-6788-48e4-a38c-2dda5e6cc206"). InnerVolumeSpecName "kube-api-access-gh8gd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:23:23 crc kubenswrapper[4723]: I0309 13:23:23.780394 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4e81d20-6788-48e4-a38c-2dda5e6cc206-scripts" (OuterVolumeSpecName: "scripts") pod "c4e81d20-6788-48e4-a38c-2dda5e6cc206" (UID: "c4e81d20-6788-48e4-a38c-2dda5e6cc206"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:23:23 crc kubenswrapper[4723]: I0309 13:23:23.814779 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4e81d20-6788-48e4-a38c-2dda5e6cc206-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c4e81d20-6788-48e4-a38c-2dda5e6cc206" (UID: "c4e81d20-6788-48e4-a38c-2dda5e6cc206"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:23:23 crc kubenswrapper[4723]: I0309 13:23:23.815218 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4e81d20-6788-48e4-a38c-2dda5e6cc206-config-data" (OuterVolumeSpecName: "config-data") pod "c4e81d20-6788-48e4-a38c-2dda5e6cc206" (UID: "c4e81d20-6788-48e4-a38c-2dda5e6cc206"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:23:23 crc kubenswrapper[4723]: I0309 13:23:23.877606 4723 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4e81d20-6788-48e4-a38c-2dda5e6cc206-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:23 crc kubenswrapper[4723]: I0309 13:23:23.877662 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4e81d20-6788-48e4-a38c-2dda5e6cc206-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:23 crc kubenswrapper[4723]: I0309 13:23:23.877673 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gh8gd\" (UniqueName: \"kubernetes.io/projected/c4e81d20-6788-48e4-a38c-2dda5e6cc206-kube-api-access-gh8gd\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:23 crc kubenswrapper[4723]: I0309 13:23:23.877684 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e81d20-6788-48e4-a38c-2dda5e6cc206-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:24 crc kubenswrapper[4723]: I0309 13:23:24.210504 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mfbfj" event={"ID":"c4e81d20-6788-48e4-a38c-2dda5e6cc206","Type":"ContainerDied","Data":"2e49d0262e2d850a995f4d1a7bd5c7dc94a65a4000986986bd593d60571b4642"} Mar 09 13:23:24 crc kubenswrapper[4723]: I0309 13:23:24.210572 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e49d0262e2d850a995f4d1a7bd5c7dc94a65a4000986986bd593d60571b4642" Mar 09 13:23:24 crc kubenswrapper[4723]: I0309 13:23:24.210639 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mfbfj" Mar 09 13:23:24 crc kubenswrapper[4723]: I0309 13:23:24.623671 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hx7mw" Mar 09 13:23:24 crc kubenswrapper[4723]: I0309 13:23:24.702852 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxkvl\" (UniqueName: \"kubernetes.io/projected/1f23bcc5-001d-4f5e-a28f-ed00ab283c01-kube-api-access-nxkvl\") pod \"1f23bcc5-001d-4f5e-a28f-ed00ab283c01\" (UID: \"1f23bcc5-001d-4f5e-a28f-ed00ab283c01\") " Mar 09 13:23:24 crc kubenswrapper[4723]: I0309 13:23:24.703016 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f23bcc5-001d-4f5e-a28f-ed00ab283c01-config-data\") pod \"1f23bcc5-001d-4f5e-a28f-ed00ab283c01\" (UID: \"1f23bcc5-001d-4f5e-a28f-ed00ab283c01\") " Mar 09 13:23:24 crc kubenswrapper[4723]: I0309 13:23:24.704634 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f23bcc5-001d-4f5e-a28f-ed00ab283c01-scripts\") pod \"1f23bcc5-001d-4f5e-a28f-ed00ab283c01\" (UID: \"1f23bcc5-001d-4f5e-a28f-ed00ab283c01\") " Mar 09 13:23:24 crc kubenswrapper[4723]: I0309 13:23:24.705074 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f23bcc5-001d-4f5e-a28f-ed00ab283c01-combined-ca-bundle\") pod \"1f23bcc5-001d-4f5e-a28f-ed00ab283c01\" (UID: \"1f23bcc5-001d-4f5e-a28f-ed00ab283c01\") " Mar 09 13:23:24 crc kubenswrapper[4723]: I0309 13:23:24.709413 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f23bcc5-001d-4f5e-a28f-ed00ab283c01-kube-api-access-nxkvl" (OuterVolumeSpecName: "kube-api-access-nxkvl") pod "1f23bcc5-001d-4f5e-a28f-ed00ab283c01" (UID: "1f23bcc5-001d-4f5e-a28f-ed00ab283c01"). InnerVolumeSpecName "kube-api-access-nxkvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:23:24 crc kubenswrapper[4723]: I0309 13:23:24.713974 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f23bcc5-001d-4f5e-a28f-ed00ab283c01-scripts" (OuterVolumeSpecName: "scripts") pod "1f23bcc5-001d-4f5e-a28f-ed00ab283c01" (UID: "1f23bcc5-001d-4f5e-a28f-ed00ab283c01"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:23:24 crc kubenswrapper[4723]: I0309 13:23:24.736110 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f23bcc5-001d-4f5e-a28f-ed00ab283c01-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f23bcc5-001d-4f5e-a28f-ed00ab283c01" (UID: "1f23bcc5-001d-4f5e-a28f-ed00ab283c01"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:23:24 crc kubenswrapper[4723]: I0309 13:23:24.763085 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f23bcc5-001d-4f5e-a28f-ed00ab283c01-config-data" (OuterVolumeSpecName: "config-data") pod "1f23bcc5-001d-4f5e-a28f-ed00ab283c01" (UID: "1f23bcc5-001d-4f5e-a28f-ed00ab283c01"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:23:24 crc kubenswrapper[4723]: I0309 13:23:24.809670 4723 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f23bcc5-001d-4f5e-a28f-ed00ab283c01-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:24 crc kubenswrapper[4723]: I0309 13:23:24.809713 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f23bcc5-001d-4f5e-a28f-ed00ab283c01-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:24 crc kubenswrapper[4723]: I0309 13:23:24.809734 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxkvl\" (UniqueName: \"kubernetes.io/projected/1f23bcc5-001d-4f5e-a28f-ed00ab283c01-kube-api-access-nxkvl\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:24 crc kubenswrapper[4723]: I0309 13:23:24.809755 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f23bcc5-001d-4f5e-a28f-ed00ab283c01-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.101511 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-5td4v"] Mar 09 13:23:25 crc kubenswrapper[4723]: E0309 13:23:25.102092 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49c81a50-9366-425a-b957-330022266b2d" containerName="mariadb-account-create-update" Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.102114 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="49c81a50-9366-425a-b957-330022266b2d" containerName="mariadb-account-create-update" Mar 09 13:23:25 crc kubenswrapper[4723]: E0309 13:23:25.102135 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f49fba07-b6b5-4e69-9c8a-2c3e1b09182f" containerName="dnsmasq-dns" Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.102142 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="f49fba07-b6b5-4e69-9c8a-2c3e1b09182f" containerName="dnsmasq-dns" Mar 09 13:23:25 crc kubenswrapper[4723]: E0309 13:23:25.102166 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f23bcc5-001d-4f5e-a28f-ed00ab283c01" containerName="nova-cell1-conductor-db-sync" Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.102175 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f23bcc5-001d-4f5e-a28f-ed00ab283c01" containerName="nova-cell1-conductor-db-sync" Mar 09 13:23:25 crc kubenswrapper[4723]: E0309 13:23:25.102190 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f49fba07-b6b5-4e69-9c8a-2c3e1b09182f" containerName="init" Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.102197 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="f49fba07-b6b5-4e69-9c8a-2c3e1b09182f" containerName="init" Mar 09 13:23:25 crc kubenswrapper[4723]: E0309 13:23:25.102214 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4e81d20-6788-48e4-a38c-2dda5e6cc206" containerName="nova-manage" Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.102223 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4e81d20-6788-48e4-a38c-2dda5e6cc206" containerName="nova-manage" Mar 09 13:23:25 crc kubenswrapper[4723]: E0309 13:23:25.102249 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaf73308-8e25-4470-a1d4-a2f9d59b1cd6" containerName="mariadb-database-create" Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.102257 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaf73308-8e25-4470-a1d4-a2f9d59b1cd6" containerName="mariadb-database-create" Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.102509 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaf73308-8e25-4470-a1d4-a2f9d59b1cd6" containerName="mariadb-database-create" Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.102527 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4e81d20-6788-48e4-a38c-2dda5e6cc206" containerName="nova-manage" Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.102539 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="49c81a50-9366-425a-b957-330022266b2d" containerName="mariadb-account-create-update" Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.102553 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="f49fba07-b6b5-4e69-9c8a-2c3e1b09182f" containerName="dnsmasq-dns" Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.102563 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f23bcc5-001d-4f5e-a28f-ed00ab283c01" containerName="nova-cell1-conductor-db-sync" Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.103469 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-5td4v" Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.106131 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.106402 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.106436 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-97c45" Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.106820 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.116719 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-5td4v"] Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.219594 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6f0b3e5-b717-4ea6-ae5b-400876201699-combined-ca-bundle\") pod \"aodh-db-sync-5td4v\" (UID: \"f6f0b3e5-b717-4ea6-ae5b-400876201699\") " pod="openstack/aodh-db-sync-5td4v" Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.219691 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szvr6\" (UniqueName: \"kubernetes.io/projected/f6f0b3e5-b717-4ea6-ae5b-400876201699-kube-api-access-szvr6\") pod \"aodh-db-sync-5td4v\" (UID: \"f6f0b3e5-b717-4ea6-ae5b-400876201699\") " pod="openstack/aodh-db-sync-5td4v" Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.219887 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6f0b3e5-b717-4ea6-ae5b-400876201699-config-data\") pod \"aodh-db-sync-5td4v\" (UID: \"f6f0b3e5-b717-4ea6-ae5b-400876201699\") " pod="openstack/aodh-db-sync-5td4v" Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.220047 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6f0b3e5-b717-4ea6-ae5b-400876201699-scripts\") pod \"aodh-db-sync-5td4v\" (UID: \"f6f0b3e5-b717-4ea6-ae5b-400876201699\") " pod="openstack/aodh-db-sync-5td4v" Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.222616 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hx7mw" event={"ID":"1f23bcc5-001d-4f5e-a28f-ed00ab283c01","Type":"ContainerDied","Data":"a75b429874e05c3595dbba592b12efff830d878e493f4ef2ac34179a10b11824"} Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.222654 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a75b429874e05c3595dbba592b12efff830d878e493f4ef2ac34179a10b11824" Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.222682 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hx7mw" Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.322442 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6f0b3e5-b717-4ea6-ae5b-400876201699-combined-ca-bundle\") pod \"aodh-db-sync-5td4v\" (UID: \"f6f0b3e5-b717-4ea6-ae5b-400876201699\") " pod="openstack/aodh-db-sync-5td4v" Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.322510 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szvr6\" (UniqueName: \"kubernetes.io/projected/f6f0b3e5-b717-4ea6-ae5b-400876201699-kube-api-access-szvr6\") pod \"aodh-db-sync-5td4v\" (UID: \"f6f0b3e5-b717-4ea6-ae5b-400876201699\") " pod="openstack/aodh-db-sync-5td4v" Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.322558 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6f0b3e5-b717-4ea6-ae5b-400876201699-config-data\") pod \"aodh-db-sync-5td4v\" (UID: \"f6f0b3e5-b717-4ea6-ae5b-400876201699\") " pod="openstack/aodh-db-sync-5td4v" Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.322600 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6f0b3e5-b717-4ea6-ae5b-400876201699-scripts\") pod \"aodh-db-sync-5td4v\" (UID: \"f6f0b3e5-b717-4ea6-ae5b-400876201699\") " pod="openstack/aodh-db-sync-5td4v" Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.327121 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6f0b3e5-b717-4ea6-ae5b-400876201699-scripts\") pod \"aodh-db-sync-5td4v\" (UID: \"f6f0b3e5-b717-4ea6-ae5b-400876201699\") " pod="openstack/aodh-db-sync-5td4v" Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.327179 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.327268 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6f0b3e5-b717-4ea6-ae5b-400876201699-combined-ca-bundle\") pod \"aodh-db-sync-5td4v\" (UID: \"f6f0b3e5-b717-4ea6-ae5b-400876201699\") " pod="openstack/aodh-db-sync-5td4v" Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.328969 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.329317 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6f0b3e5-b717-4ea6-ae5b-400876201699-config-data\") pod \"aodh-db-sync-5td4v\" (UID: \"f6f0b3e5-b717-4ea6-ae5b-400876201699\") " pod="openstack/aodh-db-sync-5td4v" Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.330912 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.338363 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.393746 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szvr6\" (UniqueName: \"kubernetes.io/projected/f6f0b3e5-b717-4ea6-ae5b-400876201699-kube-api-access-szvr6\") pod \"aodh-db-sync-5td4v\" (UID: \"f6f0b3e5-b717-4ea6-ae5b-400876201699\") " pod="openstack/aodh-db-sync-5td4v" Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.424755 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3884df2d-0e53-4bdd-b661-b809a2791240-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"3884df2d-0e53-4bdd-b661-b809a2791240\") " pod="openstack/nova-cell1-conductor-0" Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.424854 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3884df2d-0e53-4bdd-b661-b809a2791240-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"3884df2d-0e53-4bdd-b661-b809a2791240\") " pod="openstack/nova-cell1-conductor-0" Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.425100 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5wtd\" (UniqueName: \"kubernetes.io/projected/3884df2d-0e53-4bdd-b661-b809a2791240-kube-api-access-t5wtd\") pod \"nova-cell1-conductor-0\" (UID: \"3884df2d-0e53-4bdd-b661-b809a2791240\") " pod="openstack/nova-cell1-conductor-0" Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.428144 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-5td4v" Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.529967 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5wtd\" (UniqueName: \"kubernetes.io/projected/3884df2d-0e53-4bdd-b661-b809a2791240-kube-api-access-t5wtd\") pod \"nova-cell1-conductor-0\" (UID: \"3884df2d-0e53-4bdd-b661-b809a2791240\") " pod="openstack/nova-cell1-conductor-0" Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.530098 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3884df2d-0e53-4bdd-b661-b809a2791240-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"3884df2d-0e53-4bdd-b661-b809a2791240\") " pod="openstack/nova-cell1-conductor-0" Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.530166 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3884df2d-0e53-4bdd-b661-b809a2791240-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"3884df2d-0e53-4bdd-b661-b809a2791240\") " pod="openstack/nova-cell1-conductor-0" Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.539017 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3884df2d-0e53-4bdd-b661-b809a2791240-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"3884df2d-0e53-4bdd-b661-b809a2791240\") " pod="openstack/nova-cell1-conductor-0" Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.539069 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3884df2d-0e53-4bdd-b661-b809a2791240-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"3884df2d-0e53-4bdd-b661-b809a2791240\") " pod="openstack/nova-cell1-conductor-0" Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.550646 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5wtd\" (UniqueName: \"kubernetes.io/projected/3884df2d-0e53-4bdd-b661-b809a2791240-kube-api-access-t5wtd\") pod \"nova-cell1-conductor-0\" (UID: \"3884df2d-0e53-4bdd-b661-b809a2791240\") " pod="openstack/nova-cell1-conductor-0" Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.758705 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 09 13:23:25 crc kubenswrapper[4723]: I0309 13:23:25.929614 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-5td4v"] Mar 09 13:23:26 crc kubenswrapper[4723]: I0309 13:23:26.239537 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-5td4v" event={"ID":"f6f0b3e5-b717-4ea6-ae5b-400876201699","Type":"ContainerStarted","Data":"48046610ee9e4d023b5f81d2d1877198ee4b567dde525d5f4421da012fdf4821"} Mar 09 13:23:26 crc kubenswrapper[4723]: I0309 13:23:26.303528 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 09 13:23:26 crc kubenswrapper[4723]: W0309 13:23:26.305711 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3884df2d_0e53_4bdd_b661_b809a2791240.slice/crio-21e9c7af48e70e63792fecc7a7d4be1ce8f3438424fe9e8d8d38f2a48fd48f15 WatchSource:0}: Error finding container 21e9c7af48e70e63792fecc7a7d4be1ce8f3438424fe9e8d8d38f2a48fd48f15: Status 404 returned error can't find the container with id 21e9c7af48e70e63792fecc7a7d4be1ce8f3438424fe9e8d8d38f2a48fd48f15 Mar 09 13:23:27 crc kubenswrapper[4723]: I0309 13:23:27.255200 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"3884df2d-0e53-4bdd-b661-b809a2791240","Type":"ContainerStarted","Data":"9f3e096694c149c92a0589a60a43e8f03808d65c828bb8082f0461ff6c86424f"} Mar 09 13:23:27 crc kubenswrapper[4723]: I0309 13:23:27.255545 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"3884df2d-0e53-4bdd-b661-b809a2791240","Type":"ContainerStarted","Data":"21e9c7af48e70e63792fecc7a7d4be1ce8f3438424fe9e8d8d38f2a48fd48f15"} Mar 09 13:23:27 crc kubenswrapper[4723]: I0309 13:23:27.255564 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 09 13:23:27 crc kubenswrapper[4723]: I0309 13:23:27.275583 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.275561682 podStartE2EDuration="2.275561682s" podCreationTimestamp="2026-03-09 13:23:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:27.27208122 +0000 UTC m=+1481.286548780" watchObservedRunningTime="2026-03-09 13:23:27.275561682 +0000 UTC m=+1481.290029222" Mar 09 13:23:28 crc kubenswrapper[4723]: I0309 13:23:28.567950 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 09 13:23:29 crc kubenswrapper[4723]: I0309 13:23:29.066187 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:23:29 crc kubenswrapper[4723]: I0309 13:23:29.066579 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1bdee676-b46b-477b-ba90-669494e8a6b0" containerName="nova-api-log" containerID="cri-o://28b113e59c97cba2bc909ba80e8cf2d104fcb2ee897edb9dd1309acecdc972a7" gracePeriod=30 Mar 09 13:23:29 crc kubenswrapper[4723]: I0309 13:23:29.066837 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1bdee676-b46b-477b-ba90-669494e8a6b0" containerName="nova-api-api" containerID="cri-o://2c13c083cdef2d26ea58c637737f65c467103f2e07facbc6f7a945a865d7fe77" gracePeriod=30 Mar 09 13:23:29 crc kubenswrapper[4723]: I0309 13:23:29.279243 4723 generic.go:334] "Generic (PLEG): container finished" podID="1bdee676-b46b-477b-ba90-669494e8a6b0" containerID="28b113e59c97cba2bc909ba80e8cf2d104fcb2ee897edb9dd1309acecdc972a7" exitCode=143 Mar 09 13:23:29 crc kubenswrapper[4723]: I0309 13:23:29.279282 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1bdee676-b46b-477b-ba90-669494e8a6b0","Type":"ContainerDied","Data":"28b113e59c97cba2bc909ba80e8cf2d104fcb2ee897edb9dd1309acecdc972a7"} Mar 09 13:23:30 crc kubenswrapper[4723]: I0309 13:23:30.306501 4723 generic.go:334] "Generic (PLEG): container finished" podID="1bdee676-b46b-477b-ba90-669494e8a6b0" containerID="2c13c083cdef2d26ea58c637737f65c467103f2e07facbc6f7a945a865d7fe77" exitCode=0 Mar 09 13:23:30 crc kubenswrapper[4723]: I0309 13:23:30.306577 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1bdee676-b46b-477b-ba90-669494e8a6b0","Type":"ContainerDied","Data":"2c13c083cdef2d26ea58c637737f65c467103f2e07facbc6f7a945a865d7fe77"} Mar 09 13:23:31 crc kubenswrapper[4723]: I0309 13:23:31.189725 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 13:23:31 crc kubenswrapper[4723]: I0309 13:23:31.287255 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1bdee676-b46b-477b-ba90-669494e8a6b0-logs\") pod \"1bdee676-b46b-477b-ba90-669494e8a6b0\" (UID: \"1bdee676-b46b-477b-ba90-669494e8a6b0\") " Mar 09 13:23:31 crc kubenswrapper[4723]: I0309 13:23:31.287330 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bdee676-b46b-477b-ba90-669494e8a6b0-config-data\") pod \"1bdee676-b46b-477b-ba90-669494e8a6b0\" (UID: \"1bdee676-b46b-477b-ba90-669494e8a6b0\") " Mar 09 13:23:31 crc kubenswrapper[4723]: I0309 13:23:31.287642 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bdee676-b46b-477b-ba90-669494e8a6b0-combined-ca-bundle\") pod \"1bdee676-b46b-477b-ba90-669494e8a6b0\" (UID: \"1bdee676-b46b-477b-ba90-669494e8a6b0\") " Mar 09 13:23:31 crc kubenswrapper[4723]: I0309 13:23:31.287980 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97lns\" (UniqueName: \"kubernetes.io/projected/1bdee676-b46b-477b-ba90-669494e8a6b0-kube-api-access-97lns\") pod \"1bdee676-b46b-477b-ba90-669494e8a6b0\" (UID: \"1bdee676-b46b-477b-ba90-669494e8a6b0\") " Mar 09 13:23:31 crc kubenswrapper[4723]: I0309 13:23:31.292323 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bdee676-b46b-477b-ba90-669494e8a6b0-logs" (OuterVolumeSpecName: "logs") pod "1bdee676-b46b-477b-ba90-669494e8a6b0" (UID: "1bdee676-b46b-477b-ba90-669494e8a6b0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:23:31 crc kubenswrapper[4723]: I0309 13:23:31.294776 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bdee676-b46b-477b-ba90-669494e8a6b0-kube-api-access-97lns" (OuterVolumeSpecName: "kube-api-access-97lns") pod "1bdee676-b46b-477b-ba90-669494e8a6b0" (UID: "1bdee676-b46b-477b-ba90-669494e8a6b0"). InnerVolumeSpecName "kube-api-access-97lns". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:23:31 crc kubenswrapper[4723]: I0309 13:23:31.322000 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 13:23:31 crc kubenswrapper[4723]: I0309 13:23:31.322075 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1bdee676-b46b-477b-ba90-669494e8a6b0","Type":"ContainerDied","Data":"92b136921af210347dc86422ba008b341842fa1d1eb09d8eb98c6ae6148ec92e"} Mar 09 13:23:31 crc kubenswrapper[4723]: I0309 13:23:31.322130 4723 scope.go:117] "RemoveContainer" containerID="2c13c083cdef2d26ea58c637737f65c467103f2e07facbc6f7a945a865d7fe77" Mar 09 13:23:31 crc kubenswrapper[4723]: I0309 13:23:31.325996 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bdee676-b46b-477b-ba90-669494e8a6b0-config-data" (OuterVolumeSpecName: "config-data") pod "1bdee676-b46b-477b-ba90-669494e8a6b0" (UID: "1bdee676-b46b-477b-ba90-669494e8a6b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:23:31 crc kubenswrapper[4723]: I0309 13:23:31.326439 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-5td4v" event={"ID":"f6f0b3e5-b717-4ea6-ae5b-400876201699","Type":"ContainerStarted","Data":"7a5d6970703d77b0e5e36ab6fd8890a569908407f0a4b312edfdca0e38b0ad32"} Mar 09 13:23:31 crc kubenswrapper[4723]: I0309 13:23:31.328825 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bdee676-b46b-477b-ba90-669494e8a6b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1bdee676-b46b-477b-ba90-669494e8a6b0" (UID: "1bdee676-b46b-477b-ba90-669494e8a6b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:23:31 crc kubenswrapper[4723]: I0309 13:23:31.363612 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-5td4v" podStartSLOduration=1.451400032 podStartE2EDuration="6.363589581s" podCreationTimestamp="2026-03-09 13:23:25 +0000 UTC" firstStartedPulling="2026-03-09 13:23:25.952388211 +0000 UTC m=+1479.966855751" lastFinishedPulling="2026-03-09 13:23:30.86457776 +0000 UTC m=+1484.879045300" observedRunningTime="2026-03-09 13:23:31.338255924 +0000 UTC m=+1485.352723464" watchObservedRunningTime="2026-03-09 13:23:31.363589581 +0000 UTC m=+1485.378057131" Mar 09 13:23:31 crc kubenswrapper[4723]: I0309 13:23:31.390568 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bdee676-b46b-477b-ba90-669494e8a6b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:31 crc kubenswrapper[4723]: I0309 13:23:31.390937 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97lns\" (UniqueName: \"kubernetes.io/projected/1bdee676-b46b-477b-ba90-669494e8a6b0-kube-api-access-97lns\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:31 crc kubenswrapper[4723]: I0309 13:23:31.390960 4723 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1bdee676-b46b-477b-ba90-669494e8a6b0-logs\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:31 crc kubenswrapper[4723]: I0309 13:23:31.390987 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bdee676-b46b-477b-ba90-669494e8a6b0-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:31 crc kubenswrapper[4723]: I0309 13:23:31.428468 4723 scope.go:117] "RemoveContainer" containerID="28b113e59c97cba2bc909ba80e8cf2d104fcb2ee897edb9dd1309acecdc972a7" Mar 09 13:23:31 crc kubenswrapper[4723]: I0309 13:23:31.659885 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:23:31 crc kubenswrapper[4723]: I0309 13:23:31.678577 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:23:31 crc kubenswrapper[4723]: I0309 13:23:31.693653 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 09 13:23:31 crc kubenswrapper[4723]: E0309 13:23:31.694336 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bdee676-b46b-477b-ba90-669494e8a6b0" containerName="nova-api-api" Mar 09 13:23:31 crc kubenswrapper[4723]: I0309 13:23:31.694356 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bdee676-b46b-477b-ba90-669494e8a6b0" containerName="nova-api-api" Mar 09 13:23:31 crc kubenswrapper[4723]: E0309 13:23:31.694388 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bdee676-b46b-477b-ba90-669494e8a6b0" containerName="nova-api-log" Mar 09 13:23:31 crc kubenswrapper[4723]: I0309 13:23:31.694396 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bdee676-b46b-477b-ba90-669494e8a6b0" containerName="nova-api-log" Mar 09 13:23:31 crc kubenswrapper[4723]: I0309 13:23:31.694606 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bdee676-b46b-477b-ba90-669494e8a6b0" containerName="nova-api-log" Mar 09 13:23:31 crc kubenswrapper[4723]: I0309 13:23:31.694625 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bdee676-b46b-477b-ba90-669494e8a6b0" containerName="nova-api-api" Mar 09 13:23:31 crc kubenswrapper[4723]: I0309 13:23:31.696027 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 13:23:31 crc kubenswrapper[4723]: I0309 13:23:31.699044 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 09 13:23:31 crc kubenswrapper[4723]: I0309 13:23:31.710405 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:23:31 crc kubenswrapper[4723]: I0309 13:23:31.799145 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2938063f-bc95-4b78-8636-eded365a5f2c-logs\") pod \"nova-api-0\" (UID: \"2938063f-bc95-4b78-8636-eded365a5f2c\") " pod="openstack/nova-api-0" Mar 09 13:23:31 crc kubenswrapper[4723]: I0309 13:23:31.799269 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2938063f-bc95-4b78-8636-eded365a5f2c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2938063f-bc95-4b78-8636-eded365a5f2c\") " pod="openstack/nova-api-0" Mar 09 13:23:31 crc kubenswrapper[4723]: I0309 13:23:31.799354 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mwcd\" (UniqueName: \"kubernetes.io/projected/2938063f-bc95-4b78-8636-eded365a5f2c-kube-api-access-9mwcd\") pod \"nova-api-0\" (UID: \"2938063f-bc95-4b78-8636-eded365a5f2c\") " pod="openstack/nova-api-0" Mar 09 13:23:31 crc kubenswrapper[4723]: I0309 13:23:31.799409 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2938063f-bc95-4b78-8636-eded365a5f2c-config-data\") pod \"nova-api-0\" (UID: \"2938063f-bc95-4b78-8636-eded365a5f2c\") " pod="openstack/nova-api-0" Mar 09 13:23:31 crc kubenswrapper[4723]: I0309 13:23:31.901123 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2938063f-bc95-4b78-8636-eded365a5f2c-logs\") pod \"nova-api-0\" (UID: \"2938063f-bc95-4b78-8636-eded365a5f2c\") " pod="openstack/nova-api-0" Mar 09 13:23:31 crc kubenswrapper[4723]: I0309 13:23:31.901229 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2938063f-bc95-4b78-8636-eded365a5f2c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2938063f-bc95-4b78-8636-eded365a5f2c\") " pod="openstack/nova-api-0" Mar 09 13:23:31 crc kubenswrapper[4723]: I0309 13:23:31.901309 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mwcd\" (UniqueName: \"kubernetes.io/projected/2938063f-bc95-4b78-8636-eded365a5f2c-kube-api-access-9mwcd\") pod \"nova-api-0\" (UID: \"2938063f-bc95-4b78-8636-eded365a5f2c\") " pod="openstack/nova-api-0" Mar 09 13:23:31 crc kubenswrapper[4723]: I0309 13:23:31.901352 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2938063f-bc95-4b78-8636-eded365a5f2c-config-data\") pod \"nova-api-0\" (UID: \"2938063f-bc95-4b78-8636-eded365a5f2c\") " pod="openstack/nova-api-0" Mar 09 13:23:31 crc kubenswrapper[4723]: I0309 13:23:31.901617 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2938063f-bc95-4b78-8636-eded365a5f2c-logs\") pod \"nova-api-0\" (UID: \"2938063f-bc95-4b78-8636-eded365a5f2c\") " pod="openstack/nova-api-0" Mar 09 13:23:31 crc kubenswrapper[4723]: I0309 13:23:31.905414 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2938063f-bc95-4b78-8636-eded365a5f2c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2938063f-bc95-4b78-8636-eded365a5f2c\") " pod="openstack/nova-api-0" Mar 09 13:23:31 crc kubenswrapper[4723]: I0309 13:23:31.906960 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2938063f-bc95-4b78-8636-eded365a5f2c-config-data\") pod \"nova-api-0\" (UID: \"2938063f-bc95-4b78-8636-eded365a5f2c\") " pod="openstack/nova-api-0" Mar 09 13:23:31 crc kubenswrapper[4723]: I0309 13:23:31.923403 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mwcd\" (UniqueName: \"kubernetes.io/projected/2938063f-bc95-4b78-8636-eded365a5f2c-kube-api-access-9mwcd\") pod \"nova-api-0\" (UID: \"2938063f-bc95-4b78-8636-eded365a5f2c\") " pod="openstack/nova-api-0" Mar 09 13:23:32 crc kubenswrapper[4723]: I0309 13:23:32.024159 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 13:23:32 crc kubenswrapper[4723]: I0309 13:23:32.661785 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:23:32 crc kubenswrapper[4723]: I0309 13:23:32.910242 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bdee676-b46b-477b-ba90-669494e8a6b0" path="/var/lib/kubelet/pods/1bdee676-b46b-477b-ba90-669494e8a6b0/volumes" Mar 09 13:23:33 crc kubenswrapper[4723]: I0309 13:23:33.359678 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2938063f-bc95-4b78-8636-eded365a5f2c","Type":"ContainerStarted","Data":"18d1cd29d9a9d38d973f5f28934391a8eb60e638e99dc741611feb5303a88dda"} Mar 09 13:23:33 crc kubenswrapper[4723]: I0309 13:23:33.360052 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2938063f-bc95-4b78-8636-eded365a5f2c","Type":"ContainerStarted","Data":"3f3816cac9b4c3afaf8027d64d1b71242c59fe7fca4012aee44c7598ade3d4cb"} Mar 09 13:23:34 crc kubenswrapper[4723]: I0309 13:23:34.374200 4723 generic.go:334] "Generic (PLEG): container finished" podID="f6f0b3e5-b717-4ea6-ae5b-400876201699" containerID="7a5d6970703d77b0e5e36ab6fd8890a569908407f0a4b312edfdca0e38b0ad32" exitCode=0 Mar 09 13:23:34 crc kubenswrapper[4723]: I0309 13:23:34.374300 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-5td4v" event={"ID":"f6f0b3e5-b717-4ea6-ae5b-400876201699","Type":"ContainerDied","Data":"7a5d6970703d77b0e5e36ab6fd8890a569908407f0a4b312edfdca0e38b0ad32"} Mar 09 13:23:34 crc kubenswrapper[4723]: I0309 13:23:34.377440 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2938063f-bc95-4b78-8636-eded365a5f2c","Type":"ContainerStarted","Data":"2b604c7c44840f5d4b27142f76394a30a5bcc51b7c6263488e159a907d976abd"} Mar 09 13:23:34 crc kubenswrapper[4723]: I0309 13:23:34.429641 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.429621176 podStartE2EDuration="3.429621176s" podCreationTimestamp="2026-03-09 13:23:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:34.41613993 +0000 UTC m=+1488.430607570" watchObservedRunningTime="2026-03-09 13:23:34.429621176 +0000 UTC m=+1488.444088726" Mar 09 13:23:35 crc kubenswrapper[4723]: I0309 13:23:35.797554 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 09 13:23:35 crc kubenswrapper[4723]: I0309 13:23:35.940761 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-5td4v" Mar 09 13:23:36 crc kubenswrapper[4723]: I0309 13:23:36.006161 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szvr6\" (UniqueName: \"kubernetes.io/projected/f6f0b3e5-b717-4ea6-ae5b-400876201699-kube-api-access-szvr6\") pod \"f6f0b3e5-b717-4ea6-ae5b-400876201699\" (UID: \"f6f0b3e5-b717-4ea6-ae5b-400876201699\") " Mar 09 13:23:36 crc kubenswrapper[4723]: I0309 13:23:36.006413 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6f0b3e5-b717-4ea6-ae5b-400876201699-combined-ca-bundle\") pod \"f6f0b3e5-b717-4ea6-ae5b-400876201699\" (UID: \"f6f0b3e5-b717-4ea6-ae5b-400876201699\") " Mar 09 13:23:36 crc kubenswrapper[4723]: I0309 13:23:36.006567 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6f0b3e5-b717-4ea6-ae5b-400876201699-scripts\") pod \"f6f0b3e5-b717-4ea6-ae5b-400876201699\" (UID: \"f6f0b3e5-b717-4ea6-ae5b-400876201699\") " Mar 09 13:23:36 crc kubenswrapper[4723]: I0309 13:23:36.006697 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6f0b3e5-b717-4ea6-ae5b-400876201699-config-data\") pod \"f6f0b3e5-b717-4ea6-ae5b-400876201699\" (UID: \"f6f0b3e5-b717-4ea6-ae5b-400876201699\") " Mar 09 13:23:36 crc kubenswrapper[4723]: I0309 13:23:36.011581 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6f0b3e5-b717-4ea6-ae5b-400876201699-kube-api-access-szvr6" (OuterVolumeSpecName: "kube-api-access-szvr6") pod "f6f0b3e5-b717-4ea6-ae5b-400876201699" (UID: "f6f0b3e5-b717-4ea6-ae5b-400876201699"). InnerVolumeSpecName "kube-api-access-szvr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:23:36 crc kubenswrapper[4723]: I0309 13:23:36.012998 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6f0b3e5-b717-4ea6-ae5b-400876201699-scripts" (OuterVolumeSpecName: "scripts") pod "f6f0b3e5-b717-4ea6-ae5b-400876201699" (UID: "f6f0b3e5-b717-4ea6-ae5b-400876201699"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:23:36 crc kubenswrapper[4723]: I0309 13:23:36.043620 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6f0b3e5-b717-4ea6-ae5b-400876201699-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f6f0b3e5-b717-4ea6-ae5b-400876201699" (UID: "f6f0b3e5-b717-4ea6-ae5b-400876201699"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:23:36 crc kubenswrapper[4723]: I0309 13:23:36.049919 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6f0b3e5-b717-4ea6-ae5b-400876201699-config-data" (OuterVolumeSpecName: "config-data") pod "f6f0b3e5-b717-4ea6-ae5b-400876201699" (UID: "f6f0b3e5-b717-4ea6-ae5b-400876201699"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:23:36 crc kubenswrapper[4723]: I0309 13:23:36.110792 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6f0b3e5-b717-4ea6-ae5b-400876201699-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:36 crc kubenswrapper[4723]: I0309 13:23:36.110843 4723 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6f0b3e5-b717-4ea6-ae5b-400876201699-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:36 crc kubenswrapper[4723]: I0309 13:23:36.110878 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6f0b3e5-b717-4ea6-ae5b-400876201699-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:36 crc kubenswrapper[4723]: I0309 13:23:36.110896 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szvr6\" (UniqueName: \"kubernetes.io/projected/f6f0b3e5-b717-4ea6-ae5b-400876201699-kube-api-access-szvr6\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:36 crc kubenswrapper[4723]: I0309 13:23:36.401782 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-5td4v" event={"ID":"f6f0b3e5-b717-4ea6-ae5b-400876201699","Type":"ContainerDied","Data":"48046610ee9e4d023b5f81d2d1877198ee4b567dde525d5f4421da012fdf4821"} Mar 09 13:23:36 crc kubenswrapper[4723]: I0309 13:23:36.401825 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48046610ee9e4d023b5f81d2d1877198ee4b567dde525d5f4421da012fdf4821" Mar 09 13:23:36 crc kubenswrapper[4723]: I0309 13:23:36.401907 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-5td4v" Mar 09 13:23:39 crc kubenswrapper[4723]: I0309 13:23:39.675975 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 09 13:23:39 crc kubenswrapper[4723]: E0309 13:23:39.676629 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6f0b3e5-b717-4ea6-ae5b-400876201699" containerName="aodh-db-sync" Mar 09 13:23:39 crc kubenswrapper[4723]: I0309 13:23:39.676641 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6f0b3e5-b717-4ea6-ae5b-400876201699" containerName="aodh-db-sync" Mar 09 13:23:39 crc kubenswrapper[4723]: I0309 13:23:39.676870 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6f0b3e5-b717-4ea6-ae5b-400876201699" containerName="aodh-db-sync" Mar 09 13:23:39 crc kubenswrapper[4723]: I0309 13:23:39.679708 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 09 13:23:39 crc kubenswrapper[4723]: I0309 13:23:39.682779 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-97c45" Mar 09 13:23:39 crc kubenswrapper[4723]: I0309 13:23:39.682963 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 09 13:23:39 crc kubenswrapper[4723]: I0309 13:23:39.683094 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 09 13:23:39 crc kubenswrapper[4723]: I0309 13:23:39.745141 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 09 13:23:39 crc kubenswrapper[4723]: I0309 13:23:39.828742 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0637c672-5bcf-44ec-add1-638ce6065b6e-scripts\") pod \"aodh-0\" (UID: \"0637c672-5bcf-44ec-add1-638ce6065b6e\") " pod="openstack/aodh-0" Mar 09 13:23:39 crc kubenswrapper[4723]: I0309 13:23:39.828823 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0637c672-5bcf-44ec-add1-638ce6065b6e-config-data\") pod \"aodh-0\" (UID: \"0637c672-5bcf-44ec-add1-638ce6065b6e\") " pod="openstack/aodh-0" Mar 09 13:23:39 crc kubenswrapper[4723]: I0309 13:23:39.829005 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzlfs\" (UniqueName: \"kubernetes.io/projected/0637c672-5bcf-44ec-add1-638ce6065b6e-kube-api-access-jzlfs\") pod \"aodh-0\" (UID: \"0637c672-5bcf-44ec-add1-638ce6065b6e\") " pod="openstack/aodh-0" Mar 09 13:23:39 crc kubenswrapper[4723]: I0309 13:23:39.829046 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0637c672-5bcf-44ec-add1-638ce6065b6e-combined-ca-bundle\") pod \"aodh-0\" (UID: \"0637c672-5bcf-44ec-add1-638ce6065b6e\") " pod="openstack/aodh-0" Mar 09 13:23:39 crc kubenswrapper[4723]: I0309 13:23:39.930629 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0637c672-5bcf-44ec-add1-638ce6065b6e-scripts\") pod \"aodh-0\" (UID: \"0637c672-5bcf-44ec-add1-638ce6065b6e\") " pod="openstack/aodh-0" Mar 09 13:23:39 crc kubenswrapper[4723]: I0309 13:23:39.930713 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0637c672-5bcf-44ec-add1-638ce6065b6e-config-data\") pod \"aodh-0\" (UID: \"0637c672-5bcf-44ec-add1-638ce6065b6e\") " pod="openstack/aodh-0" Mar 09 13:23:39 crc kubenswrapper[4723]: I0309 13:23:39.930844 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzlfs\" (UniqueName: \"kubernetes.io/projected/0637c672-5bcf-44ec-add1-638ce6065b6e-kube-api-access-jzlfs\") pod \"aodh-0\" (UID: \"0637c672-5bcf-44ec-add1-638ce6065b6e\") " pod="openstack/aodh-0" Mar 09 13:23:39 crc kubenswrapper[4723]: I0309 13:23:39.930872 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0637c672-5bcf-44ec-add1-638ce6065b6e-combined-ca-bundle\") pod \"aodh-0\" (UID: \"0637c672-5bcf-44ec-add1-638ce6065b6e\") " pod="openstack/aodh-0" Mar 09 13:23:39 crc kubenswrapper[4723]: I0309 13:23:39.954219 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0637c672-5bcf-44ec-add1-638ce6065b6e-scripts\") pod \"aodh-0\" (UID: \"0637c672-5bcf-44ec-add1-638ce6065b6e\") " pod="openstack/aodh-0" Mar 09 13:23:39 crc kubenswrapper[4723]: I0309 13:23:39.958167 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0637c672-5bcf-44ec-add1-638ce6065b6e-config-data\") pod \"aodh-0\" (UID: \"0637c672-5bcf-44ec-add1-638ce6065b6e\") " pod="openstack/aodh-0" Mar 09 13:23:39 crc kubenswrapper[4723]: I0309 13:23:39.962662 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0637c672-5bcf-44ec-add1-638ce6065b6e-combined-ca-bundle\") pod \"aodh-0\" (UID: \"0637c672-5bcf-44ec-add1-638ce6065b6e\") " pod="openstack/aodh-0" Mar 09 13:23:39 crc kubenswrapper[4723]: I0309 13:23:39.974524 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzlfs\" (UniqueName: \"kubernetes.io/projected/0637c672-5bcf-44ec-add1-638ce6065b6e-kube-api-access-jzlfs\") pod \"aodh-0\" (UID: \"0637c672-5bcf-44ec-add1-638ce6065b6e\") " pod="openstack/aodh-0" Mar 09 13:23:40 crc kubenswrapper[4723]: I0309 13:23:40.022534 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 09 13:23:40 crc kubenswrapper[4723]: W0309 13:23:40.604652 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0637c672_5bcf_44ec_add1_638ce6065b6e.slice/crio-510b9a27bab141e2aa97c63779d76a99eec4168d719163a076701378a4c88253 WatchSource:0}: Error finding container 510b9a27bab141e2aa97c63779d76a99eec4168d719163a076701378a4c88253: Status 404 returned error can't find the container with id 510b9a27bab141e2aa97c63779d76a99eec4168d719163a076701378a4c88253 Mar 09 13:23:40 crc kubenswrapper[4723]: I0309 13:23:40.610244 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 09 13:23:41 crc kubenswrapper[4723]: I0309 13:23:41.462538 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"0637c672-5bcf-44ec-add1-638ce6065b6e","Type":"ContainerStarted","Data":"98fd34ceecb8a13f7287803e9ecac6b659250dad38768f0624b41f08d6d156c9"} Mar 09 13:23:41 crc kubenswrapper[4723]: I0309 13:23:41.462830 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"0637c672-5bcf-44ec-add1-638ce6065b6e","Type":"ContainerStarted","Data":"510b9a27bab141e2aa97c63779d76a99eec4168d719163a076701378a4c88253"} Mar 09 13:23:42 crc kubenswrapper[4723]: I0309 13:23:42.025020 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 13:23:42 crc kubenswrapper[4723]: I0309 13:23:42.025068 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 13:23:42 crc kubenswrapper[4723]: I0309 13:23:42.679253 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:23:42 crc kubenswrapper[4723]: I0309 13:23:42.679904 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0ec76144-c358-4a57-bf5e-b3fd37531ae7" containerName="ceilometer-central-agent" containerID="cri-o://85df1e22230c029105d30e0b99b6d8fd55c16180865e5e0cbec9110f3f0625a9" gracePeriod=30 Mar 09 13:23:42 crc kubenswrapper[4723]: I0309 13:23:42.680047 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0ec76144-c358-4a57-bf5e-b3fd37531ae7" containerName="proxy-httpd" containerID="cri-o://4ba4539b9017d6bfe47e6cf804915e83bfa20355355863196282e7c2b529f246" gracePeriod=30 Mar 09 13:23:42 crc kubenswrapper[4723]: I0309 13:23:42.680096 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0ec76144-c358-4a57-bf5e-b3fd37531ae7" containerName="sg-core" containerID="cri-o://fa779e47056a0ebaf99c16d74c22c99b6b8add0cec4df04e7ed1cf0c8b8ce442" gracePeriod=30 Mar 09 13:23:42 crc kubenswrapper[4723]: I0309 13:23:42.680139 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0ec76144-c358-4a57-bf5e-b3fd37531ae7" containerName="ceilometer-notification-agent" containerID="cri-o://e2e269e516560e901d6fa603389152c0f546bda7295e7d75ae3e244622069e61" gracePeriod=30 Mar 09 13:23:42 crc kubenswrapper[4723]: I0309 13:23:42.700639 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="0ec76144-c358-4a57-bf5e-b3fd37531ae7" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Mar 09 13:23:43 crc kubenswrapper[4723]: I0309 13:23:43.060294 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 09 13:23:43 crc kubenswrapper[4723]: I0309 13:23:43.110048 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2938063f-bc95-4b78-8636-eded365a5f2c" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.2:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 13:23:43 crc kubenswrapper[4723]: I0309 13:23:43.110289 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2938063f-bc95-4b78-8636-eded365a5f2c" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.2:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 13:23:43 crc kubenswrapper[4723]: I0309 13:23:43.499303 4723 generic.go:334] "Generic (PLEG): container finished" podID="0ec76144-c358-4a57-bf5e-b3fd37531ae7" containerID="4ba4539b9017d6bfe47e6cf804915e83bfa20355355863196282e7c2b529f246" exitCode=0 Mar 09 13:23:43 crc kubenswrapper[4723]: I0309 13:23:43.499335 4723 generic.go:334] "Generic (PLEG): container finished" podID="0ec76144-c358-4a57-bf5e-b3fd37531ae7" containerID="fa779e47056a0ebaf99c16d74c22c99b6b8add0cec4df04e7ed1cf0c8b8ce442" exitCode=2 Mar 09 13:23:43 crc kubenswrapper[4723]: I0309 13:23:43.499343 4723 generic.go:334] "Generic (PLEG): container finished" podID="0ec76144-c358-4a57-bf5e-b3fd37531ae7" containerID="85df1e22230c029105d30e0b99b6d8fd55c16180865e5e0cbec9110f3f0625a9" exitCode=0 Mar 09 13:23:43 crc kubenswrapper[4723]: I0309 13:23:43.499392 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ec76144-c358-4a57-bf5e-b3fd37531ae7","Type":"ContainerDied","Data":"4ba4539b9017d6bfe47e6cf804915e83bfa20355355863196282e7c2b529f246"} Mar 09 13:23:43 crc kubenswrapper[4723]: I0309 13:23:43.499447 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ec76144-c358-4a57-bf5e-b3fd37531ae7","Type":"ContainerDied","Data":"fa779e47056a0ebaf99c16d74c22c99b6b8add0cec4df04e7ed1cf0c8b8ce442"} Mar 09 13:23:43 crc kubenswrapper[4723]: I0309 13:23:43.499459 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ec76144-c358-4a57-bf5e-b3fd37531ae7","Type":"ContainerDied","Data":"85df1e22230c029105d30e0b99b6d8fd55c16180865e5e0cbec9110f3f0625a9"} Mar 09 13:23:43 crc kubenswrapper[4723]: I0309 13:23:43.502321 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"0637c672-5bcf-44ec-add1-638ce6065b6e","Type":"ContainerStarted","Data":"e7b95cc57d3bb1e0f5b53a049f881d06fc9bb4e3f45b3f635cbaf04da358c06c"} Mar 09 13:23:44 crc kubenswrapper[4723]: I0309 13:23:44.426154 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:23:44 crc kubenswrapper[4723]: I0309 13:23:44.518089 4723 generic.go:334] "Generic (PLEG): container finished" podID="0ec76144-c358-4a57-bf5e-b3fd37531ae7" containerID="e2e269e516560e901d6fa603389152c0f546bda7295e7d75ae3e244622069e61" exitCode=0 Mar 09 13:23:44 crc kubenswrapper[4723]: I0309 13:23:44.518127 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ec76144-c358-4a57-bf5e-b3fd37531ae7","Type":"ContainerDied","Data":"e2e269e516560e901d6fa603389152c0f546bda7295e7d75ae3e244622069e61"} Mar 09 13:23:44 crc kubenswrapper[4723]: I0309 13:23:44.518151 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ec76144-c358-4a57-bf5e-b3fd37531ae7","Type":"ContainerDied","Data":"e63ddc8565dfd2cec19d6aac02bbdefd0a04dad15c43bdc7d17003322a628264"} Mar 09 13:23:44 crc kubenswrapper[4723]: I0309 13:23:44.518169 4723 scope.go:117] "RemoveContainer" containerID="4ba4539b9017d6bfe47e6cf804915e83bfa20355355863196282e7c2b529f246" Mar 09 13:23:44 crc kubenswrapper[4723]: I0309 13:23:44.518299 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:23:44 crc kubenswrapper[4723]: I0309 13:23:44.592491 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ec76144-c358-4a57-bf5e-b3fd37531ae7-combined-ca-bundle\") pod \"0ec76144-c358-4a57-bf5e-b3fd37531ae7\" (UID: \"0ec76144-c358-4a57-bf5e-b3fd37531ae7\") " Mar 09 13:23:44 crc kubenswrapper[4723]: I0309 13:23:44.592548 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ec76144-c358-4a57-bf5e-b3fd37531ae7-config-data\") pod \"0ec76144-c358-4a57-bf5e-b3fd37531ae7\" (UID: \"0ec76144-c358-4a57-bf5e-b3fd37531ae7\") " Mar 09 13:23:44 crc kubenswrapper[4723]: I0309 13:23:44.592651 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hwk4\" (UniqueName: \"kubernetes.io/projected/0ec76144-c358-4a57-bf5e-b3fd37531ae7-kube-api-access-7hwk4\") pod \"0ec76144-c358-4a57-bf5e-b3fd37531ae7\" (UID: \"0ec76144-c358-4a57-bf5e-b3fd37531ae7\") " Mar 09 13:23:44 crc kubenswrapper[4723]: I0309 13:23:44.592711 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ec76144-c358-4a57-bf5e-b3fd37531ae7-run-httpd\") pod \"0ec76144-c358-4a57-bf5e-b3fd37531ae7\" (UID: \"0ec76144-c358-4a57-bf5e-b3fd37531ae7\") " Mar 09 13:23:44 crc kubenswrapper[4723]: I0309 13:23:44.592828 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ec76144-c358-4a57-bf5e-b3fd37531ae7-scripts\") pod \"0ec76144-c358-4a57-bf5e-b3fd37531ae7\" (UID: \"0ec76144-c358-4a57-bf5e-b3fd37531ae7\") " Mar 09 13:23:44 crc kubenswrapper[4723]: I0309 13:23:44.592861 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0ec76144-c358-4a57-bf5e-b3fd37531ae7-sg-core-conf-yaml\") pod \"0ec76144-c358-4a57-bf5e-b3fd37531ae7\" (UID: \"0ec76144-c358-4a57-bf5e-b3fd37531ae7\") " Mar 09 13:23:44 crc kubenswrapper[4723]: I0309 13:23:44.592940 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ec76144-c358-4a57-bf5e-b3fd37531ae7-log-httpd\") pod \"0ec76144-c358-4a57-bf5e-b3fd37531ae7\" (UID: \"0ec76144-c358-4a57-bf5e-b3fd37531ae7\") " Mar 09 13:23:44 crc kubenswrapper[4723]: I0309 13:23:44.593502 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ec76144-c358-4a57-bf5e-b3fd37531ae7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0ec76144-c358-4a57-bf5e-b3fd37531ae7" (UID: "0ec76144-c358-4a57-bf5e-b3fd37531ae7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:23:44 crc kubenswrapper[4723]: I0309 13:23:44.593585 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ec76144-c358-4a57-bf5e-b3fd37531ae7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0ec76144-c358-4a57-bf5e-b3fd37531ae7" (UID: "0ec76144-c358-4a57-bf5e-b3fd37531ae7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:23:44 crc kubenswrapper[4723]: I0309 13:23:44.593740 4723 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ec76144-c358-4a57-bf5e-b3fd37531ae7-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:44 crc kubenswrapper[4723]: I0309 13:23:44.598314 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ec76144-c358-4a57-bf5e-b3fd37531ae7-kube-api-access-7hwk4" (OuterVolumeSpecName: "kube-api-access-7hwk4") pod "0ec76144-c358-4a57-bf5e-b3fd37531ae7" (UID: "0ec76144-c358-4a57-bf5e-b3fd37531ae7"). InnerVolumeSpecName "kube-api-access-7hwk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:23:44 crc kubenswrapper[4723]: I0309 13:23:44.608210 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ec76144-c358-4a57-bf5e-b3fd37531ae7-scripts" (OuterVolumeSpecName: "scripts") pod "0ec76144-c358-4a57-bf5e-b3fd37531ae7" (UID: "0ec76144-c358-4a57-bf5e-b3fd37531ae7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:23:44 crc kubenswrapper[4723]: I0309 13:23:44.638954 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ec76144-c358-4a57-bf5e-b3fd37531ae7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0ec76144-c358-4a57-bf5e-b3fd37531ae7" (UID: "0ec76144-c358-4a57-bf5e-b3fd37531ae7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:23:44 crc kubenswrapper[4723]: I0309 13:23:44.696447 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hwk4\" (UniqueName: \"kubernetes.io/projected/0ec76144-c358-4a57-bf5e-b3fd37531ae7-kube-api-access-7hwk4\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:44 crc kubenswrapper[4723]: I0309 13:23:44.696479 4723 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ec76144-c358-4a57-bf5e-b3fd37531ae7-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:44 crc kubenswrapper[4723]: I0309 13:23:44.696488 4723 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ec76144-c358-4a57-bf5e-b3fd37531ae7-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:44 crc kubenswrapper[4723]: I0309 13:23:44.696497 4723 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0ec76144-c358-4a57-bf5e-b3fd37531ae7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:44 crc kubenswrapper[4723]: I0309 13:23:44.733823 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ec76144-c358-4a57-bf5e-b3fd37531ae7-config-data" (OuterVolumeSpecName: "config-data") pod "0ec76144-c358-4a57-bf5e-b3fd37531ae7" (UID: "0ec76144-c358-4a57-bf5e-b3fd37531ae7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:23:44 crc kubenswrapper[4723]: I0309 13:23:44.746282 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ec76144-c358-4a57-bf5e-b3fd37531ae7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ec76144-c358-4a57-bf5e-b3fd37531ae7" (UID: "0ec76144-c358-4a57-bf5e-b3fd37531ae7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:23:44 crc kubenswrapper[4723]: I0309 13:23:44.798742 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ec76144-c358-4a57-bf5e-b3fd37531ae7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:44 crc kubenswrapper[4723]: I0309 13:23:44.798774 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ec76144-c358-4a57-bf5e-b3fd37531ae7-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:44 crc kubenswrapper[4723]: I0309 13:23:44.840823 4723 scope.go:117] "RemoveContainer" containerID="fa779e47056a0ebaf99c16d74c22c99b6b8add0cec4df04e7ed1cf0c8b8ce442" Mar 09 13:23:44 crc kubenswrapper[4723]: I0309 13:23:44.909053 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:23:44 crc kubenswrapper[4723]: I0309 13:23:44.921555 4723 scope.go:117] "RemoveContainer" containerID="e2e269e516560e901d6fa603389152c0f546bda7295e7d75ae3e244622069e61" Mar 09 13:23:44 crc kubenswrapper[4723]: I0309 13:23:44.944071 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:23:44 crc kubenswrapper[4723]: I0309 13:23:44.962108 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:23:44 crc kubenswrapper[4723]: E0309 13:23:44.962778 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ec76144-c358-4a57-bf5e-b3fd37531ae7" containerName="ceilometer-notification-agent" Mar 09 13:23:44 crc kubenswrapper[4723]: I0309 13:23:44.962889 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ec76144-c358-4a57-bf5e-b3fd37531ae7" containerName="ceilometer-notification-agent" Mar 09 13:23:44 crc kubenswrapper[4723]: E0309 13:23:44.962967 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ec76144-c358-4a57-bf5e-b3fd37531ae7" containerName="sg-core" Mar 09 13:23:44 crc kubenswrapper[4723]: I0309 13:23:44.963138 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ec76144-c358-4a57-bf5e-b3fd37531ae7" containerName="sg-core" Mar 09 13:23:44 crc kubenswrapper[4723]: E0309 13:23:44.963246 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ec76144-c358-4a57-bf5e-b3fd37531ae7" containerName="ceilometer-central-agent" Mar 09 13:23:44 crc kubenswrapper[4723]: I0309 13:23:44.963318 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ec76144-c358-4a57-bf5e-b3fd37531ae7" containerName="ceilometer-central-agent" Mar 09 13:23:44 crc kubenswrapper[4723]: E0309 13:23:44.963397 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ec76144-c358-4a57-bf5e-b3fd37531ae7" containerName="proxy-httpd" Mar 09 13:23:44 crc kubenswrapper[4723]: I0309 13:23:44.963471 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ec76144-c358-4a57-bf5e-b3fd37531ae7" containerName="proxy-httpd" Mar 09 13:23:44 crc kubenswrapper[4723]: I0309 13:23:44.963838 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ec76144-c358-4a57-bf5e-b3fd37531ae7" containerName="proxy-httpd" Mar 09 13:23:44 crc kubenswrapper[4723]: I0309 13:23:44.968540 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ec76144-c358-4a57-bf5e-b3fd37531ae7" containerName="ceilometer-central-agent" Mar 09 13:23:44 crc kubenswrapper[4723]: I0309 13:23:44.968691 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ec76144-c358-4a57-bf5e-b3fd37531ae7" containerName="ceilometer-notification-agent" Mar 09 13:23:44 crc kubenswrapper[4723]: I0309 13:23:44.968827 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ec76144-c358-4a57-bf5e-b3fd37531ae7" containerName="sg-core" Mar 09 13:23:44 crc kubenswrapper[4723]: I0309 13:23:44.973267 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:23:44 crc kubenswrapper[4723]: I0309 13:23:44.981196 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 13:23:44 crc kubenswrapper[4723]: I0309 13:23:44.981570 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 13:23:44 crc kubenswrapper[4723]: I0309 13:23:44.983744 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:23:45 crc kubenswrapper[4723]: I0309 13:23:45.092841 4723 scope.go:117] "RemoveContainer" containerID="85df1e22230c029105d30e0b99b6d8fd55c16180865e5e0cbec9110f3f0625a9" Mar 09 13:23:45 crc kubenswrapper[4723]: I0309 13:23:45.109801 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba\") " pod="openstack/ceilometer-0" Mar 09 13:23:45 crc kubenswrapper[4723]: I0309 13:23:45.109935 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba-scripts\") pod \"ceilometer-0\" (UID: \"1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba\") " pod="openstack/ceilometer-0" Mar 09 13:23:45 crc kubenswrapper[4723]: I0309 13:23:45.109964 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba-config-data\") pod \"ceilometer-0\" (UID: \"1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba\") " pod="openstack/ceilometer-0" Mar 09 13:23:45 crc kubenswrapper[4723]: I0309 13:23:45.110060 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba-log-httpd\") pod \"ceilometer-0\" (UID: \"1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba\") " pod="openstack/ceilometer-0" Mar 09 13:23:45 crc kubenswrapper[4723]: I0309 13:23:45.110127 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmf5k\" (UniqueName: \"kubernetes.io/projected/1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba-kube-api-access-gmf5k\") pod \"ceilometer-0\" (UID: \"1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba\") " pod="openstack/ceilometer-0" Mar 09 13:23:45 crc kubenswrapper[4723]: I0309 13:23:45.110188 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba-run-httpd\") pod \"ceilometer-0\" (UID: \"1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba\") " pod="openstack/ceilometer-0" Mar 09 13:23:45 crc kubenswrapper[4723]: I0309 13:23:45.110487 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba\") " pod="openstack/ceilometer-0" Mar 09 13:23:45 crc kubenswrapper[4723]: I0309 13:23:45.191104 4723 scope.go:117] "RemoveContainer" containerID="4ba4539b9017d6bfe47e6cf804915e83bfa20355355863196282e7c2b529f246" Mar 09 13:23:45 crc kubenswrapper[4723]: E0309 13:23:45.191583 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ba4539b9017d6bfe47e6cf804915e83bfa20355355863196282e7c2b529f246\": container with ID starting with 4ba4539b9017d6bfe47e6cf804915e83bfa20355355863196282e7c2b529f246 not found: ID does not exist" containerID="4ba4539b9017d6bfe47e6cf804915e83bfa20355355863196282e7c2b529f246" Mar 09 13:23:45 crc kubenswrapper[4723]: I0309 13:23:45.191614 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ba4539b9017d6bfe47e6cf804915e83bfa20355355863196282e7c2b529f246"} err="failed to get container status \"4ba4539b9017d6bfe47e6cf804915e83bfa20355355863196282e7c2b529f246\": rpc error: code = NotFound desc = could not find container \"4ba4539b9017d6bfe47e6cf804915e83bfa20355355863196282e7c2b529f246\": container with ID starting with 4ba4539b9017d6bfe47e6cf804915e83bfa20355355863196282e7c2b529f246 not found: ID does not exist" Mar 09 13:23:45 crc kubenswrapper[4723]: I0309 13:23:45.191635 4723 scope.go:117] "RemoveContainer" containerID="fa779e47056a0ebaf99c16d74c22c99b6b8add0cec4df04e7ed1cf0c8b8ce442" Mar 09 13:23:45 crc kubenswrapper[4723]: E0309 13:23:45.194178 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa779e47056a0ebaf99c16d74c22c99b6b8add0cec4df04e7ed1cf0c8b8ce442\": container with ID starting with fa779e47056a0ebaf99c16d74c22c99b6b8add0cec4df04e7ed1cf0c8b8ce442 not found: ID does not exist" containerID="fa779e47056a0ebaf99c16d74c22c99b6b8add0cec4df04e7ed1cf0c8b8ce442" Mar 09 13:23:45 crc kubenswrapper[4723]: I0309 13:23:45.194252 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa779e47056a0ebaf99c16d74c22c99b6b8add0cec4df04e7ed1cf0c8b8ce442"} err="failed to get container status \"fa779e47056a0ebaf99c16d74c22c99b6b8add0cec4df04e7ed1cf0c8b8ce442\": rpc error: code = NotFound desc = could not find container \"fa779e47056a0ebaf99c16d74c22c99b6b8add0cec4df04e7ed1cf0c8b8ce442\": container with ID starting with fa779e47056a0ebaf99c16d74c22c99b6b8add0cec4df04e7ed1cf0c8b8ce442 not found: ID does not exist" Mar 09 13:23:45 crc kubenswrapper[4723]: I0309 13:23:45.194286 4723 scope.go:117] "RemoveContainer" containerID="e2e269e516560e901d6fa603389152c0f546bda7295e7d75ae3e244622069e61" Mar 09 13:23:45 crc kubenswrapper[4723]: E0309 13:23:45.197680 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2e269e516560e901d6fa603389152c0f546bda7295e7d75ae3e244622069e61\": container with ID starting with e2e269e516560e901d6fa603389152c0f546bda7295e7d75ae3e244622069e61 not found: ID does not exist" containerID="e2e269e516560e901d6fa603389152c0f546bda7295e7d75ae3e244622069e61" Mar 09 13:23:45 crc kubenswrapper[4723]: I0309 13:23:45.197720 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2e269e516560e901d6fa603389152c0f546bda7295e7d75ae3e244622069e61"} err="failed to get container status \"e2e269e516560e901d6fa603389152c0f546bda7295e7d75ae3e244622069e61\": rpc error: code = NotFound desc = could not find container \"e2e269e516560e901d6fa603389152c0f546bda7295e7d75ae3e244622069e61\": container with ID starting with e2e269e516560e901d6fa603389152c0f546bda7295e7d75ae3e244622069e61 not found: ID does not exist" Mar 09 13:23:45 crc kubenswrapper[4723]: I0309 13:23:45.197743 4723 scope.go:117] "RemoveContainer" containerID="85df1e22230c029105d30e0b99b6d8fd55c16180865e5e0cbec9110f3f0625a9" Mar 09 13:23:45 crc kubenswrapper[4723]: E0309 13:23:45.202027 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85df1e22230c029105d30e0b99b6d8fd55c16180865e5e0cbec9110f3f0625a9\": container with ID starting with 85df1e22230c029105d30e0b99b6d8fd55c16180865e5e0cbec9110f3f0625a9 not found: ID does not exist" containerID="85df1e22230c029105d30e0b99b6d8fd55c16180865e5e0cbec9110f3f0625a9" Mar 09 13:23:45 crc kubenswrapper[4723]: I0309 13:23:45.202066 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85df1e22230c029105d30e0b99b6d8fd55c16180865e5e0cbec9110f3f0625a9"} err="failed to get container status \"85df1e22230c029105d30e0b99b6d8fd55c16180865e5e0cbec9110f3f0625a9\": rpc error: code = NotFound desc = could not find container \"85df1e22230c029105d30e0b99b6d8fd55c16180865e5e0cbec9110f3f0625a9\": container with ID starting with 85df1e22230c029105d30e0b99b6d8fd55c16180865e5e0cbec9110f3f0625a9 not found: ID does not exist" Mar 09 13:23:45 crc kubenswrapper[4723]: I0309 13:23:45.212163 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba\") " pod="openstack/ceilometer-0" Mar 09 13:23:45 crc kubenswrapper[4723]: I0309 13:23:45.212220 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba-scripts\") pod \"ceilometer-0\" (UID: \"1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba\") " pod="openstack/ceilometer-0" Mar 09 13:23:45 crc kubenswrapper[4723]: I0309 13:23:45.212238 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba-config-data\") pod \"ceilometer-0\" (UID: \"1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba\") " pod="openstack/ceilometer-0" Mar 09 13:23:45 crc kubenswrapper[4723]: I0309 13:23:45.212292 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba-log-httpd\") pod \"ceilometer-0\" (UID: \"1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba\") " pod="openstack/ceilometer-0" Mar 09 13:23:45 crc kubenswrapper[4723]: I0309 13:23:45.212327 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmf5k\" (UniqueName: \"kubernetes.io/projected/1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba-kube-api-access-gmf5k\") pod \"ceilometer-0\" (UID: \"1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba\") " pod="openstack/ceilometer-0" Mar 09 13:23:45 crc kubenswrapper[4723]: I0309 13:23:45.212360 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba-run-httpd\") pod \"ceilometer-0\" (UID: \"1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba\") " pod="openstack/ceilometer-0" Mar 09 13:23:45 crc kubenswrapper[4723]: I0309 13:23:45.212436 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba\") " pod="openstack/ceilometer-0" Mar 09 13:23:45 crc kubenswrapper[4723]: I0309 13:23:45.216657 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba-log-httpd\") pod \"ceilometer-0\" (UID: \"1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba\") " pod="openstack/ceilometer-0" Mar 09 13:23:45 crc kubenswrapper[4723]: I0309 13:23:45.220085 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba-run-httpd\") pod \"ceilometer-0\" (UID: \"1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba\") " pod="openstack/ceilometer-0" Mar 09 13:23:45 crc kubenswrapper[4723]: I0309 13:23:45.221812 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba\") " pod="openstack/ceilometer-0" Mar 09 13:23:45 crc kubenswrapper[4723]: I0309 13:23:45.224793 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba-scripts\") pod \"ceilometer-0\" (UID: \"1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba\") " pod="openstack/ceilometer-0" Mar 09 13:23:45 crc kubenswrapper[4723]: I0309 13:23:45.225833 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba\") " pod="openstack/ceilometer-0" Mar 09 13:23:45 crc kubenswrapper[4723]: I0309 13:23:45.226020 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba-config-data\") pod \"ceilometer-0\" (UID: \"1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba\") " pod="openstack/ceilometer-0" Mar 09 13:23:45 crc kubenswrapper[4723]: I0309 13:23:45.244759 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmf5k\" (UniqueName: \"kubernetes.io/projected/1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba-kube-api-access-gmf5k\") pod \"ceilometer-0\" (UID: \"1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba\") " pod="openstack/ceilometer-0" Mar 09 13:23:45 crc kubenswrapper[4723]: I0309 13:23:45.292773 4723 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 13:23:45 crc kubenswrapper[4723]: I0309 13:23:45.392976 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:23:45 crc kubenswrapper[4723]: I0309 13:23:45.555000 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"0637c672-5bcf-44ec-add1-638ce6065b6e","Type":"ContainerStarted","Data":"20562d9be0c7cdb6d497fe8af4504e9f7d9deb069057276c233776b1e8afa74c"} Mar 09 13:23:45 crc kubenswrapper[4723]: I0309 13:23:45.674025 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:23:45 crc kubenswrapper[4723]: I0309 13:23:45.871344 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:23:46 crc kubenswrapper[4723]: I0309 13:23:46.568076 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba","Type":"ContainerStarted","Data":"15f9f01ad715422d3181f598dc4f1526a0edbbd1336af3b5f2cdb845b6bb4b0d"} Mar 09 13:23:46 crc kubenswrapper[4723]: I0309 13:23:46.571531 4723 generic.go:334] "Generic (PLEG): container finished" podID="435ec576-f731-4d62-9eeb-804d4ae4f52a" containerID="c3ce48aa40ab0a7d443453576bc9289d614a2e1a39312338de9fa7df423a9349" exitCode=137 Mar 09 13:23:46 crc kubenswrapper[4723]: I0309 13:23:46.571565 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"435ec576-f731-4d62-9eeb-804d4ae4f52a","Type":"ContainerDied","Data":"c3ce48aa40ab0a7d443453576bc9289d614a2e1a39312338de9fa7df423a9349"} Mar 09 13:23:46 crc kubenswrapper[4723]: I0309 13:23:46.924662 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ec76144-c358-4a57-bf5e-b3fd37531ae7" path="/var/lib/kubelet/pods/0ec76144-c358-4a57-bf5e-b3fd37531ae7/volumes" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.036979 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.163506 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/435ec576-f731-4d62-9eeb-804d4ae4f52a-config-data\") pod \"435ec576-f731-4d62-9eeb-804d4ae4f52a\" (UID: \"435ec576-f731-4d62-9eeb-804d4ae4f52a\") " Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.163665 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/435ec576-f731-4d62-9eeb-804d4ae4f52a-combined-ca-bundle\") pod \"435ec576-f731-4d62-9eeb-804d4ae4f52a\" (UID: \"435ec576-f731-4d62-9eeb-804d4ae4f52a\") " Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.163830 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpj8x\" (UniqueName: \"kubernetes.io/projected/435ec576-f731-4d62-9eeb-804d4ae4f52a-kube-api-access-mpj8x\") pod \"435ec576-f731-4d62-9eeb-804d4ae4f52a\" (UID: \"435ec576-f731-4d62-9eeb-804d4ae4f52a\") " Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.170601 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/435ec576-f731-4d62-9eeb-804d4ae4f52a-kube-api-access-mpj8x" (OuterVolumeSpecName: "kube-api-access-mpj8x") pod "435ec576-f731-4d62-9eeb-804d4ae4f52a" (UID: "435ec576-f731-4d62-9eeb-804d4ae4f52a"). InnerVolumeSpecName "kube-api-access-mpj8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.209538 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/435ec576-f731-4d62-9eeb-804d4ae4f52a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "435ec576-f731-4d62-9eeb-804d4ae4f52a" (UID: "435ec576-f731-4d62-9eeb-804d4ae4f52a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.224082 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/435ec576-f731-4d62-9eeb-804d4ae4f52a-config-data" (OuterVolumeSpecName: "config-data") pod "435ec576-f731-4d62-9eeb-804d4ae4f52a" (UID: "435ec576-f731-4d62-9eeb-804d4ae4f52a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.266972 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpj8x\" (UniqueName: \"kubernetes.io/projected/435ec576-f731-4d62-9eeb-804d4ae4f52a-kube-api-access-mpj8x\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.267022 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/435ec576-f731-4d62-9eeb-804d4ae4f52a-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.267037 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/435ec576-f731-4d62-9eeb-804d4ae4f52a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.460784 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.577844 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5014a76c-6fd4-44e5-9151-f07dcfb5f1d4-config-data\") pod \"5014a76c-6fd4-44e5-9151-f07dcfb5f1d4\" (UID: \"5014a76c-6fd4-44e5-9151-f07dcfb5f1d4\") " Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.577985 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5014a76c-6fd4-44e5-9151-f07dcfb5f1d4-combined-ca-bundle\") pod \"5014a76c-6fd4-44e5-9151-f07dcfb5f1d4\" (UID: \"5014a76c-6fd4-44e5-9151-f07dcfb5f1d4\") " Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.578478 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7l9gc\" (UniqueName: \"kubernetes.io/projected/5014a76c-6fd4-44e5-9151-f07dcfb5f1d4-kube-api-access-7l9gc\") pod \"5014a76c-6fd4-44e5-9151-f07dcfb5f1d4\" (UID: \"5014a76c-6fd4-44e5-9151-f07dcfb5f1d4\") " Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.583233 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5014a76c-6fd4-44e5-9151-f07dcfb5f1d4-kube-api-access-7l9gc" (OuterVolumeSpecName: "kube-api-access-7l9gc") pod "5014a76c-6fd4-44e5-9151-f07dcfb5f1d4" (UID: "5014a76c-6fd4-44e5-9151-f07dcfb5f1d4"). InnerVolumeSpecName "kube-api-access-7l9gc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.588789 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba","Type":"ContainerStarted","Data":"c1d5076ef09bea7125d21403fdc851853ee1d67fa9518a737b9979f050819662"} Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.591295 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"435ec576-f731-4d62-9eeb-804d4ae4f52a","Type":"ContainerDied","Data":"223b764c8cfaddd4cf5cc631cf01997c2aeeb8b295280359c9669a0b5f1a7ea2"} Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.591367 4723 scope.go:117] "RemoveContainer" containerID="c3ce48aa40ab0a7d443453576bc9289d614a2e1a39312338de9fa7df423a9349" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.591304 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.597495 4723 generic.go:334] "Generic (PLEG): container finished" podID="5014a76c-6fd4-44e5-9151-f07dcfb5f1d4" containerID="d76142d3b9987ec272ef4164909218ec50cdf18d8bd433d5347585e908c93c1b" exitCode=137 Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.597592 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.597594 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5014a76c-6fd4-44e5-9151-f07dcfb5f1d4","Type":"ContainerDied","Data":"d76142d3b9987ec272ef4164909218ec50cdf18d8bd433d5347585e908c93c1b"} Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.597829 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5014a76c-6fd4-44e5-9151-f07dcfb5f1d4","Type":"ContainerDied","Data":"9eacfab77052b22c148588f266a26da7a0b98dac07572e911a0c5740f5fe6f56"} Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.598080 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.603819 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"0637c672-5bcf-44ec-add1-638ce6065b6e","Type":"ContainerStarted","Data":"976238b04ee9ccad82abc3e653cc56b6e1908493cb78efe9690bc684d20643f8"} Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.604059 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="0637c672-5bcf-44ec-add1-638ce6065b6e" containerName="aodh-evaluator" containerID="cri-o://e7b95cc57d3bb1e0f5b53a049f881d06fc9bb4e3f45b3f635cbaf04da358c06c" gracePeriod=30 Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.603895 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="0637c672-5bcf-44ec-add1-638ce6065b6e" containerName="aodh-api" containerID="cri-o://98fd34ceecb8a13f7287803e9ecac6b659250dad38768f0624b41f08d6d156c9" gracePeriod=30 Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.603985 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="0637c672-5bcf-44ec-add1-638ce6065b6e" containerName="aodh-listener" containerID="cri-o://976238b04ee9ccad82abc3e653cc56b6e1908493cb78efe9690bc684d20643f8" gracePeriod=30 Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.603962 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="0637c672-5bcf-44ec-add1-638ce6065b6e" containerName="aodh-notifier" containerID="cri-o://20562d9be0c7cdb6d497fe8af4504e9f7d9deb069057276c233776b1e8afa74c" gracePeriod=30 Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.607803 4723 generic.go:334] "Generic (PLEG): container finished" podID="f8105baf-a986-4aba-a114-10e0b997f27c" containerID="4e920ad0222517ce6f2b8194793fb7169ed22f3ef4557f83f51a2df34eb61182" exitCode=137 Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.607841 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f8105baf-a986-4aba-a114-10e0b997f27c","Type":"ContainerDied","Data":"4e920ad0222517ce6f2b8194793fb7169ed22f3ef4557f83f51a2df34eb61182"} Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.607928 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.614495 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5014a76c-6fd4-44e5-9151-f07dcfb5f1d4-config-data" (OuterVolumeSpecName: "config-data") pod "5014a76c-6fd4-44e5-9151-f07dcfb5f1d4" (UID: "5014a76c-6fd4-44e5-9151-f07dcfb5f1d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.633847 4723 scope.go:117] "RemoveContainer" containerID="d76142d3b9987ec272ef4164909218ec50cdf18d8bd433d5347585e908c93c1b" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.650973 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5014a76c-6fd4-44e5-9151-f07dcfb5f1d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5014a76c-6fd4-44e5-9151-f07dcfb5f1d4" (UID: "5014a76c-6fd4-44e5-9151-f07dcfb5f1d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.681335 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8105baf-a986-4aba-a114-10e0b997f27c-config-data\") pod \"f8105baf-a986-4aba-a114-10e0b997f27c\" (UID: \"f8105baf-a986-4aba-a114-10e0b997f27c\") " Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.681489 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8105baf-a986-4aba-a114-10e0b997f27c-logs\") pod \"f8105baf-a986-4aba-a114-10e0b997f27c\" (UID: \"f8105baf-a986-4aba-a114-10e0b997f27c\") " Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.681612 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8105baf-a986-4aba-a114-10e0b997f27c-combined-ca-bundle\") pod \"f8105baf-a986-4aba-a114-10e0b997f27c\" (UID: \"f8105baf-a986-4aba-a114-10e0b997f27c\") " Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.681634 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-st24r\" (UniqueName: \"kubernetes.io/projected/f8105baf-a986-4aba-a114-10e0b997f27c-kube-api-access-st24r\") pod \"f8105baf-a986-4aba-a114-10e0b997f27c\" (UID: \"f8105baf-a986-4aba-a114-10e0b997f27c\") " Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.682152 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5014a76c-6fd4-44e5-9151-f07dcfb5f1d4-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.682168 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5014a76c-6fd4-44e5-9151-f07dcfb5f1d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.682181 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7l9gc\" (UniqueName: \"kubernetes.io/projected/5014a76c-6fd4-44e5-9151-f07dcfb5f1d4-kube-api-access-7l9gc\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.686038 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8105baf-a986-4aba-a114-10e0b997f27c-logs" (OuterVolumeSpecName: "logs") pod "f8105baf-a986-4aba-a114-10e0b997f27c" (UID: "f8105baf-a986-4aba-a114-10e0b997f27c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.701265 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8105baf-a986-4aba-a114-10e0b997f27c-kube-api-access-st24r" (OuterVolumeSpecName: "kube-api-access-st24r") pod "f8105baf-a986-4aba-a114-10e0b997f27c" (UID: "f8105baf-a986-4aba-a114-10e0b997f27c"). InnerVolumeSpecName "kube-api-access-st24r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.708064 4723 scope.go:117] "RemoveContainer" containerID="d76142d3b9987ec272ef4164909218ec50cdf18d8bd433d5347585e908c93c1b" Mar 09 13:23:47 crc kubenswrapper[4723]: E0309 13:23:47.710918 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d76142d3b9987ec272ef4164909218ec50cdf18d8bd433d5347585e908c93c1b\": container with ID starting with d76142d3b9987ec272ef4164909218ec50cdf18d8bd433d5347585e908c93c1b not found: ID does not exist" containerID="d76142d3b9987ec272ef4164909218ec50cdf18d8bd433d5347585e908c93c1b" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.710978 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d76142d3b9987ec272ef4164909218ec50cdf18d8bd433d5347585e908c93c1b"} err="failed to get container status \"d76142d3b9987ec272ef4164909218ec50cdf18d8bd433d5347585e908c93c1b\": rpc error: code = NotFound desc = could not find container \"d76142d3b9987ec272ef4164909218ec50cdf18d8bd433d5347585e908c93c1b\": container with ID starting with d76142d3b9987ec272ef4164909218ec50cdf18d8bd433d5347585e908c93c1b not found: ID does not exist" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.711006 4723 scope.go:117] "RemoveContainer" containerID="4e920ad0222517ce6f2b8194793fb7169ed22f3ef4557f83f51a2df34eb61182" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.730691 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.764609 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.782308 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8105baf-a986-4aba-a114-10e0b997f27c-config-data" (OuterVolumeSpecName: "config-data") pod "f8105baf-a986-4aba-a114-10e0b997f27c" (UID: "f8105baf-a986-4aba-a114-10e0b997f27c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.784393 4723 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8105baf-a986-4aba-a114-10e0b997f27c-logs\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.784432 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-st24r\" (UniqueName: \"kubernetes.io/projected/f8105baf-a986-4aba-a114-10e0b997f27c-kube-api-access-st24r\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.784444 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8105baf-a986-4aba-a114-10e0b997f27c-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.793348 4723 scope.go:117] "RemoveContainer" containerID="5cc5b95b52319cf3fad4e4d63d8270110843673ba6b1cc1c712af8d9c5935160" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.793633 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 13:23:47 crc kubenswrapper[4723]: E0309 13:23:47.794200 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="435ec576-f731-4d62-9eeb-804d4ae4f52a" containerName="nova-cell1-novncproxy-novncproxy" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.794220 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="435ec576-f731-4d62-9eeb-804d4ae4f52a" containerName="nova-cell1-novncproxy-novncproxy" Mar 09 13:23:47 crc kubenswrapper[4723]: E0309 13:23:47.794243 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8105baf-a986-4aba-a114-10e0b997f27c" containerName="nova-metadata-log" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.794254 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8105baf-a986-4aba-a114-10e0b997f27c" containerName="nova-metadata-log" Mar 09 13:23:47 crc kubenswrapper[4723]: E0309 13:23:47.794277 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8105baf-a986-4aba-a114-10e0b997f27c" containerName="nova-metadata-metadata" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.794285 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8105baf-a986-4aba-a114-10e0b997f27c" containerName="nova-metadata-metadata" Mar 09 13:23:47 crc kubenswrapper[4723]: E0309 13:23:47.794320 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5014a76c-6fd4-44e5-9151-f07dcfb5f1d4" containerName="nova-scheduler-scheduler" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.794328 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="5014a76c-6fd4-44e5-9151-f07dcfb5f1d4" containerName="nova-scheduler-scheduler" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.794562 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8105baf-a986-4aba-a114-10e0b997f27c" containerName="nova-metadata-metadata" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.794587 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="5014a76c-6fd4-44e5-9151-f07dcfb5f1d4" containerName="nova-scheduler-scheduler" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.794605 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8105baf-a986-4aba-a114-10e0b997f27c" containerName="nova-metadata-log" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.794617 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="435ec576-f731-4d62-9eeb-804d4ae4f52a" containerName="nova-cell1-novncproxy-novncproxy" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.796702 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.636577615 podStartE2EDuration="8.796682753s" podCreationTimestamp="2026-03-09 13:23:39 +0000 UTC" firstStartedPulling="2026-03-09 13:23:40.612961917 +0000 UTC m=+1494.627429457" lastFinishedPulling="2026-03-09 13:23:46.773067055 +0000 UTC m=+1500.787534595" observedRunningTime="2026-03-09 13:23:47.690291569 +0000 UTC m=+1501.704759129" watchObservedRunningTime="2026-03-09 13:23:47.796682753 +0000 UTC m=+1501.811150293" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.801626 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.814232 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.814459 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.816656 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8105baf-a986-4aba-a114-10e0b997f27c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8105baf-a986-4aba-a114-10e0b997f27c" (UID: "f8105baf-a986-4aba-a114-10e0b997f27c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.823998 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.857668 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.886309 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/44bd7e16-39c2-4847-8f6f-523ade24e8cb-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"44bd7e16-39c2-4847-8f6f-523ade24e8cb\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.886345 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8dng\" (UniqueName: \"kubernetes.io/projected/44bd7e16-39c2-4847-8f6f-523ade24e8cb-kube-api-access-j8dng\") pod \"nova-cell1-novncproxy-0\" (UID: \"44bd7e16-39c2-4847-8f6f-523ade24e8cb\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.886400 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44bd7e16-39c2-4847-8f6f-523ade24e8cb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"44bd7e16-39c2-4847-8f6f-523ade24e8cb\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.886487 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44bd7e16-39c2-4847-8f6f-523ade24e8cb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"44bd7e16-39c2-4847-8f6f-523ade24e8cb\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.886505 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/44bd7e16-39c2-4847-8f6f-523ade24e8cb-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"44bd7e16-39c2-4847-8f6f-523ade24e8cb\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:23:47 crc kubenswrapper[4723]: I0309 13:23:47.886659 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8105baf-a986-4aba-a114-10e0b997f27c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:47.999844 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.008509 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.012939 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44bd7e16-39c2-4847-8f6f-523ade24e8cb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"44bd7e16-39c2-4847-8f6f-523ade24e8cb\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.013080 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44bd7e16-39c2-4847-8f6f-523ade24e8cb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"44bd7e16-39c2-4847-8f6f-523ade24e8cb\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.013108 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/44bd7e16-39c2-4847-8f6f-523ade24e8cb-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"44bd7e16-39c2-4847-8f6f-523ade24e8cb\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.013256 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/44bd7e16-39c2-4847-8f6f-523ade24e8cb-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"44bd7e16-39c2-4847-8f6f-523ade24e8cb\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.013283 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8dng\" (UniqueName: \"kubernetes.io/projected/44bd7e16-39c2-4847-8f6f-523ade24e8cb-kube-api-access-j8dng\") pod \"nova-cell1-novncproxy-0\" (UID: \"44bd7e16-39c2-4847-8f6f-523ade24e8cb\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.022412 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/44bd7e16-39c2-4847-8f6f-523ade24e8cb-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"44bd7e16-39c2-4847-8f6f-523ade24e8cb\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.022542 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44bd7e16-39c2-4847-8f6f-523ade24e8cb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"44bd7e16-39c2-4847-8f6f-523ade24e8cb\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.029475 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/44bd7e16-39c2-4847-8f6f-523ade24e8cb-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"44bd7e16-39c2-4847-8f6f-523ade24e8cb\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.062639 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44bd7e16-39c2-4847-8f6f-523ade24e8cb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"44bd7e16-39c2-4847-8f6f-523ade24e8cb\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.063807 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8dng\" (UniqueName: \"kubernetes.io/projected/44bd7e16-39c2-4847-8f6f-523ade24e8cb-kube-api-access-j8dng\") pod \"nova-cell1-novncproxy-0\" (UID: \"44bd7e16-39c2-4847-8f6f-523ade24e8cb\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.113216 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.146428 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.152490 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.179626 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.214853 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.228401 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.244422 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.255995 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.256604 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5e88473-555f-408b-917e-997969b8f48d-config-data\") pod \"nova-scheduler-0\" (UID: \"c5e88473-555f-408b-917e-997969b8f48d\") " pod="openstack/nova-scheduler-0" Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.256761 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5e88473-555f-408b-917e-997969b8f48d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c5e88473-555f-408b-917e-997969b8f48d\") " pod="openstack/nova-scheduler-0" Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.257197 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz8c9\" (UniqueName: \"kubernetes.io/projected/c5e88473-555f-408b-917e-997969b8f48d-kube-api-access-cz8c9\") pod \"nova-scheduler-0\" (UID: \"c5e88473-555f-408b-917e-997969b8f48d\") " pod="openstack/nova-scheduler-0" Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.259769 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.260300 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.260575 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.260661 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.359575 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/faddb12d-2b02-4f85-b253-8b9ce0c6ed27-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"faddb12d-2b02-4f85-b253-8b9ce0c6ed27\") " pod="openstack/nova-metadata-0" Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.359851 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/faddb12d-2b02-4f85-b253-8b9ce0c6ed27-logs\") pod \"nova-metadata-0\" (UID: \"faddb12d-2b02-4f85-b253-8b9ce0c6ed27\") " pod="openstack/nova-metadata-0" Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.359900 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz8c9\" (UniqueName: \"kubernetes.io/projected/c5e88473-555f-408b-917e-997969b8f48d-kube-api-access-cz8c9\") pod \"nova-scheduler-0\" (UID: \"c5e88473-555f-408b-917e-997969b8f48d\") " pod="openstack/nova-scheduler-0" Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.359946 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faddb12d-2b02-4f85-b253-8b9ce0c6ed27-config-data\") pod \"nova-metadata-0\" (UID: \"faddb12d-2b02-4f85-b253-8b9ce0c6ed27\") " pod="openstack/nova-metadata-0" Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.360040 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5e88473-555f-408b-917e-997969b8f48d-config-data\") pod \"nova-scheduler-0\" (UID: \"c5e88473-555f-408b-917e-997969b8f48d\") " pod="openstack/nova-scheduler-0" Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.360076 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn66v\" (UniqueName: \"kubernetes.io/projected/faddb12d-2b02-4f85-b253-8b9ce0c6ed27-kube-api-access-wn66v\") pod \"nova-metadata-0\" (UID: \"faddb12d-2b02-4f85-b253-8b9ce0c6ed27\") " pod="openstack/nova-metadata-0" Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.360094 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5e88473-555f-408b-917e-997969b8f48d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c5e88473-555f-408b-917e-997969b8f48d\") " pod="openstack/nova-scheduler-0" Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.360180 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faddb12d-2b02-4f85-b253-8b9ce0c6ed27-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"faddb12d-2b02-4f85-b253-8b9ce0c6ed27\") " pod="openstack/nova-metadata-0" Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.364390 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5e88473-555f-408b-917e-997969b8f48d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c5e88473-555f-408b-917e-997969b8f48d\") " pod="openstack/nova-scheduler-0" Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.369473 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5e88473-555f-408b-917e-997969b8f48d-config-data\") pod \"nova-scheduler-0\" (UID: \"c5e88473-555f-408b-917e-997969b8f48d\") " pod="openstack/nova-scheduler-0" Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.377500 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz8c9\" (UniqueName: \"kubernetes.io/projected/c5e88473-555f-408b-917e-997969b8f48d-kube-api-access-cz8c9\") pod \"nova-scheduler-0\" (UID: \"c5e88473-555f-408b-917e-997969b8f48d\") " pod="openstack/nova-scheduler-0" Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.392957 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.462930 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wn66v\" (UniqueName: \"kubernetes.io/projected/faddb12d-2b02-4f85-b253-8b9ce0c6ed27-kube-api-access-wn66v\") pod \"nova-metadata-0\" (UID: \"faddb12d-2b02-4f85-b253-8b9ce0c6ed27\") " pod="openstack/nova-metadata-0" Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.463557 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faddb12d-2b02-4f85-b253-8b9ce0c6ed27-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"faddb12d-2b02-4f85-b253-8b9ce0c6ed27\") " pod="openstack/nova-metadata-0" Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.463665 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/faddb12d-2b02-4f85-b253-8b9ce0c6ed27-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"faddb12d-2b02-4f85-b253-8b9ce0c6ed27\") " pod="openstack/nova-metadata-0" Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.463726 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/faddb12d-2b02-4f85-b253-8b9ce0c6ed27-logs\") pod \"nova-metadata-0\" (UID: \"faddb12d-2b02-4f85-b253-8b9ce0c6ed27\") " pod="openstack/nova-metadata-0" Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.463810 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faddb12d-2b02-4f85-b253-8b9ce0c6ed27-config-data\") pod \"nova-metadata-0\" (UID: \"faddb12d-2b02-4f85-b253-8b9ce0c6ed27\") " pod="openstack/nova-metadata-0" Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.467091 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/faddb12d-2b02-4f85-b253-8b9ce0c6ed27-logs\") pod \"nova-metadata-0\" (UID: \"faddb12d-2b02-4f85-b253-8b9ce0c6ed27\") " pod="openstack/nova-metadata-0" Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.489391 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/faddb12d-2b02-4f85-b253-8b9ce0c6ed27-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"faddb12d-2b02-4f85-b253-8b9ce0c6ed27\") " pod="openstack/nova-metadata-0" Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.490064 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faddb12d-2b02-4f85-b253-8b9ce0c6ed27-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"faddb12d-2b02-4f85-b253-8b9ce0c6ed27\") " pod="openstack/nova-metadata-0" Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.490775 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faddb12d-2b02-4f85-b253-8b9ce0c6ed27-config-data\") pod \"nova-metadata-0\" (UID: \"faddb12d-2b02-4f85-b253-8b9ce0c6ed27\") " pod="openstack/nova-metadata-0" Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.494103 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn66v\" (UniqueName: \"kubernetes.io/projected/faddb12d-2b02-4f85-b253-8b9ce0c6ed27-kube-api-access-wn66v\") pod \"nova-metadata-0\" (UID: \"faddb12d-2b02-4f85-b253-8b9ce0c6ed27\") " pod="openstack/nova-metadata-0" Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.632229 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba","Type":"ContainerStarted","Data":"f80feb668af0d6e7e3d7c89ca7582277f191814e9e1203e4f2f140a7de0dde27"} Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.638239 4723 generic.go:334] "Generic (PLEG): container finished" podID="0637c672-5bcf-44ec-add1-638ce6065b6e" containerID="20562d9be0c7cdb6d497fe8af4504e9f7d9deb069057276c233776b1e8afa74c" exitCode=0 Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.638272 4723 generic.go:334] "Generic (PLEG): container finished" podID="0637c672-5bcf-44ec-add1-638ce6065b6e" containerID="e7b95cc57d3bb1e0f5b53a049f881d06fc9bb4e3f45b3f635cbaf04da358c06c" exitCode=0 Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.638281 4723 generic.go:334] "Generic (PLEG): container finished" podID="0637c672-5bcf-44ec-add1-638ce6065b6e" containerID="98fd34ceecb8a13f7287803e9ecac6b659250dad38768f0624b41f08d6d156c9" exitCode=0 Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.638333 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"0637c672-5bcf-44ec-add1-638ce6065b6e","Type":"ContainerDied","Data":"20562d9be0c7cdb6d497fe8af4504e9f7d9deb069057276c233776b1e8afa74c"} Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.638358 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"0637c672-5bcf-44ec-add1-638ce6065b6e","Type":"ContainerDied","Data":"e7b95cc57d3bb1e0f5b53a049f881d06fc9bb4e3f45b3f635cbaf04da358c06c"} Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.638372 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"0637c672-5bcf-44ec-add1-638ce6065b6e","Type":"ContainerDied","Data":"98fd34ceecb8a13f7287803e9ecac6b659250dad38768f0624b41f08d6d156c9"} Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.728168 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.800472 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.904383 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="435ec576-f731-4d62-9eeb-804d4ae4f52a" path="/var/lib/kubelet/pods/435ec576-f731-4d62-9eeb-804d4ae4f52a/volumes" Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.909184 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5014a76c-6fd4-44e5-9151-f07dcfb5f1d4" path="/var/lib/kubelet/pods/5014a76c-6fd4-44e5-9151-f07dcfb5f1d4/volumes" Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.909879 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8105baf-a986-4aba-a114-10e0b997f27c" path="/var/lib/kubelet/pods/f8105baf-a986-4aba-a114-10e0b997f27c/volumes" Mar 09 13:23:48 crc kubenswrapper[4723]: I0309 13:23:48.954309 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 13:23:49 crc kubenswrapper[4723]: I0309 13:23:49.334928 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:23:49 crc kubenswrapper[4723]: W0309 13:23:49.364018 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfaddb12d_2b02_4f85_b253_8b9ce0c6ed27.slice/crio-d51c5edb5031ed5be2592673e3a82fd6c8a22f29cbbd2817e76467620b62b70f WatchSource:0}: Error finding container d51c5edb5031ed5be2592673e3a82fd6c8a22f29cbbd2817e76467620b62b70f: Status 404 returned error can't find the container with id d51c5edb5031ed5be2592673e3a82fd6c8a22f29cbbd2817e76467620b62b70f Mar 09 13:23:49 crc kubenswrapper[4723]: I0309 13:23:49.692312 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"faddb12d-2b02-4f85-b253-8b9ce0c6ed27","Type":"ContainerStarted","Data":"9bc78ed950342851717b1a9f23160704ca61116e0e5760f5073bcea572ce0fc7"} Mar 09 13:23:49 crc kubenswrapper[4723]: I0309 13:23:49.692648 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"faddb12d-2b02-4f85-b253-8b9ce0c6ed27","Type":"ContainerStarted","Data":"d51c5edb5031ed5be2592673e3a82fd6c8a22f29cbbd2817e76467620b62b70f"} Mar 09 13:23:49 crc kubenswrapper[4723]: I0309 13:23:49.698350 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"44bd7e16-39c2-4847-8f6f-523ade24e8cb","Type":"ContainerStarted","Data":"14ecc59ec1a44ecb84c4783bd1fd7ce8142e2bd39e1d0531430fb38f42db1346"} Mar 09 13:23:49 crc kubenswrapper[4723]: I0309 13:23:49.698408 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"44bd7e16-39c2-4847-8f6f-523ade24e8cb","Type":"ContainerStarted","Data":"63574dae1c3d27f9c27ac4f2bbe3e963154fc8be857a18c53ece78ade02c9a73"} Mar 09 13:23:49 crc kubenswrapper[4723]: I0309 13:23:49.718224 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba","Type":"ContainerStarted","Data":"078c49fcd767f1744dd6c1805775876365c8a7e05ea75dcbceb40feaa278e0ae"} Mar 09 13:23:49 crc kubenswrapper[4723]: I0309 13:23:49.725087 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c5e88473-555f-408b-917e-997969b8f48d","Type":"ContainerStarted","Data":"9331bf85019f77dffb9d38a982fcb9c3ea17697ba109e52e788c27965e489f54"} Mar 09 13:23:49 crc kubenswrapper[4723]: I0309 13:23:49.725131 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c5e88473-555f-408b-917e-997969b8f48d","Type":"ContainerStarted","Data":"4a9adaac78f885646800e3743affbcb8fcaf9186a08a923402a324f6bfdf8217"} Mar 09 13:23:49 crc kubenswrapper[4723]: I0309 13:23:49.727045 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.727022286 podStartE2EDuration="2.727022286s" podCreationTimestamp="2026-03-09 13:23:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:49.714364102 +0000 UTC m=+1503.728831642" watchObservedRunningTime="2026-03-09 13:23:49.727022286 +0000 UTC m=+1503.741489826" Mar 09 13:23:49 crc kubenswrapper[4723]: I0309 13:23:49.753931 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.753913205 podStartE2EDuration="2.753913205s" podCreationTimestamp="2026-03-09 13:23:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:49.750298139 +0000 UTC m=+1503.764765679" watchObservedRunningTime="2026-03-09 13:23:49.753913205 +0000 UTC m=+1503.768380745" Mar 09 13:23:50 crc kubenswrapper[4723]: I0309 13:23:50.737623 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"faddb12d-2b02-4f85-b253-8b9ce0c6ed27","Type":"ContainerStarted","Data":"037f538246ef982dda1da2494290f1ca3b27164494789f1720bebb5084dd4153"} Mar 09 13:23:50 crc kubenswrapper[4723]: I0309 13:23:50.764626 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.764602331 podStartE2EDuration="2.764602331s" podCreationTimestamp="2026-03-09 13:23:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:50.754805213 +0000 UTC m=+1504.769272753" watchObservedRunningTime="2026-03-09 13:23:50.764602331 +0000 UTC m=+1504.779069871" Mar 09 13:23:52 crc kubenswrapper[4723]: I0309 13:23:52.028526 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 09 13:23:52 crc kubenswrapper[4723]: I0309 13:23:52.029342 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 09 13:23:52 crc kubenswrapper[4723]: I0309 13:23:52.033146 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 09 13:23:52 crc kubenswrapper[4723]: I0309 13:23:52.033498 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 09 13:23:52 crc kubenswrapper[4723]: I0309 13:23:52.781372 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba" containerName="ceilometer-central-agent" containerID="cri-o://c1d5076ef09bea7125d21403fdc851853ee1d67fa9518a737b9979f050819662" gracePeriod=30 Mar 09 13:23:52 crc kubenswrapper[4723]: I0309 13:23:52.781947 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba","Type":"ContainerStarted","Data":"5ef91a72d4b940ecf1416c88fcb7be295f770c1a6e8f53f10ce0ae54b4425ca3"} Mar 09 13:23:52 crc kubenswrapper[4723]: I0309 13:23:52.781985 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 09 13:23:52 crc kubenswrapper[4723]: I0309 13:23:52.782013 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 13:23:52 crc kubenswrapper[4723]: I0309 13:23:52.782419 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba" containerName="proxy-httpd" containerID="cri-o://5ef91a72d4b940ecf1416c88fcb7be295f770c1a6e8f53f10ce0ae54b4425ca3" gracePeriod=30 Mar 09 13:23:52 crc kubenswrapper[4723]: I0309 13:23:52.782484 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba" containerName="sg-core" containerID="cri-o://078c49fcd767f1744dd6c1805775876365c8a7e05ea75dcbceb40feaa278e0ae" gracePeriod=30 Mar 09 13:23:52 crc kubenswrapper[4723]: I0309 13:23:52.782537 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba" containerName="ceilometer-notification-agent" containerID="cri-o://f80feb668af0d6e7e3d7c89ca7582277f191814e9e1203e4f2f140a7de0dde27" gracePeriod=30 Mar 09 13:23:52 crc kubenswrapper[4723]: I0309 13:23:52.786791 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 09 13:23:52 crc kubenswrapper[4723]: I0309 13:23:52.810771 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.410485325 podStartE2EDuration="8.810752648s" podCreationTimestamp="2026-03-09 13:23:44 +0000 UTC" firstStartedPulling="2026-03-09 13:23:45.887033274 +0000 UTC m=+1499.901500804" lastFinishedPulling="2026-03-09 13:23:51.287300587 +0000 UTC m=+1505.301768127" observedRunningTime="2026-03-09 13:23:52.802942422 +0000 UTC m=+1506.817409962" watchObservedRunningTime="2026-03-09 13:23:52.810752648 +0000 UTC m=+1506.825220188" Mar 09 13:23:53 crc kubenswrapper[4723]: I0309 13:23:53.022992 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79b5d74c8c-kng4m"] Mar 09 13:23:53 crc kubenswrapper[4723]: I0309 13:23:53.037408 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79b5d74c8c-kng4m" Mar 09 13:23:53 crc kubenswrapper[4723]: I0309 13:23:53.056284 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79b5d74c8c-kng4m"] Mar 09 13:23:53 crc kubenswrapper[4723]: I0309 13:23:53.097303 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1247b23-18d7-4343-90cb-e35826999ba9-ovsdbserver-nb\") pod \"dnsmasq-dns-79b5d74c8c-kng4m\" (UID: \"e1247b23-18d7-4343-90cb-e35826999ba9\") " pod="openstack/dnsmasq-dns-79b5d74c8c-kng4m" Mar 09 13:23:53 crc kubenswrapper[4723]: I0309 13:23:53.097345 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg288\" (UniqueName: \"kubernetes.io/projected/e1247b23-18d7-4343-90cb-e35826999ba9-kube-api-access-jg288\") pod \"dnsmasq-dns-79b5d74c8c-kng4m\" (UID: \"e1247b23-18d7-4343-90cb-e35826999ba9\") " pod="openstack/dnsmasq-dns-79b5d74c8c-kng4m" Mar 09 13:23:53 crc kubenswrapper[4723]: I0309 13:23:53.097374 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1247b23-18d7-4343-90cb-e35826999ba9-dns-svc\") pod \"dnsmasq-dns-79b5d74c8c-kng4m\" (UID: \"e1247b23-18d7-4343-90cb-e35826999ba9\") " pod="openstack/dnsmasq-dns-79b5d74c8c-kng4m" Mar 09 13:23:53 crc kubenswrapper[4723]: I0309 13:23:53.097402 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1247b23-18d7-4343-90cb-e35826999ba9-dns-swift-storage-0\") pod \"dnsmasq-dns-79b5d74c8c-kng4m\" (UID: \"e1247b23-18d7-4343-90cb-e35826999ba9\") " pod="openstack/dnsmasq-dns-79b5d74c8c-kng4m" Mar 09 13:23:53 crc kubenswrapper[4723]: I0309 13:23:53.097422 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1247b23-18d7-4343-90cb-e35826999ba9-config\") pod \"dnsmasq-dns-79b5d74c8c-kng4m\" (UID: \"e1247b23-18d7-4343-90cb-e35826999ba9\") " pod="openstack/dnsmasq-dns-79b5d74c8c-kng4m" Mar 09 13:23:53 crc kubenswrapper[4723]: I0309 13:23:53.097594 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1247b23-18d7-4343-90cb-e35826999ba9-ovsdbserver-sb\") pod \"dnsmasq-dns-79b5d74c8c-kng4m\" (UID: \"e1247b23-18d7-4343-90cb-e35826999ba9\") " pod="openstack/dnsmasq-dns-79b5d74c8c-kng4m" Mar 09 13:23:53 crc kubenswrapper[4723]: I0309 13:23:53.199056 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1247b23-18d7-4343-90cb-e35826999ba9-ovsdbserver-sb\") pod \"dnsmasq-dns-79b5d74c8c-kng4m\" (UID: \"e1247b23-18d7-4343-90cb-e35826999ba9\") " pod="openstack/dnsmasq-dns-79b5d74c8c-kng4m" Mar 09 13:23:53 crc kubenswrapper[4723]: I0309 13:23:53.199137 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1247b23-18d7-4343-90cb-e35826999ba9-ovsdbserver-nb\") pod \"dnsmasq-dns-79b5d74c8c-kng4m\" (UID: \"e1247b23-18d7-4343-90cb-e35826999ba9\") " pod="openstack/dnsmasq-dns-79b5d74c8c-kng4m" Mar 09 13:23:53 crc kubenswrapper[4723]: I0309 13:23:53.199160 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jg288\" (UniqueName: \"kubernetes.io/projected/e1247b23-18d7-4343-90cb-e35826999ba9-kube-api-access-jg288\") pod \"dnsmasq-dns-79b5d74c8c-kng4m\" (UID: \"e1247b23-18d7-4343-90cb-e35826999ba9\") " pod="openstack/dnsmasq-dns-79b5d74c8c-kng4m" Mar 09 13:23:53 crc kubenswrapper[4723]: I0309 13:23:53.199186 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1247b23-18d7-4343-90cb-e35826999ba9-dns-svc\") pod \"dnsmasq-dns-79b5d74c8c-kng4m\" (UID: \"e1247b23-18d7-4343-90cb-e35826999ba9\") " pod="openstack/dnsmasq-dns-79b5d74c8c-kng4m" Mar 09 13:23:53 crc kubenswrapper[4723]: I0309 13:23:53.199214 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1247b23-18d7-4343-90cb-e35826999ba9-dns-swift-storage-0\") pod \"dnsmasq-dns-79b5d74c8c-kng4m\" (UID: \"e1247b23-18d7-4343-90cb-e35826999ba9\") " pod="openstack/dnsmasq-dns-79b5d74c8c-kng4m" Mar 09 13:23:53 crc kubenswrapper[4723]: I0309 13:23:53.199234 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1247b23-18d7-4343-90cb-e35826999ba9-config\") pod \"dnsmasq-dns-79b5d74c8c-kng4m\" (UID: \"e1247b23-18d7-4343-90cb-e35826999ba9\") " pod="openstack/dnsmasq-dns-79b5d74c8c-kng4m" Mar 09 13:23:53 crc kubenswrapper[4723]: I0309 13:23:53.200026 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1247b23-18d7-4343-90cb-e35826999ba9-ovsdbserver-sb\") pod \"dnsmasq-dns-79b5d74c8c-kng4m\" (UID: \"e1247b23-18d7-4343-90cb-e35826999ba9\") " pod="openstack/dnsmasq-dns-79b5d74c8c-kng4m" Mar 09 13:23:53 crc kubenswrapper[4723]: I0309 13:23:53.200092 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1247b23-18d7-4343-90cb-e35826999ba9-config\") pod \"dnsmasq-dns-79b5d74c8c-kng4m\" (UID: \"e1247b23-18d7-4343-90cb-e35826999ba9\") " pod="openstack/dnsmasq-dns-79b5d74c8c-kng4m" Mar 09 13:23:53 crc kubenswrapper[4723]: I0309 13:23:53.200615 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1247b23-18d7-4343-90cb-e35826999ba9-dns-svc\") pod \"dnsmasq-dns-79b5d74c8c-kng4m\" (UID: \"e1247b23-18d7-4343-90cb-e35826999ba9\") " pod="openstack/dnsmasq-dns-79b5d74c8c-kng4m" Mar 09 13:23:53 crc kubenswrapper[4723]: I0309 13:23:53.200831 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1247b23-18d7-4343-90cb-e35826999ba9-ovsdbserver-nb\") pod \"dnsmasq-dns-79b5d74c8c-kng4m\" (UID: \"e1247b23-18d7-4343-90cb-e35826999ba9\") " pod="openstack/dnsmasq-dns-79b5d74c8c-kng4m" Mar 09 13:23:53 crc kubenswrapper[4723]: I0309 13:23:53.201140 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1247b23-18d7-4343-90cb-e35826999ba9-dns-swift-storage-0\") pod \"dnsmasq-dns-79b5d74c8c-kng4m\" (UID: \"e1247b23-18d7-4343-90cb-e35826999ba9\") " pod="openstack/dnsmasq-dns-79b5d74c8c-kng4m" Mar 09 13:23:53 crc kubenswrapper[4723]: I0309 13:23:53.236428 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg288\" (UniqueName: \"kubernetes.io/projected/e1247b23-18d7-4343-90cb-e35826999ba9-kube-api-access-jg288\") pod \"dnsmasq-dns-79b5d74c8c-kng4m\" (UID: \"e1247b23-18d7-4343-90cb-e35826999ba9\") " pod="openstack/dnsmasq-dns-79b5d74c8c-kng4m" Mar 09 13:23:53 crc kubenswrapper[4723]: I0309 13:23:53.260419 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:23:53 crc kubenswrapper[4723]: I0309 13:23:53.393292 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 09 13:23:53 crc kubenswrapper[4723]: I0309 13:23:53.415736 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79b5d74c8c-kng4m" Mar 09 13:23:53 crc kubenswrapper[4723]: I0309 13:23:53.729766 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 09 13:23:53 crc kubenswrapper[4723]: I0309 13:23:53.730112 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 09 13:23:53 crc kubenswrapper[4723]: I0309 13:23:53.802534 4723 generic.go:334] "Generic (PLEG): container finished" podID="1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba" containerID="5ef91a72d4b940ecf1416c88fcb7be295f770c1a6e8f53f10ce0ae54b4425ca3" exitCode=0 Mar 09 13:23:53 crc kubenswrapper[4723]: I0309 13:23:53.802576 4723 generic.go:334] "Generic (PLEG): container finished" podID="1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba" containerID="078c49fcd767f1744dd6c1805775876365c8a7e05ea75dcbceb40feaa278e0ae" exitCode=2 Mar 09 13:23:53 crc kubenswrapper[4723]: I0309 13:23:53.802588 4723 generic.go:334] "Generic (PLEG): container finished" podID="1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba" containerID="f80feb668af0d6e7e3d7c89ca7582277f191814e9e1203e4f2f140a7de0dde27" exitCode=0 Mar 09 13:23:53 crc kubenswrapper[4723]: I0309 13:23:53.803068 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba","Type":"ContainerDied","Data":"5ef91a72d4b940ecf1416c88fcb7be295f770c1a6e8f53f10ce0ae54b4425ca3"} Mar 09 13:23:53 crc kubenswrapper[4723]: I0309 13:23:53.803119 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba","Type":"ContainerDied","Data":"078c49fcd767f1744dd6c1805775876365c8a7e05ea75dcbceb40feaa278e0ae"} Mar 09 13:23:53 crc kubenswrapper[4723]: I0309 13:23:53.803131 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba","Type":"ContainerDied","Data":"f80feb668af0d6e7e3d7c89ca7582277f191814e9e1203e4f2f140a7de0dde27"} Mar 09 13:23:54 crc kubenswrapper[4723]: I0309 13:23:54.074090 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79b5d74c8c-kng4m"] Mar 09 13:23:54 crc kubenswrapper[4723]: I0309 13:23:54.814398 4723 generic.go:334] "Generic (PLEG): container finished" podID="e1247b23-18d7-4343-90cb-e35826999ba9" containerID="4149c338266a383a890ee2025b6f783bca7bdb9600bb31dc56c9dd1e756f1dc5" exitCode=0 Mar 09 13:23:54 crc kubenswrapper[4723]: I0309 13:23:54.814505 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79b5d74c8c-kng4m" event={"ID":"e1247b23-18d7-4343-90cb-e35826999ba9","Type":"ContainerDied","Data":"4149c338266a383a890ee2025b6f783bca7bdb9600bb31dc56c9dd1e756f1dc5"} Mar 09 13:23:54 crc kubenswrapper[4723]: I0309 13:23:54.814734 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79b5d74c8c-kng4m" event={"ID":"e1247b23-18d7-4343-90cb-e35826999ba9","Type":"ContainerStarted","Data":"e6c59a0cf90fb6ca4dffd5aa650f7f80dcdb7d95bd4a19de6242e9103210d852"} Mar 09 13:23:55 crc kubenswrapper[4723]: I0309 13:23:55.785822 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:23:55 crc kubenswrapper[4723]: I0309 13:23:55.843348 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79b5d74c8c-kng4m" event={"ID":"e1247b23-18d7-4343-90cb-e35826999ba9","Type":"ContainerStarted","Data":"a148f889e50bfd745321c33011dbca488fa39edc0bcdad45045f6df087a6b45d"} Mar 09 13:23:55 crc kubenswrapper[4723]: I0309 13:23:55.843618 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79b5d74c8c-kng4m" Mar 09 13:23:55 crc kubenswrapper[4723]: I0309 13:23:55.852540 4723 generic.go:334] "Generic (PLEG): container finished" podID="1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba" containerID="c1d5076ef09bea7125d21403fdc851853ee1d67fa9518a737b9979f050819662" exitCode=0 Mar 09 13:23:55 crc kubenswrapper[4723]: I0309 13:23:55.852949 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2938063f-bc95-4b78-8636-eded365a5f2c" containerName="nova-api-log" containerID="cri-o://18d1cd29d9a9d38d973f5f28934391a8eb60e638e99dc741611feb5303a88dda" gracePeriod=30 Mar 09 13:23:55 crc kubenswrapper[4723]: I0309 13:23:55.852995 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba","Type":"ContainerDied","Data":"c1d5076ef09bea7125d21403fdc851853ee1d67fa9518a737b9979f050819662"} Mar 09 13:23:55 crc kubenswrapper[4723]: I0309 13:23:55.853032 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2938063f-bc95-4b78-8636-eded365a5f2c" containerName="nova-api-api" containerID="cri-o://2b604c7c44840f5d4b27142f76394a30a5bcc51b7c6263488e159a907d976abd" gracePeriod=30 Mar 09 13:23:55 crc kubenswrapper[4723]: I0309 13:23:55.893445 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79b5d74c8c-kng4m" podStartSLOduration=3.893427001 podStartE2EDuration="3.893427001s" podCreationTimestamp="2026-03-09 13:23:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:55.877510502 +0000 UTC m=+1509.891978042" watchObservedRunningTime="2026-03-09 13:23:55.893427001 +0000 UTC m=+1509.907894541" Mar 09 13:23:55 crc kubenswrapper[4723]: I0309 13:23:55.948604 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:23:56 crc kubenswrapper[4723]: I0309 13:23:56.065076 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba-combined-ca-bundle\") pod \"1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba\" (UID: \"1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba\") " Mar 09 13:23:56 crc kubenswrapper[4723]: I0309 13:23:56.065390 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba-scripts\") pod \"1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba\" (UID: \"1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba\") " Mar 09 13:23:56 crc kubenswrapper[4723]: I0309 13:23:56.065505 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba-config-data\") pod \"1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba\" (UID: \"1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba\") " Mar 09 13:23:56 crc kubenswrapper[4723]: I0309 13:23:56.065587 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba-log-httpd\") pod \"1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba\" (UID: \"1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba\") " Mar 09 13:23:56 crc kubenswrapper[4723]: I0309 13:23:56.065732 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba-sg-core-conf-yaml\") pod \"1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba\" (UID: \"1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba\") " Mar 09 13:23:56 crc kubenswrapper[4723]: I0309 13:23:56.065858 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba-run-httpd\") pod \"1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba\" (UID: \"1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba\") " Mar 09 13:23:56 crc kubenswrapper[4723]: I0309 13:23:56.066073 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmf5k\" (UniqueName: \"kubernetes.io/projected/1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba-kube-api-access-gmf5k\") pod \"1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba\" (UID: \"1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba\") " Mar 09 13:23:56 crc kubenswrapper[4723]: I0309 13:23:56.066892 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba" (UID: "1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:23:56 crc kubenswrapper[4723]: I0309 13:23:56.068204 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba" (UID: "1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:23:56 crc kubenswrapper[4723]: I0309 13:23:56.073294 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba-scripts" (OuterVolumeSpecName: "scripts") pod "1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba" (UID: "1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:23:56 crc kubenswrapper[4723]: I0309 13:23:56.087195 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba-kube-api-access-gmf5k" (OuterVolumeSpecName: "kube-api-access-gmf5k") pod "1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba" (UID: "1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba"). InnerVolumeSpecName "kube-api-access-gmf5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:23:56 crc kubenswrapper[4723]: I0309 13:23:56.111978 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba" (UID: "1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:23:56 crc kubenswrapper[4723]: I0309 13:23:56.169202 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmf5k\" (UniqueName: \"kubernetes.io/projected/1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba-kube-api-access-gmf5k\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:56 crc kubenswrapper[4723]: I0309 13:23:56.169233 4723 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:56 crc kubenswrapper[4723]: I0309 13:23:56.169242 4723 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:56 crc kubenswrapper[4723]: I0309 13:23:56.169253 4723 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:56 crc kubenswrapper[4723]: I0309 13:23:56.169261 4723 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:56 crc kubenswrapper[4723]: I0309 13:23:56.182602 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba" (UID: "1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:23:56 crc kubenswrapper[4723]: I0309 13:23:56.276968 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:56 crc kubenswrapper[4723]: I0309 13:23:56.346186 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba-config-data" (OuterVolumeSpecName: "config-data") pod "1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba" (UID: "1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:23:56 crc kubenswrapper[4723]: I0309 13:23:56.379001 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:56 crc kubenswrapper[4723]: I0309 13:23:56.870184 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba","Type":"ContainerDied","Data":"15f9f01ad715422d3181f598dc4f1526a0edbbd1336af3b5f2cdb845b6bb4b0d"} Mar 09 13:23:56 crc kubenswrapper[4723]: I0309 13:23:56.870525 4723 scope.go:117] "RemoveContainer" containerID="5ef91a72d4b940ecf1416c88fcb7be295f770c1a6e8f53f10ce0ae54b4425ca3" Mar 09 13:23:56 crc kubenswrapper[4723]: I0309 13:23:56.870776 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:23:56 crc kubenswrapper[4723]: I0309 13:23:56.872801 4723 generic.go:334] "Generic (PLEG): container finished" podID="2938063f-bc95-4b78-8636-eded365a5f2c" containerID="18d1cd29d9a9d38d973f5f28934391a8eb60e638e99dc741611feb5303a88dda" exitCode=143 Mar 09 13:23:56 crc kubenswrapper[4723]: I0309 13:23:56.873374 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2938063f-bc95-4b78-8636-eded365a5f2c","Type":"ContainerDied","Data":"18d1cd29d9a9d38d973f5f28934391a8eb60e638e99dc741611feb5303a88dda"} Mar 09 13:23:56 crc kubenswrapper[4723]: I0309 13:23:56.897585 4723 scope.go:117] "RemoveContainer" containerID="078c49fcd767f1744dd6c1805775876365c8a7e05ea75dcbceb40feaa278e0ae" Mar 09 13:23:56 crc kubenswrapper[4723]: I0309 13:23:56.919512 4723 scope.go:117] "RemoveContainer" containerID="f80feb668af0d6e7e3d7c89ca7582277f191814e9e1203e4f2f140a7de0dde27" Mar 09 13:23:56 crc kubenswrapper[4723]: I0309 13:23:56.947232 4723 scope.go:117] "RemoveContainer" containerID="c1d5076ef09bea7125d21403fdc851853ee1d67fa9518a737b9979f050819662" Mar 09 13:23:56 crc kubenswrapper[4723]: I0309 13:23:56.957366 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:23:56 crc kubenswrapper[4723]: I0309 13:23:56.991102 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:23:57 crc kubenswrapper[4723]: I0309 13:23:57.043573 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:23:57 crc kubenswrapper[4723]: E0309 13:23:57.044202 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba" containerName="ceilometer-central-agent" Mar 09 13:23:57 crc kubenswrapper[4723]: I0309 13:23:57.044224 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba" containerName="ceilometer-central-agent" Mar 09 13:23:57 crc kubenswrapper[4723]: E0309 13:23:57.044241 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba" containerName="ceilometer-notification-agent" Mar 09 13:23:57 crc kubenswrapper[4723]: I0309 13:23:57.044249 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba" containerName="ceilometer-notification-agent" Mar 09 13:23:57 crc kubenswrapper[4723]: E0309 13:23:57.044271 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba" containerName="proxy-httpd" Mar 09 13:23:57 crc kubenswrapper[4723]: I0309 13:23:57.044277 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba" containerName="proxy-httpd" Mar 09 13:23:57 crc kubenswrapper[4723]: E0309 13:23:57.044299 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba" containerName="sg-core" Mar 09 13:23:57 crc kubenswrapper[4723]: I0309 13:23:57.044305 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba" containerName="sg-core" Mar 09 13:23:57 crc kubenswrapper[4723]: I0309 13:23:57.044543 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba" containerName="ceilometer-notification-agent" Mar 09 13:23:57 crc kubenswrapper[4723]: I0309 13:23:57.044558 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba" containerName="proxy-httpd" Mar 09 13:23:57 crc kubenswrapper[4723]: I0309 13:23:57.044573 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba" containerName="sg-core" Mar 09 13:23:57 crc kubenswrapper[4723]: I0309 13:23:57.044583 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba" containerName="ceilometer-central-agent" Mar 09 13:23:57 crc kubenswrapper[4723]: I0309 13:23:57.046837 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:23:57 crc kubenswrapper[4723]: I0309 13:23:57.049080 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 13:23:57 crc kubenswrapper[4723]: I0309 13:23:57.049375 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 13:23:57 crc kubenswrapper[4723]: I0309 13:23:57.062414 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:23:57 crc kubenswrapper[4723]: I0309 13:23:57.100875 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b188ad33-34c9-4d97-9a95-7f519c368868-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b188ad33-34c9-4d97-9a95-7f519c368868\") " pod="openstack/ceilometer-0" Mar 09 13:23:57 crc kubenswrapper[4723]: I0309 13:23:57.101029 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b188ad33-34c9-4d97-9a95-7f519c368868-scripts\") pod \"ceilometer-0\" (UID: \"b188ad33-34c9-4d97-9a95-7f519c368868\") " pod="openstack/ceilometer-0" Mar 09 13:23:57 crc kubenswrapper[4723]: I0309 13:23:57.101089 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b188ad33-34c9-4d97-9a95-7f519c368868-log-httpd\") pod \"ceilometer-0\" (UID: \"b188ad33-34c9-4d97-9a95-7f519c368868\") " pod="openstack/ceilometer-0" Mar 09 13:23:57 crc kubenswrapper[4723]: I0309 13:23:57.101327 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b188ad33-34c9-4d97-9a95-7f519c368868-run-httpd\") pod \"ceilometer-0\" (UID: \"b188ad33-34c9-4d97-9a95-7f519c368868\") " pod="openstack/ceilometer-0" Mar 09 13:23:57 crc kubenswrapper[4723]: I0309 13:23:57.101463 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b188ad33-34c9-4d97-9a95-7f519c368868-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b188ad33-34c9-4d97-9a95-7f519c368868\") " pod="openstack/ceilometer-0" Mar 09 13:23:57 crc kubenswrapper[4723]: I0309 13:23:57.101489 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vmh7\" (UniqueName: \"kubernetes.io/projected/b188ad33-34c9-4d97-9a95-7f519c368868-kube-api-access-6vmh7\") pod \"ceilometer-0\" (UID: \"b188ad33-34c9-4d97-9a95-7f519c368868\") " pod="openstack/ceilometer-0" Mar 09 13:23:57 crc kubenswrapper[4723]: I0309 13:23:57.101524 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b188ad33-34c9-4d97-9a95-7f519c368868-config-data\") pod \"ceilometer-0\" (UID: \"b188ad33-34c9-4d97-9a95-7f519c368868\") " pod="openstack/ceilometer-0" Mar 09 13:23:57 crc kubenswrapper[4723]: E0309 13:23:57.115245 4723 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b9a7211_38e1_47ad_a56c_7fdeb3e2e6ba.slice/crio-15f9f01ad715422d3181f598dc4f1526a0edbbd1336af3b5f2cdb845b6bb4b0d\": RecentStats: unable to find data in memory cache]" Mar 09 13:23:57 crc kubenswrapper[4723]: I0309 13:23:57.204826 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b188ad33-34c9-4d97-9a95-7f519c368868-log-httpd\") pod \"ceilometer-0\" (UID: \"b188ad33-34c9-4d97-9a95-7f519c368868\") " pod="openstack/ceilometer-0" Mar 09 13:23:57 crc kubenswrapper[4723]: I0309 13:23:57.205018 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b188ad33-34c9-4d97-9a95-7f519c368868-run-httpd\") pod \"ceilometer-0\" (UID: \"b188ad33-34c9-4d97-9a95-7f519c368868\") " pod="openstack/ceilometer-0" Mar 09 13:23:57 crc kubenswrapper[4723]: I0309 13:23:57.205129 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b188ad33-34c9-4d97-9a95-7f519c368868-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b188ad33-34c9-4d97-9a95-7f519c368868\") " pod="openstack/ceilometer-0" Mar 09 13:23:57 crc kubenswrapper[4723]: I0309 13:23:57.205159 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vmh7\" (UniqueName: \"kubernetes.io/projected/b188ad33-34c9-4d97-9a95-7f519c368868-kube-api-access-6vmh7\") pod \"ceilometer-0\" (UID: \"b188ad33-34c9-4d97-9a95-7f519c368868\") " pod="openstack/ceilometer-0" Mar 09 13:23:57 crc kubenswrapper[4723]: I0309 13:23:57.205212 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b188ad33-34c9-4d97-9a95-7f519c368868-config-data\") pod \"ceilometer-0\" (UID: \"b188ad33-34c9-4d97-9a95-7f519c368868\") " pod="openstack/ceilometer-0" Mar 09 13:23:57 crc kubenswrapper[4723]: I0309 13:23:57.205307 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b188ad33-34c9-4d97-9a95-7f519c368868-log-httpd\") pod \"ceilometer-0\" (UID: \"b188ad33-34c9-4d97-9a95-7f519c368868\") " pod="openstack/ceilometer-0" Mar 09 13:23:57 crc kubenswrapper[4723]: I0309 13:23:57.205332 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b188ad33-34c9-4d97-9a95-7f519c368868-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b188ad33-34c9-4d97-9a95-7f519c368868\") " pod="openstack/ceilometer-0" Mar 09 13:23:57 crc kubenswrapper[4723]: I0309 13:23:57.205417 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b188ad33-34c9-4d97-9a95-7f519c368868-run-httpd\") pod \"ceilometer-0\" (UID: \"b188ad33-34c9-4d97-9a95-7f519c368868\") " pod="openstack/ceilometer-0" Mar 09 13:23:57 crc kubenswrapper[4723]: I0309 13:23:57.205567 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b188ad33-34c9-4d97-9a95-7f519c368868-scripts\") pod \"ceilometer-0\" (UID: \"b188ad33-34c9-4d97-9a95-7f519c368868\") " pod="openstack/ceilometer-0" Mar 09 13:23:57 crc kubenswrapper[4723]: I0309 13:23:57.210535 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b188ad33-34c9-4d97-9a95-7f519c368868-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b188ad33-34c9-4d97-9a95-7f519c368868\") " pod="openstack/ceilometer-0" Mar 09 13:23:57 crc kubenswrapper[4723]: I0309 13:23:57.211389 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b188ad33-34c9-4d97-9a95-7f519c368868-scripts\") pod \"ceilometer-0\" (UID: \"b188ad33-34c9-4d97-9a95-7f519c368868\") " pod="openstack/ceilometer-0" Mar 09 13:23:57 crc kubenswrapper[4723]: I0309 13:23:57.212806 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b188ad33-34c9-4d97-9a95-7f519c368868-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b188ad33-34c9-4d97-9a95-7f519c368868\") " pod="openstack/ceilometer-0" Mar 09 13:23:57 crc kubenswrapper[4723]: I0309 13:23:57.213713 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b188ad33-34c9-4d97-9a95-7f519c368868-config-data\") pod \"ceilometer-0\" (UID: \"b188ad33-34c9-4d97-9a95-7f519c368868\") " pod="openstack/ceilometer-0" Mar 09 13:23:57 crc kubenswrapper[4723]: I0309 13:23:57.224985 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vmh7\" (UniqueName: \"kubernetes.io/projected/b188ad33-34c9-4d97-9a95-7f519c368868-kube-api-access-6vmh7\") pod \"ceilometer-0\" (UID: \"b188ad33-34c9-4d97-9a95-7f519c368868\") " pod="openstack/ceilometer-0" Mar 09 13:23:57 crc kubenswrapper[4723]: I0309 13:23:57.371108 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:23:57 crc kubenswrapper[4723]: I0309 13:23:57.953840 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:23:57 crc kubenswrapper[4723]: W0309 13:23:57.964834 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb188ad33_34c9_4d97_9a95_7f519c368868.slice/crio-d0c973a5707330e7914c9ca2e2ffe7f3082375e6e3d4b78de4aff84c66d5f577 WatchSource:0}: Error finding container d0c973a5707330e7914c9ca2e2ffe7f3082375e6e3d4b78de4aff84c66d5f577: Status 404 returned error can't find the container with id d0c973a5707330e7914c9ca2e2ffe7f3082375e6e3d4b78de4aff84c66d5f577 Mar 09 13:23:58 crc kubenswrapper[4723]: I0309 13:23:58.260502 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:23:58 crc kubenswrapper[4723]: I0309 13:23:58.292402 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:23:58 crc kubenswrapper[4723]: I0309 13:23:58.393674 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 09 13:23:58 crc kubenswrapper[4723]: I0309 13:23:58.431294 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 09 13:23:58 crc kubenswrapper[4723]: I0309 13:23:58.608454 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:23:58 crc kubenswrapper[4723]: I0309 13:23:58.729060 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 09 13:23:58 crc kubenswrapper[4723]: I0309 13:23:58.729483 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 09 13:23:58 crc kubenswrapper[4723]: I0309 13:23:58.900805 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba" path="/var/lib/kubelet/pods/1b9a7211-38e1-47ad-a56c-7fdeb3e2e6ba/volumes" Mar 09 13:23:58 crc kubenswrapper[4723]: I0309 13:23:58.901692 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b188ad33-34c9-4d97-9a95-7f519c368868","Type":"ContainerStarted","Data":"512f30fac78f8f10d424b1d98e98c157290edb4dc1f0c4e5a7151c0e22d02e9c"} Mar 09 13:23:58 crc kubenswrapper[4723]: I0309 13:23:58.901722 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b188ad33-34c9-4d97-9a95-7f519c368868","Type":"ContainerStarted","Data":"d0c973a5707330e7914c9ca2e2ffe7f3082375e6e3d4b78de4aff84c66d5f577"} Mar 09 13:23:58 crc kubenswrapper[4723]: I0309 13:23:58.925630 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:23:58 crc kubenswrapper[4723]: I0309 13:23:58.939246 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 09 13:23:59 crc kubenswrapper[4723]: I0309 13:23:59.162161 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-schhb"] Mar 09 13:23:59 crc kubenswrapper[4723]: I0309 13:23:59.163766 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-schhb" Mar 09 13:23:59 crc kubenswrapper[4723]: I0309 13:23:59.166146 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 09 13:23:59 crc kubenswrapper[4723]: I0309 13:23:59.166444 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 09 13:23:59 crc kubenswrapper[4723]: I0309 13:23:59.178451 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-schhb"] Mar 09 13:23:59 crc kubenswrapper[4723]: I0309 13:23:59.260852 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df4024d7-4807-416a-883e-b36bdc7945b7-config-data\") pod \"nova-cell1-cell-mapping-schhb\" (UID: \"df4024d7-4807-416a-883e-b36bdc7945b7\") " pod="openstack/nova-cell1-cell-mapping-schhb" Mar 09 13:23:59 crc kubenswrapper[4723]: I0309 13:23:59.261224 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df4024d7-4807-416a-883e-b36bdc7945b7-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-schhb\" (UID: \"df4024d7-4807-416a-883e-b36bdc7945b7\") " pod="openstack/nova-cell1-cell-mapping-schhb" Mar 09 13:23:59 crc kubenswrapper[4723]: I0309 13:23:59.261255 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tff26\" (UniqueName: \"kubernetes.io/projected/df4024d7-4807-416a-883e-b36bdc7945b7-kube-api-access-tff26\") pod \"nova-cell1-cell-mapping-schhb\" (UID: \"df4024d7-4807-416a-883e-b36bdc7945b7\") " pod="openstack/nova-cell1-cell-mapping-schhb" Mar 09 13:23:59 crc kubenswrapper[4723]: I0309 13:23:59.261408 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df4024d7-4807-416a-883e-b36bdc7945b7-scripts\") pod \"nova-cell1-cell-mapping-schhb\" (UID: \"df4024d7-4807-416a-883e-b36bdc7945b7\") " pod="openstack/nova-cell1-cell-mapping-schhb" Mar 09 13:23:59 crc kubenswrapper[4723]: I0309 13:23:59.363967 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df4024d7-4807-416a-883e-b36bdc7945b7-scripts\") pod \"nova-cell1-cell-mapping-schhb\" (UID: \"df4024d7-4807-416a-883e-b36bdc7945b7\") " pod="openstack/nova-cell1-cell-mapping-schhb" Mar 09 13:23:59 crc kubenswrapper[4723]: I0309 13:23:59.364104 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df4024d7-4807-416a-883e-b36bdc7945b7-config-data\") pod \"nova-cell1-cell-mapping-schhb\" (UID: \"df4024d7-4807-416a-883e-b36bdc7945b7\") " pod="openstack/nova-cell1-cell-mapping-schhb" Mar 09 13:23:59 crc kubenswrapper[4723]: I0309 13:23:59.364151 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df4024d7-4807-416a-883e-b36bdc7945b7-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-schhb\" (UID: \"df4024d7-4807-416a-883e-b36bdc7945b7\") " pod="openstack/nova-cell1-cell-mapping-schhb" Mar 09 13:23:59 crc kubenswrapper[4723]: I0309 13:23:59.364179 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tff26\" (UniqueName: \"kubernetes.io/projected/df4024d7-4807-416a-883e-b36bdc7945b7-kube-api-access-tff26\") pod \"nova-cell1-cell-mapping-schhb\" (UID: \"df4024d7-4807-416a-883e-b36bdc7945b7\") " pod="openstack/nova-cell1-cell-mapping-schhb" Mar 09 13:23:59 crc kubenswrapper[4723]: I0309 13:23:59.368566 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df4024d7-4807-416a-883e-b36bdc7945b7-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-schhb\" (UID: \"df4024d7-4807-416a-883e-b36bdc7945b7\") " pod="openstack/nova-cell1-cell-mapping-schhb" Mar 09 13:23:59 crc kubenswrapper[4723]: I0309 13:23:59.370464 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df4024d7-4807-416a-883e-b36bdc7945b7-config-data\") pod \"nova-cell1-cell-mapping-schhb\" (UID: \"df4024d7-4807-416a-883e-b36bdc7945b7\") " pod="openstack/nova-cell1-cell-mapping-schhb" Mar 09 13:23:59 crc kubenswrapper[4723]: I0309 13:23:59.376315 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df4024d7-4807-416a-883e-b36bdc7945b7-scripts\") pod \"nova-cell1-cell-mapping-schhb\" (UID: \"df4024d7-4807-416a-883e-b36bdc7945b7\") " pod="openstack/nova-cell1-cell-mapping-schhb" Mar 09 13:23:59 crc kubenswrapper[4723]: I0309 13:23:59.400152 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tff26\" (UniqueName: \"kubernetes.io/projected/df4024d7-4807-416a-883e-b36bdc7945b7-kube-api-access-tff26\") pod \"nova-cell1-cell-mapping-schhb\" (UID: \"df4024d7-4807-416a-883e-b36bdc7945b7\") " pod="openstack/nova-cell1-cell-mapping-schhb" Mar 09 13:23:59 crc kubenswrapper[4723]: I0309 13:23:59.658329 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-schhb" Mar 09 13:23:59 crc kubenswrapper[4723]: I0309 13:23:59.750007 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="faddb12d-2b02-4f85-b253-8b9ce0c6ed27" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.7:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 13:23:59 crc kubenswrapper[4723]: I0309 13:23:59.749985 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="faddb12d-2b02-4f85-b253-8b9ce0c6ed27" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.7:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 13:23:59 crc kubenswrapper[4723]: I0309 13:23:59.966114 4723 generic.go:334] "Generic (PLEG): container finished" podID="2938063f-bc95-4b78-8636-eded365a5f2c" containerID="2b604c7c44840f5d4b27142f76394a30a5bcc51b7c6263488e159a907d976abd" exitCode=0 Mar 09 13:23:59 crc kubenswrapper[4723]: I0309 13:23:59.966477 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2938063f-bc95-4b78-8636-eded365a5f2c","Type":"ContainerDied","Data":"2b604c7c44840f5d4b27142f76394a30a5bcc51b7c6263488e159a907d976abd"} Mar 09 13:23:59 crc kubenswrapper[4723]: I0309 13:23:59.995994 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b188ad33-34c9-4d97-9a95-7f519c368868","Type":"ContainerStarted","Data":"803e9ffa7d030488d68054e0aa05509cda4918f9903de85a4cf61327d8544b09"} Mar 09 13:24:00 crc kubenswrapper[4723]: I0309 13:24:00.238165 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551044-kr92d"] Mar 09 13:24:00 crc kubenswrapper[4723]: I0309 13:24:00.239771 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551044-kr92d" Mar 09 13:24:00 crc kubenswrapper[4723]: I0309 13:24:00.245385 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:24:00 crc kubenswrapper[4723]: I0309 13:24:00.245583 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:24:00 crc kubenswrapper[4723]: I0309 13:24:00.245698 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jrp4x" Mar 09 13:24:00 crc kubenswrapper[4723]: I0309 13:24:00.255401 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551044-kr92d"] Mar 09 13:24:00 crc kubenswrapper[4723]: I0309 13:24:00.305614 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59zgk\" (UniqueName: \"kubernetes.io/projected/66dc3748-8aa4-4a0d-8162-42a120d6233d-kube-api-access-59zgk\") pod \"auto-csr-approver-29551044-kr92d\" (UID: \"66dc3748-8aa4-4a0d-8162-42a120d6233d\") " pod="openshift-infra/auto-csr-approver-29551044-kr92d" Mar 09 13:24:00 crc kubenswrapper[4723]: I0309 13:24:00.366654 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 13:24:00 crc kubenswrapper[4723]: I0309 13:24:00.407363 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59zgk\" (UniqueName: \"kubernetes.io/projected/66dc3748-8aa4-4a0d-8162-42a120d6233d-kube-api-access-59zgk\") pod \"auto-csr-approver-29551044-kr92d\" (UID: \"66dc3748-8aa4-4a0d-8162-42a120d6233d\") " pod="openshift-infra/auto-csr-approver-29551044-kr92d" Mar 09 13:24:00 crc kubenswrapper[4723]: I0309 13:24:00.440148 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59zgk\" (UniqueName: \"kubernetes.io/projected/66dc3748-8aa4-4a0d-8162-42a120d6233d-kube-api-access-59zgk\") pod \"auto-csr-approver-29551044-kr92d\" (UID: \"66dc3748-8aa4-4a0d-8162-42a120d6233d\") " pod="openshift-infra/auto-csr-approver-29551044-kr92d" Mar 09 13:24:00 crc kubenswrapper[4723]: I0309 13:24:00.509647 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mwcd\" (UniqueName: \"kubernetes.io/projected/2938063f-bc95-4b78-8636-eded365a5f2c-kube-api-access-9mwcd\") pod \"2938063f-bc95-4b78-8636-eded365a5f2c\" (UID: \"2938063f-bc95-4b78-8636-eded365a5f2c\") " Mar 09 13:24:00 crc kubenswrapper[4723]: I0309 13:24:00.510048 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2938063f-bc95-4b78-8636-eded365a5f2c-combined-ca-bundle\") pod \"2938063f-bc95-4b78-8636-eded365a5f2c\" (UID: \"2938063f-bc95-4b78-8636-eded365a5f2c\") " Mar 09 13:24:00 crc kubenswrapper[4723]: I0309 13:24:00.510225 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2938063f-bc95-4b78-8636-eded365a5f2c-logs\") pod \"2938063f-bc95-4b78-8636-eded365a5f2c\" (UID: \"2938063f-bc95-4b78-8636-eded365a5f2c\") " Mar 09 13:24:00 crc kubenswrapper[4723]: I0309 13:24:00.510387 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2938063f-bc95-4b78-8636-eded365a5f2c-config-data\") pod \"2938063f-bc95-4b78-8636-eded365a5f2c\" (UID: \"2938063f-bc95-4b78-8636-eded365a5f2c\") " Mar 09 13:24:00 crc kubenswrapper[4723]: I0309 13:24:00.510826 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2938063f-bc95-4b78-8636-eded365a5f2c-logs" (OuterVolumeSpecName: "logs") pod "2938063f-bc95-4b78-8636-eded365a5f2c" (UID: "2938063f-bc95-4b78-8636-eded365a5f2c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:24:00 crc kubenswrapper[4723]: I0309 13:24:00.511391 4723 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2938063f-bc95-4b78-8636-eded365a5f2c-logs\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:00 crc kubenswrapper[4723]: I0309 13:24:00.513971 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2938063f-bc95-4b78-8636-eded365a5f2c-kube-api-access-9mwcd" (OuterVolumeSpecName: "kube-api-access-9mwcd") pod "2938063f-bc95-4b78-8636-eded365a5f2c" (UID: "2938063f-bc95-4b78-8636-eded365a5f2c"). InnerVolumeSpecName "kube-api-access-9mwcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:24:00 crc kubenswrapper[4723]: I0309 13:24:00.585046 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2938063f-bc95-4b78-8636-eded365a5f2c-config-data" (OuterVolumeSpecName: "config-data") pod "2938063f-bc95-4b78-8636-eded365a5f2c" (UID: "2938063f-bc95-4b78-8636-eded365a5f2c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:24:00 crc kubenswrapper[4723]: I0309 13:24:00.604996 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2938063f-bc95-4b78-8636-eded365a5f2c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2938063f-bc95-4b78-8636-eded365a5f2c" (UID: "2938063f-bc95-4b78-8636-eded365a5f2c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:24:00 crc kubenswrapper[4723]: I0309 13:24:00.614309 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2938063f-bc95-4b78-8636-eded365a5f2c-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:00 crc kubenswrapper[4723]: I0309 13:24:00.614343 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mwcd\" (UniqueName: \"kubernetes.io/projected/2938063f-bc95-4b78-8636-eded365a5f2c-kube-api-access-9mwcd\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:00 crc kubenswrapper[4723]: I0309 13:24:00.614354 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2938063f-bc95-4b78-8636-eded365a5f2c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:00 crc kubenswrapper[4723]: I0309 13:24:00.660446 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551044-kr92d" Mar 09 13:24:00 crc kubenswrapper[4723]: I0309 13:24:00.718601 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-schhb"] Mar 09 13:24:01 crc kubenswrapper[4723]: I0309 13:24:01.031523 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-schhb" event={"ID":"df4024d7-4807-416a-883e-b36bdc7945b7","Type":"ContainerStarted","Data":"bf75c11a6cf225aa39e3d068e41ba89c4ab3850731fb0ea06a28e818407f99dd"} Mar 09 13:24:01 crc kubenswrapper[4723]: I0309 13:24:01.044087 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b188ad33-34c9-4d97-9a95-7f519c368868","Type":"ContainerStarted","Data":"290ccd7a9f329b889c65c69da32e6911ce00971140586fb13bd5f3cbef576d13"} Mar 09 13:24:01 crc kubenswrapper[4723]: I0309 13:24:01.056271 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2938063f-bc95-4b78-8636-eded365a5f2c","Type":"ContainerDied","Data":"3f3816cac9b4c3afaf8027d64d1b71242c59fe7fca4012aee44c7598ade3d4cb"} Mar 09 13:24:01 crc kubenswrapper[4723]: I0309 13:24:01.056525 4723 scope.go:117] "RemoveContainer" containerID="2b604c7c44840f5d4b27142f76394a30a5bcc51b7c6263488e159a907d976abd" Mar 09 13:24:01 crc kubenswrapper[4723]: I0309 13:24:01.056484 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 13:24:01 crc kubenswrapper[4723]: I0309 13:24:01.118036 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:24:01 crc kubenswrapper[4723]: I0309 13:24:01.119033 4723 scope.go:117] "RemoveContainer" containerID="18d1cd29d9a9d38d973f5f28934391a8eb60e638e99dc741611feb5303a88dda" Mar 09 13:24:01 crc kubenswrapper[4723]: I0309 13:24:01.139721 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:24:01 crc kubenswrapper[4723]: I0309 13:24:01.176678 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 09 13:24:01 crc kubenswrapper[4723]: E0309 13:24:01.177499 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2938063f-bc95-4b78-8636-eded365a5f2c" containerName="nova-api-log" Mar 09 13:24:01 crc kubenswrapper[4723]: I0309 13:24:01.177519 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="2938063f-bc95-4b78-8636-eded365a5f2c" containerName="nova-api-log" Mar 09 13:24:01 crc kubenswrapper[4723]: E0309 13:24:01.177580 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2938063f-bc95-4b78-8636-eded365a5f2c" containerName="nova-api-api" Mar 09 13:24:01 crc kubenswrapper[4723]: I0309 13:24:01.177590 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="2938063f-bc95-4b78-8636-eded365a5f2c" containerName="nova-api-api" Mar 09 13:24:01 crc kubenswrapper[4723]: I0309 13:24:01.178074 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="2938063f-bc95-4b78-8636-eded365a5f2c" containerName="nova-api-api" Mar 09 13:24:01 crc kubenswrapper[4723]: I0309 13:24:01.178139 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="2938063f-bc95-4b78-8636-eded365a5f2c" containerName="nova-api-log" Mar 09 13:24:01 crc kubenswrapper[4723]: I0309 13:24:01.180831 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 13:24:01 crc kubenswrapper[4723]: I0309 13:24:01.185919 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 09 13:24:01 crc kubenswrapper[4723]: I0309 13:24:01.186163 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 09 13:24:01 crc kubenswrapper[4723]: I0309 13:24:01.188179 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 09 13:24:01 crc kubenswrapper[4723]: I0309 13:24:01.219684 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:24:01 crc kubenswrapper[4723]: I0309 13:24:01.237675 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d-config-data\") pod \"nova-api-0\" (UID: \"1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d\") " pod="openstack/nova-api-0" Mar 09 13:24:01 crc kubenswrapper[4723]: I0309 13:24:01.237807 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d\") " pod="openstack/nova-api-0" Mar 09 13:24:01 crc kubenswrapper[4723]: I0309 13:24:01.238086 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d-logs\") pod \"nova-api-0\" (UID: \"1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d\") " pod="openstack/nova-api-0" Mar 09 13:24:01 crc kubenswrapper[4723]: I0309 13:24:01.238190 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9684r\" (UniqueName: \"kubernetes.io/projected/1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d-kube-api-access-9684r\") pod \"nova-api-0\" (UID: \"1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d\") " pod="openstack/nova-api-0" Mar 09 13:24:01 crc kubenswrapper[4723]: I0309 13:24:01.238354 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d-public-tls-certs\") pod \"nova-api-0\" (UID: \"1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d\") " pod="openstack/nova-api-0" Mar 09 13:24:01 crc kubenswrapper[4723]: I0309 13:24:01.239001 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d\") " pod="openstack/nova-api-0" Mar 09 13:24:01 crc kubenswrapper[4723]: I0309 13:24:01.241912 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551044-kr92d"] Mar 09 13:24:01 crc kubenswrapper[4723]: I0309 13:24:01.341415 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d-config-data\") pod \"nova-api-0\" (UID: \"1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d\") " pod="openstack/nova-api-0" Mar 09 13:24:01 crc kubenswrapper[4723]: I0309 13:24:01.341522 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d\") " pod="openstack/nova-api-0" Mar 09 13:24:01 crc kubenswrapper[4723]: I0309 13:24:01.341618 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d-logs\") pod \"nova-api-0\" (UID: \"1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d\") " pod="openstack/nova-api-0" Mar 09 13:24:01 crc kubenswrapper[4723]: I0309 13:24:01.341673 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9684r\" (UniqueName: \"kubernetes.io/projected/1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d-kube-api-access-9684r\") pod \"nova-api-0\" (UID: \"1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d\") " pod="openstack/nova-api-0" Mar 09 13:24:01 crc kubenswrapper[4723]: I0309 13:24:01.341737 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d-public-tls-certs\") pod \"nova-api-0\" (UID: \"1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d\") " pod="openstack/nova-api-0" Mar 09 13:24:01 crc kubenswrapper[4723]: I0309 13:24:01.342793 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d-logs\") pod \"nova-api-0\" (UID: \"1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d\") " pod="openstack/nova-api-0" Mar 09 13:24:01 crc kubenswrapper[4723]: I0309 13:24:01.343082 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d\") " pod="openstack/nova-api-0" Mar 09 13:24:01 crc kubenswrapper[4723]: I0309 13:24:01.349370 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d-public-tls-certs\") pod \"nova-api-0\" (UID: \"1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d\") " pod="openstack/nova-api-0" Mar 09 13:24:01 crc kubenswrapper[4723]: I0309 13:24:01.349646 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d\") " pod="openstack/nova-api-0" Mar 09 13:24:01 crc kubenswrapper[4723]: I0309 13:24:01.349727 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d\") " pod="openstack/nova-api-0" Mar 09 13:24:01 crc kubenswrapper[4723]: I0309 13:24:01.363410 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9684r\" (UniqueName: \"kubernetes.io/projected/1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d-kube-api-access-9684r\") pod \"nova-api-0\" (UID: \"1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d\") " pod="openstack/nova-api-0" Mar 09 13:24:01 crc kubenswrapper[4723]: I0309 13:24:01.363600 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d-config-data\") pod \"nova-api-0\" (UID: \"1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d\") " pod="openstack/nova-api-0" Mar 09 13:24:01 crc kubenswrapper[4723]: I0309 13:24:01.594638 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 13:24:02 crc kubenswrapper[4723]: I0309 13:24:02.068870 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551044-kr92d" event={"ID":"66dc3748-8aa4-4a0d-8162-42a120d6233d","Type":"ContainerStarted","Data":"a8ca5f09ac2492c71ab3f1a2bdb3fb4c3863fa55de3a46f6297b526522718b95"} Mar 09 13:24:02 crc kubenswrapper[4723]: I0309 13:24:02.070620 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-schhb" event={"ID":"df4024d7-4807-416a-883e-b36bdc7945b7","Type":"ContainerStarted","Data":"2f856257f42c32ae4211493d61e0c0341c2a6e862631446bed630983f8fdcfbb"} Mar 09 13:24:02 crc kubenswrapper[4723]: I0309 13:24:02.086344 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-schhb" podStartSLOduration=3.086326515 podStartE2EDuration="3.086326515s" podCreationTimestamp="2026-03-09 13:23:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:02.08536949 +0000 UTC m=+1516.099837040" watchObservedRunningTime="2026-03-09 13:24:02.086326515 +0000 UTC m=+1516.100794055" Mar 09 13:24:02 crc kubenswrapper[4723]: I0309 13:24:02.229623 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:24:02 crc kubenswrapper[4723]: W0309 13:24:02.231060 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e2f3a8f_6de5_4830_8bc3_3d5e81fadf3d.slice/crio-9c25c7369cdae8752aa70c6e98a69538596e244bf0a578026420cd3a00ba12cf WatchSource:0}: Error finding container 9c25c7369cdae8752aa70c6e98a69538596e244bf0a578026420cd3a00ba12cf: Status 404 returned error can't find the container with id 9c25c7369cdae8752aa70c6e98a69538596e244bf0a578026420cd3a00ba12cf Mar 09 13:24:02 crc kubenswrapper[4723]: I0309 13:24:02.895945 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2938063f-bc95-4b78-8636-eded365a5f2c" path="/var/lib/kubelet/pods/2938063f-bc95-4b78-8636-eded365a5f2c/volumes" Mar 09 13:24:03 crc kubenswrapper[4723]: I0309 13:24:03.085335 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d","Type":"ContainerStarted","Data":"8632df57bd8060190400951999a1ba2614f2a94a50fe0f7f20b8598554c4a526"} Mar 09 13:24:03 crc kubenswrapper[4723]: I0309 13:24:03.085707 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d","Type":"ContainerStarted","Data":"278c9cff7d9c390beb6efa6f92cae7060d558df4a9770d3bc6ebce0c097deb21"} Mar 09 13:24:03 crc kubenswrapper[4723]: I0309 13:24:03.085720 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d","Type":"ContainerStarted","Data":"9c25c7369cdae8752aa70c6e98a69538596e244bf0a578026420cd3a00ba12cf"} Mar 09 13:24:03 crc kubenswrapper[4723]: I0309 13:24:03.102891 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551044-kr92d" event={"ID":"66dc3748-8aa4-4a0d-8162-42a120d6233d","Type":"ContainerStarted","Data":"25cbf1b28d497c5a5e2adaae63fafcf501646f6ae727f3ba0a578919f989820a"} Mar 09 13:24:03 crc kubenswrapper[4723]: I0309 13:24:03.126124 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.126103047 podStartE2EDuration="2.126103047s" podCreationTimestamp="2026-03-09 13:24:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:03.108410491 +0000 UTC m=+1517.122878031" watchObservedRunningTime="2026-03-09 13:24:03.126103047 +0000 UTC m=+1517.140570587" Mar 09 13:24:03 crc kubenswrapper[4723]: I0309 13:24:03.130444 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551044-kr92d" podStartSLOduration=2.21061631 podStartE2EDuration="3.130431722s" podCreationTimestamp="2026-03-09 13:24:00 +0000 UTC" firstStartedPulling="2026-03-09 13:24:01.203379025 +0000 UTC m=+1515.217846565" lastFinishedPulling="2026-03-09 13:24:02.123194437 +0000 UTC m=+1516.137661977" observedRunningTime="2026-03-09 13:24:03.127458133 +0000 UTC m=+1517.141925683" watchObservedRunningTime="2026-03-09 13:24:03.130431722 +0000 UTC m=+1517.144899262" Mar 09 13:24:03 crc kubenswrapper[4723]: I0309 13:24:03.418133 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79b5d74c8c-kng4m" Mar 09 13:24:03 crc kubenswrapper[4723]: I0309 13:24:03.500161 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fbc4d444f-mlq2z"] Mar 09 13:24:03 crc kubenswrapper[4723]: I0309 13:24:03.500410 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5fbc4d444f-mlq2z" podUID="330db7f4-8928-4101-b28f-e4a129b90227" containerName="dnsmasq-dns" containerID="cri-o://7942d56e2ba234419f069386784848fa3fa7a8bb97e4e04d5dbb94e9322d0277" gracePeriod=10 Mar 09 13:24:03 crc kubenswrapper[4723]: I0309 13:24:03.947095 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:24:03 crc kubenswrapper[4723]: I0309 13:24:03.947417 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:24:04 crc kubenswrapper[4723]: I0309 13:24:04.124485 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b188ad33-34c9-4d97-9a95-7f519c368868","Type":"ContainerStarted","Data":"88c868240d128088ba74e0f842208ae46ddd8185b28fd852c0ceaafdaa95abab"} Mar 09 13:24:04 crc kubenswrapper[4723]: I0309 13:24:04.124636 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b188ad33-34c9-4d97-9a95-7f519c368868" containerName="ceilometer-central-agent" containerID="cri-o://512f30fac78f8f10d424b1d98e98c157290edb4dc1f0c4e5a7151c0e22d02e9c" gracePeriod=30 Mar 09 13:24:04 crc kubenswrapper[4723]: I0309 13:24:04.124962 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 13:24:04 crc kubenswrapper[4723]: I0309 13:24:04.124984 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b188ad33-34c9-4d97-9a95-7f519c368868" containerName="proxy-httpd" containerID="cri-o://88c868240d128088ba74e0f842208ae46ddd8185b28fd852c0ceaafdaa95abab" gracePeriod=30 Mar 09 13:24:04 crc kubenswrapper[4723]: I0309 13:24:04.125027 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b188ad33-34c9-4d97-9a95-7f519c368868" containerName="sg-core" containerID="cri-o://290ccd7a9f329b889c65c69da32e6911ce00971140586fb13bd5f3cbef576d13" gracePeriod=30 Mar 09 13:24:04 crc kubenswrapper[4723]: I0309 13:24:04.125059 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b188ad33-34c9-4d97-9a95-7f519c368868" containerName="ceilometer-notification-agent" containerID="cri-o://803e9ffa7d030488d68054e0aa05509cda4918f9903de85a4cf61327d8544b09" gracePeriod=30 Mar 09 13:24:04 crc kubenswrapper[4723]: I0309 13:24:04.149205 4723 generic.go:334] "Generic (PLEG): container finished" podID="330db7f4-8928-4101-b28f-e4a129b90227" containerID="7942d56e2ba234419f069386784848fa3fa7a8bb97e4e04d5dbb94e9322d0277" exitCode=0 Mar 09 13:24:04 crc kubenswrapper[4723]: I0309 13:24:04.150708 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fbc4d444f-mlq2z" event={"ID":"330db7f4-8928-4101-b28f-e4a129b90227","Type":"ContainerDied","Data":"7942d56e2ba234419f069386784848fa3fa7a8bb97e4e04d5dbb94e9322d0277"} Mar 09 13:24:04 crc kubenswrapper[4723]: I0309 13:24:04.150746 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fbc4d444f-mlq2z" event={"ID":"330db7f4-8928-4101-b28f-e4a129b90227","Type":"ContainerDied","Data":"79edb7c88617db0eeec9de18a881a1f497e595c08408f78fb6f14ad5c189471c"} Mar 09 13:24:04 crc kubenswrapper[4723]: I0309 13:24:04.150762 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79edb7c88617db0eeec9de18a881a1f497e595c08408f78fb6f14ad5c189471c" Mar 09 13:24:04 crc kubenswrapper[4723]: I0309 13:24:04.176014 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.361089472 podStartE2EDuration="8.175995347s" podCreationTimestamp="2026-03-09 13:23:56 +0000 UTC" firstStartedPulling="2026-03-09 13:23:57.977655152 +0000 UTC m=+1511.992122692" lastFinishedPulling="2026-03-09 13:24:02.792561027 +0000 UTC m=+1516.807028567" observedRunningTime="2026-03-09 13:24:04.15296169 +0000 UTC m=+1518.167429240" watchObservedRunningTime="2026-03-09 13:24:04.175995347 +0000 UTC m=+1518.190462887" Mar 09 13:24:04 crc kubenswrapper[4723]: I0309 13:24:04.201959 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fbc4d444f-mlq2z" Mar 09 13:24:04 crc kubenswrapper[4723]: I0309 13:24:04.364808 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/330db7f4-8928-4101-b28f-e4a129b90227-dns-swift-storage-0\") pod \"330db7f4-8928-4101-b28f-e4a129b90227\" (UID: \"330db7f4-8928-4101-b28f-e4a129b90227\") " Mar 09 13:24:04 crc kubenswrapper[4723]: I0309 13:24:04.364974 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/330db7f4-8928-4101-b28f-e4a129b90227-ovsdbserver-nb\") pod \"330db7f4-8928-4101-b28f-e4a129b90227\" (UID: \"330db7f4-8928-4101-b28f-e4a129b90227\") " Mar 09 13:24:04 crc kubenswrapper[4723]: I0309 13:24:04.365044 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/330db7f4-8928-4101-b28f-e4a129b90227-config\") pod \"330db7f4-8928-4101-b28f-e4a129b90227\" (UID: \"330db7f4-8928-4101-b28f-e4a129b90227\") " Mar 09 13:24:04 crc kubenswrapper[4723]: I0309 13:24:04.365167 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n962m\" (UniqueName: \"kubernetes.io/projected/330db7f4-8928-4101-b28f-e4a129b90227-kube-api-access-n962m\") pod \"330db7f4-8928-4101-b28f-e4a129b90227\" (UID: \"330db7f4-8928-4101-b28f-e4a129b90227\") " Mar 09 13:24:04 crc kubenswrapper[4723]: I0309 13:24:04.365283 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/330db7f4-8928-4101-b28f-e4a129b90227-ovsdbserver-sb\") pod \"330db7f4-8928-4101-b28f-e4a129b90227\" (UID: \"330db7f4-8928-4101-b28f-e4a129b90227\") " Mar 09 13:24:04 crc kubenswrapper[4723]: I0309 13:24:04.365348 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/330db7f4-8928-4101-b28f-e4a129b90227-dns-svc\") pod \"330db7f4-8928-4101-b28f-e4a129b90227\" (UID: \"330db7f4-8928-4101-b28f-e4a129b90227\") " Mar 09 13:24:04 crc kubenswrapper[4723]: I0309 13:24:04.389308 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/330db7f4-8928-4101-b28f-e4a129b90227-kube-api-access-n962m" (OuterVolumeSpecName: "kube-api-access-n962m") pod "330db7f4-8928-4101-b28f-e4a129b90227" (UID: "330db7f4-8928-4101-b28f-e4a129b90227"). InnerVolumeSpecName "kube-api-access-n962m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:24:04 crc kubenswrapper[4723]: I0309 13:24:04.454616 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/330db7f4-8928-4101-b28f-e4a129b90227-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "330db7f4-8928-4101-b28f-e4a129b90227" (UID: "330db7f4-8928-4101-b28f-e4a129b90227"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:24:04 crc kubenswrapper[4723]: I0309 13:24:04.468651 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n962m\" (UniqueName: \"kubernetes.io/projected/330db7f4-8928-4101-b28f-e4a129b90227-kube-api-access-n962m\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:04 crc kubenswrapper[4723]: I0309 13:24:04.468686 4723 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/330db7f4-8928-4101-b28f-e4a129b90227-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:04 crc kubenswrapper[4723]: I0309 13:24:04.482335 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/330db7f4-8928-4101-b28f-e4a129b90227-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "330db7f4-8928-4101-b28f-e4a129b90227" (UID: "330db7f4-8928-4101-b28f-e4a129b90227"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:24:04 crc kubenswrapper[4723]: I0309 13:24:04.495316 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/330db7f4-8928-4101-b28f-e4a129b90227-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "330db7f4-8928-4101-b28f-e4a129b90227" (UID: "330db7f4-8928-4101-b28f-e4a129b90227"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:24:04 crc kubenswrapper[4723]: I0309 13:24:04.500368 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/330db7f4-8928-4101-b28f-e4a129b90227-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "330db7f4-8928-4101-b28f-e4a129b90227" (UID: "330db7f4-8928-4101-b28f-e4a129b90227"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:24:04 crc kubenswrapper[4723]: I0309 13:24:04.512912 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/330db7f4-8928-4101-b28f-e4a129b90227-config" (OuterVolumeSpecName: "config") pod "330db7f4-8928-4101-b28f-e4a129b90227" (UID: "330db7f4-8928-4101-b28f-e4a129b90227"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:24:04 crc kubenswrapper[4723]: I0309 13:24:04.570428 4723 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/330db7f4-8928-4101-b28f-e4a129b90227-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:04 crc kubenswrapper[4723]: I0309 13:24:04.570455 4723 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/330db7f4-8928-4101-b28f-e4a129b90227-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:04 crc kubenswrapper[4723]: I0309 13:24:04.570469 4723 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/330db7f4-8928-4101-b28f-e4a129b90227-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:04 crc kubenswrapper[4723]: I0309 13:24:04.570478 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/330db7f4-8928-4101-b28f-e4a129b90227-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:05 crc kubenswrapper[4723]: I0309 13:24:05.163553 4723 generic.go:334] "Generic (PLEG): container finished" podID="66dc3748-8aa4-4a0d-8162-42a120d6233d" containerID="25cbf1b28d497c5a5e2adaae63fafcf501646f6ae727f3ba0a578919f989820a" exitCode=0 Mar 09 13:24:05 crc kubenswrapper[4723]: I0309 13:24:05.163625 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551044-kr92d" event={"ID":"66dc3748-8aa4-4a0d-8162-42a120d6233d","Type":"ContainerDied","Data":"25cbf1b28d497c5a5e2adaae63fafcf501646f6ae727f3ba0a578919f989820a"} Mar 09 13:24:05 crc kubenswrapper[4723]: I0309 13:24:05.167129 4723 generic.go:334] "Generic (PLEG): container finished" podID="b188ad33-34c9-4d97-9a95-7f519c368868" containerID="88c868240d128088ba74e0f842208ae46ddd8185b28fd852c0ceaafdaa95abab" exitCode=0 Mar 09 13:24:05 crc kubenswrapper[4723]: I0309 13:24:05.167157 4723 generic.go:334] "Generic (PLEG): container finished" podID="b188ad33-34c9-4d97-9a95-7f519c368868" containerID="290ccd7a9f329b889c65c69da32e6911ce00971140586fb13bd5f3cbef576d13" exitCode=2 Mar 09 13:24:05 crc kubenswrapper[4723]: I0309 13:24:05.167170 4723 generic.go:334] "Generic (PLEG): container finished" podID="b188ad33-34c9-4d97-9a95-7f519c368868" containerID="803e9ffa7d030488d68054e0aa05509cda4918f9903de85a4cf61327d8544b09" exitCode=0 Mar 09 13:24:05 crc kubenswrapper[4723]: I0309 13:24:05.167210 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b188ad33-34c9-4d97-9a95-7f519c368868","Type":"ContainerDied","Data":"88c868240d128088ba74e0f842208ae46ddd8185b28fd852c0ceaafdaa95abab"} Mar 09 13:24:05 crc kubenswrapper[4723]: I0309 13:24:05.167231 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fbc4d444f-mlq2z" Mar 09 13:24:05 crc kubenswrapper[4723]: I0309 13:24:05.167238 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b188ad33-34c9-4d97-9a95-7f519c368868","Type":"ContainerDied","Data":"290ccd7a9f329b889c65c69da32e6911ce00971140586fb13bd5f3cbef576d13"} Mar 09 13:24:05 crc kubenswrapper[4723]: I0309 13:24:05.167248 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b188ad33-34c9-4d97-9a95-7f519c368868","Type":"ContainerDied","Data":"803e9ffa7d030488d68054e0aa05509cda4918f9903de85a4cf61327d8544b09"} Mar 09 13:24:05 crc kubenswrapper[4723]: I0309 13:24:05.205072 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fbc4d444f-mlq2z"] Mar 09 13:24:05 crc kubenswrapper[4723]: I0309 13:24:05.218790 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5fbc4d444f-mlq2z"] Mar 09 13:24:06 crc kubenswrapper[4723]: I0309 13:24:06.681992 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551044-kr92d" Mar 09 13:24:06 crc kubenswrapper[4723]: I0309 13:24:06.832740 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59zgk\" (UniqueName: \"kubernetes.io/projected/66dc3748-8aa4-4a0d-8162-42a120d6233d-kube-api-access-59zgk\") pod \"66dc3748-8aa4-4a0d-8162-42a120d6233d\" (UID: \"66dc3748-8aa4-4a0d-8162-42a120d6233d\") " Mar 09 13:24:06 crc kubenswrapper[4723]: I0309 13:24:06.838010 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66dc3748-8aa4-4a0d-8162-42a120d6233d-kube-api-access-59zgk" (OuterVolumeSpecName: "kube-api-access-59zgk") pod "66dc3748-8aa4-4a0d-8162-42a120d6233d" (UID: "66dc3748-8aa4-4a0d-8162-42a120d6233d"). InnerVolumeSpecName "kube-api-access-59zgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:24:06 crc kubenswrapper[4723]: I0309 13:24:06.898290 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="330db7f4-8928-4101-b28f-e4a129b90227" path="/var/lib/kubelet/pods/330db7f4-8928-4101-b28f-e4a129b90227/volumes" Mar 09 13:24:06 crc kubenswrapper[4723]: I0309 13:24:06.936629 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59zgk\" (UniqueName: \"kubernetes.io/projected/66dc3748-8aa4-4a0d-8162-42a120d6233d-kube-api-access-59zgk\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:07 crc kubenswrapper[4723]: I0309 13:24:07.195798 4723 generic.go:334] "Generic (PLEG): container finished" podID="b188ad33-34c9-4d97-9a95-7f519c368868" containerID="512f30fac78f8f10d424b1d98e98c157290edb4dc1f0c4e5a7151c0e22d02e9c" exitCode=0 Mar 09 13:24:07 crc kubenswrapper[4723]: I0309 13:24:07.195844 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b188ad33-34c9-4d97-9a95-7f519c368868","Type":"ContainerDied","Data":"512f30fac78f8f10d424b1d98e98c157290edb4dc1f0c4e5a7151c0e22d02e9c"} Mar 09 13:24:07 crc kubenswrapper[4723]: I0309 13:24:07.205541 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551044-kr92d" event={"ID":"66dc3748-8aa4-4a0d-8162-42a120d6233d","Type":"ContainerDied","Data":"a8ca5f09ac2492c71ab3f1a2bdb3fb4c3863fa55de3a46f6297b526522718b95"} Mar 09 13:24:07 crc kubenswrapper[4723]: I0309 13:24:07.205646 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8ca5f09ac2492c71ab3f1a2bdb3fb4c3863fa55de3a46f6297b526522718b95" Mar 09 13:24:07 crc kubenswrapper[4723]: I0309 13:24:07.205734 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551044-kr92d" Mar 09 13:24:07 crc kubenswrapper[4723]: I0309 13:24:07.289707 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551038-49mwf"] Mar 09 13:24:07 crc kubenswrapper[4723]: I0309 13:24:07.316991 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551038-49mwf"] Mar 09 13:24:07 crc kubenswrapper[4723]: I0309 13:24:07.490090 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:24:07 crc kubenswrapper[4723]: I0309 13:24:07.562838 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b188ad33-34c9-4d97-9a95-7f519c368868-config-data\") pod \"b188ad33-34c9-4d97-9a95-7f519c368868\" (UID: \"b188ad33-34c9-4d97-9a95-7f519c368868\") " Mar 09 13:24:07 crc kubenswrapper[4723]: I0309 13:24:07.563112 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b188ad33-34c9-4d97-9a95-7f519c368868-sg-core-conf-yaml\") pod \"b188ad33-34c9-4d97-9a95-7f519c368868\" (UID: \"b188ad33-34c9-4d97-9a95-7f519c368868\") " Mar 09 13:24:07 crc kubenswrapper[4723]: I0309 13:24:07.563169 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vmh7\" (UniqueName: \"kubernetes.io/projected/b188ad33-34c9-4d97-9a95-7f519c368868-kube-api-access-6vmh7\") pod \"b188ad33-34c9-4d97-9a95-7f519c368868\" (UID: \"b188ad33-34c9-4d97-9a95-7f519c368868\") " Mar 09 13:24:07 crc kubenswrapper[4723]: I0309 13:24:07.563228 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b188ad33-34c9-4d97-9a95-7f519c368868-run-httpd\") pod \"b188ad33-34c9-4d97-9a95-7f519c368868\" (UID: \"b188ad33-34c9-4d97-9a95-7f519c368868\") " Mar 09 13:24:07 crc kubenswrapper[4723]: I0309 13:24:07.563343 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b188ad33-34c9-4d97-9a95-7f519c368868-log-httpd\") pod \"b188ad33-34c9-4d97-9a95-7f519c368868\" (UID: \"b188ad33-34c9-4d97-9a95-7f519c368868\") " Mar 09 13:24:07 crc kubenswrapper[4723]: I0309 13:24:07.563382 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b188ad33-34c9-4d97-9a95-7f519c368868-scripts\") pod \"b188ad33-34c9-4d97-9a95-7f519c368868\" (UID: \"b188ad33-34c9-4d97-9a95-7f519c368868\") " Mar 09 13:24:07 crc kubenswrapper[4723]: I0309 13:24:07.563457 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b188ad33-34c9-4d97-9a95-7f519c368868-combined-ca-bundle\") pod \"b188ad33-34c9-4d97-9a95-7f519c368868\" (UID: \"b188ad33-34c9-4d97-9a95-7f519c368868\") " Mar 09 13:24:07 crc kubenswrapper[4723]: I0309 13:24:07.563655 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b188ad33-34c9-4d97-9a95-7f519c368868-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b188ad33-34c9-4d97-9a95-7f519c368868" (UID: "b188ad33-34c9-4d97-9a95-7f519c368868"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:24:07 crc kubenswrapper[4723]: I0309 13:24:07.563998 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b188ad33-34c9-4d97-9a95-7f519c368868-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b188ad33-34c9-4d97-9a95-7f519c368868" (UID: "b188ad33-34c9-4d97-9a95-7f519c368868"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:24:07 crc kubenswrapper[4723]: I0309 13:24:07.564464 4723 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b188ad33-34c9-4d97-9a95-7f519c368868-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:07 crc kubenswrapper[4723]: I0309 13:24:07.564483 4723 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b188ad33-34c9-4d97-9a95-7f519c368868-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:07 crc kubenswrapper[4723]: I0309 13:24:07.569021 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b188ad33-34c9-4d97-9a95-7f519c368868-scripts" (OuterVolumeSpecName: "scripts") pod "b188ad33-34c9-4d97-9a95-7f519c368868" (UID: "b188ad33-34c9-4d97-9a95-7f519c368868"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:24:07 crc kubenswrapper[4723]: I0309 13:24:07.572096 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b188ad33-34c9-4d97-9a95-7f519c368868-kube-api-access-6vmh7" (OuterVolumeSpecName: "kube-api-access-6vmh7") pod "b188ad33-34c9-4d97-9a95-7f519c368868" (UID: "b188ad33-34c9-4d97-9a95-7f519c368868"). InnerVolumeSpecName "kube-api-access-6vmh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:24:07 crc kubenswrapper[4723]: I0309 13:24:07.614916 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b188ad33-34c9-4d97-9a95-7f519c368868-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b188ad33-34c9-4d97-9a95-7f519c368868" (UID: "b188ad33-34c9-4d97-9a95-7f519c368868"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:24:07 crc kubenswrapper[4723]: I0309 13:24:07.667998 4723 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b188ad33-34c9-4d97-9a95-7f519c368868-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:07 crc kubenswrapper[4723]: I0309 13:24:07.668036 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vmh7\" (UniqueName: \"kubernetes.io/projected/b188ad33-34c9-4d97-9a95-7f519c368868-kube-api-access-6vmh7\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:07 crc kubenswrapper[4723]: I0309 13:24:07.668049 4723 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b188ad33-34c9-4d97-9a95-7f519c368868-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:07 crc kubenswrapper[4723]: I0309 13:24:07.672715 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b188ad33-34c9-4d97-9a95-7f519c368868-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b188ad33-34c9-4d97-9a95-7f519c368868" (UID: "b188ad33-34c9-4d97-9a95-7f519c368868"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:24:07 crc kubenswrapper[4723]: I0309 13:24:07.718132 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b188ad33-34c9-4d97-9a95-7f519c368868-config-data" (OuterVolumeSpecName: "config-data") pod "b188ad33-34c9-4d97-9a95-7f519c368868" (UID: "b188ad33-34c9-4d97-9a95-7f519c368868"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:24:07 crc kubenswrapper[4723]: I0309 13:24:07.771743 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b188ad33-34c9-4d97-9a95-7f519c368868-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:07 crc kubenswrapper[4723]: I0309 13:24:07.772002 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b188ad33-34c9-4d97-9a95-7f519c368868-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.219726 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b188ad33-34c9-4d97-9a95-7f519c368868","Type":"ContainerDied","Data":"d0c973a5707330e7914c9ca2e2ffe7f3082375e6e3d4b78de4aff84c66d5f577"} Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.219787 4723 scope.go:117] "RemoveContainer" containerID="88c868240d128088ba74e0f842208ae46ddd8185b28fd852c0ceaafdaa95abab" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.219960 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.225547 4723 generic.go:334] "Generic (PLEG): container finished" podID="df4024d7-4807-416a-883e-b36bdc7945b7" containerID="2f856257f42c32ae4211493d61e0c0341c2a6e862631446bed630983f8fdcfbb" exitCode=0 Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.225585 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-schhb" event={"ID":"df4024d7-4807-416a-883e-b36bdc7945b7","Type":"ContainerDied","Data":"2f856257f42c32ae4211493d61e0c0341c2a6e862631446bed630983f8fdcfbb"} Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.271029 4723 scope.go:117] "RemoveContainer" containerID="290ccd7a9f329b889c65c69da32e6911ce00971140586fb13bd5f3cbef576d13" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.304801 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.322516 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.326628 4723 scope.go:117] "RemoveContainer" containerID="803e9ffa7d030488d68054e0aa05509cda4918f9903de85a4cf61327d8544b09" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.335390 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:24:08 crc kubenswrapper[4723]: E0309 13:24:08.336141 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b188ad33-34c9-4d97-9a95-7f519c368868" containerName="ceilometer-notification-agent" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.336163 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="b188ad33-34c9-4d97-9a95-7f519c368868" containerName="ceilometer-notification-agent" Mar 09 13:24:08 crc kubenswrapper[4723]: E0309 13:24:08.336189 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="330db7f4-8928-4101-b28f-e4a129b90227" containerName="dnsmasq-dns" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.336203 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="330db7f4-8928-4101-b28f-e4a129b90227" containerName="dnsmasq-dns" Mar 09 13:24:08 crc kubenswrapper[4723]: E0309 13:24:08.336251 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66dc3748-8aa4-4a0d-8162-42a120d6233d" containerName="oc" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.336263 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="66dc3748-8aa4-4a0d-8162-42a120d6233d" containerName="oc" Mar 09 13:24:08 crc kubenswrapper[4723]: E0309 13:24:08.336285 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b188ad33-34c9-4d97-9a95-7f519c368868" containerName="sg-core" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.336296 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="b188ad33-34c9-4d97-9a95-7f519c368868" containerName="sg-core" Mar 09 13:24:08 crc kubenswrapper[4723]: E0309 13:24:08.336310 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="330db7f4-8928-4101-b28f-e4a129b90227" containerName="init" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.336320 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="330db7f4-8928-4101-b28f-e4a129b90227" containerName="init" Mar 09 13:24:08 crc kubenswrapper[4723]: E0309 13:24:08.336369 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b188ad33-34c9-4d97-9a95-7f519c368868" containerName="ceilometer-central-agent" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.336381 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="b188ad33-34c9-4d97-9a95-7f519c368868" containerName="ceilometer-central-agent" Mar 09 13:24:08 crc kubenswrapper[4723]: E0309 13:24:08.336399 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b188ad33-34c9-4d97-9a95-7f519c368868" containerName="proxy-httpd" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.336409 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="b188ad33-34c9-4d97-9a95-7f519c368868" containerName="proxy-httpd" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.336798 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="b188ad33-34c9-4d97-9a95-7f519c368868" containerName="ceilometer-central-agent" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.336833 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="b188ad33-34c9-4d97-9a95-7f519c368868" containerName="proxy-httpd" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.336883 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="66dc3748-8aa4-4a0d-8162-42a120d6233d" containerName="oc" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.336904 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="b188ad33-34c9-4d97-9a95-7f519c368868" containerName="sg-core" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.336927 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="330db7f4-8928-4101-b28f-e4a129b90227" containerName="dnsmasq-dns" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.336966 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="b188ad33-34c9-4d97-9a95-7f519c368868" containerName="ceilometer-notification-agent" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.345457 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.353712 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.355158 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.361276 4723 scope.go:117] "RemoveContainer" containerID="512f30fac78f8f10d424b1d98e98c157290edb4dc1f0c4e5a7151c0e22d02e9c" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.369039 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.487359 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/224c4d37-323a-4d7c-9b7c-c284b931b6fd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"224c4d37-323a-4d7c-9b7c-c284b931b6fd\") " pod="openstack/ceilometer-0" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.487771 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/224c4d37-323a-4d7c-9b7c-c284b931b6fd-config-data\") pod \"ceilometer-0\" (UID: \"224c4d37-323a-4d7c-9b7c-c284b931b6fd\") " pod="openstack/ceilometer-0" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.487810 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crn4h\" (UniqueName: \"kubernetes.io/projected/224c4d37-323a-4d7c-9b7c-c284b931b6fd-kube-api-access-crn4h\") pod \"ceilometer-0\" (UID: \"224c4d37-323a-4d7c-9b7c-c284b931b6fd\") " pod="openstack/ceilometer-0" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.487978 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/224c4d37-323a-4d7c-9b7c-c284b931b6fd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"224c4d37-323a-4d7c-9b7c-c284b931b6fd\") " pod="openstack/ceilometer-0" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.488001 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/224c4d37-323a-4d7c-9b7c-c284b931b6fd-scripts\") pod \"ceilometer-0\" (UID: \"224c4d37-323a-4d7c-9b7c-c284b931b6fd\") " pod="openstack/ceilometer-0" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.488022 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/224c4d37-323a-4d7c-9b7c-c284b931b6fd-log-httpd\") pod \"ceilometer-0\" (UID: \"224c4d37-323a-4d7c-9b7c-c284b931b6fd\") " pod="openstack/ceilometer-0" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.488045 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/224c4d37-323a-4d7c-9b7c-c284b931b6fd-run-httpd\") pod \"ceilometer-0\" (UID: \"224c4d37-323a-4d7c-9b7c-c284b931b6fd\") " pod="openstack/ceilometer-0" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.589754 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crn4h\" (UniqueName: \"kubernetes.io/projected/224c4d37-323a-4d7c-9b7c-c284b931b6fd-kube-api-access-crn4h\") pod \"ceilometer-0\" (UID: \"224c4d37-323a-4d7c-9b7c-c284b931b6fd\") " pod="openstack/ceilometer-0" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.589950 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/224c4d37-323a-4d7c-9b7c-c284b931b6fd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"224c4d37-323a-4d7c-9b7c-c284b931b6fd\") " pod="openstack/ceilometer-0" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.589978 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/224c4d37-323a-4d7c-9b7c-c284b931b6fd-scripts\") pod \"ceilometer-0\" (UID: \"224c4d37-323a-4d7c-9b7c-c284b931b6fd\") " pod="openstack/ceilometer-0" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.590018 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/224c4d37-323a-4d7c-9b7c-c284b931b6fd-log-httpd\") pod \"ceilometer-0\" (UID: \"224c4d37-323a-4d7c-9b7c-c284b931b6fd\") " pod="openstack/ceilometer-0" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.590049 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/224c4d37-323a-4d7c-9b7c-c284b931b6fd-run-httpd\") pod \"ceilometer-0\" (UID: \"224c4d37-323a-4d7c-9b7c-c284b931b6fd\") " pod="openstack/ceilometer-0" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.590095 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/224c4d37-323a-4d7c-9b7c-c284b931b6fd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"224c4d37-323a-4d7c-9b7c-c284b931b6fd\") " pod="openstack/ceilometer-0" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.590189 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/224c4d37-323a-4d7c-9b7c-c284b931b6fd-config-data\") pod \"ceilometer-0\" (UID: \"224c4d37-323a-4d7c-9b7c-c284b931b6fd\") " pod="openstack/ceilometer-0" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.590923 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/224c4d37-323a-4d7c-9b7c-c284b931b6fd-run-httpd\") pod \"ceilometer-0\" (UID: \"224c4d37-323a-4d7c-9b7c-c284b931b6fd\") " pod="openstack/ceilometer-0" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.592270 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/224c4d37-323a-4d7c-9b7c-c284b931b6fd-log-httpd\") pod \"ceilometer-0\" (UID: \"224c4d37-323a-4d7c-9b7c-c284b931b6fd\") " pod="openstack/ceilometer-0" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.596306 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/224c4d37-323a-4d7c-9b7c-c284b931b6fd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"224c4d37-323a-4d7c-9b7c-c284b931b6fd\") " pod="openstack/ceilometer-0" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.596456 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/224c4d37-323a-4d7c-9b7c-c284b931b6fd-scripts\") pod \"ceilometer-0\" (UID: \"224c4d37-323a-4d7c-9b7c-c284b931b6fd\") " pod="openstack/ceilometer-0" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.596629 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/224c4d37-323a-4d7c-9b7c-c284b931b6fd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"224c4d37-323a-4d7c-9b7c-c284b931b6fd\") " pod="openstack/ceilometer-0" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.607127 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/224c4d37-323a-4d7c-9b7c-c284b931b6fd-config-data\") pod \"ceilometer-0\" (UID: \"224c4d37-323a-4d7c-9b7c-c284b931b6fd\") " pod="openstack/ceilometer-0" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.621696 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crn4h\" (UniqueName: \"kubernetes.io/projected/224c4d37-323a-4d7c-9b7c-c284b931b6fd-kube-api-access-crn4h\") pod \"ceilometer-0\" (UID: \"224c4d37-323a-4d7c-9b7c-c284b931b6fd\") " pod="openstack/ceilometer-0" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.669275 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.746126 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.752665 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.754107 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.897175 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75404ce0-2053-477b-8324-d967c2dff0e9" path="/var/lib/kubelet/pods/75404ce0-2053-477b-8324-d967c2dff0e9/volumes" Mar 09 13:24:08 crc kubenswrapper[4723]: I0309 13:24:08.898531 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b188ad33-34c9-4d97-9a95-7f519c368868" path="/var/lib/kubelet/pods/b188ad33-34c9-4d97-9a95-7f519c368868/volumes" Mar 09 13:24:09 crc kubenswrapper[4723]: W0309 13:24:09.246321 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod224c4d37_323a_4d7c_9b7c_c284b931b6fd.slice/crio-ba678d7b397c0f90a7b39b35ba0c8033a808675a5529cb8eb07a76fd24ed8abf WatchSource:0}: Error finding container ba678d7b397c0f90a7b39b35ba0c8033a808675a5529cb8eb07a76fd24ed8abf: Status 404 returned error can't find the container with id ba678d7b397c0f90a7b39b35ba0c8033a808675a5529cb8eb07a76fd24ed8abf Mar 09 13:24:09 crc kubenswrapper[4723]: I0309 13:24:09.248578 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 09 13:24:09 crc kubenswrapper[4723]: I0309 13:24:09.252407 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:24:09 crc kubenswrapper[4723]: I0309 13:24:09.779660 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-schhb" Mar 09 13:24:09 crc kubenswrapper[4723]: I0309 13:24:09.940287 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df4024d7-4807-416a-883e-b36bdc7945b7-combined-ca-bundle\") pod \"df4024d7-4807-416a-883e-b36bdc7945b7\" (UID: \"df4024d7-4807-416a-883e-b36bdc7945b7\") " Mar 09 13:24:09 crc kubenswrapper[4723]: I0309 13:24:09.940818 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df4024d7-4807-416a-883e-b36bdc7945b7-scripts\") pod \"df4024d7-4807-416a-883e-b36bdc7945b7\" (UID: \"df4024d7-4807-416a-883e-b36bdc7945b7\") " Mar 09 13:24:09 crc kubenswrapper[4723]: I0309 13:24:09.940856 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tff26\" (UniqueName: \"kubernetes.io/projected/df4024d7-4807-416a-883e-b36bdc7945b7-kube-api-access-tff26\") pod \"df4024d7-4807-416a-883e-b36bdc7945b7\" (UID: \"df4024d7-4807-416a-883e-b36bdc7945b7\") " Mar 09 13:24:09 crc kubenswrapper[4723]: I0309 13:24:09.951992 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df4024d7-4807-416a-883e-b36bdc7945b7-config-data\") pod \"df4024d7-4807-416a-883e-b36bdc7945b7\" (UID: \"df4024d7-4807-416a-883e-b36bdc7945b7\") " Mar 09 13:24:09 crc kubenswrapper[4723]: I0309 13:24:09.968706 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df4024d7-4807-416a-883e-b36bdc7945b7-scripts" (OuterVolumeSpecName: "scripts") pod "df4024d7-4807-416a-883e-b36bdc7945b7" (UID: "df4024d7-4807-416a-883e-b36bdc7945b7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:24:09 crc kubenswrapper[4723]: I0309 13:24:09.978087 4723 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df4024d7-4807-416a-883e-b36bdc7945b7-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:09 crc kubenswrapper[4723]: I0309 13:24:09.988113 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df4024d7-4807-416a-883e-b36bdc7945b7-kube-api-access-tff26" (OuterVolumeSpecName: "kube-api-access-tff26") pod "df4024d7-4807-416a-883e-b36bdc7945b7" (UID: "df4024d7-4807-416a-883e-b36bdc7945b7"). InnerVolumeSpecName "kube-api-access-tff26". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:24:10 crc kubenswrapper[4723]: I0309 13:24:10.031517 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df4024d7-4807-416a-883e-b36bdc7945b7-config-data" (OuterVolumeSpecName: "config-data") pod "df4024d7-4807-416a-883e-b36bdc7945b7" (UID: "df4024d7-4807-416a-883e-b36bdc7945b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:24:10 crc kubenswrapper[4723]: I0309 13:24:10.085135 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tff26\" (UniqueName: \"kubernetes.io/projected/df4024d7-4807-416a-883e-b36bdc7945b7-kube-api-access-tff26\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:10 crc kubenswrapper[4723]: I0309 13:24:10.088077 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df4024d7-4807-416a-883e-b36bdc7945b7-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:10 crc kubenswrapper[4723]: I0309 13:24:10.092000 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df4024d7-4807-416a-883e-b36bdc7945b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df4024d7-4807-416a-883e-b36bdc7945b7" (UID: "df4024d7-4807-416a-883e-b36bdc7945b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:24:10 crc kubenswrapper[4723]: I0309 13:24:10.191649 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df4024d7-4807-416a-883e-b36bdc7945b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:10 crc kubenswrapper[4723]: I0309 13:24:10.254607 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-schhb" event={"ID":"df4024d7-4807-416a-883e-b36bdc7945b7","Type":"ContainerDied","Data":"bf75c11a6cf225aa39e3d068e41ba89c4ab3850731fb0ea06a28e818407f99dd"} Mar 09 13:24:10 crc kubenswrapper[4723]: I0309 13:24:10.254647 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf75c11a6cf225aa39e3d068e41ba89c4ab3850731fb0ea06a28e818407f99dd" Mar 09 13:24:10 crc kubenswrapper[4723]: I0309 13:24:10.254724 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-schhb" Mar 09 13:24:10 crc kubenswrapper[4723]: I0309 13:24:10.256652 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"224c4d37-323a-4d7c-9b7c-c284b931b6fd","Type":"ContainerStarted","Data":"48fcaf846f4d2a86691b857cd5e78789c74342699162b59cf13aad047d5bfef3"} Mar 09 13:24:10 crc kubenswrapper[4723]: I0309 13:24:10.256716 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"224c4d37-323a-4d7c-9b7c-c284b931b6fd","Type":"ContainerStarted","Data":"ba678d7b397c0f90a7b39b35ba0c8033a808675a5529cb8eb07a76fd24ed8abf"} Mar 09 13:24:10 crc kubenswrapper[4723]: I0309 13:24:10.435733 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 13:24:10 crc kubenswrapper[4723]: I0309 13:24:10.436287 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="c5e88473-555f-408b-917e-997969b8f48d" containerName="nova-scheduler-scheduler" containerID="cri-o://9331bf85019f77dffb9d38a982fcb9c3ea17697ba109e52e788c27965e489f54" gracePeriod=30 Mar 09 13:24:10 crc kubenswrapper[4723]: I0309 13:24:10.448784 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:24:10 crc kubenswrapper[4723]: I0309 13:24:10.449123 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d" containerName="nova-api-log" containerID="cri-o://278c9cff7d9c390beb6efa6f92cae7060d558df4a9770d3bc6ebce0c097deb21" gracePeriod=30 Mar 09 13:24:10 crc kubenswrapper[4723]: I0309 13:24:10.449240 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d" containerName="nova-api-api" containerID="cri-o://8632df57bd8060190400951999a1ba2614f2a94a50fe0f7f20b8598554c4a526" gracePeriod=30 Mar 09 13:24:10 crc kubenswrapper[4723]: I0309 13:24:10.534310 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.147002 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.273165 4723 generic.go:334] "Generic (PLEG): container finished" podID="1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d" containerID="8632df57bd8060190400951999a1ba2614f2a94a50fe0f7f20b8598554c4a526" exitCode=0 Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.273486 4723 generic.go:334] "Generic (PLEG): container finished" podID="1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d" containerID="278c9cff7d9c390beb6efa6f92cae7060d558df4a9770d3bc6ebce0c097deb21" exitCode=143 Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.273368 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d","Type":"ContainerDied","Data":"8632df57bd8060190400951999a1ba2614f2a94a50fe0f7f20b8598554c4a526"} Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.273454 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.273532 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d","Type":"ContainerDied","Data":"278c9cff7d9c390beb6efa6f92cae7060d558df4a9770d3bc6ebce0c097deb21"} Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.273545 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d","Type":"ContainerDied","Data":"9c25c7369cdae8752aa70c6e98a69538596e244bf0a578026420cd3a00ba12cf"} Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.273575 4723 scope.go:117] "RemoveContainer" containerID="8632df57bd8060190400951999a1ba2614f2a94a50fe0f7f20b8598554c4a526" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.278136 4723 generic.go:334] "Generic (PLEG): container finished" podID="c5e88473-555f-408b-917e-997969b8f48d" containerID="9331bf85019f77dffb9d38a982fcb9c3ea17697ba109e52e788c27965e489f54" exitCode=0 Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.279086 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c5e88473-555f-408b-917e-997969b8f48d","Type":"ContainerDied","Data":"9331bf85019f77dffb9d38a982fcb9c3ea17697ba109e52e788c27965e489f54"} Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.319146 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d-combined-ca-bundle\") pod \"1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d\" (UID: \"1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d\") " Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.319225 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d-config-data\") pod \"1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d\" (UID: \"1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d\") " Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.319358 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9684r\" (UniqueName: \"kubernetes.io/projected/1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d-kube-api-access-9684r\") pod \"1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d\" (UID: \"1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d\") " Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.319391 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d-internal-tls-certs\") pod \"1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d\" (UID: \"1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d\") " Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.319444 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d-public-tls-certs\") pod \"1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d\" (UID: \"1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d\") " Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.319563 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d-logs\") pod \"1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d\" (UID: \"1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d\") " Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.322350 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d-logs" (OuterVolumeSpecName: "logs") pod "1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d" (UID: "1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.325444 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d-kube-api-access-9684r" (OuterVolumeSpecName: "kube-api-access-9684r") pod "1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d" (UID: "1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d"). InnerVolumeSpecName "kube-api-access-9684r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.353376 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d" (UID: "1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.362288 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d-config-data" (OuterVolumeSpecName: "config-data") pod "1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d" (UID: "1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.381013 4723 scope.go:117] "RemoveContainer" containerID="278c9cff7d9c390beb6efa6f92cae7060d558df4a9770d3bc6ebce0c097deb21" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.402359 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d" (UID: "1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.405449 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d" (UID: "1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.409347 4723 scope.go:117] "RemoveContainer" containerID="8632df57bd8060190400951999a1ba2614f2a94a50fe0f7f20b8598554c4a526" Mar 09 13:24:11 crc kubenswrapper[4723]: E0309 13:24:11.409744 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8632df57bd8060190400951999a1ba2614f2a94a50fe0f7f20b8598554c4a526\": container with ID starting with 8632df57bd8060190400951999a1ba2614f2a94a50fe0f7f20b8598554c4a526 not found: ID does not exist" containerID="8632df57bd8060190400951999a1ba2614f2a94a50fe0f7f20b8598554c4a526" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.409773 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8632df57bd8060190400951999a1ba2614f2a94a50fe0f7f20b8598554c4a526"} err="failed to get container status \"8632df57bd8060190400951999a1ba2614f2a94a50fe0f7f20b8598554c4a526\": rpc error: code = NotFound desc = could not find container \"8632df57bd8060190400951999a1ba2614f2a94a50fe0f7f20b8598554c4a526\": container with ID starting with 8632df57bd8060190400951999a1ba2614f2a94a50fe0f7f20b8598554c4a526 not found: ID does not exist" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.409793 4723 scope.go:117] "RemoveContainer" containerID="278c9cff7d9c390beb6efa6f92cae7060d558df4a9770d3bc6ebce0c097deb21" Mar 09 13:24:11 crc kubenswrapper[4723]: E0309 13:24:11.410252 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"278c9cff7d9c390beb6efa6f92cae7060d558df4a9770d3bc6ebce0c097deb21\": container with ID starting with 278c9cff7d9c390beb6efa6f92cae7060d558df4a9770d3bc6ebce0c097deb21 not found: ID does not exist" containerID="278c9cff7d9c390beb6efa6f92cae7060d558df4a9770d3bc6ebce0c097deb21" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.410276 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"278c9cff7d9c390beb6efa6f92cae7060d558df4a9770d3bc6ebce0c097deb21"} err="failed to get container status \"278c9cff7d9c390beb6efa6f92cae7060d558df4a9770d3bc6ebce0c097deb21\": rpc error: code = NotFound desc = could not find container \"278c9cff7d9c390beb6efa6f92cae7060d558df4a9770d3bc6ebce0c097deb21\": container with ID starting with 278c9cff7d9c390beb6efa6f92cae7060d558df4a9770d3bc6ebce0c097deb21 not found: ID does not exist" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.410290 4723 scope.go:117] "RemoveContainer" containerID="8632df57bd8060190400951999a1ba2614f2a94a50fe0f7f20b8598554c4a526" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.410538 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8632df57bd8060190400951999a1ba2614f2a94a50fe0f7f20b8598554c4a526"} err="failed to get container status \"8632df57bd8060190400951999a1ba2614f2a94a50fe0f7f20b8598554c4a526\": rpc error: code = NotFound desc = could not find container \"8632df57bd8060190400951999a1ba2614f2a94a50fe0f7f20b8598554c4a526\": container with ID starting with 8632df57bd8060190400951999a1ba2614f2a94a50fe0f7f20b8598554c4a526 not found: ID does not exist" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.410555 4723 scope.go:117] "RemoveContainer" containerID="278c9cff7d9c390beb6efa6f92cae7060d558df4a9770d3bc6ebce0c097deb21" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.410962 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"278c9cff7d9c390beb6efa6f92cae7060d558df4a9770d3bc6ebce0c097deb21"} err="failed to get container status \"278c9cff7d9c390beb6efa6f92cae7060d558df4a9770d3bc6ebce0c097deb21\": rpc error: code = NotFound desc = could not find container \"278c9cff7d9c390beb6efa6f92cae7060d558df4a9770d3bc6ebce0c097deb21\": container with ID starting with 278c9cff7d9c390beb6efa6f92cae7060d558df4a9770d3bc6ebce0c097deb21 not found: ID does not exist" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.422038 4723 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d-logs\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.422067 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.422077 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.422087 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9684r\" (UniqueName: \"kubernetes.io/projected/1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d-kube-api-access-9684r\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.422097 4723 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.422106 4723 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.613772 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.636000 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.643927 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 09 13:24:11 crc kubenswrapper[4723]: E0309 13:24:11.644471 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d" containerName="nova-api-log" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.644489 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d" containerName="nova-api-log" Mar 09 13:24:11 crc kubenswrapper[4723]: E0309 13:24:11.644522 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d" containerName="nova-api-api" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.644530 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d" containerName="nova-api-api" Mar 09 13:24:11 crc kubenswrapper[4723]: E0309 13:24:11.644561 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df4024d7-4807-416a-883e-b36bdc7945b7" containerName="nova-manage" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.644567 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="df4024d7-4807-416a-883e-b36bdc7945b7" containerName="nova-manage" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.644791 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="df4024d7-4807-416a-883e-b36bdc7945b7" containerName="nova-manage" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.644832 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d" containerName="nova-api-log" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.644850 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d" containerName="nova-api-api" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.646253 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.648774 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.648984 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.653938 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.665569 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.744932 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f73e368c-e54c-4f8f-9e50-857e5e72f8ce-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f73e368c-e54c-4f8f-9e50-857e5e72f8ce\") " pod="openstack/nova-api-0" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.745392 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9bgl\" (UniqueName: \"kubernetes.io/projected/f73e368c-e54c-4f8f-9e50-857e5e72f8ce-kube-api-access-s9bgl\") pod \"nova-api-0\" (UID: \"f73e368c-e54c-4f8f-9e50-857e5e72f8ce\") " pod="openstack/nova-api-0" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.745631 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f73e368c-e54c-4f8f-9e50-857e5e72f8ce-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f73e368c-e54c-4f8f-9e50-857e5e72f8ce\") " pod="openstack/nova-api-0" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.745723 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f73e368c-e54c-4f8f-9e50-857e5e72f8ce-config-data\") pod \"nova-api-0\" (UID: \"f73e368c-e54c-4f8f-9e50-857e5e72f8ce\") " pod="openstack/nova-api-0" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.745807 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f73e368c-e54c-4f8f-9e50-857e5e72f8ce-public-tls-certs\") pod \"nova-api-0\" (UID: \"f73e368c-e54c-4f8f-9e50-857e5e72f8ce\") " pod="openstack/nova-api-0" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.745972 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f73e368c-e54c-4f8f-9e50-857e5e72f8ce-logs\") pod \"nova-api-0\" (UID: \"f73e368c-e54c-4f8f-9e50-857e5e72f8ce\") " pod="openstack/nova-api-0" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.848117 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9bgl\" (UniqueName: \"kubernetes.io/projected/f73e368c-e54c-4f8f-9e50-857e5e72f8ce-kube-api-access-s9bgl\") pod \"nova-api-0\" (UID: \"f73e368c-e54c-4f8f-9e50-857e5e72f8ce\") " pod="openstack/nova-api-0" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.848262 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f73e368c-e54c-4f8f-9e50-857e5e72f8ce-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f73e368c-e54c-4f8f-9e50-857e5e72f8ce\") " pod="openstack/nova-api-0" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.848308 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f73e368c-e54c-4f8f-9e50-857e5e72f8ce-config-data\") pod \"nova-api-0\" (UID: \"f73e368c-e54c-4f8f-9e50-857e5e72f8ce\") " pod="openstack/nova-api-0" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.848348 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f73e368c-e54c-4f8f-9e50-857e5e72f8ce-public-tls-certs\") pod \"nova-api-0\" (UID: \"f73e368c-e54c-4f8f-9e50-857e5e72f8ce\") " pod="openstack/nova-api-0" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.848416 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f73e368c-e54c-4f8f-9e50-857e5e72f8ce-logs\") pod \"nova-api-0\" (UID: \"f73e368c-e54c-4f8f-9e50-857e5e72f8ce\") " pod="openstack/nova-api-0" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.848499 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f73e368c-e54c-4f8f-9e50-857e5e72f8ce-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f73e368c-e54c-4f8f-9e50-857e5e72f8ce\") " pod="openstack/nova-api-0" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.849426 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f73e368c-e54c-4f8f-9e50-857e5e72f8ce-logs\") pod \"nova-api-0\" (UID: \"f73e368c-e54c-4f8f-9e50-857e5e72f8ce\") " pod="openstack/nova-api-0" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.851639 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f73e368c-e54c-4f8f-9e50-857e5e72f8ce-public-tls-certs\") pod \"nova-api-0\" (UID: \"f73e368c-e54c-4f8f-9e50-857e5e72f8ce\") " pod="openstack/nova-api-0" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.851710 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f73e368c-e54c-4f8f-9e50-857e5e72f8ce-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f73e368c-e54c-4f8f-9e50-857e5e72f8ce\") " pod="openstack/nova-api-0" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.852731 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f73e368c-e54c-4f8f-9e50-857e5e72f8ce-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f73e368c-e54c-4f8f-9e50-857e5e72f8ce\") " pod="openstack/nova-api-0" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.854152 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f73e368c-e54c-4f8f-9e50-857e5e72f8ce-config-data\") pod \"nova-api-0\" (UID: \"f73e368c-e54c-4f8f-9e50-857e5e72f8ce\") " pod="openstack/nova-api-0" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.870516 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9bgl\" (UniqueName: \"kubernetes.io/projected/f73e368c-e54c-4f8f-9e50-857e5e72f8ce-kube-api-access-s9bgl\") pod \"nova-api-0\" (UID: \"f73e368c-e54c-4f8f-9e50-857e5e72f8ce\") " pod="openstack/nova-api-0" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.973032 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 13:24:11 crc kubenswrapper[4723]: I0309 13:24:11.990592 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 13:24:12 crc kubenswrapper[4723]: I0309 13:24:12.153997 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5e88473-555f-408b-917e-997969b8f48d-config-data\") pod \"c5e88473-555f-408b-917e-997969b8f48d\" (UID: \"c5e88473-555f-408b-917e-997969b8f48d\") " Mar 09 13:24:12 crc kubenswrapper[4723]: I0309 13:24:12.154296 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5e88473-555f-408b-917e-997969b8f48d-combined-ca-bundle\") pod \"c5e88473-555f-408b-917e-997969b8f48d\" (UID: \"c5e88473-555f-408b-917e-997969b8f48d\") " Mar 09 13:24:12 crc kubenswrapper[4723]: I0309 13:24:12.154384 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cz8c9\" (UniqueName: \"kubernetes.io/projected/c5e88473-555f-408b-917e-997969b8f48d-kube-api-access-cz8c9\") pod \"c5e88473-555f-408b-917e-997969b8f48d\" (UID: \"c5e88473-555f-408b-917e-997969b8f48d\") " Mar 09 13:24:12 crc kubenswrapper[4723]: I0309 13:24:12.160372 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5e88473-555f-408b-917e-997969b8f48d-kube-api-access-cz8c9" (OuterVolumeSpecName: "kube-api-access-cz8c9") pod "c5e88473-555f-408b-917e-997969b8f48d" (UID: "c5e88473-555f-408b-917e-997969b8f48d"). InnerVolumeSpecName "kube-api-access-cz8c9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:24:12 crc kubenswrapper[4723]: I0309 13:24:12.199404 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5e88473-555f-408b-917e-997969b8f48d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c5e88473-555f-408b-917e-997969b8f48d" (UID: "c5e88473-555f-408b-917e-997969b8f48d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:24:12 crc kubenswrapper[4723]: I0309 13:24:12.202122 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5e88473-555f-408b-917e-997969b8f48d-config-data" (OuterVolumeSpecName: "config-data") pod "c5e88473-555f-408b-917e-997969b8f48d" (UID: "c5e88473-555f-408b-917e-997969b8f48d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:24:12 crc kubenswrapper[4723]: I0309 13:24:12.256714 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cz8c9\" (UniqueName: \"kubernetes.io/projected/c5e88473-555f-408b-917e-997969b8f48d-kube-api-access-cz8c9\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:12 crc kubenswrapper[4723]: I0309 13:24:12.256751 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5e88473-555f-408b-917e-997969b8f48d-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:12 crc kubenswrapper[4723]: I0309 13:24:12.256761 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5e88473-555f-408b-917e-997969b8f48d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:12 crc kubenswrapper[4723]: I0309 13:24:12.290615 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"224c4d37-323a-4d7c-9b7c-c284b931b6fd","Type":"ContainerStarted","Data":"487d97bd65f79a60ffd5e591cec1d6aa684f5f53f5c7f26507d8c39f3243625f"} Mar 09 13:24:12 crc kubenswrapper[4723]: I0309 13:24:12.290657 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"224c4d37-323a-4d7c-9b7c-c284b931b6fd","Type":"ContainerStarted","Data":"d7fc59ead6b37d6d2d496b6e1a04621a73f284489179e1e316ad2e27ff4f24bb"} Mar 09 13:24:12 crc kubenswrapper[4723]: I0309 13:24:12.292755 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 13:24:12 crc kubenswrapper[4723]: I0309 13:24:12.292757 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c5e88473-555f-408b-917e-997969b8f48d","Type":"ContainerDied","Data":"4a9adaac78f885646800e3743affbcb8fcaf9186a08a923402a324f6bfdf8217"} Mar 09 13:24:12 crc kubenswrapper[4723]: I0309 13:24:12.292829 4723 scope.go:117] "RemoveContainer" containerID="9331bf85019f77dffb9d38a982fcb9c3ea17697ba109e52e788c27965e489f54" Mar 09 13:24:12 crc kubenswrapper[4723]: I0309 13:24:12.295415 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="faddb12d-2b02-4f85-b253-8b9ce0c6ed27" containerName="nova-metadata-log" containerID="cri-o://9bc78ed950342851717b1a9f23160704ca61116e0e5760f5073bcea572ce0fc7" gracePeriod=30 Mar 09 13:24:12 crc kubenswrapper[4723]: I0309 13:24:12.295459 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="faddb12d-2b02-4f85-b253-8b9ce0c6ed27" containerName="nova-metadata-metadata" containerID="cri-o://037f538246ef982dda1da2494290f1ca3b27164494789f1720bebb5084dd4153" gracePeriod=30 Mar 09 13:24:12 crc kubenswrapper[4723]: I0309 13:24:12.388982 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 13:24:12 crc kubenswrapper[4723]: I0309 13:24:12.436232 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 13:24:12 crc kubenswrapper[4723]: I0309 13:24:12.530389 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 13:24:12 crc kubenswrapper[4723]: E0309 13:24:12.531612 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5e88473-555f-408b-917e-997969b8f48d" containerName="nova-scheduler-scheduler" Mar 09 13:24:12 crc kubenswrapper[4723]: I0309 13:24:12.531650 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5e88473-555f-408b-917e-997969b8f48d" containerName="nova-scheduler-scheduler" Mar 09 13:24:12 crc kubenswrapper[4723]: I0309 13:24:12.532085 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5e88473-555f-408b-917e-997969b8f48d" containerName="nova-scheduler-scheduler" Mar 09 13:24:12 crc kubenswrapper[4723]: I0309 13:24:12.533022 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 13:24:12 crc kubenswrapper[4723]: I0309 13:24:12.535313 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 09 13:24:12 crc kubenswrapper[4723]: I0309 13:24:12.554277 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 13:24:12 crc kubenswrapper[4723]: I0309 13:24:12.571640 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h7z2\" (UniqueName: \"kubernetes.io/projected/c19eac35-9f03-4dea-b1a2-72276e8a1074-kube-api-access-6h7z2\") pod \"nova-scheduler-0\" (UID: \"c19eac35-9f03-4dea-b1a2-72276e8a1074\") " pod="openstack/nova-scheduler-0" Mar 09 13:24:12 crc kubenswrapper[4723]: I0309 13:24:12.572288 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c19eac35-9f03-4dea-b1a2-72276e8a1074-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c19eac35-9f03-4dea-b1a2-72276e8a1074\") " pod="openstack/nova-scheduler-0" Mar 09 13:24:12 crc kubenswrapper[4723]: I0309 13:24:12.572369 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c19eac35-9f03-4dea-b1a2-72276e8a1074-config-data\") pod \"nova-scheduler-0\" (UID: \"c19eac35-9f03-4dea-b1a2-72276e8a1074\") " pod="openstack/nova-scheduler-0" Mar 09 13:24:12 crc kubenswrapper[4723]: I0309 13:24:12.600420 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:24:12 crc kubenswrapper[4723]: I0309 13:24:12.677604 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c19eac35-9f03-4dea-b1a2-72276e8a1074-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c19eac35-9f03-4dea-b1a2-72276e8a1074\") " pod="openstack/nova-scheduler-0" Mar 09 13:24:12 crc kubenswrapper[4723]: I0309 13:24:12.677671 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c19eac35-9f03-4dea-b1a2-72276e8a1074-config-data\") pod \"nova-scheduler-0\" (UID: \"c19eac35-9f03-4dea-b1a2-72276e8a1074\") " pod="openstack/nova-scheduler-0" Mar 09 13:24:12 crc kubenswrapper[4723]: I0309 13:24:12.677868 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6h7z2\" (UniqueName: \"kubernetes.io/projected/c19eac35-9f03-4dea-b1a2-72276e8a1074-kube-api-access-6h7z2\") pod \"nova-scheduler-0\" (UID: \"c19eac35-9f03-4dea-b1a2-72276e8a1074\") " pod="openstack/nova-scheduler-0" Mar 09 13:24:12 crc kubenswrapper[4723]: I0309 13:24:12.684756 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c19eac35-9f03-4dea-b1a2-72276e8a1074-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c19eac35-9f03-4dea-b1a2-72276e8a1074\") " pod="openstack/nova-scheduler-0" Mar 09 13:24:12 crc kubenswrapper[4723]: I0309 13:24:12.699319 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h7z2\" (UniqueName: \"kubernetes.io/projected/c19eac35-9f03-4dea-b1a2-72276e8a1074-kube-api-access-6h7z2\") pod \"nova-scheduler-0\" (UID: \"c19eac35-9f03-4dea-b1a2-72276e8a1074\") " pod="openstack/nova-scheduler-0" Mar 09 13:24:12 crc kubenswrapper[4723]: I0309 13:24:12.705623 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c19eac35-9f03-4dea-b1a2-72276e8a1074-config-data\") pod \"nova-scheduler-0\" (UID: \"c19eac35-9f03-4dea-b1a2-72276e8a1074\") " pod="openstack/nova-scheduler-0" Mar 09 13:24:12 crc kubenswrapper[4723]: I0309 13:24:12.861465 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 13:24:12 crc kubenswrapper[4723]: I0309 13:24:12.902160 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d" path="/var/lib/kubelet/pods/1e2f3a8f-6de5-4830-8bc3-3d5e81fadf3d/volumes" Mar 09 13:24:12 crc kubenswrapper[4723]: I0309 13:24:12.903016 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5e88473-555f-408b-917e-997969b8f48d" path="/var/lib/kubelet/pods/c5e88473-555f-408b-917e-997969b8f48d/volumes" Mar 09 13:24:13 crc kubenswrapper[4723]: I0309 13:24:13.313912 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f73e368c-e54c-4f8f-9e50-857e5e72f8ce","Type":"ContainerStarted","Data":"b92dabdcb8c318541724e84fc7ad797401732c8075251119f05aa15ee2d9370d"} Mar 09 13:24:13 crc kubenswrapper[4723]: I0309 13:24:13.314281 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f73e368c-e54c-4f8f-9e50-857e5e72f8ce","Type":"ContainerStarted","Data":"d86457b2f0b7942906b43df7406e88cbf6458df0d6eab3f7b4955fdc06406248"} Mar 09 13:24:13 crc kubenswrapper[4723]: I0309 13:24:13.326481 4723 generic.go:334] "Generic (PLEG): container finished" podID="faddb12d-2b02-4f85-b253-8b9ce0c6ed27" containerID="9bc78ed950342851717b1a9f23160704ca61116e0e5760f5073bcea572ce0fc7" exitCode=143 Mar 09 13:24:13 crc kubenswrapper[4723]: I0309 13:24:13.326543 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"faddb12d-2b02-4f85-b253-8b9ce0c6ed27","Type":"ContainerDied","Data":"9bc78ed950342851717b1a9f23160704ca61116e0e5760f5073bcea572ce0fc7"} Mar 09 13:24:13 crc kubenswrapper[4723]: I0309 13:24:13.557395 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 13:24:14 crc kubenswrapper[4723]: I0309 13:24:14.347222 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f73e368c-e54c-4f8f-9e50-857e5e72f8ce","Type":"ContainerStarted","Data":"f491f1d096949497f6b851976726f59209011dd78a3760b4e8156ce01f22b034"} Mar 09 13:24:14 crc kubenswrapper[4723]: I0309 13:24:14.351314 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c19eac35-9f03-4dea-b1a2-72276e8a1074","Type":"ContainerStarted","Data":"b8359a333fd314e0a9fb2b7e84efbefc9d9857c4f83945a71027083f4a65ab10"} Mar 09 13:24:14 crc kubenswrapper[4723]: I0309 13:24:14.351355 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c19eac35-9f03-4dea-b1a2-72276e8a1074","Type":"ContainerStarted","Data":"a3cdc3fa574ec0fdf5b3d90ca16accfa54711ff04e4bd7a0875c5b4a9680182f"} Mar 09 13:24:14 crc kubenswrapper[4723]: I0309 13:24:14.370984 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.370959974 podStartE2EDuration="3.370959974s" podCreationTimestamp="2026-03-09 13:24:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:14.36739761 +0000 UTC m=+1528.381865150" watchObservedRunningTime="2026-03-09 13:24:14.370959974 +0000 UTC m=+1528.385427514" Mar 09 13:24:14 crc kubenswrapper[4723]: I0309 13:24:14.401727 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.401664133 podStartE2EDuration="2.401664133s" podCreationTimestamp="2026-03-09 13:24:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:14.384235353 +0000 UTC m=+1528.398702913" watchObservedRunningTime="2026-03-09 13:24:14.401664133 +0000 UTC m=+1528.416131673" Mar 09 13:24:15 crc kubenswrapper[4723]: I0309 13:24:15.373428 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"224c4d37-323a-4d7c-9b7c-c284b931b6fd","Type":"ContainerStarted","Data":"4d99288f2b5c2697b613232a388a07da3602e50c10f79e0e23a3611267b2e93e"} Mar 09 13:24:15 crc kubenswrapper[4723]: I0309 13:24:15.447396 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="faddb12d-2b02-4f85-b253-8b9ce0c6ed27" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.7:8775/\": read tcp 10.217.0.2:50310->10.217.1.7:8775: read: connection reset by peer" Mar 09 13:24:15 crc kubenswrapper[4723]: I0309 13:24:15.447396 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="faddb12d-2b02-4f85-b253-8b9ce0c6ed27" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.7:8775/\": read tcp 10.217.0.2:50322->10.217.1.7:8775: read: connection reset by peer" Mar 09 13:24:15 crc kubenswrapper[4723]: I0309 13:24:15.978052 4723 scope.go:117] "RemoveContainer" containerID="a6f50c1b5e999531bcc92924f162bdd30b9206002c69fcaff5e650142d092799" Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.100209 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.131775 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.806695149 podStartE2EDuration="8.13175595s" podCreationTimestamp="2026-03-09 13:24:08 +0000 UTC" firstStartedPulling="2026-03-09 13:24:09.252696573 +0000 UTC m=+1523.267164123" lastFinishedPulling="2026-03-09 13:24:14.577757384 +0000 UTC m=+1528.592224924" observedRunningTime="2026-03-09 13:24:15.398810483 +0000 UTC m=+1529.413278023" watchObservedRunningTime="2026-03-09 13:24:16.13175595 +0000 UTC m=+1530.146223490" Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.280327 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faddb12d-2b02-4f85-b253-8b9ce0c6ed27-config-data\") pod \"faddb12d-2b02-4f85-b253-8b9ce0c6ed27\" (UID: \"faddb12d-2b02-4f85-b253-8b9ce0c6ed27\") " Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.283528 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wn66v\" (UniqueName: \"kubernetes.io/projected/faddb12d-2b02-4f85-b253-8b9ce0c6ed27-kube-api-access-wn66v\") pod \"faddb12d-2b02-4f85-b253-8b9ce0c6ed27\" (UID: \"faddb12d-2b02-4f85-b253-8b9ce0c6ed27\") " Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.283644 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/faddb12d-2b02-4f85-b253-8b9ce0c6ed27-logs\") pod \"faddb12d-2b02-4f85-b253-8b9ce0c6ed27\" (UID: \"faddb12d-2b02-4f85-b253-8b9ce0c6ed27\") " Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.283815 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faddb12d-2b02-4f85-b253-8b9ce0c6ed27-combined-ca-bundle\") pod \"faddb12d-2b02-4f85-b253-8b9ce0c6ed27\" (UID: \"faddb12d-2b02-4f85-b253-8b9ce0c6ed27\") " Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.284058 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/faddb12d-2b02-4f85-b253-8b9ce0c6ed27-nova-metadata-tls-certs\") pod \"faddb12d-2b02-4f85-b253-8b9ce0c6ed27\" (UID: \"faddb12d-2b02-4f85-b253-8b9ce0c6ed27\") " Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.284462 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/faddb12d-2b02-4f85-b253-8b9ce0c6ed27-logs" (OuterVolumeSpecName: "logs") pod "faddb12d-2b02-4f85-b253-8b9ce0c6ed27" (UID: "faddb12d-2b02-4f85-b253-8b9ce0c6ed27"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.285223 4723 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/faddb12d-2b02-4f85-b253-8b9ce0c6ed27-logs\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.301880 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faddb12d-2b02-4f85-b253-8b9ce0c6ed27-kube-api-access-wn66v" (OuterVolumeSpecName: "kube-api-access-wn66v") pod "faddb12d-2b02-4f85-b253-8b9ce0c6ed27" (UID: "faddb12d-2b02-4f85-b253-8b9ce0c6ed27"). InnerVolumeSpecName "kube-api-access-wn66v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.341616 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faddb12d-2b02-4f85-b253-8b9ce0c6ed27-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "faddb12d-2b02-4f85-b253-8b9ce0c6ed27" (UID: "faddb12d-2b02-4f85-b253-8b9ce0c6ed27"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.343808 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faddb12d-2b02-4f85-b253-8b9ce0c6ed27-config-data" (OuterVolumeSpecName: "config-data") pod "faddb12d-2b02-4f85-b253-8b9ce0c6ed27" (UID: "faddb12d-2b02-4f85-b253-8b9ce0c6ed27"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.394007 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faddb12d-2b02-4f85-b253-8b9ce0c6ed27-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.394045 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wn66v\" (UniqueName: \"kubernetes.io/projected/faddb12d-2b02-4f85-b253-8b9ce0c6ed27-kube-api-access-wn66v\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.394058 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faddb12d-2b02-4f85-b253-8b9ce0c6ed27-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.402059 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faddb12d-2b02-4f85-b253-8b9ce0c6ed27-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "faddb12d-2b02-4f85-b253-8b9ce0c6ed27" (UID: "faddb12d-2b02-4f85-b253-8b9ce0c6ed27"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.421383 4723 generic.go:334] "Generic (PLEG): container finished" podID="faddb12d-2b02-4f85-b253-8b9ce0c6ed27" containerID="037f538246ef982dda1da2494290f1ca3b27164494789f1720bebb5084dd4153" exitCode=0 Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.421450 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"faddb12d-2b02-4f85-b253-8b9ce0c6ed27","Type":"ContainerDied","Data":"037f538246ef982dda1da2494290f1ca3b27164494789f1720bebb5084dd4153"} Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.421476 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"faddb12d-2b02-4f85-b253-8b9ce0c6ed27","Type":"ContainerDied","Data":"d51c5edb5031ed5be2592673e3a82fd6c8a22f29cbbd2817e76467620b62b70f"} Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.421493 4723 scope.go:117] "RemoveContainer" containerID="037f538246ef982dda1da2494290f1ca3b27164494789f1720bebb5084dd4153" Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.421617 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.430241 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.482562 4723 scope.go:117] "RemoveContainer" containerID="9bc78ed950342851717b1a9f23160704ca61116e0e5760f5073bcea572ce0fc7" Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.497944 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.499541 4723 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/faddb12d-2b02-4f85-b253-8b9ce0c6ed27-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.510499 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.527770 4723 scope.go:117] "RemoveContainer" containerID="037f538246ef982dda1da2494290f1ca3b27164494789f1720bebb5084dd4153" Mar 09 13:24:16 crc kubenswrapper[4723]: E0309 13:24:16.529007 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"037f538246ef982dda1da2494290f1ca3b27164494789f1720bebb5084dd4153\": container with ID starting with 037f538246ef982dda1da2494290f1ca3b27164494789f1720bebb5084dd4153 not found: ID does not exist" containerID="037f538246ef982dda1da2494290f1ca3b27164494789f1720bebb5084dd4153" Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.529052 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"037f538246ef982dda1da2494290f1ca3b27164494789f1720bebb5084dd4153"} err="failed to get container status \"037f538246ef982dda1da2494290f1ca3b27164494789f1720bebb5084dd4153\": rpc error: code = NotFound desc = could not find container \"037f538246ef982dda1da2494290f1ca3b27164494789f1720bebb5084dd4153\": container with ID starting with 037f538246ef982dda1da2494290f1ca3b27164494789f1720bebb5084dd4153 not found: ID does not exist" Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.529079 4723 scope.go:117] "RemoveContainer" containerID="9bc78ed950342851717b1a9f23160704ca61116e0e5760f5073bcea572ce0fc7" Mar 09 13:24:16 crc kubenswrapper[4723]: E0309 13:24:16.530327 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bc78ed950342851717b1a9f23160704ca61116e0e5760f5073bcea572ce0fc7\": container with ID starting with 9bc78ed950342851717b1a9f23160704ca61116e0e5760f5073bcea572ce0fc7 not found: ID does not exist" containerID="9bc78ed950342851717b1a9f23160704ca61116e0e5760f5073bcea572ce0fc7" Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.530362 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bc78ed950342851717b1a9f23160704ca61116e0e5760f5073bcea572ce0fc7"} err="failed to get container status \"9bc78ed950342851717b1a9f23160704ca61116e0e5760f5073bcea572ce0fc7\": rpc error: code = NotFound desc = could not find container \"9bc78ed950342851717b1a9f23160704ca61116e0e5760f5073bcea572ce0fc7\": container with ID starting with 9bc78ed950342851717b1a9f23160704ca61116e0e5760f5073bcea572ce0fc7 not found: ID does not exist" Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.548271 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:24:16 crc kubenswrapper[4723]: E0309 13:24:16.548817 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faddb12d-2b02-4f85-b253-8b9ce0c6ed27" containerName="nova-metadata-metadata" Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.548836 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="faddb12d-2b02-4f85-b253-8b9ce0c6ed27" containerName="nova-metadata-metadata" Mar 09 13:24:16 crc kubenswrapper[4723]: E0309 13:24:16.548883 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faddb12d-2b02-4f85-b253-8b9ce0c6ed27" containerName="nova-metadata-log" Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.548889 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="faddb12d-2b02-4f85-b253-8b9ce0c6ed27" containerName="nova-metadata-log" Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.549122 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="faddb12d-2b02-4f85-b253-8b9ce0c6ed27" containerName="nova-metadata-log" Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.549140 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="faddb12d-2b02-4f85-b253-8b9ce0c6ed27" containerName="nova-metadata-metadata" Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.550449 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.555371 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.555559 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.576413 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.705355 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdae6e1e-9f23-495b-aeef-2a457377db3a-config-data\") pod \"nova-metadata-0\" (UID: \"fdae6e1e-9f23-495b-aeef-2a457377db3a\") " pod="openstack/nova-metadata-0" Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.706142 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdae6e1e-9f23-495b-aeef-2a457377db3a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fdae6e1e-9f23-495b-aeef-2a457377db3a\") " pod="openstack/nova-metadata-0" Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.706406 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wgtc\" (UniqueName: \"kubernetes.io/projected/fdae6e1e-9f23-495b-aeef-2a457377db3a-kube-api-access-7wgtc\") pod \"nova-metadata-0\" (UID: \"fdae6e1e-9f23-495b-aeef-2a457377db3a\") " pod="openstack/nova-metadata-0" Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.706602 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdae6e1e-9f23-495b-aeef-2a457377db3a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fdae6e1e-9f23-495b-aeef-2a457377db3a\") " pod="openstack/nova-metadata-0" Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.706637 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdae6e1e-9f23-495b-aeef-2a457377db3a-logs\") pod \"nova-metadata-0\" (UID: \"fdae6e1e-9f23-495b-aeef-2a457377db3a\") " pod="openstack/nova-metadata-0" Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.809502 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wgtc\" (UniqueName: \"kubernetes.io/projected/fdae6e1e-9f23-495b-aeef-2a457377db3a-kube-api-access-7wgtc\") pod \"nova-metadata-0\" (UID: \"fdae6e1e-9f23-495b-aeef-2a457377db3a\") " pod="openstack/nova-metadata-0" Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.809626 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdae6e1e-9f23-495b-aeef-2a457377db3a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fdae6e1e-9f23-495b-aeef-2a457377db3a\") " pod="openstack/nova-metadata-0" Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.809656 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdae6e1e-9f23-495b-aeef-2a457377db3a-logs\") pod \"nova-metadata-0\" (UID: \"fdae6e1e-9f23-495b-aeef-2a457377db3a\") " pod="openstack/nova-metadata-0" Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.809762 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdae6e1e-9f23-495b-aeef-2a457377db3a-config-data\") pod \"nova-metadata-0\" (UID: \"fdae6e1e-9f23-495b-aeef-2a457377db3a\") " pod="openstack/nova-metadata-0" Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.809814 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdae6e1e-9f23-495b-aeef-2a457377db3a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fdae6e1e-9f23-495b-aeef-2a457377db3a\") " pod="openstack/nova-metadata-0" Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.810829 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdae6e1e-9f23-495b-aeef-2a457377db3a-logs\") pod \"nova-metadata-0\" (UID: \"fdae6e1e-9f23-495b-aeef-2a457377db3a\") " pod="openstack/nova-metadata-0" Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.813851 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdae6e1e-9f23-495b-aeef-2a457377db3a-config-data\") pod \"nova-metadata-0\" (UID: \"fdae6e1e-9f23-495b-aeef-2a457377db3a\") " pod="openstack/nova-metadata-0" Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.814210 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdae6e1e-9f23-495b-aeef-2a457377db3a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fdae6e1e-9f23-495b-aeef-2a457377db3a\") " pod="openstack/nova-metadata-0" Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.814213 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdae6e1e-9f23-495b-aeef-2a457377db3a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fdae6e1e-9f23-495b-aeef-2a457377db3a\") " pod="openstack/nova-metadata-0" Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.830856 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wgtc\" (UniqueName: \"kubernetes.io/projected/fdae6e1e-9f23-495b-aeef-2a457377db3a-kube-api-access-7wgtc\") pod \"nova-metadata-0\" (UID: \"fdae6e1e-9f23-495b-aeef-2a457377db3a\") " pod="openstack/nova-metadata-0" Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.870241 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 13:24:16 crc kubenswrapper[4723]: I0309 13:24:16.895135 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="faddb12d-2b02-4f85-b253-8b9ce0c6ed27" path="/var/lib/kubelet/pods/faddb12d-2b02-4f85-b253-8b9ce0c6ed27/volumes" Mar 09 13:24:17 crc kubenswrapper[4723]: I0309 13:24:17.457249 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:24:17 crc kubenswrapper[4723]: W0309 13:24:17.467349 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfdae6e1e_9f23_495b_aeef_2a457377db3a.slice/crio-838fbbd4fe2ec08c3763ebb56d021c48e4be75ce81b91dad262f122151775e02 WatchSource:0}: Error finding container 838fbbd4fe2ec08c3763ebb56d021c48e4be75ce81b91dad262f122151775e02: Status 404 returned error can't find the container with id 838fbbd4fe2ec08c3763ebb56d021c48e4be75ce81b91dad262f122151775e02 Mar 09 13:24:17 crc kubenswrapper[4723]: I0309 13:24:17.865348 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.207468 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.346257 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0637c672-5bcf-44ec-add1-638ce6065b6e-config-data\") pod \"0637c672-5bcf-44ec-add1-638ce6065b6e\" (UID: \"0637c672-5bcf-44ec-add1-638ce6065b6e\") " Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.346369 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0637c672-5bcf-44ec-add1-638ce6065b6e-scripts\") pod \"0637c672-5bcf-44ec-add1-638ce6065b6e\" (UID: \"0637c672-5bcf-44ec-add1-638ce6065b6e\") " Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.346439 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzlfs\" (UniqueName: \"kubernetes.io/projected/0637c672-5bcf-44ec-add1-638ce6065b6e-kube-api-access-jzlfs\") pod \"0637c672-5bcf-44ec-add1-638ce6065b6e\" (UID: \"0637c672-5bcf-44ec-add1-638ce6065b6e\") " Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.346555 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0637c672-5bcf-44ec-add1-638ce6065b6e-combined-ca-bundle\") pod \"0637c672-5bcf-44ec-add1-638ce6065b6e\" (UID: \"0637c672-5bcf-44ec-add1-638ce6065b6e\") " Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.355203 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0637c672-5bcf-44ec-add1-638ce6065b6e-scripts" (OuterVolumeSpecName: "scripts") pod "0637c672-5bcf-44ec-add1-638ce6065b6e" (UID: "0637c672-5bcf-44ec-add1-638ce6065b6e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.355212 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0637c672-5bcf-44ec-add1-638ce6065b6e-kube-api-access-jzlfs" (OuterVolumeSpecName: "kube-api-access-jzlfs") pod "0637c672-5bcf-44ec-add1-638ce6065b6e" (UID: "0637c672-5bcf-44ec-add1-638ce6065b6e"). InnerVolumeSpecName "kube-api-access-jzlfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.449524 4723 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0637c672-5bcf-44ec-add1-638ce6065b6e-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.449557 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzlfs\" (UniqueName: \"kubernetes.io/projected/0637c672-5bcf-44ec-add1-638ce6065b6e-kube-api-access-jzlfs\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.484407 4723 generic.go:334] "Generic (PLEG): container finished" podID="0637c672-5bcf-44ec-add1-638ce6065b6e" containerID="976238b04ee9ccad82abc3e653cc56b6e1908493cb78efe9690bc684d20643f8" exitCode=137 Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.484462 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"0637c672-5bcf-44ec-add1-638ce6065b6e","Type":"ContainerDied","Data":"976238b04ee9ccad82abc3e653cc56b6e1908493cb78efe9690bc684d20643f8"} Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.484507 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"0637c672-5bcf-44ec-add1-638ce6065b6e","Type":"ContainerDied","Data":"510b9a27bab141e2aa97c63779d76a99eec4168d719163a076701378a4c88253"} Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.484523 4723 scope.go:117] "RemoveContainer" containerID="976238b04ee9ccad82abc3e653cc56b6e1908493cb78efe9690bc684d20643f8" Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.484521 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.489777 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fdae6e1e-9f23-495b-aeef-2a457377db3a","Type":"ContainerStarted","Data":"432d0d80bd169b168da4f98387789ff5f5199d225ea96cfd2dd65d8d0e2c5624"} Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.489821 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fdae6e1e-9f23-495b-aeef-2a457377db3a","Type":"ContainerStarted","Data":"e83678530203f583ed359cde32fe7e1a1751e3774bd896d3362d967fd1eea301"} Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.489832 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fdae6e1e-9f23-495b-aeef-2a457377db3a","Type":"ContainerStarted","Data":"838fbbd4fe2ec08c3763ebb56d021c48e4be75ce81b91dad262f122151775e02"} Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.519049 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.519025095 podStartE2EDuration="2.519025095s" podCreationTimestamp="2026-03-09 13:24:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:18.505949561 +0000 UTC m=+1532.520417111" watchObservedRunningTime="2026-03-09 13:24:18.519025095 +0000 UTC m=+1532.533492635" Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.528896 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0637c672-5bcf-44ec-add1-638ce6065b6e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0637c672-5bcf-44ec-add1-638ce6065b6e" (UID: "0637c672-5bcf-44ec-add1-638ce6065b6e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.539007 4723 scope.go:117] "RemoveContainer" containerID="20562d9be0c7cdb6d497fe8af4504e9f7d9deb069057276c233776b1e8afa74c" Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.552960 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0637c672-5bcf-44ec-add1-638ce6065b6e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.561531 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0637c672-5bcf-44ec-add1-638ce6065b6e-config-data" (OuterVolumeSpecName: "config-data") pod "0637c672-5bcf-44ec-add1-638ce6065b6e" (UID: "0637c672-5bcf-44ec-add1-638ce6065b6e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.570595 4723 scope.go:117] "RemoveContainer" containerID="e7b95cc57d3bb1e0f5b53a049f881d06fc9bb4e3f45b3f635cbaf04da358c06c" Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.609798 4723 scope.go:117] "RemoveContainer" containerID="98fd34ceecb8a13f7287803e9ecac6b659250dad38768f0624b41f08d6d156c9" Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.633184 4723 scope.go:117] "RemoveContainer" containerID="976238b04ee9ccad82abc3e653cc56b6e1908493cb78efe9690bc684d20643f8" Mar 09 13:24:18 crc kubenswrapper[4723]: E0309 13:24:18.633912 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"976238b04ee9ccad82abc3e653cc56b6e1908493cb78efe9690bc684d20643f8\": container with ID starting with 976238b04ee9ccad82abc3e653cc56b6e1908493cb78efe9690bc684d20643f8 not found: ID does not exist" containerID="976238b04ee9ccad82abc3e653cc56b6e1908493cb78efe9690bc684d20643f8" Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.633946 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"976238b04ee9ccad82abc3e653cc56b6e1908493cb78efe9690bc684d20643f8"} err="failed to get container status \"976238b04ee9ccad82abc3e653cc56b6e1908493cb78efe9690bc684d20643f8\": rpc error: code = NotFound desc = could not find container \"976238b04ee9ccad82abc3e653cc56b6e1908493cb78efe9690bc684d20643f8\": container with ID starting with 976238b04ee9ccad82abc3e653cc56b6e1908493cb78efe9690bc684d20643f8 not found: ID does not exist" Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.633972 4723 scope.go:117] "RemoveContainer" containerID="20562d9be0c7cdb6d497fe8af4504e9f7d9deb069057276c233776b1e8afa74c" Mar 09 13:24:18 crc kubenswrapper[4723]: E0309 13:24:18.634295 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20562d9be0c7cdb6d497fe8af4504e9f7d9deb069057276c233776b1e8afa74c\": container with ID starting with 20562d9be0c7cdb6d497fe8af4504e9f7d9deb069057276c233776b1e8afa74c not found: ID does not exist" containerID="20562d9be0c7cdb6d497fe8af4504e9f7d9deb069057276c233776b1e8afa74c" Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.634393 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20562d9be0c7cdb6d497fe8af4504e9f7d9deb069057276c233776b1e8afa74c"} err="failed to get container status \"20562d9be0c7cdb6d497fe8af4504e9f7d9deb069057276c233776b1e8afa74c\": rpc error: code = NotFound desc = could not find container \"20562d9be0c7cdb6d497fe8af4504e9f7d9deb069057276c233776b1e8afa74c\": container with ID starting with 20562d9be0c7cdb6d497fe8af4504e9f7d9deb069057276c233776b1e8afa74c not found: ID does not exist" Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.634477 4723 scope.go:117] "RemoveContainer" containerID="e7b95cc57d3bb1e0f5b53a049f881d06fc9bb4e3f45b3f635cbaf04da358c06c" Mar 09 13:24:18 crc kubenswrapper[4723]: E0309 13:24:18.634966 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7b95cc57d3bb1e0f5b53a049f881d06fc9bb4e3f45b3f635cbaf04da358c06c\": container with ID starting with e7b95cc57d3bb1e0f5b53a049f881d06fc9bb4e3f45b3f635cbaf04da358c06c not found: ID does not exist" containerID="e7b95cc57d3bb1e0f5b53a049f881d06fc9bb4e3f45b3f635cbaf04da358c06c" Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.635082 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7b95cc57d3bb1e0f5b53a049f881d06fc9bb4e3f45b3f635cbaf04da358c06c"} err="failed to get container status \"e7b95cc57d3bb1e0f5b53a049f881d06fc9bb4e3f45b3f635cbaf04da358c06c\": rpc error: code = NotFound desc = could not find container \"e7b95cc57d3bb1e0f5b53a049f881d06fc9bb4e3f45b3f635cbaf04da358c06c\": container with ID starting with e7b95cc57d3bb1e0f5b53a049f881d06fc9bb4e3f45b3f635cbaf04da358c06c not found: ID does not exist" Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.635169 4723 scope.go:117] "RemoveContainer" containerID="98fd34ceecb8a13f7287803e9ecac6b659250dad38768f0624b41f08d6d156c9" Mar 09 13:24:18 crc kubenswrapper[4723]: E0309 13:24:18.635525 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98fd34ceecb8a13f7287803e9ecac6b659250dad38768f0624b41f08d6d156c9\": container with ID starting with 98fd34ceecb8a13f7287803e9ecac6b659250dad38768f0624b41f08d6d156c9 not found: ID does not exist" containerID="98fd34ceecb8a13f7287803e9ecac6b659250dad38768f0624b41f08d6d156c9" Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.635569 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98fd34ceecb8a13f7287803e9ecac6b659250dad38768f0624b41f08d6d156c9"} err="failed to get container status \"98fd34ceecb8a13f7287803e9ecac6b659250dad38768f0624b41f08d6d156c9\": rpc error: code = NotFound desc = could not find container \"98fd34ceecb8a13f7287803e9ecac6b659250dad38768f0624b41f08d6d156c9\": container with ID starting with 98fd34ceecb8a13f7287803e9ecac6b659250dad38768f0624b41f08d6d156c9 not found: ID does not exist" Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.655538 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0637c672-5bcf-44ec-add1-638ce6065b6e-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.824759 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.840027 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.852742 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 09 13:24:18 crc kubenswrapper[4723]: E0309 13:24:18.853464 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0637c672-5bcf-44ec-add1-638ce6065b6e" containerName="aodh-api" Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.853490 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="0637c672-5bcf-44ec-add1-638ce6065b6e" containerName="aodh-api" Mar 09 13:24:18 crc kubenswrapper[4723]: E0309 13:24:18.853507 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0637c672-5bcf-44ec-add1-638ce6065b6e" containerName="aodh-notifier" Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.853515 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="0637c672-5bcf-44ec-add1-638ce6065b6e" containerName="aodh-notifier" Mar 09 13:24:18 crc kubenswrapper[4723]: E0309 13:24:18.853542 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0637c672-5bcf-44ec-add1-638ce6065b6e" containerName="aodh-evaluator" Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.853552 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="0637c672-5bcf-44ec-add1-638ce6065b6e" containerName="aodh-evaluator" Mar 09 13:24:18 crc kubenswrapper[4723]: E0309 13:24:18.853566 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0637c672-5bcf-44ec-add1-638ce6065b6e" containerName="aodh-listener" Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.853573 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="0637c672-5bcf-44ec-add1-638ce6065b6e" containerName="aodh-listener" Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.853848 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="0637c672-5bcf-44ec-add1-638ce6065b6e" containerName="aodh-listener" Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.853904 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="0637c672-5bcf-44ec-add1-638ce6065b6e" containerName="aodh-evaluator" Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.853917 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="0637c672-5bcf-44ec-add1-638ce6065b6e" containerName="aodh-api" Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.853946 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="0637c672-5bcf-44ec-add1-638ce6065b6e" containerName="aodh-notifier" Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.856439 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.860387 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.860510 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-97c45" Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.860728 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.860904 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.861096 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.904343 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0637c672-5bcf-44ec-add1-638ce6065b6e" path="/var/lib/kubelet/pods/0637c672-5bcf-44ec-add1-638ce6065b6e/volumes" Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.906535 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.962488 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/449e6144-ad49-44a8-ad79-809de89fa5c6-public-tls-certs\") pod \"aodh-0\" (UID: \"449e6144-ad49-44a8-ad79-809de89fa5c6\") " pod="openstack/aodh-0" Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.962568 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/449e6144-ad49-44a8-ad79-809de89fa5c6-internal-tls-certs\") pod \"aodh-0\" (UID: \"449e6144-ad49-44a8-ad79-809de89fa5c6\") " pod="openstack/aodh-0" Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.962625 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/449e6144-ad49-44a8-ad79-809de89fa5c6-scripts\") pod \"aodh-0\" (UID: \"449e6144-ad49-44a8-ad79-809de89fa5c6\") " pod="openstack/aodh-0" Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.962937 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/449e6144-ad49-44a8-ad79-809de89fa5c6-combined-ca-bundle\") pod \"aodh-0\" (UID: \"449e6144-ad49-44a8-ad79-809de89fa5c6\") " pod="openstack/aodh-0" Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.962966 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lp4j\" (UniqueName: \"kubernetes.io/projected/449e6144-ad49-44a8-ad79-809de89fa5c6-kube-api-access-7lp4j\") pod \"aodh-0\" (UID: \"449e6144-ad49-44a8-ad79-809de89fa5c6\") " pod="openstack/aodh-0" Mar 09 13:24:18 crc kubenswrapper[4723]: I0309 13:24:18.963026 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/449e6144-ad49-44a8-ad79-809de89fa5c6-config-data\") pod \"aodh-0\" (UID: \"449e6144-ad49-44a8-ad79-809de89fa5c6\") " pod="openstack/aodh-0" Mar 09 13:24:19 crc kubenswrapper[4723]: I0309 13:24:19.064615 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/449e6144-ad49-44a8-ad79-809de89fa5c6-config-data\") pod \"aodh-0\" (UID: \"449e6144-ad49-44a8-ad79-809de89fa5c6\") " pod="openstack/aodh-0" Mar 09 13:24:19 crc kubenswrapper[4723]: I0309 13:24:19.065089 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/449e6144-ad49-44a8-ad79-809de89fa5c6-public-tls-certs\") pod \"aodh-0\" (UID: \"449e6144-ad49-44a8-ad79-809de89fa5c6\") " pod="openstack/aodh-0" Mar 09 13:24:19 crc kubenswrapper[4723]: I0309 13:24:19.065176 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/449e6144-ad49-44a8-ad79-809de89fa5c6-internal-tls-certs\") pod \"aodh-0\" (UID: \"449e6144-ad49-44a8-ad79-809de89fa5c6\") " pod="openstack/aodh-0" Mar 09 13:24:19 crc kubenswrapper[4723]: I0309 13:24:19.065240 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/449e6144-ad49-44a8-ad79-809de89fa5c6-scripts\") pod \"aodh-0\" (UID: \"449e6144-ad49-44a8-ad79-809de89fa5c6\") " pod="openstack/aodh-0" Mar 09 13:24:19 crc kubenswrapper[4723]: I0309 13:24:19.065528 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/449e6144-ad49-44a8-ad79-809de89fa5c6-combined-ca-bundle\") pod \"aodh-0\" (UID: \"449e6144-ad49-44a8-ad79-809de89fa5c6\") " pod="openstack/aodh-0" Mar 09 13:24:19 crc kubenswrapper[4723]: I0309 13:24:19.065559 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lp4j\" (UniqueName: \"kubernetes.io/projected/449e6144-ad49-44a8-ad79-809de89fa5c6-kube-api-access-7lp4j\") pod \"aodh-0\" (UID: \"449e6144-ad49-44a8-ad79-809de89fa5c6\") " pod="openstack/aodh-0" Mar 09 13:24:19 crc kubenswrapper[4723]: I0309 13:24:19.069751 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/449e6144-ad49-44a8-ad79-809de89fa5c6-internal-tls-certs\") pod \"aodh-0\" (UID: \"449e6144-ad49-44a8-ad79-809de89fa5c6\") " pod="openstack/aodh-0" Mar 09 13:24:19 crc kubenswrapper[4723]: I0309 13:24:19.069790 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/449e6144-ad49-44a8-ad79-809de89fa5c6-combined-ca-bundle\") pod \"aodh-0\" (UID: \"449e6144-ad49-44a8-ad79-809de89fa5c6\") " pod="openstack/aodh-0" Mar 09 13:24:19 crc kubenswrapper[4723]: I0309 13:24:19.073203 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/449e6144-ad49-44a8-ad79-809de89fa5c6-scripts\") pod \"aodh-0\" (UID: \"449e6144-ad49-44a8-ad79-809de89fa5c6\") " pod="openstack/aodh-0" Mar 09 13:24:19 crc kubenswrapper[4723]: I0309 13:24:19.085154 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lp4j\" (UniqueName: \"kubernetes.io/projected/449e6144-ad49-44a8-ad79-809de89fa5c6-kube-api-access-7lp4j\") pod \"aodh-0\" (UID: \"449e6144-ad49-44a8-ad79-809de89fa5c6\") " pod="openstack/aodh-0" Mar 09 13:24:19 crc kubenswrapper[4723]: I0309 13:24:19.086158 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/449e6144-ad49-44a8-ad79-809de89fa5c6-config-data\") pod \"aodh-0\" (UID: \"449e6144-ad49-44a8-ad79-809de89fa5c6\") " pod="openstack/aodh-0" Mar 09 13:24:19 crc kubenswrapper[4723]: I0309 13:24:19.090634 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/449e6144-ad49-44a8-ad79-809de89fa5c6-public-tls-certs\") pod \"aodh-0\" (UID: \"449e6144-ad49-44a8-ad79-809de89fa5c6\") " pod="openstack/aodh-0" Mar 09 13:24:19 crc kubenswrapper[4723]: I0309 13:24:19.197197 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 09 13:24:19 crc kubenswrapper[4723]: I0309 13:24:19.761328 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 09 13:24:20 crc kubenswrapper[4723]: I0309 13:24:20.525592 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"449e6144-ad49-44a8-ad79-809de89fa5c6","Type":"ContainerStarted","Data":"2a43ed24c53c90ff73e9513c57760e48ad17450200a520872d6c47cb4cc380e0"} Mar 09 13:24:20 crc kubenswrapper[4723]: I0309 13:24:20.526151 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"449e6144-ad49-44a8-ad79-809de89fa5c6","Type":"ContainerStarted","Data":"cdd137e6d9c30f707f9511233d98b502ae4971ba7f018e76318e524c8bf7e3f0"} Mar 09 13:24:21 crc kubenswrapper[4723]: I0309 13:24:21.547847 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"449e6144-ad49-44a8-ad79-809de89fa5c6","Type":"ContainerStarted","Data":"78281b4a2b76e933fde883105a715006fbc28f0156e8152af1b38dc6f7ebdd6d"} Mar 09 13:24:21 crc kubenswrapper[4723]: I0309 13:24:21.871232 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 09 13:24:21 crc kubenswrapper[4723]: I0309 13:24:21.871295 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 09 13:24:21 crc kubenswrapper[4723]: I0309 13:24:21.973755 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 13:24:21 crc kubenswrapper[4723]: I0309 13:24:21.974135 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 13:24:22 crc kubenswrapper[4723]: I0309 13:24:22.562255 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"449e6144-ad49-44a8-ad79-809de89fa5c6","Type":"ContainerStarted","Data":"1e523dfd4062df573b2177b8e2288353fdf30bdfcdddeb4ec818d336c2c949ee"} Mar 09 13:24:22 crc kubenswrapper[4723]: I0309 13:24:22.861724 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 09 13:24:22 crc kubenswrapper[4723]: I0309 13:24:22.896396 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 09 13:24:22 crc kubenswrapper[4723]: I0309 13:24:22.987208 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f73e368c-e54c-4f8f-9e50-857e5e72f8ce" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.14:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 13:24:22 crc kubenswrapper[4723]: I0309 13:24:22.987431 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f73e368c-e54c-4f8f-9e50-857e5e72f8ce" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.14:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 13:24:23 crc kubenswrapper[4723]: I0309 13:24:23.577343 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"449e6144-ad49-44a8-ad79-809de89fa5c6","Type":"ContainerStarted","Data":"614f6f3f3d2a3090f8d68732d63bcf45362a34004d6e5414bf4083a904fe172c"} Mar 09 13:24:23 crc kubenswrapper[4723]: I0309 13:24:23.618152 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=3.028146752 podStartE2EDuration="5.618131321s" podCreationTimestamp="2026-03-09 13:24:18 +0000 UTC" firstStartedPulling="2026-03-09 13:24:19.747983544 +0000 UTC m=+1533.762451074" lastFinishedPulling="2026-03-09 13:24:22.337968093 +0000 UTC m=+1536.352435643" observedRunningTime="2026-03-09 13:24:23.600670101 +0000 UTC m=+1537.615137661" watchObservedRunningTime="2026-03-09 13:24:23.618131321 +0000 UTC m=+1537.632598871" Mar 09 13:24:23 crc kubenswrapper[4723]: I0309 13:24:23.635618 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 09 13:24:26 crc kubenswrapper[4723]: I0309 13:24:26.871444 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 09 13:24:26 crc kubenswrapper[4723]: I0309 13:24:26.872005 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 09 13:24:27 crc kubenswrapper[4723]: I0309 13:24:27.888322 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="fdae6e1e-9f23-495b-aeef-2a457377db3a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.16:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 13:24:27 crc kubenswrapper[4723]: I0309 13:24:27.888519 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="fdae6e1e-9f23-495b-aeef-2a457377db3a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.16:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 13:24:31 crc kubenswrapper[4723]: I0309 13:24:31.981807 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 09 13:24:31 crc kubenswrapper[4723]: I0309 13:24:31.982985 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 09 13:24:31 crc kubenswrapper[4723]: I0309 13:24:31.989292 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 09 13:24:31 crc kubenswrapper[4723]: I0309 13:24:31.991034 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 09 13:24:32 crc kubenswrapper[4723]: I0309 13:24:32.715421 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 09 13:24:32 crc kubenswrapper[4723]: I0309 13:24:32.720820 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 09 13:24:33 crc kubenswrapper[4723]: I0309 13:24:33.947097 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:24:33 crc kubenswrapper[4723]: I0309 13:24:33.947410 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:24:36 crc kubenswrapper[4723]: I0309 13:24:36.875002 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 09 13:24:36 crc kubenswrapper[4723]: I0309 13:24:36.878791 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 09 13:24:36 crc kubenswrapper[4723]: I0309 13:24:36.880513 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 09 13:24:37 crc kubenswrapper[4723]: I0309 13:24:37.793775 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 09 13:24:38 crc kubenswrapper[4723]: I0309 13:24:38.694843 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 09 13:24:42 crc kubenswrapper[4723]: I0309 13:24:42.966481 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 13:24:42 crc kubenswrapper[4723]: I0309 13:24:42.967329 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="85a79034-ec88-4dfb-9714-0630d9637c3b" containerName="kube-state-metrics" containerID="cri-o://741ecb501e21c591eab9cbe41217a7c4b863f0bfc7046701176cee2c13ed487e" gracePeriod=30 Mar 09 13:24:43 crc kubenswrapper[4723]: I0309 13:24:43.176187 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 09 13:24:43 crc kubenswrapper[4723]: I0309 13:24:43.176654 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mysqld-exporter-0" podUID="a9fc4bb5-d4e9-46f7-b213-a64914a27ee9" containerName="mysqld-exporter" containerID="cri-o://3564f49b4f404d76e803fe48113d380ef0fa7e1f2c45ba842d9ab8158d0e505e" gracePeriod=30 Mar 09 13:24:43 crc kubenswrapper[4723]: I0309 13:24:43.865976 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 09 13:24:43 crc kubenswrapper[4723]: I0309 13:24:43.870347 4723 generic.go:334] "Generic (PLEG): container finished" podID="a9fc4bb5-d4e9-46f7-b213-a64914a27ee9" containerID="3564f49b4f404d76e803fe48113d380ef0fa7e1f2c45ba842d9ab8158d0e505e" exitCode=2 Mar 09 13:24:43 crc kubenswrapper[4723]: I0309 13:24:43.870440 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"a9fc4bb5-d4e9-46f7-b213-a64914a27ee9","Type":"ContainerDied","Data":"3564f49b4f404d76e803fe48113d380ef0fa7e1f2c45ba842d9ab8158d0e505e"} Mar 09 13:24:43 crc kubenswrapper[4723]: I0309 13:24:43.907156 4723 generic.go:334] "Generic (PLEG): container finished" podID="85a79034-ec88-4dfb-9714-0630d9637c3b" containerID="741ecb501e21c591eab9cbe41217a7c4b863f0bfc7046701176cee2c13ed487e" exitCode=2 Mar 09 13:24:43 crc kubenswrapper[4723]: I0309 13:24:43.907209 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"85a79034-ec88-4dfb-9714-0630d9637c3b","Type":"ContainerDied","Data":"741ecb501e21c591eab9cbe41217a7c4b863f0bfc7046701176cee2c13ed487e"} Mar 09 13:24:43 crc kubenswrapper[4723]: I0309 13:24:43.907258 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"85a79034-ec88-4dfb-9714-0630d9637c3b","Type":"ContainerDied","Data":"b9625e652dc29aa4132b44f35c3022ce612b2a17e125dce85970f3a343c7fdff"} Mar 09 13:24:43 crc kubenswrapper[4723]: I0309 13:24:43.907274 4723 scope.go:117] "RemoveContainer" containerID="741ecb501e21c591eab9cbe41217a7c4b863f0bfc7046701176cee2c13ed487e" Mar 09 13:24:43 crc kubenswrapper[4723]: I0309 13:24:43.907765 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.000171 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.004610 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9wk7\" (UniqueName: \"kubernetes.io/projected/85a79034-ec88-4dfb-9714-0630d9637c3b-kube-api-access-w9wk7\") pod \"85a79034-ec88-4dfb-9714-0630d9637c3b\" (UID: \"85a79034-ec88-4dfb-9714-0630d9637c3b\") " Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.012058 4723 scope.go:117] "RemoveContainer" containerID="741ecb501e21c591eab9cbe41217a7c4b863f0bfc7046701176cee2c13ed487e" Mar 09 13:24:44 crc kubenswrapper[4723]: E0309 13:24:44.012718 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"741ecb501e21c591eab9cbe41217a7c4b863f0bfc7046701176cee2c13ed487e\": container with ID starting with 741ecb501e21c591eab9cbe41217a7c4b863f0bfc7046701176cee2c13ed487e not found: ID does not exist" containerID="741ecb501e21c591eab9cbe41217a7c4b863f0bfc7046701176cee2c13ed487e" Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.012757 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"741ecb501e21c591eab9cbe41217a7c4b863f0bfc7046701176cee2c13ed487e"} err="failed to get container status \"741ecb501e21c591eab9cbe41217a7c4b863f0bfc7046701176cee2c13ed487e\": rpc error: code = NotFound desc = could not find container \"741ecb501e21c591eab9cbe41217a7c4b863f0bfc7046701176cee2c13ed487e\": container with ID starting with 741ecb501e21c591eab9cbe41217a7c4b863f0bfc7046701176cee2c13ed487e not found: ID does not exist" Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.017977 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85a79034-ec88-4dfb-9714-0630d9637c3b-kube-api-access-w9wk7" (OuterVolumeSpecName: "kube-api-access-w9wk7") pod "85a79034-ec88-4dfb-9714-0630d9637c3b" (UID: "85a79034-ec88-4dfb-9714-0630d9637c3b"). InnerVolumeSpecName "kube-api-access-w9wk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.107160 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnrkg\" (UniqueName: \"kubernetes.io/projected/a9fc4bb5-d4e9-46f7-b213-a64914a27ee9-kube-api-access-lnrkg\") pod \"a9fc4bb5-d4e9-46f7-b213-a64914a27ee9\" (UID: \"a9fc4bb5-d4e9-46f7-b213-a64914a27ee9\") " Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.107212 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9fc4bb5-d4e9-46f7-b213-a64914a27ee9-config-data\") pod \"a9fc4bb5-d4e9-46f7-b213-a64914a27ee9\" (UID: \"a9fc4bb5-d4e9-46f7-b213-a64914a27ee9\") " Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.107253 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9fc4bb5-d4e9-46f7-b213-a64914a27ee9-combined-ca-bundle\") pod \"a9fc4bb5-d4e9-46f7-b213-a64914a27ee9\" (UID: \"a9fc4bb5-d4e9-46f7-b213-a64914a27ee9\") " Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.107964 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9wk7\" (UniqueName: \"kubernetes.io/projected/85a79034-ec88-4dfb-9714-0630d9637c3b-kube-api-access-w9wk7\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.111182 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9fc4bb5-d4e9-46f7-b213-a64914a27ee9-kube-api-access-lnrkg" (OuterVolumeSpecName: "kube-api-access-lnrkg") pod "a9fc4bb5-d4e9-46f7-b213-a64914a27ee9" (UID: "a9fc4bb5-d4e9-46f7-b213-a64914a27ee9"). InnerVolumeSpecName "kube-api-access-lnrkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.152181 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9fc4bb5-d4e9-46f7-b213-a64914a27ee9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9fc4bb5-d4e9-46f7-b213-a64914a27ee9" (UID: "a9fc4bb5-d4e9-46f7-b213-a64914a27ee9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.176427 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9fc4bb5-d4e9-46f7-b213-a64914a27ee9-config-data" (OuterVolumeSpecName: "config-data") pod "a9fc4bb5-d4e9-46f7-b213-a64914a27ee9" (UID: "a9fc4bb5-d4e9-46f7-b213-a64914a27ee9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.210568 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnrkg\" (UniqueName: \"kubernetes.io/projected/a9fc4bb5-d4e9-46f7-b213-a64914a27ee9-kube-api-access-lnrkg\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.211074 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9fc4bb5-d4e9-46f7-b213-a64914a27ee9-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.211087 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9fc4bb5-d4e9-46f7-b213-a64914a27ee9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.264129 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.278346 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.291078 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 13:24:44 crc kubenswrapper[4723]: E0309 13:24:44.291727 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9fc4bb5-d4e9-46f7-b213-a64914a27ee9" containerName="mysqld-exporter" Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.291751 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9fc4bb5-d4e9-46f7-b213-a64914a27ee9" containerName="mysqld-exporter" Mar 09 13:24:44 crc kubenswrapper[4723]: E0309 13:24:44.291808 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85a79034-ec88-4dfb-9714-0630d9637c3b" containerName="kube-state-metrics" Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.291818 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="85a79034-ec88-4dfb-9714-0630d9637c3b" containerName="kube-state-metrics" Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.292106 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="85a79034-ec88-4dfb-9714-0630d9637c3b" containerName="kube-state-metrics" Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.292146 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9fc4bb5-d4e9-46f7-b213-a64914a27ee9" containerName="mysqld-exporter" Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.293327 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.297168 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.297976 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.305490 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.415807 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d7e1dc5-cfb6-423a-9098-24e0ca23e7b9-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7d7e1dc5-cfb6-423a-9098-24e0ca23e7b9\") " pod="openstack/kube-state-metrics-0" Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.416026 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d7e1dc5-cfb6-423a-9098-24e0ca23e7b9-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7d7e1dc5-cfb6-423a-9098-24e0ca23e7b9\") " pod="openstack/kube-state-metrics-0" Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.416140 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96v6n\" (UniqueName: \"kubernetes.io/projected/7d7e1dc5-cfb6-423a-9098-24e0ca23e7b9-kube-api-access-96v6n\") pod \"kube-state-metrics-0\" (UID: \"7d7e1dc5-cfb6-423a-9098-24e0ca23e7b9\") " pod="openstack/kube-state-metrics-0" Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.416181 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7d7e1dc5-cfb6-423a-9098-24e0ca23e7b9-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7d7e1dc5-cfb6-423a-9098-24e0ca23e7b9\") " pod="openstack/kube-state-metrics-0" Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.517874 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96v6n\" (UniqueName: \"kubernetes.io/projected/7d7e1dc5-cfb6-423a-9098-24e0ca23e7b9-kube-api-access-96v6n\") pod \"kube-state-metrics-0\" (UID: \"7d7e1dc5-cfb6-423a-9098-24e0ca23e7b9\") " pod="openstack/kube-state-metrics-0" Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.517961 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7d7e1dc5-cfb6-423a-9098-24e0ca23e7b9-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7d7e1dc5-cfb6-423a-9098-24e0ca23e7b9\") " pod="openstack/kube-state-metrics-0" Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.518003 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d7e1dc5-cfb6-423a-9098-24e0ca23e7b9-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7d7e1dc5-cfb6-423a-9098-24e0ca23e7b9\") " pod="openstack/kube-state-metrics-0" Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.518142 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d7e1dc5-cfb6-423a-9098-24e0ca23e7b9-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7d7e1dc5-cfb6-423a-9098-24e0ca23e7b9\") " pod="openstack/kube-state-metrics-0" Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.522509 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d7e1dc5-cfb6-423a-9098-24e0ca23e7b9-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7d7e1dc5-cfb6-423a-9098-24e0ca23e7b9\") " pod="openstack/kube-state-metrics-0" Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.523097 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d7e1dc5-cfb6-423a-9098-24e0ca23e7b9-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7d7e1dc5-cfb6-423a-9098-24e0ca23e7b9\") " pod="openstack/kube-state-metrics-0" Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.525145 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7d7e1dc5-cfb6-423a-9098-24e0ca23e7b9-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7d7e1dc5-cfb6-423a-9098-24e0ca23e7b9\") " pod="openstack/kube-state-metrics-0" Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.542250 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96v6n\" (UniqueName: \"kubernetes.io/projected/7d7e1dc5-cfb6-423a-9098-24e0ca23e7b9-kube-api-access-96v6n\") pod \"kube-state-metrics-0\" (UID: \"7d7e1dc5-cfb6-423a-9098-24e0ca23e7b9\") " pod="openstack/kube-state-metrics-0" Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.619372 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.904402 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85a79034-ec88-4dfb-9714-0630d9637c3b" path="/var/lib/kubelet/pods/85a79034-ec88-4dfb-9714-0630d9637c3b/volumes" Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.921477 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"a9fc4bb5-d4e9-46f7-b213-a64914a27ee9","Type":"ContainerDied","Data":"251f1342a8675c3b964480b4e6450a82d9648bbc987af5ab0ff0fcf24f0e6189"} Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.921518 4723 scope.go:117] "RemoveContainer" containerID="3564f49b4f404d76e803fe48113d380ef0fa7e1f2c45ba842d9ab8158d0e505e" Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.921763 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.948793 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.959655 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.973031 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.974656 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.977612 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.983177 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-mysqld-exporter-svc" Mar 09 13:24:44 crc kubenswrapper[4723]: I0309 13:24:44.984164 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 09 13:24:45 crc kubenswrapper[4723]: I0309 13:24:45.038433 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/add32e59-a256-4ac8-9e96-8f340b9119de-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"add32e59-a256-4ac8-9e96-8f340b9119de\") " pod="openstack/mysqld-exporter-0" Mar 09 13:24:45 crc kubenswrapper[4723]: I0309 13:24:45.038734 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dngg\" (UniqueName: \"kubernetes.io/projected/add32e59-a256-4ac8-9e96-8f340b9119de-kube-api-access-7dngg\") pod \"mysqld-exporter-0\" (UID: \"add32e59-a256-4ac8-9e96-8f340b9119de\") " pod="openstack/mysqld-exporter-0" Mar 09 13:24:45 crc kubenswrapper[4723]: I0309 13:24:45.038843 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/add32e59-a256-4ac8-9e96-8f340b9119de-config-data\") pod \"mysqld-exporter-0\" (UID: \"add32e59-a256-4ac8-9e96-8f340b9119de\") " pod="openstack/mysqld-exporter-0" Mar 09 13:24:45 crc kubenswrapper[4723]: I0309 13:24:45.039168 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/add32e59-a256-4ac8-9e96-8f340b9119de-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"add32e59-a256-4ac8-9e96-8f340b9119de\") " pod="openstack/mysqld-exporter-0" Mar 09 13:24:45 crc kubenswrapper[4723]: I0309 13:24:45.095692 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 13:24:45 crc kubenswrapper[4723]: I0309 13:24:45.141999 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/add32e59-a256-4ac8-9e96-8f340b9119de-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"add32e59-a256-4ac8-9e96-8f340b9119de\") " pod="openstack/mysqld-exporter-0" Mar 09 13:24:45 crc kubenswrapper[4723]: I0309 13:24:45.142428 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/add32e59-a256-4ac8-9e96-8f340b9119de-config-data\") pod \"mysqld-exporter-0\" (UID: \"add32e59-a256-4ac8-9e96-8f340b9119de\") " pod="openstack/mysqld-exporter-0" Mar 09 13:24:45 crc kubenswrapper[4723]: I0309 13:24:45.142452 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dngg\" (UniqueName: \"kubernetes.io/projected/add32e59-a256-4ac8-9e96-8f340b9119de-kube-api-access-7dngg\") pod \"mysqld-exporter-0\" (UID: \"add32e59-a256-4ac8-9e96-8f340b9119de\") " pod="openstack/mysqld-exporter-0" Mar 09 13:24:45 crc kubenswrapper[4723]: I0309 13:24:45.142551 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/add32e59-a256-4ac8-9e96-8f340b9119de-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"add32e59-a256-4ac8-9e96-8f340b9119de\") " pod="openstack/mysqld-exporter-0" Mar 09 13:24:45 crc kubenswrapper[4723]: I0309 13:24:45.149082 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/add32e59-a256-4ac8-9e96-8f340b9119de-config-data\") pod \"mysqld-exporter-0\" (UID: \"add32e59-a256-4ac8-9e96-8f340b9119de\") " pod="openstack/mysqld-exporter-0" Mar 09 13:24:45 crc kubenswrapper[4723]: I0309 13:24:45.149141 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/add32e59-a256-4ac8-9e96-8f340b9119de-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"add32e59-a256-4ac8-9e96-8f340b9119de\") " pod="openstack/mysqld-exporter-0" Mar 09 13:24:45 crc kubenswrapper[4723]: I0309 13:24:45.149305 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/add32e59-a256-4ac8-9e96-8f340b9119de-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"add32e59-a256-4ac8-9e96-8f340b9119de\") " pod="openstack/mysqld-exporter-0" Mar 09 13:24:45 crc kubenswrapper[4723]: I0309 13:24:45.167028 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dngg\" (UniqueName: \"kubernetes.io/projected/add32e59-a256-4ac8-9e96-8f340b9119de-kube-api-access-7dngg\") pod \"mysqld-exporter-0\" (UID: \"add32e59-a256-4ac8-9e96-8f340b9119de\") " pod="openstack/mysqld-exporter-0" Mar 09 13:24:45 crc kubenswrapper[4723]: I0309 13:24:45.293368 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 09 13:24:45 crc kubenswrapper[4723]: I0309 13:24:45.772600 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:24:45 crc kubenswrapper[4723]: I0309 13:24:45.773270 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="224c4d37-323a-4d7c-9b7c-c284b931b6fd" containerName="ceilometer-central-agent" containerID="cri-o://48fcaf846f4d2a86691b857cd5e78789c74342699162b59cf13aad047d5bfef3" gracePeriod=30 Mar 09 13:24:45 crc kubenswrapper[4723]: I0309 13:24:45.773849 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="224c4d37-323a-4d7c-9b7c-c284b931b6fd" containerName="proxy-httpd" containerID="cri-o://4d99288f2b5c2697b613232a388a07da3602e50c10f79e0e23a3611267b2e93e" gracePeriod=30 Mar 09 13:24:45 crc kubenswrapper[4723]: I0309 13:24:45.773944 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="224c4d37-323a-4d7c-9b7c-c284b931b6fd" containerName="sg-core" containerID="cri-o://487d97bd65f79a60ffd5e591cec1d6aa684f5f53f5c7f26507d8c39f3243625f" gracePeriod=30 Mar 09 13:24:45 crc kubenswrapper[4723]: I0309 13:24:45.773981 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="224c4d37-323a-4d7c-9b7c-c284b931b6fd" containerName="ceilometer-notification-agent" containerID="cri-o://d7fc59ead6b37d6d2d496b6e1a04621a73f284489179e1e316ad2e27ff4f24bb" gracePeriod=30 Mar 09 13:24:45 crc kubenswrapper[4723]: I0309 13:24:45.828447 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 09 13:24:45 crc kubenswrapper[4723]: I0309 13:24:45.954445 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7d7e1dc5-cfb6-423a-9098-24e0ca23e7b9","Type":"ContainerStarted","Data":"b5688457d51bfad56b87d4edeb884875657fc70d9c78ea253d8fcb5b626cbbec"} Mar 09 13:24:45 crc kubenswrapper[4723]: I0309 13:24:45.954513 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7d7e1dc5-cfb6-423a-9098-24e0ca23e7b9","Type":"ContainerStarted","Data":"a7c455b8510844b6b747bf7d294153fdfd43e3fdc69cdec5ce5fbed632e6d9d5"} Mar 09 13:24:45 crc kubenswrapper[4723]: I0309 13:24:45.954540 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 09 13:24:45 crc kubenswrapper[4723]: I0309 13:24:45.959983 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"add32e59-a256-4ac8-9e96-8f340b9119de","Type":"ContainerStarted","Data":"450c0b57f19504176fa2013ba8e28baa59033419e258fda4cf5d4835d3ba6d1f"} Mar 09 13:24:45 crc kubenswrapper[4723]: I0309 13:24:45.964500 4723 generic.go:334] "Generic (PLEG): container finished" podID="224c4d37-323a-4d7c-9b7c-c284b931b6fd" containerID="487d97bd65f79a60ffd5e591cec1d6aa684f5f53f5c7f26507d8c39f3243625f" exitCode=2 Mar 09 13:24:45 crc kubenswrapper[4723]: I0309 13:24:45.964570 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"224c4d37-323a-4d7c-9b7c-c284b931b6fd","Type":"ContainerDied","Data":"487d97bd65f79a60ffd5e591cec1d6aa684f5f53f5c7f26507d8c39f3243625f"} Mar 09 13:24:45 crc kubenswrapper[4723]: I0309 13:24:45.971531 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.5468235479999999 podStartE2EDuration="1.971512122s" podCreationTimestamp="2026-03-09 13:24:44 +0000 UTC" firstStartedPulling="2026-03-09 13:24:45.087179189 +0000 UTC m=+1559.101646729" lastFinishedPulling="2026-03-09 13:24:45.511867763 +0000 UTC m=+1559.526335303" observedRunningTime="2026-03-09 13:24:45.971461001 +0000 UTC m=+1559.985928551" watchObservedRunningTime="2026-03-09 13:24:45.971512122 +0000 UTC m=+1559.985979662" Mar 09 13:24:47 crc kubenswrapper[4723]: I0309 13:24:47.004050 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9fc4bb5-d4e9-46f7-b213-a64914a27ee9" path="/var/lib/kubelet/pods/a9fc4bb5-d4e9-46f7-b213-a64914a27ee9/volumes" Mar 09 13:24:47 crc kubenswrapper[4723]: I0309 13:24:47.013665 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"add32e59-a256-4ac8-9e96-8f340b9119de","Type":"ContainerStarted","Data":"18f51e3e19fe9ab06e185c60dc90c6b14184044d7a802dc7c8afd1809080845f"} Mar 09 13:24:47 crc kubenswrapper[4723]: I0309 13:24:47.019947 4723 generic.go:334] "Generic (PLEG): container finished" podID="224c4d37-323a-4d7c-9b7c-c284b931b6fd" containerID="4d99288f2b5c2697b613232a388a07da3602e50c10f79e0e23a3611267b2e93e" exitCode=0 Mar 09 13:24:47 crc kubenswrapper[4723]: I0309 13:24:47.019986 4723 generic.go:334] "Generic (PLEG): container finished" podID="224c4d37-323a-4d7c-9b7c-c284b931b6fd" containerID="48fcaf846f4d2a86691b857cd5e78789c74342699162b59cf13aad047d5bfef3" exitCode=0 Mar 09 13:24:47 crc kubenswrapper[4723]: I0309 13:24:47.020865 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"224c4d37-323a-4d7c-9b7c-c284b931b6fd","Type":"ContainerDied","Data":"4d99288f2b5c2697b613232a388a07da3602e50c10f79e0e23a3611267b2e93e"} Mar 09 13:24:47 crc kubenswrapper[4723]: I0309 13:24:47.020914 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"224c4d37-323a-4d7c-9b7c-c284b931b6fd","Type":"ContainerDied","Data":"48fcaf846f4d2a86691b857cd5e78789c74342699162b59cf13aad047d5bfef3"} Mar 09 13:24:47 crc kubenswrapper[4723]: I0309 13:24:47.050626 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=2.475461843 podStartE2EDuration="3.050606005s" podCreationTimestamp="2026-03-09 13:24:44 +0000 UTC" firstStartedPulling="2026-03-09 13:24:45.832648132 +0000 UTC m=+1559.847115672" lastFinishedPulling="2026-03-09 13:24:46.407792294 +0000 UTC m=+1560.422259834" observedRunningTime="2026-03-09 13:24:47.035489455 +0000 UTC m=+1561.049956995" watchObservedRunningTime="2026-03-09 13:24:47.050606005 +0000 UTC m=+1561.065073545" Mar 09 13:24:47 crc kubenswrapper[4723]: I0309 13:24:47.544499 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nb5mx"] Mar 09 13:24:47 crc kubenswrapper[4723]: I0309 13:24:47.547257 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nb5mx" Mar 09 13:24:47 crc kubenswrapper[4723]: I0309 13:24:47.561847 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nb5mx"] Mar 09 13:24:47 crc kubenswrapper[4723]: I0309 13:24:47.632765 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/959952f4-8104-4bc2-988d-906fe1ea3662-utilities\") pod \"redhat-marketplace-nb5mx\" (UID: \"959952f4-8104-4bc2-988d-906fe1ea3662\") " pod="openshift-marketplace/redhat-marketplace-nb5mx" Mar 09 13:24:47 crc kubenswrapper[4723]: I0309 13:24:47.632854 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrvbf\" (UniqueName: \"kubernetes.io/projected/959952f4-8104-4bc2-988d-906fe1ea3662-kube-api-access-hrvbf\") pod \"redhat-marketplace-nb5mx\" (UID: \"959952f4-8104-4bc2-988d-906fe1ea3662\") " pod="openshift-marketplace/redhat-marketplace-nb5mx" Mar 09 13:24:47 crc kubenswrapper[4723]: I0309 13:24:47.632933 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/959952f4-8104-4bc2-988d-906fe1ea3662-catalog-content\") pod \"redhat-marketplace-nb5mx\" (UID: \"959952f4-8104-4bc2-988d-906fe1ea3662\") " pod="openshift-marketplace/redhat-marketplace-nb5mx" Mar 09 13:24:47 crc kubenswrapper[4723]: I0309 13:24:47.735419 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/959952f4-8104-4bc2-988d-906fe1ea3662-utilities\") pod \"redhat-marketplace-nb5mx\" (UID: \"959952f4-8104-4bc2-988d-906fe1ea3662\") " pod="openshift-marketplace/redhat-marketplace-nb5mx" Mar 09 13:24:47 crc kubenswrapper[4723]: I0309 13:24:47.735487 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrvbf\" (UniqueName: \"kubernetes.io/projected/959952f4-8104-4bc2-988d-906fe1ea3662-kube-api-access-hrvbf\") pod \"redhat-marketplace-nb5mx\" (UID: \"959952f4-8104-4bc2-988d-906fe1ea3662\") " pod="openshift-marketplace/redhat-marketplace-nb5mx" Mar 09 13:24:47 crc kubenswrapper[4723]: I0309 13:24:47.735529 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/959952f4-8104-4bc2-988d-906fe1ea3662-catalog-content\") pod \"redhat-marketplace-nb5mx\" (UID: \"959952f4-8104-4bc2-988d-906fe1ea3662\") " pod="openshift-marketplace/redhat-marketplace-nb5mx" Mar 09 13:24:47 crc kubenswrapper[4723]: I0309 13:24:47.735981 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/959952f4-8104-4bc2-988d-906fe1ea3662-utilities\") pod \"redhat-marketplace-nb5mx\" (UID: \"959952f4-8104-4bc2-988d-906fe1ea3662\") " pod="openshift-marketplace/redhat-marketplace-nb5mx" Mar 09 13:24:47 crc kubenswrapper[4723]: I0309 13:24:47.736122 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/959952f4-8104-4bc2-988d-906fe1ea3662-catalog-content\") pod \"redhat-marketplace-nb5mx\" (UID: \"959952f4-8104-4bc2-988d-906fe1ea3662\") " pod="openshift-marketplace/redhat-marketplace-nb5mx" Mar 09 13:24:47 crc kubenswrapper[4723]: I0309 13:24:47.754053 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrvbf\" (UniqueName: \"kubernetes.io/projected/959952f4-8104-4bc2-988d-906fe1ea3662-kube-api-access-hrvbf\") pod \"redhat-marketplace-nb5mx\" (UID: \"959952f4-8104-4bc2-988d-906fe1ea3662\") " pod="openshift-marketplace/redhat-marketplace-nb5mx" Mar 09 13:24:47 crc kubenswrapper[4723]: I0309 13:24:47.870348 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nb5mx" Mar 09 13:24:48 crc kubenswrapper[4723]: I0309 13:24:48.459281 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nb5mx"] Mar 09 13:24:49 crc kubenswrapper[4723]: I0309 13:24:49.051981 4723 generic.go:334] "Generic (PLEG): container finished" podID="959952f4-8104-4bc2-988d-906fe1ea3662" containerID="34c875e78833c8bfc74e76e13af608804248e22f883b5d05ad4d07e2af019b24" exitCode=0 Mar 09 13:24:49 crc kubenswrapper[4723]: I0309 13:24:49.052159 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nb5mx" event={"ID":"959952f4-8104-4bc2-988d-906fe1ea3662","Type":"ContainerDied","Data":"34c875e78833c8bfc74e76e13af608804248e22f883b5d05ad4d07e2af019b24"} Mar 09 13:24:49 crc kubenswrapper[4723]: I0309 13:24:49.052865 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nb5mx" event={"ID":"959952f4-8104-4bc2-988d-906fe1ea3662","Type":"ContainerStarted","Data":"69622cef8b65b42a782591fc8001bb42be25fbe651d28a3227cc3947e70a0542"} Mar 09 13:24:49 crc kubenswrapper[4723]: E0309 13:24:49.351714 4723 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod224c4d37_323a_4d7c_9b7c_c284b931b6fd.slice/crio-conmon-d7fc59ead6b37d6d2d496b6e1a04621a73f284489179e1e316ad2e27ff4f24bb.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod224c4d37_323a_4d7c_9b7c_c284b931b6fd.slice/crio-d7fc59ead6b37d6d2d496b6e1a04621a73f284489179e1e316ad2e27ff4f24bb.scope\": RecentStats: unable to find data in memory cache]" Mar 09 13:24:49 crc kubenswrapper[4723]: E0309 13:24:49.351867 4723 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod224c4d37_323a_4d7c_9b7c_c284b931b6fd.slice/crio-conmon-d7fc59ead6b37d6d2d496b6e1a04621a73f284489179e1e316ad2e27ff4f24bb.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod224c4d37_323a_4d7c_9b7c_c284b931b6fd.slice/crio-d7fc59ead6b37d6d2d496b6e1a04621a73f284489179e1e316ad2e27ff4f24bb.scope\": RecentStats: unable to find data in memory cache]" Mar 09 13:24:49 crc kubenswrapper[4723]: I0309 13:24:49.737925 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:24:49 crc kubenswrapper[4723]: I0309 13:24:49.800596 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crn4h\" (UniqueName: \"kubernetes.io/projected/224c4d37-323a-4d7c-9b7c-c284b931b6fd-kube-api-access-crn4h\") pod \"224c4d37-323a-4d7c-9b7c-c284b931b6fd\" (UID: \"224c4d37-323a-4d7c-9b7c-c284b931b6fd\") " Mar 09 13:24:49 crc kubenswrapper[4723]: I0309 13:24:49.800641 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/224c4d37-323a-4d7c-9b7c-c284b931b6fd-run-httpd\") pod \"224c4d37-323a-4d7c-9b7c-c284b931b6fd\" (UID: \"224c4d37-323a-4d7c-9b7c-c284b931b6fd\") " Mar 09 13:24:49 crc kubenswrapper[4723]: I0309 13:24:49.800688 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/224c4d37-323a-4d7c-9b7c-c284b931b6fd-config-data\") pod \"224c4d37-323a-4d7c-9b7c-c284b931b6fd\" (UID: \"224c4d37-323a-4d7c-9b7c-c284b931b6fd\") " Mar 09 13:24:49 crc kubenswrapper[4723]: I0309 13:24:49.800778 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/224c4d37-323a-4d7c-9b7c-c284b931b6fd-sg-core-conf-yaml\") pod \"224c4d37-323a-4d7c-9b7c-c284b931b6fd\" (UID: \"224c4d37-323a-4d7c-9b7c-c284b931b6fd\") " Mar 09 13:24:49 crc kubenswrapper[4723]: I0309 13:24:49.800838 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/224c4d37-323a-4d7c-9b7c-c284b931b6fd-combined-ca-bundle\") pod \"224c4d37-323a-4d7c-9b7c-c284b931b6fd\" (UID: \"224c4d37-323a-4d7c-9b7c-c284b931b6fd\") " Mar 09 13:24:49 crc kubenswrapper[4723]: I0309 13:24:49.800930 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/224c4d37-323a-4d7c-9b7c-c284b931b6fd-scripts\") pod \"224c4d37-323a-4d7c-9b7c-c284b931b6fd\" (UID: \"224c4d37-323a-4d7c-9b7c-c284b931b6fd\") " Mar 09 13:24:49 crc kubenswrapper[4723]: I0309 13:24:49.801017 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/224c4d37-323a-4d7c-9b7c-c284b931b6fd-log-httpd\") pod \"224c4d37-323a-4d7c-9b7c-c284b931b6fd\" (UID: \"224c4d37-323a-4d7c-9b7c-c284b931b6fd\") " Mar 09 13:24:49 crc kubenswrapper[4723]: I0309 13:24:49.803121 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/224c4d37-323a-4d7c-9b7c-c284b931b6fd-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "224c4d37-323a-4d7c-9b7c-c284b931b6fd" (UID: "224c4d37-323a-4d7c-9b7c-c284b931b6fd"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:24:49 crc kubenswrapper[4723]: I0309 13:24:49.807436 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/224c4d37-323a-4d7c-9b7c-c284b931b6fd-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "224c4d37-323a-4d7c-9b7c-c284b931b6fd" (UID: "224c4d37-323a-4d7c-9b7c-c284b931b6fd"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:24:49 crc kubenswrapper[4723]: I0309 13:24:49.809300 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/224c4d37-323a-4d7c-9b7c-c284b931b6fd-scripts" (OuterVolumeSpecName: "scripts") pod "224c4d37-323a-4d7c-9b7c-c284b931b6fd" (UID: "224c4d37-323a-4d7c-9b7c-c284b931b6fd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:24:49 crc kubenswrapper[4723]: I0309 13:24:49.825929 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/224c4d37-323a-4d7c-9b7c-c284b931b6fd-kube-api-access-crn4h" (OuterVolumeSpecName: "kube-api-access-crn4h") pod "224c4d37-323a-4d7c-9b7c-c284b931b6fd" (UID: "224c4d37-323a-4d7c-9b7c-c284b931b6fd"). InnerVolumeSpecName "kube-api-access-crn4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:24:49 crc kubenswrapper[4723]: I0309 13:24:49.856376 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/224c4d37-323a-4d7c-9b7c-c284b931b6fd-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "224c4d37-323a-4d7c-9b7c-c284b931b6fd" (UID: "224c4d37-323a-4d7c-9b7c-c284b931b6fd"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:24:49 crc kubenswrapper[4723]: I0309 13:24:49.916037 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crn4h\" (UniqueName: \"kubernetes.io/projected/224c4d37-323a-4d7c-9b7c-c284b931b6fd-kube-api-access-crn4h\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:49 crc kubenswrapper[4723]: I0309 13:24:49.916165 4723 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/224c4d37-323a-4d7c-9b7c-c284b931b6fd-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:49 crc kubenswrapper[4723]: I0309 13:24:49.916676 4723 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/224c4d37-323a-4d7c-9b7c-c284b931b6fd-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:49 crc kubenswrapper[4723]: I0309 13:24:49.916691 4723 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/224c4d37-323a-4d7c-9b7c-c284b931b6fd-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:49 crc kubenswrapper[4723]: I0309 13:24:49.916700 4723 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/224c4d37-323a-4d7c-9b7c-c284b931b6fd-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:49 crc kubenswrapper[4723]: I0309 13:24:49.948285 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/224c4d37-323a-4d7c-9b7c-c284b931b6fd-config-data" (OuterVolumeSpecName: "config-data") pod "224c4d37-323a-4d7c-9b7c-c284b931b6fd" (UID: "224c4d37-323a-4d7c-9b7c-c284b931b6fd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:24:49 crc kubenswrapper[4723]: I0309 13:24:49.951553 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/224c4d37-323a-4d7c-9b7c-c284b931b6fd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "224c4d37-323a-4d7c-9b7c-c284b931b6fd" (UID: "224c4d37-323a-4d7c-9b7c-c284b931b6fd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.019223 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/224c4d37-323a-4d7c-9b7c-c284b931b6fd-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.019259 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/224c4d37-323a-4d7c-9b7c-c284b931b6fd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.068193 4723 generic.go:334] "Generic (PLEG): container finished" podID="224c4d37-323a-4d7c-9b7c-c284b931b6fd" containerID="d7fc59ead6b37d6d2d496b6e1a04621a73f284489179e1e316ad2e27ff4f24bb" exitCode=0 Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.068282 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.068288 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"224c4d37-323a-4d7c-9b7c-c284b931b6fd","Type":"ContainerDied","Data":"d7fc59ead6b37d6d2d496b6e1a04621a73f284489179e1e316ad2e27ff4f24bb"} Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.068354 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"224c4d37-323a-4d7c-9b7c-c284b931b6fd","Type":"ContainerDied","Data":"ba678d7b397c0f90a7b39b35ba0c8033a808675a5529cb8eb07a76fd24ed8abf"} Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.068383 4723 scope.go:117] "RemoveContainer" containerID="4d99288f2b5c2697b613232a388a07da3602e50c10f79e0e23a3611267b2e93e" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.070847 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nb5mx" event={"ID":"959952f4-8104-4bc2-988d-906fe1ea3662","Type":"ContainerStarted","Data":"4ae5c90b5639d2a29447ccddba7ce528eb13cb46ae1698834f463c210c0969c6"} Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.091113 4723 scope.go:117] "RemoveContainer" containerID="487d97bd65f79a60ffd5e591cec1d6aa684f5f53f5c7f26507d8c39f3243625f" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.124953 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.127103 4723 scope.go:117] "RemoveContainer" containerID="d7fc59ead6b37d6d2d496b6e1a04621a73f284489179e1e316ad2e27ff4f24bb" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.134783 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.149452 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:24:50 crc kubenswrapper[4723]: E0309 13:24:50.150117 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="224c4d37-323a-4d7c-9b7c-c284b931b6fd" containerName="ceilometer-notification-agent" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.150168 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="224c4d37-323a-4d7c-9b7c-c284b931b6fd" containerName="ceilometer-notification-agent" Mar 09 13:24:50 crc kubenswrapper[4723]: E0309 13:24:50.150202 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="224c4d37-323a-4d7c-9b7c-c284b931b6fd" containerName="proxy-httpd" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.150211 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="224c4d37-323a-4d7c-9b7c-c284b931b6fd" containerName="proxy-httpd" Mar 09 13:24:50 crc kubenswrapper[4723]: E0309 13:24:50.150230 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="224c4d37-323a-4d7c-9b7c-c284b931b6fd" containerName="sg-core" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.150239 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="224c4d37-323a-4d7c-9b7c-c284b931b6fd" containerName="sg-core" Mar 09 13:24:50 crc kubenswrapper[4723]: E0309 13:24:50.150268 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="224c4d37-323a-4d7c-9b7c-c284b931b6fd" containerName="ceilometer-central-agent" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.150275 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="224c4d37-323a-4d7c-9b7c-c284b931b6fd" containerName="ceilometer-central-agent" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.150554 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="224c4d37-323a-4d7c-9b7c-c284b931b6fd" containerName="proxy-httpd" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.150595 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="224c4d37-323a-4d7c-9b7c-c284b931b6fd" containerName="ceilometer-central-agent" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.150611 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="224c4d37-323a-4d7c-9b7c-c284b931b6fd" containerName="ceilometer-notification-agent" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.150639 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="224c4d37-323a-4d7c-9b7c-c284b931b6fd" containerName="sg-core" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.153531 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.156714 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.156986 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.157166 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.165528 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.186563 4723 scope.go:117] "RemoveContainer" containerID="48fcaf846f4d2a86691b857cd5e78789c74342699162b59cf13aad047d5bfef3" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.215266 4723 scope.go:117] "RemoveContainer" containerID="4d99288f2b5c2697b613232a388a07da3602e50c10f79e0e23a3611267b2e93e" Mar 09 13:24:50 crc kubenswrapper[4723]: E0309 13:24:50.215773 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d99288f2b5c2697b613232a388a07da3602e50c10f79e0e23a3611267b2e93e\": container with ID starting with 4d99288f2b5c2697b613232a388a07da3602e50c10f79e0e23a3611267b2e93e not found: ID does not exist" containerID="4d99288f2b5c2697b613232a388a07da3602e50c10f79e0e23a3611267b2e93e" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.215818 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d99288f2b5c2697b613232a388a07da3602e50c10f79e0e23a3611267b2e93e"} err="failed to get container status \"4d99288f2b5c2697b613232a388a07da3602e50c10f79e0e23a3611267b2e93e\": rpc error: code = NotFound desc = could not find container \"4d99288f2b5c2697b613232a388a07da3602e50c10f79e0e23a3611267b2e93e\": container with ID starting with 4d99288f2b5c2697b613232a388a07da3602e50c10f79e0e23a3611267b2e93e not found: ID does not exist" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.215849 4723 scope.go:117] "RemoveContainer" containerID="487d97bd65f79a60ffd5e591cec1d6aa684f5f53f5c7f26507d8c39f3243625f" Mar 09 13:24:50 crc kubenswrapper[4723]: E0309 13:24:50.216355 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"487d97bd65f79a60ffd5e591cec1d6aa684f5f53f5c7f26507d8c39f3243625f\": container with ID starting with 487d97bd65f79a60ffd5e591cec1d6aa684f5f53f5c7f26507d8c39f3243625f not found: ID does not exist" containerID="487d97bd65f79a60ffd5e591cec1d6aa684f5f53f5c7f26507d8c39f3243625f" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.216388 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"487d97bd65f79a60ffd5e591cec1d6aa684f5f53f5c7f26507d8c39f3243625f"} err="failed to get container status \"487d97bd65f79a60ffd5e591cec1d6aa684f5f53f5c7f26507d8c39f3243625f\": rpc error: code = NotFound desc = could not find container \"487d97bd65f79a60ffd5e591cec1d6aa684f5f53f5c7f26507d8c39f3243625f\": container with ID starting with 487d97bd65f79a60ffd5e591cec1d6aa684f5f53f5c7f26507d8c39f3243625f not found: ID does not exist" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.216408 4723 scope.go:117] "RemoveContainer" containerID="d7fc59ead6b37d6d2d496b6e1a04621a73f284489179e1e316ad2e27ff4f24bb" Mar 09 13:24:50 crc kubenswrapper[4723]: E0309 13:24:50.216973 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7fc59ead6b37d6d2d496b6e1a04621a73f284489179e1e316ad2e27ff4f24bb\": container with ID starting with d7fc59ead6b37d6d2d496b6e1a04621a73f284489179e1e316ad2e27ff4f24bb not found: ID does not exist" containerID="d7fc59ead6b37d6d2d496b6e1a04621a73f284489179e1e316ad2e27ff4f24bb" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.217006 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7fc59ead6b37d6d2d496b6e1a04621a73f284489179e1e316ad2e27ff4f24bb"} err="failed to get container status \"d7fc59ead6b37d6d2d496b6e1a04621a73f284489179e1e316ad2e27ff4f24bb\": rpc error: code = NotFound desc = could not find container \"d7fc59ead6b37d6d2d496b6e1a04621a73f284489179e1e316ad2e27ff4f24bb\": container with ID starting with d7fc59ead6b37d6d2d496b6e1a04621a73f284489179e1e316ad2e27ff4f24bb not found: ID does not exist" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.217028 4723 scope.go:117] "RemoveContainer" containerID="48fcaf846f4d2a86691b857cd5e78789c74342699162b59cf13aad047d5bfef3" Mar 09 13:24:50 crc kubenswrapper[4723]: E0309 13:24:50.217220 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48fcaf846f4d2a86691b857cd5e78789c74342699162b59cf13aad047d5bfef3\": container with ID starting with 48fcaf846f4d2a86691b857cd5e78789c74342699162b59cf13aad047d5bfef3 not found: ID does not exist" containerID="48fcaf846f4d2a86691b857cd5e78789c74342699162b59cf13aad047d5bfef3" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.217244 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48fcaf846f4d2a86691b857cd5e78789c74342699162b59cf13aad047d5bfef3"} err="failed to get container status \"48fcaf846f4d2a86691b857cd5e78789c74342699162b59cf13aad047d5bfef3\": rpc error: code = NotFound desc = could not find container \"48fcaf846f4d2a86691b857cd5e78789c74342699162b59cf13aad047d5bfef3\": container with ID starting with 48fcaf846f4d2a86691b857cd5e78789c74342699162b59cf13aad047d5bfef3 not found: ID does not exist" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.331460 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-config-data\") pod \"ceilometer-0\" (UID: \"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2\") " pod="openstack/ceilometer-0" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.331564 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-log-httpd\") pod \"ceilometer-0\" (UID: \"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2\") " pod="openstack/ceilometer-0" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.331613 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2\") " pod="openstack/ceilometer-0" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.331650 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2\") " pod="openstack/ceilometer-0" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.331712 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2\") " pod="openstack/ceilometer-0" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.331786 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-scripts\") pod \"ceilometer-0\" (UID: \"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2\") " pod="openstack/ceilometer-0" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.331819 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghr7t\" (UniqueName: \"kubernetes.io/projected/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-kube-api-access-ghr7t\") pod \"ceilometer-0\" (UID: \"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2\") " pod="openstack/ceilometer-0" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.331842 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-run-httpd\") pod \"ceilometer-0\" (UID: \"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2\") " pod="openstack/ceilometer-0" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.433614 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-run-httpd\") pod \"ceilometer-0\" (UID: \"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2\") " pod="openstack/ceilometer-0" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.433812 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-config-data\") pod \"ceilometer-0\" (UID: \"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2\") " pod="openstack/ceilometer-0" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.433890 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-log-httpd\") pod \"ceilometer-0\" (UID: \"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2\") " pod="openstack/ceilometer-0" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.433927 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2\") " pod="openstack/ceilometer-0" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.433955 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2\") " pod="openstack/ceilometer-0" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.434005 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2\") " pod="openstack/ceilometer-0" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.434055 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-scripts\") pod \"ceilometer-0\" (UID: \"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2\") " pod="openstack/ceilometer-0" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.434080 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghr7t\" (UniqueName: \"kubernetes.io/projected/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-kube-api-access-ghr7t\") pod \"ceilometer-0\" (UID: \"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2\") " pod="openstack/ceilometer-0" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.434177 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-run-httpd\") pod \"ceilometer-0\" (UID: \"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2\") " pod="openstack/ceilometer-0" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.435336 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-log-httpd\") pod \"ceilometer-0\" (UID: \"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2\") " pod="openstack/ceilometer-0" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.440284 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2\") " pod="openstack/ceilometer-0" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.440639 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2\") " pod="openstack/ceilometer-0" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.441586 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2\") " pod="openstack/ceilometer-0" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.441836 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-config-data\") pod \"ceilometer-0\" (UID: \"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2\") " pod="openstack/ceilometer-0" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.443384 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-scripts\") pod \"ceilometer-0\" (UID: \"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2\") " pod="openstack/ceilometer-0" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.453284 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghr7t\" (UniqueName: \"kubernetes.io/projected/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-kube-api-access-ghr7t\") pod \"ceilometer-0\" (UID: \"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2\") " pod="openstack/ceilometer-0" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.479666 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:24:50 crc kubenswrapper[4723]: I0309 13:24:50.900399 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="224c4d37-323a-4d7c-9b7c-c284b931b6fd" path="/var/lib/kubelet/pods/224c4d37-323a-4d7c-9b7c-c284b931b6fd/volumes" Mar 09 13:24:51 crc kubenswrapper[4723]: I0309 13:24:51.083662 4723 generic.go:334] "Generic (PLEG): container finished" podID="959952f4-8104-4bc2-988d-906fe1ea3662" containerID="4ae5c90b5639d2a29447ccddba7ce528eb13cb46ae1698834f463c210c0969c6" exitCode=0 Mar 09 13:24:51 crc kubenswrapper[4723]: I0309 13:24:51.083699 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nb5mx" event={"ID":"959952f4-8104-4bc2-988d-906fe1ea3662","Type":"ContainerDied","Data":"4ae5c90b5639d2a29447ccddba7ce528eb13cb46ae1698834f463c210c0969c6"} Mar 09 13:24:51 crc kubenswrapper[4723]: I0309 13:24:51.112454 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:24:51 crc kubenswrapper[4723]: W0309 13:24:51.128462 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ded56b6_9f9e_4a1d_8344_7f4a6d414da2.slice/crio-833f4666a2bdefe79a2de77132af3bcbc4ff6394ec2d4dd025a396e703ddd5fb WatchSource:0}: Error finding container 833f4666a2bdefe79a2de77132af3bcbc4ff6394ec2d4dd025a396e703ddd5fb: Status 404 returned error can't find the container with id 833f4666a2bdefe79a2de77132af3bcbc4ff6394ec2d4dd025a396e703ddd5fb Mar 09 13:24:51 crc kubenswrapper[4723]: I0309 13:24:51.767240 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-lf8bq"] Mar 09 13:24:51 crc kubenswrapper[4723]: I0309 13:24:51.785500 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-lf8bq"] Mar 09 13:24:51 crc kubenswrapper[4723]: I0309 13:24:51.881655 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-flx99"] Mar 09 13:24:51 crc kubenswrapper[4723]: I0309 13:24:51.883263 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-flx99" Mar 09 13:24:51 crc kubenswrapper[4723]: I0309 13:24:51.923263 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-flx99"] Mar 09 13:24:51 crc kubenswrapper[4723]: I0309 13:24:51.973092 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09a0d87c-3e58-4839-95fb-7c964152ed7c-config-data\") pod \"heat-db-sync-flx99\" (UID: \"09a0d87c-3e58-4839-95fb-7c964152ed7c\") " pod="openstack/heat-db-sync-flx99" Mar 09 13:24:51 crc kubenswrapper[4723]: I0309 13:24:51.973454 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09a0d87c-3e58-4839-95fb-7c964152ed7c-combined-ca-bundle\") pod \"heat-db-sync-flx99\" (UID: \"09a0d87c-3e58-4839-95fb-7c964152ed7c\") " pod="openstack/heat-db-sync-flx99" Mar 09 13:24:51 crc kubenswrapper[4723]: I0309 13:24:51.973546 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4crf\" (UniqueName: \"kubernetes.io/projected/09a0d87c-3e58-4839-95fb-7c964152ed7c-kube-api-access-c4crf\") pod \"heat-db-sync-flx99\" (UID: \"09a0d87c-3e58-4839-95fb-7c964152ed7c\") " pod="openstack/heat-db-sync-flx99" Mar 09 13:24:52 crc kubenswrapper[4723]: I0309 13:24:52.075833 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09a0d87c-3e58-4839-95fb-7c964152ed7c-combined-ca-bundle\") pod \"heat-db-sync-flx99\" (UID: \"09a0d87c-3e58-4839-95fb-7c964152ed7c\") " pod="openstack/heat-db-sync-flx99" Mar 09 13:24:52 crc kubenswrapper[4723]: I0309 13:24:52.075930 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4crf\" (UniqueName: \"kubernetes.io/projected/09a0d87c-3e58-4839-95fb-7c964152ed7c-kube-api-access-c4crf\") pod \"heat-db-sync-flx99\" (UID: \"09a0d87c-3e58-4839-95fb-7c964152ed7c\") " pod="openstack/heat-db-sync-flx99" Mar 09 13:24:52 crc kubenswrapper[4723]: I0309 13:24:52.076098 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09a0d87c-3e58-4839-95fb-7c964152ed7c-config-data\") pod \"heat-db-sync-flx99\" (UID: \"09a0d87c-3e58-4839-95fb-7c964152ed7c\") " pod="openstack/heat-db-sync-flx99" Mar 09 13:24:52 crc kubenswrapper[4723]: I0309 13:24:52.081882 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09a0d87c-3e58-4839-95fb-7c964152ed7c-combined-ca-bundle\") pod \"heat-db-sync-flx99\" (UID: \"09a0d87c-3e58-4839-95fb-7c964152ed7c\") " pod="openstack/heat-db-sync-flx99" Mar 09 13:24:52 crc kubenswrapper[4723]: I0309 13:24:52.088982 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09a0d87c-3e58-4839-95fb-7c964152ed7c-config-data\") pod \"heat-db-sync-flx99\" (UID: \"09a0d87c-3e58-4839-95fb-7c964152ed7c\") " pod="openstack/heat-db-sync-flx99" Mar 09 13:24:52 crc kubenswrapper[4723]: I0309 13:24:52.093104 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4crf\" (UniqueName: \"kubernetes.io/projected/09a0d87c-3e58-4839-95fb-7c964152ed7c-kube-api-access-c4crf\") pod \"heat-db-sync-flx99\" (UID: \"09a0d87c-3e58-4839-95fb-7c964152ed7c\") " pod="openstack/heat-db-sync-flx99" Mar 09 13:24:52 crc kubenswrapper[4723]: I0309 13:24:52.094980 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2","Type":"ContainerStarted","Data":"833f4666a2bdefe79a2de77132af3bcbc4ff6394ec2d4dd025a396e703ddd5fb"} Mar 09 13:24:52 crc kubenswrapper[4723]: I0309 13:24:52.205404 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-flx99" Mar 09 13:24:52 crc kubenswrapper[4723]: I0309 13:24:52.708936 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-flx99"] Mar 09 13:24:53 crc kubenswrapper[4723]: I0309 13:24:53.147300 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nb5mx" podStartSLOduration=3.222528034 podStartE2EDuration="6.14728253s" podCreationTimestamp="2026-03-09 13:24:47 +0000 UTC" firstStartedPulling="2026-03-09 13:24:49.054245365 +0000 UTC m=+1563.068712895" lastFinishedPulling="2026-03-09 13:24:51.978999851 +0000 UTC m=+1565.993467391" observedRunningTime="2026-03-09 13:24:53.134342508 +0000 UTC m=+1567.148810048" watchObservedRunningTime="2026-03-09 13:24:53.14728253 +0000 UTC m=+1567.161750070" Mar 09 13:24:53 crc kubenswrapper[4723]: I0309 13:24:53.179807 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90dea403-5a65-4824-ac5b-5c34c828d616" path="/var/lib/kubelet/pods/90dea403-5a65-4824-ac5b-5c34c828d616/volumes" Mar 09 13:24:53 crc kubenswrapper[4723]: I0309 13:24:53.226296 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2","Type":"ContainerStarted","Data":"674ef4261250890626ed068cd56de70f69bbe6476d52344252ccc30a4e2612b9"} Mar 09 13:24:53 crc kubenswrapper[4723]: I0309 13:24:53.226368 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-flx99" event={"ID":"09a0d87c-3e58-4839-95fb-7c964152ed7c","Type":"ContainerStarted","Data":"45fed542ca68f1a0952f67e059b3608c789ba2870c5844fc8d81cb1615c75a5f"} Mar 09 13:24:53 crc kubenswrapper[4723]: I0309 13:24:53.226407 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nb5mx" event={"ID":"959952f4-8104-4bc2-988d-906fe1ea3662","Type":"ContainerStarted","Data":"aadab442fb27183d0f9a96ea5667285ac2c876d2e1b29d5d8d307d4fc32978de"} Mar 09 13:24:54 crc kubenswrapper[4723]: I0309 13:24:54.681595 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 09 13:24:54 crc kubenswrapper[4723]: I0309 13:24:54.799420 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 09 13:24:55 crc kubenswrapper[4723]: I0309 13:24:55.607804 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:24:55 crc kubenswrapper[4723]: I0309 13:24:55.869919 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 13:24:56 crc kubenswrapper[4723]: I0309 13:24:56.182518 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2","Type":"ContainerStarted","Data":"6c4f7025c883c3c5a5aba017bd7289bc78337de42ffa23268639b9f3c7261d00"} Mar 09 13:24:57 crc kubenswrapper[4723]: I0309 13:24:57.871034 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nb5mx" Mar 09 13:24:57 crc kubenswrapper[4723]: I0309 13:24:57.871738 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nb5mx" Mar 09 13:24:58 crc kubenswrapper[4723]: I0309 13:24:58.235207 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2","Type":"ContainerStarted","Data":"2cea2fee01e6224a831d4f4bf7203591d3031d8950cd512479fadd9c6fcc6dd3"} Mar 09 13:24:59 crc kubenswrapper[4723]: I0309 13:24:59.015041 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-nb5mx" podUID="959952f4-8104-4bc2-988d-906fe1ea3662" containerName="registry-server" probeResult="failure" output=< Mar 09 13:24:59 crc kubenswrapper[4723]: timeout: failed to connect service ":50051" within 1s Mar 09 13:24:59 crc kubenswrapper[4723]: > Mar 09 13:25:01 crc kubenswrapper[4723]: I0309 13:25:01.346128 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2","Type":"ContainerStarted","Data":"0a7c75ec59d20af361a4c913ae4e63d868a7a6ece77ba7efe7f6210dbb1af257"} Mar 09 13:25:01 crc kubenswrapper[4723]: I0309 13:25:01.347586 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2ded56b6-9f9e-4a1d-8344-7f4a6d414da2" containerName="ceilometer-central-agent" containerID="cri-o://674ef4261250890626ed068cd56de70f69bbe6476d52344252ccc30a4e2612b9" gracePeriod=30 Mar 09 13:25:01 crc kubenswrapper[4723]: I0309 13:25:01.348210 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 13:25:01 crc kubenswrapper[4723]: I0309 13:25:01.349045 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2ded56b6-9f9e-4a1d-8344-7f4a6d414da2" containerName="proxy-httpd" containerID="cri-o://0a7c75ec59d20af361a4c913ae4e63d868a7a6ece77ba7efe7f6210dbb1af257" gracePeriod=30 Mar 09 13:25:01 crc kubenswrapper[4723]: I0309 13:25:01.349125 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2ded56b6-9f9e-4a1d-8344-7f4a6d414da2" containerName="sg-core" containerID="cri-o://2cea2fee01e6224a831d4f4bf7203591d3031d8950cd512479fadd9c6fcc6dd3" gracePeriod=30 Mar 09 13:25:01 crc kubenswrapper[4723]: I0309 13:25:01.349172 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2ded56b6-9f9e-4a1d-8344-7f4a6d414da2" containerName="ceilometer-notification-agent" containerID="cri-o://6c4f7025c883c3c5a5aba017bd7289bc78337de42ffa23268639b9f3c7261d00" gracePeriod=30 Mar 09 13:25:01 crc kubenswrapper[4723]: I0309 13:25:01.421234 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.365333503 podStartE2EDuration="11.421211345s" podCreationTimestamp="2026-03-09 13:24:50 +0000 UTC" firstStartedPulling="2026-03-09 13:24:51.131854249 +0000 UTC m=+1565.146321789" lastFinishedPulling="2026-03-09 13:25:00.187732091 +0000 UTC m=+1574.202199631" observedRunningTime="2026-03-09 13:25:01.398388331 +0000 UTC m=+1575.412855881" watchObservedRunningTime="2026-03-09 13:25:01.421211345 +0000 UTC m=+1575.435678885" Mar 09 13:25:01 crc kubenswrapper[4723]: I0309 13:25:01.939846 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="4a39acc6-3d02-4b5a-957f-eb4e3d578aeb" containerName="rabbitmq" containerID="cri-o://949cfd774a5ac85ffdc5516fd1299f2c3fc1e7abdb1a2335f187c11475bef008" gracePeriod=604794 Mar 09 13:25:02 crc kubenswrapper[4723]: I0309 13:25:02.268985 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-2" podUID="54210e7b-b34d-411d-93e1-e8cc3448c4b0" containerName="rabbitmq" containerID="cri-o://025b50af0a9f9816db745dfddd2c4f6971a2e8d89c088c14482796d31046a5b3" gracePeriod=604793 Mar 09 13:25:02 crc kubenswrapper[4723]: I0309 13:25:02.360906 4723 generic.go:334] "Generic (PLEG): container finished" podID="2ded56b6-9f9e-4a1d-8344-7f4a6d414da2" containerID="2cea2fee01e6224a831d4f4bf7203591d3031d8950cd512479fadd9c6fcc6dd3" exitCode=2 Mar 09 13:25:02 crc kubenswrapper[4723]: I0309 13:25:02.360950 4723 generic.go:334] "Generic (PLEG): container finished" podID="2ded56b6-9f9e-4a1d-8344-7f4a6d414da2" containerID="6c4f7025c883c3c5a5aba017bd7289bc78337de42ffa23268639b9f3c7261d00" exitCode=0 Mar 09 13:25:02 crc kubenswrapper[4723]: I0309 13:25:02.360962 4723 generic.go:334] "Generic (PLEG): container finished" podID="2ded56b6-9f9e-4a1d-8344-7f4a6d414da2" containerID="674ef4261250890626ed068cd56de70f69bbe6476d52344252ccc30a4e2612b9" exitCode=0 Mar 09 13:25:02 crc kubenswrapper[4723]: I0309 13:25:02.360984 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2","Type":"ContainerDied","Data":"2cea2fee01e6224a831d4f4bf7203591d3031d8950cd512479fadd9c6fcc6dd3"} Mar 09 13:25:02 crc kubenswrapper[4723]: I0309 13:25:02.361013 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2","Type":"ContainerDied","Data":"6c4f7025c883c3c5a5aba017bd7289bc78337de42ffa23268639b9f3c7261d00"} Mar 09 13:25:02 crc kubenswrapper[4723]: I0309 13:25:02.361026 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2","Type":"ContainerDied","Data":"674ef4261250890626ed068cd56de70f69bbe6476d52344252ccc30a4e2612b9"} Mar 09 13:25:03 crc kubenswrapper[4723]: I0309 13:25:03.946689 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:25:03 crc kubenswrapper[4723]: I0309 13:25:03.947031 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:25:03 crc kubenswrapper[4723]: I0309 13:25:03.947088 4723 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" Mar 09 13:25:03 crc kubenswrapper[4723]: I0309 13:25:03.948154 4723 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6ac6d2c984403d03e4d4370dd6ca12328beaf68b063a60d758d836e9ab8d0176"} pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 13:25:03 crc kubenswrapper[4723]: I0309 13:25:03.948216 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" containerID="cri-o://6ac6d2c984403d03e4d4370dd6ca12328beaf68b063a60d758d836e9ab8d0176" gracePeriod=600 Mar 09 13:25:04 crc kubenswrapper[4723]: I0309 13:25:04.413657 4723 generic.go:334] "Generic (PLEG): container finished" podID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerID="6ac6d2c984403d03e4d4370dd6ca12328beaf68b063a60d758d836e9ab8d0176" exitCode=0 Mar 09 13:25:04 crc kubenswrapper[4723]: I0309 13:25:04.413738 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" event={"ID":"983d5ed4-cfc7-402a-b226-29dc071c6e4e","Type":"ContainerDied","Data":"6ac6d2c984403d03e4d4370dd6ca12328beaf68b063a60d758d836e9ab8d0176"} Mar 09 13:25:04 crc kubenswrapper[4723]: I0309 13:25:04.414054 4723 scope.go:117] "RemoveContainer" containerID="37a6af4ad4a336694755a90bae29c7dad0bac535fc07da2bdf95f50123da1b17" Mar 09 13:25:08 crc kubenswrapper[4723]: I0309 13:25:08.497143 4723 generic.go:334] "Generic (PLEG): container finished" podID="4a39acc6-3d02-4b5a-957f-eb4e3d578aeb" containerID="949cfd774a5ac85ffdc5516fd1299f2c3fc1e7abdb1a2335f187c11475bef008" exitCode=0 Mar 09 13:25:08 crc kubenswrapper[4723]: I0309 13:25:08.497678 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb","Type":"ContainerDied","Data":"949cfd774a5ac85ffdc5516fd1299f2c3fc1e7abdb1a2335f187c11475bef008"} Mar 09 13:25:08 crc kubenswrapper[4723]: I0309 13:25:08.677746 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="4a39acc6-3d02-4b5a-957f-eb4e3d578aeb" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.132:5671: connect: connection refused" Mar 09 13:25:08 crc kubenswrapper[4723]: I0309 13:25:08.934247 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-nb5mx" podUID="959952f4-8104-4bc2-988d-906fe1ea3662" containerName="registry-server" probeResult="failure" output=< Mar 09 13:25:08 crc kubenswrapper[4723]: timeout: failed to connect service ":50051" within 1s Mar 09 13:25:08 crc kubenswrapper[4723]: > Mar 09 13:25:09 crc kubenswrapper[4723]: I0309 13:25:09.262314 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-2" podUID="54210e7b-b34d-411d-93e1-e8cc3448c4b0" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.134:5671: connect: connection refused" Mar 09 13:25:09 crc kubenswrapper[4723]: I0309 13:25:09.514953 4723 generic.go:334] "Generic (PLEG): container finished" podID="54210e7b-b34d-411d-93e1-e8cc3448c4b0" containerID="025b50af0a9f9816db745dfddd2c4f6971a2e8d89c088c14482796d31046a5b3" exitCode=0 Mar 09 13:25:09 crc kubenswrapper[4723]: I0309 13:25:09.515006 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"54210e7b-b34d-411d-93e1-e8cc3448c4b0","Type":"ContainerDied","Data":"025b50af0a9f9816db745dfddd2c4f6971a2e8d89c088c14482796d31046a5b3"} Mar 09 13:25:16 crc kubenswrapper[4723]: I0309 13:25:16.414258 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 09 13:25:16 crc kubenswrapper[4723]: I0309 13:25:16.477076 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/54210e7b-b34d-411d-93e1-e8cc3448c4b0-erlang-cookie-secret\") pod \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\" (UID: \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\") " Mar 09 13:25:16 crc kubenswrapper[4723]: I0309 13:25:16.477399 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/54210e7b-b34d-411d-93e1-e8cc3448c4b0-rabbitmq-confd\") pod \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\" (UID: \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\") " Mar 09 13:25:16 crc kubenswrapper[4723]: I0309 13:25:16.477461 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/54210e7b-b34d-411d-93e1-e8cc3448c4b0-rabbitmq-tls\") pod \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\" (UID: \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\") " Mar 09 13:25:16 crc kubenswrapper[4723]: I0309 13:25:16.477495 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/54210e7b-b34d-411d-93e1-e8cc3448c4b0-server-conf\") pod \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\" (UID: \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\") " Mar 09 13:25:16 crc kubenswrapper[4723]: I0309 13:25:16.477530 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/54210e7b-b34d-411d-93e1-e8cc3448c4b0-plugins-conf\") pod \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\" (UID: \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\") " Mar 09 13:25:16 crc kubenswrapper[4723]: I0309 13:25:16.477559 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22vct\" (UniqueName: \"kubernetes.io/projected/54210e7b-b34d-411d-93e1-e8cc3448c4b0-kube-api-access-22vct\") pod \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\" (UID: \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\") " Mar 09 13:25:16 crc kubenswrapper[4723]: I0309 13:25:16.477599 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/54210e7b-b34d-411d-93e1-e8cc3448c4b0-rabbitmq-erlang-cookie\") pod \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\" (UID: \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\") " Mar 09 13:25:16 crc kubenswrapper[4723]: I0309 13:25:16.480541 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ba0cc841-17e0-495d-9b6d-24d90fd9975f\") pod \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\" (UID: \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\") " Mar 09 13:25:16 crc kubenswrapper[4723]: I0309 13:25:16.480596 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/54210e7b-b34d-411d-93e1-e8cc3448c4b0-config-data\") pod \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\" (UID: \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\") " Mar 09 13:25:16 crc kubenswrapper[4723]: I0309 13:25:16.480659 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/54210e7b-b34d-411d-93e1-e8cc3448c4b0-pod-info\") pod \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\" (UID: \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\") " Mar 09 13:25:16 crc kubenswrapper[4723]: I0309 13:25:16.480786 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/54210e7b-b34d-411d-93e1-e8cc3448c4b0-rabbitmq-plugins\") pod \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\" (UID: \"54210e7b-b34d-411d-93e1-e8cc3448c4b0\") " Mar 09 13:25:16 crc kubenswrapper[4723]: I0309 13:25:16.484136 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54210e7b-b34d-411d-93e1-e8cc3448c4b0-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "54210e7b-b34d-411d-93e1-e8cc3448c4b0" (UID: "54210e7b-b34d-411d-93e1-e8cc3448c4b0"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:25:16 crc kubenswrapper[4723]: I0309 13:25:16.485740 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54210e7b-b34d-411d-93e1-e8cc3448c4b0-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "54210e7b-b34d-411d-93e1-e8cc3448c4b0" (UID: "54210e7b-b34d-411d-93e1-e8cc3448c4b0"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:25:16 crc kubenswrapper[4723]: I0309 13:25:16.490532 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54210e7b-b34d-411d-93e1-e8cc3448c4b0-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "54210e7b-b34d-411d-93e1-e8cc3448c4b0" (UID: "54210e7b-b34d-411d-93e1-e8cc3448c4b0"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:25:16 crc kubenswrapper[4723]: I0309 13:25:16.496417 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54210e7b-b34d-411d-93e1-e8cc3448c4b0-kube-api-access-22vct" (OuterVolumeSpecName: "kube-api-access-22vct") pod "54210e7b-b34d-411d-93e1-e8cc3448c4b0" (UID: "54210e7b-b34d-411d-93e1-e8cc3448c4b0"). InnerVolumeSpecName "kube-api-access-22vct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:25:16 crc kubenswrapper[4723]: I0309 13:25:16.501680 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54210e7b-b34d-411d-93e1-e8cc3448c4b0-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "54210e7b-b34d-411d-93e1-e8cc3448c4b0" (UID: "54210e7b-b34d-411d-93e1-e8cc3448c4b0"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:16 crc kubenswrapper[4723]: I0309 13:25:16.505652 4723 scope.go:117] "RemoveContainer" containerID="42687d97a92f3eec8eb044239bd85c0cb19dc311299573a1084789dac3e84d1d" Mar 09 13:25:16 crc kubenswrapper[4723]: I0309 13:25:16.515135 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/54210e7b-b34d-411d-93e1-e8cc3448c4b0-pod-info" (OuterVolumeSpecName: "pod-info") pod "54210e7b-b34d-411d-93e1-e8cc3448c4b0" (UID: "54210e7b-b34d-411d-93e1-e8cc3448c4b0"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 09 13:25:16 crc kubenswrapper[4723]: I0309 13:25:16.528599 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54210e7b-b34d-411d-93e1-e8cc3448c4b0-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "54210e7b-b34d-411d-93e1-e8cc3448c4b0" (UID: "54210e7b-b34d-411d-93e1-e8cc3448c4b0"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:25:16 crc kubenswrapper[4723]: I0309 13:25:16.557986 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54210e7b-b34d-411d-93e1-e8cc3448c4b0-config-data" (OuterVolumeSpecName: "config-data") pod "54210e7b-b34d-411d-93e1-e8cc3448c4b0" (UID: "54210e7b-b34d-411d-93e1-e8cc3448c4b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:16 crc kubenswrapper[4723]: I0309 13:25:16.585701 4723 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/54210e7b-b34d-411d-93e1-e8cc3448c4b0-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:16 crc kubenswrapper[4723]: I0309 13:25:16.585742 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22vct\" (UniqueName: \"kubernetes.io/projected/54210e7b-b34d-411d-93e1-e8cc3448c4b0-kube-api-access-22vct\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:16 crc kubenswrapper[4723]: I0309 13:25:16.585786 4723 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/54210e7b-b34d-411d-93e1-e8cc3448c4b0-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:16 crc kubenswrapper[4723]: I0309 13:25:16.585799 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/54210e7b-b34d-411d-93e1-e8cc3448c4b0-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:16 crc kubenswrapper[4723]: I0309 13:25:16.585809 4723 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/54210e7b-b34d-411d-93e1-e8cc3448c4b0-pod-info\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:16 crc kubenswrapper[4723]: I0309 13:25:16.585819 4723 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/54210e7b-b34d-411d-93e1-e8cc3448c4b0-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:16 crc kubenswrapper[4723]: I0309 13:25:16.585872 4723 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/54210e7b-b34d-411d-93e1-e8cc3448c4b0-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:16 crc kubenswrapper[4723]: I0309 13:25:16.585887 4723 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/54210e7b-b34d-411d-93e1-e8cc3448c4b0-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:16 crc kubenswrapper[4723]: I0309 13:25:16.596678 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ba0cc841-17e0-495d-9b6d-24d90fd9975f" (OuterVolumeSpecName: "persistence") pod "54210e7b-b34d-411d-93e1-e8cc3448c4b0" (UID: "54210e7b-b34d-411d-93e1-e8cc3448c4b0"). InnerVolumeSpecName "pvc-ba0cc841-17e0-495d-9b6d-24d90fd9975f". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 09 13:25:16 crc kubenswrapper[4723]: I0309 13:25:16.610199 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"54210e7b-b34d-411d-93e1-e8cc3448c4b0","Type":"ContainerDied","Data":"0e2903e72b9b2d42ff4a0150e30f0bb0c4aa44b0fec566d81c30e6a55f3d5c35"} Mar 09 13:25:16 crc kubenswrapper[4723]: I0309 13:25:16.610566 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 09 13:25:16 crc kubenswrapper[4723]: I0309 13:25:16.649878 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54210e7b-b34d-411d-93e1-e8cc3448c4b0-server-conf" (OuterVolumeSpecName: "server-conf") pod "54210e7b-b34d-411d-93e1-e8cc3448c4b0" (UID: "54210e7b-b34d-411d-93e1-e8cc3448c4b0"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:16 crc kubenswrapper[4723]: I0309 13:25:16.688033 4723 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/54210e7b-b34d-411d-93e1-e8cc3448c4b0-server-conf\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:16 crc kubenswrapper[4723]: I0309 13:25:16.688097 4723 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-ba0cc841-17e0-495d-9b6d-24d90fd9975f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ba0cc841-17e0-495d-9b6d-24d90fd9975f\") on node \"crc\" " Mar 09 13:25:16 crc kubenswrapper[4723]: I0309 13:25:16.714872 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54210e7b-b34d-411d-93e1-e8cc3448c4b0-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "54210e7b-b34d-411d-93e1-e8cc3448c4b0" (UID: "54210e7b-b34d-411d-93e1-e8cc3448c4b0"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:25:16 crc kubenswrapper[4723]: I0309 13:25:16.723383 4723 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 09 13:25:16 crc kubenswrapper[4723]: I0309 13:25:16.723612 4723 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-ba0cc841-17e0-495d-9b6d-24d90fd9975f" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ba0cc841-17e0-495d-9b6d-24d90fd9975f") on node "crc" Mar 09 13:25:16 crc kubenswrapper[4723]: I0309 13:25:16.790105 4723 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/54210e7b-b34d-411d-93e1-e8cc3448c4b0-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:16 crc kubenswrapper[4723]: I0309 13:25:16.790137 4723 reconciler_common.go:293] "Volume detached for volume \"pvc-ba0cc841-17e0-495d-9b6d-24d90fd9975f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ba0cc841-17e0-495d-9b6d-24d90fd9975f\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:16 crc kubenswrapper[4723]: I0309 13:25:16.915650 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-znk5w"] Mar 09 13:25:16 crc kubenswrapper[4723]: E0309 13:25:16.916280 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54210e7b-b34d-411d-93e1-e8cc3448c4b0" containerName="setup-container" Mar 09 13:25:16 crc kubenswrapper[4723]: I0309 13:25:16.916298 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="54210e7b-b34d-411d-93e1-e8cc3448c4b0" containerName="setup-container" Mar 09 13:25:16 crc kubenswrapper[4723]: E0309 13:25:16.916308 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54210e7b-b34d-411d-93e1-e8cc3448c4b0" containerName="rabbitmq" Mar 09 13:25:16 crc kubenswrapper[4723]: I0309 13:25:16.916316 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="54210e7b-b34d-411d-93e1-e8cc3448c4b0" containerName="rabbitmq" Mar 09 13:25:16 crc kubenswrapper[4723]: I0309 13:25:16.916555 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="54210e7b-b34d-411d-93e1-e8cc3448c4b0" containerName="rabbitmq" Mar 09 13:25:16 crc kubenswrapper[4723]: I0309 13:25:16.949703 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-znk5w" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.004819 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9917a47a-64be-47d0-a329-7380b87ac154-catalog-content\") pod \"community-operators-znk5w\" (UID: \"9917a47a-64be-47d0-a329-7380b87ac154\") " pod="openshift-marketplace/community-operators-znk5w" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.005179 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bvms\" (UniqueName: \"kubernetes.io/projected/9917a47a-64be-47d0-a329-7380b87ac154-kube-api-access-2bvms\") pod \"community-operators-znk5w\" (UID: \"9917a47a-64be-47d0-a329-7380b87ac154\") " pod="openshift-marketplace/community-operators-znk5w" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.005336 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9917a47a-64be-47d0-a329-7380b87ac154-utilities\") pod \"community-operators-znk5w\" (UID: \"9917a47a-64be-47d0-a329-7380b87ac154\") " pod="openshift-marketplace/community-operators-znk5w" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.007512 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-znk5w"] Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.062446 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.068558 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.080148 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-2"] Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.082705 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.094756 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.116533 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2e967475-660d-4ada-b409-bae77e4f6905-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"2e967475-660d-4ada-b409-bae77e4f6905\") " pod="openstack/rabbitmq-server-2" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.116628 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bvms\" (UniqueName: \"kubernetes.io/projected/9917a47a-64be-47d0-a329-7380b87ac154-kube-api-access-2bvms\") pod \"community-operators-znk5w\" (UID: \"9917a47a-64be-47d0-a329-7380b87ac154\") " pod="openshift-marketplace/community-operators-znk5w" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.116670 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ba0cc841-17e0-495d-9b6d-24d90fd9975f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ba0cc841-17e0-495d-9b6d-24d90fd9975f\") pod \"rabbitmq-server-2\" (UID: \"2e967475-660d-4ada-b409-bae77e4f6905\") " pod="openstack/rabbitmq-server-2" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.116694 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2e967475-660d-4ada-b409-bae77e4f6905-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"2e967475-660d-4ada-b409-bae77e4f6905\") " pod="openstack/rabbitmq-server-2" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.116761 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2e967475-660d-4ada-b409-bae77e4f6905-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"2e967475-660d-4ada-b409-bae77e4f6905\") " pod="openstack/rabbitmq-server-2" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.116803 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9917a47a-64be-47d0-a329-7380b87ac154-utilities\") pod \"community-operators-znk5w\" (UID: \"9917a47a-64be-47d0-a329-7380b87ac154\") " pod="openshift-marketplace/community-operators-znk5w" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.116844 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2e967475-660d-4ada-b409-bae77e4f6905-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"2e967475-660d-4ada-b409-bae77e4f6905\") " pod="openstack/rabbitmq-server-2" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.116948 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dxc9\" (UniqueName: \"kubernetes.io/projected/2e967475-660d-4ada-b409-bae77e4f6905-kube-api-access-4dxc9\") pod \"rabbitmq-server-2\" (UID: \"2e967475-660d-4ada-b409-bae77e4f6905\") " pod="openstack/rabbitmq-server-2" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.116977 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2e967475-660d-4ada-b409-bae77e4f6905-config-data\") pod \"rabbitmq-server-2\" (UID: \"2e967475-660d-4ada-b409-bae77e4f6905\") " pod="openstack/rabbitmq-server-2" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.117012 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9917a47a-64be-47d0-a329-7380b87ac154-catalog-content\") pod \"community-operators-znk5w\" (UID: \"9917a47a-64be-47d0-a329-7380b87ac154\") " pod="openshift-marketplace/community-operators-znk5w" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.117067 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2e967475-660d-4ada-b409-bae77e4f6905-pod-info\") pod \"rabbitmq-server-2\" (UID: \"2e967475-660d-4ada-b409-bae77e4f6905\") " pod="openstack/rabbitmq-server-2" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.117089 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2e967475-660d-4ada-b409-bae77e4f6905-server-conf\") pod \"rabbitmq-server-2\" (UID: \"2e967475-660d-4ada-b409-bae77e4f6905\") " pod="openstack/rabbitmq-server-2" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.117120 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2e967475-660d-4ada-b409-bae77e4f6905-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"2e967475-660d-4ada-b409-bae77e4f6905\") " pod="openstack/rabbitmq-server-2" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.117177 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2e967475-660d-4ada-b409-bae77e4f6905-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"2e967475-660d-4ada-b409-bae77e4f6905\") " pod="openstack/rabbitmq-server-2" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.117641 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9917a47a-64be-47d0-a329-7380b87ac154-catalog-content\") pod \"community-operators-znk5w\" (UID: \"9917a47a-64be-47d0-a329-7380b87ac154\") " pod="openshift-marketplace/community-operators-znk5w" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.117970 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9917a47a-64be-47d0-a329-7380b87ac154-utilities\") pod \"community-operators-znk5w\" (UID: \"9917a47a-64be-47d0-a329-7380b87ac154\") " pod="openshift-marketplace/community-operators-znk5w" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.138690 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bvms\" (UniqueName: \"kubernetes.io/projected/9917a47a-64be-47d0-a329-7380b87ac154-kube-api-access-2bvms\") pod \"community-operators-znk5w\" (UID: \"9917a47a-64be-47d0-a329-7380b87ac154\") " pod="openshift-marketplace/community-operators-znk5w" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.198935 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68df85789f-6qr75"] Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.205094 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68df85789f-6qr75" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.207819 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.217371 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68df85789f-6qr75"] Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.220238 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2e967475-660d-4ada-b409-bae77e4f6905-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"2e967475-660d-4ada-b409-bae77e4f6905\") " pod="openstack/rabbitmq-server-2" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.220379 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2e967475-660d-4ada-b409-bae77e4f6905-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"2e967475-660d-4ada-b409-bae77e4f6905\") " pod="openstack/rabbitmq-server-2" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.220458 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dxc9\" (UniqueName: \"kubernetes.io/projected/2e967475-660d-4ada-b409-bae77e4f6905-kube-api-access-4dxc9\") pod \"rabbitmq-server-2\" (UID: \"2e967475-660d-4ada-b409-bae77e4f6905\") " pod="openstack/rabbitmq-server-2" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.220483 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2e967475-660d-4ada-b409-bae77e4f6905-config-data\") pod \"rabbitmq-server-2\" (UID: \"2e967475-660d-4ada-b409-bae77e4f6905\") " pod="openstack/rabbitmq-server-2" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.220539 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2e967475-660d-4ada-b409-bae77e4f6905-pod-info\") pod \"rabbitmq-server-2\" (UID: \"2e967475-660d-4ada-b409-bae77e4f6905\") " pod="openstack/rabbitmq-server-2" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.220554 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2e967475-660d-4ada-b409-bae77e4f6905-server-conf\") pod \"rabbitmq-server-2\" (UID: \"2e967475-660d-4ada-b409-bae77e4f6905\") " pod="openstack/rabbitmq-server-2" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.220575 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2e967475-660d-4ada-b409-bae77e4f6905-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"2e967475-660d-4ada-b409-bae77e4f6905\") " pod="openstack/rabbitmq-server-2" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.220619 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2e967475-660d-4ada-b409-bae77e4f6905-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"2e967475-660d-4ada-b409-bae77e4f6905\") " pod="openstack/rabbitmq-server-2" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.220659 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2e967475-660d-4ada-b409-bae77e4f6905-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"2e967475-660d-4ada-b409-bae77e4f6905\") " pod="openstack/rabbitmq-server-2" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.220713 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2e967475-660d-4ada-b409-bae77e4f6905-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"2e967475-660d-4ada-b409-bae77e4f6905\") " pod="openstack/rabbitmq-server-2" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.220733 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ba0cc841-17e0-495d-9b6d-24d90fd9975f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ba0cc841-17e0-495d-9b6d-24d90fd9975f\") pod \"rabbitmq-server-2\" (UID: \"2e967475-660d-4ada-b409-bae77e4f6905\") " pod="openstack/rabbitmq-server-2" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.221383 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2e967475-660d-4ada-b409-bae77e4f6905-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"2e967475-660d-4ada-b409-bae77e4f6905\") " pod="openstack/rabbitmq-server-2" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.221751 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2e967475-660d-4ada-b409-bae77e4f6905-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"2e967475-660d-4ada-b409-bae77e4f6905\") " pod="openstack/rabbitmq-server-2" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.225517 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2e967475-660d-4ada-b409-bae77e4f6905-config-data\") pod \"rabbitmq-server-2\" (UID: \"2e967475-660d-4ada-b409-bae77e4f6905\") " pod="openstack/rabbitmq-server-2" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.226682 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2e967475-660d-4ada-b409-bae77e4f6905-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"2e967475-660d-4ada-b409-bae77e4f6905\") " pod="openstack/rabbitmq-server-2" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.227891 4723 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.227927 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2e967475-660d-4ada-b409-bae77e4f6905-server-conf\") pod \"rabbitmq-server-2\" (UID: \"2e967475-660d-4ada-b409-bae77e4f6905\") " pod="openstack/rabbitmq-server-2" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.227927 4723 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ba0cc841-17e0-495d-9b6d-24d90fd9975f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ba0cc841-17e0-495d-9b6d-24d90fd9975f\") pod \"rabbitmq-server-2\" (UID: \"2e967475-660d-4ada-b409-bae77e4f6905\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7ed97ef2ccf927dc4b78d1b5ddfee572a0deac37d7e58eca401e08357de5559e/globalmount\"" pod="openstack/rabbitmq-server-2" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.232087 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2e967475-660d-4ada-b409-bae77e4f6905-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"2e967475-660d-4ada-b409-bae77e4f6905\") " pod="openstack/rabbitmq-server-2" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.235628 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2e967475-660d-4ada-b409-bae77e4f6905-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"2e967475-660d-4ada-b409-bae77e4f6905\") " pod="openstack/rabbitmq-server-2" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.236083 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2e967475-660d-4ada-b409-bae77e4f6905-pod-info\") pod \"rabbitmq-server-2\" (UID: \"2e967475-660d-4ada-b409-bae77e4f6905\") " pod="openstack/rabbitmq-server-2" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.242080 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2e967475-660d-4ada-b409-bae77e4f6905-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"2e967475-660d-4ada-b409-bae77e4f6905\") " pod="openstack/rabbitmq-server-2" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.260703 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dxc9\" (UniqueName: \"kubernetes.io/projected/2e967475-660d-4ada-b409-bae77e4f6905-kube-api-access-4dxc9\") pod \"rabbitmq-server-2\" (UID: \"2e967475-660d-4ada-b409-bae77e4f6905\") " pod="openstack/rabbitmq-server-2" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.283906 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-znk5w" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.322963 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bfa18767-ca55-45cd-bcc4-64e6e7572efe-dns-swift-storage-0\") pod \"dnsmasq-dns-68df85789f-6qr75\" (UID: \"bfa18767-ca55-45cd-bcc4-64e6e7572efe\") " pod="openstack/dnsmasq-dns-68df85789f-6qr75" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.323207 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfa18767-ca55-45cd-bcc4-64e6e7572efe-dns-svc\") pod \"dnsmasq-dns-68df85789f-6qr75\" (UID: \"bfa18767-ca55-45cd-bcc4-64e6e7572efe\") " pod="openstack/dnsmasq-dns-68df85789f-6qr75" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.323256 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bfa18767-ca55-45cd-bcc4-64e6e7572efe-ovsdbserver-nb\") pod \"dnsmasq-dns-68df85789f-6qr75\" (UID: \"bfa18767-ca55-45cd-bcc4-64e6e7572efe\") " pod="openstack/dnsmasq-dns-68df85789f-6qr75" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.323306 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/bfa18767-ca55-45cd-bcc4-64e6e7572efe-openstack-edpm-ipam\") pod \"dnsmasq-dns-68df85789f-6qr75\" (UID: \"bfa18767-ca55-45cd-bcc4-64e6e7572efe\") " pod="openstack/dnsmasq-dns-68df85789f-6qr75" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.323365 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfa18767-ca55-45cd-bcc4-64e6e7572efe-config\") pod \"dnsmasq-dns-68df85789f-6qr75\" (UID: \"bfa18767-ca55-45cd-bcc4-64e6e7572efe\") " pod="openstack/dnsmasq-dns-68df85789f-6qr75" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.323389 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bfa18767-ca55-45cd-bcc4-64e6e7572efe-ovsdbserver-sb\") pod \"dnsmasq-dns-68df85789f-6qr75\" (UID: \"bfa18767-ca55-45cd-bcc4-64e6e7572efe\") " pod="openstack/dnsmasq-dns-68df85789f-6qr75" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.323470 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfjls\" (UniqueName: \"kubernetes.io/projected/bfa18767-ca55-45cd-bcc4-64e6e7572efe-kube-api-access-zfjls\") pod \"dnsmasq-dns-68df85789f-6qr75\" (UID: \"bfa18767-ca55-45cd-bcc4-64e6e7572efe\") " pod="openstack/dnsmasq-dns-68df85789f-6qr75" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.425670 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bfa18767-ca55-45cd-bcc4-64e6e7572efe-ovsdbserver-nb\") pod \"dnsmasq-dns-68df85789f-6qr75\" (UID: \"bfa18767-ca55-45cd-bcc4-64e6e7572efe\") " pod="openstack/dnsmasq-dns-68df85789f-6qr75" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.425764 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/bfa18767-ca55-45cd-bcc4-64e6e7572efe-openstack-edpm-ipam\") pod \"dnsmasq-dns-68df85789f-6qr75\" (UID: \"bfa18767-ca55-45cd-bcc4-64e6e7572efe\") " pod="openstack/dnsmasq-dns-68df85789f-6qr75" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.425877 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfa18767-ca55-45cd-bcc4-64e6e7572efe-config\") pod \"dnsmasq-dns-68df85789f-6qr75\" (UID: \"bfa18767-ca55-45cd-bcc4-64e6e7572efe\") " pod="openstack/dnsmasq-dns-68df85789f-6qr75" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.425913 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bfa18767-ca55-45cd-bcc4-64e6e7572efe-ovsdbserver-sb\") pod \"dnsmasq-dns-68df85789f-6qr75\" (UID: \"bfa18767-ca55-45cd-bcc4-64e6e7572efe\") " pod="openstack/dnsmasq-dns-68df85789f-6qr75" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.426010 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfjls\" (UniqueName: \"kubernetes.io/projected/bfa18767-ca55-45cd-bcc4-64e6e7572efe-kube-api-access-zfjls\") pod \"dnsmasq-dns-68df85789f-6qr75\" (UID: \"bfa18767-ca55-45cd-bcc4-64e6e7572efe\") " pod="openstack/dnsmasq-dns-68df85789f-6qr75" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.426132 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bfa18767-ca55-45cd-bcc4-64e6e7572efe-dns-swift-storage-0\") pod \"dnsmasq-dns-68df85789f-6qr75\" (UID: \"bfa18767-ca55-45cd-bcc4-64e6e7572efe\") " pod="openstack/dnsmasq-dns-68df85789f-6qr75" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.426154 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfa18767-ca55-45cd-bcc4-64e6e7572efe-dns-svc\") pod \"dnsmasq-dns-68df85789f-6qr75\" (UID: \"bfa18767-ca55-45cd-bcc4-64e6e7572efe\") " pod="openstack/dnsmasq-dns-68df85789f-6qr75" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.426634 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ba0cc841-17e0-495d-9b6d-24d90fd9975f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ba0cc841-17e0-495d-9b6d-24d90fd9975f\") pod \"rabbitmq-server-2\" (UID: \"2e967475-660d-4ada-b409-bae77e4f6905\") " pod="openstack/rabbitmq-server-2" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.426855 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/bfa18767-ca55-45cd-bcc4-64e6e7572efe-openstack-edpm-ipam\") pod \"dnsmasq-dns-68df85789f-6qr75\" (UID: \"bfa18767-ca55-45cd-bcc4-64e6e7572efe\") " pod="openstack/dnsmasq-dns-68df85789f-6qr75" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.427000 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bfa18767-ca55-45cd-bcc4-64e6e7572efe-ovsdbserver-nb\") pod \"dnsmasq-dns-68df85789f-6qr75\" (UID: \"bfa18767-ca55-45cd-bcc4-64e6e7572efe\") " pod="openstack/dnsmasq-dns-68df85789f-6qr75" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.427034 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfa18767-ca55-45cd-bcc4-64e6e7572efe-dns-svc\") pod \"dnsmasq-dns-68df85789f-6qr75\" (UID: \"bfa18767-ca55-45cd-bcc4-64e6e7572efe\") " pod="openstack/dnsmasq-dns-68df85789f-6qr75" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.427478 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bfa18767-ca55-45cd-bcc4-64e6e7572efe-ovsdbserver-sb\") pod \"dnsmasq-dns-68df85789f-6qr75\" (UID: \"bfa18767-ca55-45cd-bcc4-64e6e7572efe\") " pod="openstack/dnsmasq-dns-68df85789f-6qr75" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.427518 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfa18767-ca55-45cd-bcc4-64e6e7572efe-config\") pod \"dnsmasq-dns-68df85789f-6qr75\" (UID: \"bfa18767-ca55-45cd-bcc4-64e6e7572efe\") " pod="openstack/dnsmasq-dns-68df85789f-6qr75" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.427702 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bfa18767-ca55-45cd-bcc4-64e6e7572efe-dns-swift-storage-0\") pod \"dnsmasq-dns-68df85789f-6qr75\" (UID: \"bfa18767-ca55-45cd-bcc4-64e6e7572efe\") " pod="openstack/dnsmasq-dns-68df85789f-6qr75" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.456060 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfjls\" (UniqueName: \"kubernetes.io/projected/bfa18767-ca55-45cd-bcc4-64e6e7572efe-kube-api-access-zfjls\") pod \"dnsmasq-dns-68df85789f-6qr75\" (UID: \"bfa18767-ca55-45cd-bcc4-64e6e7572efe\") " pod="openstack/dnsmasq-dns-68df85789f-6qr75" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.527687 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68df85789f-6qr75" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.542355 4723 scope.go:117] "RemoveContainer" containerID="025b50af0a9f9816db745dfddd2c4f6971a2e8d89c088c14482796d31046a5b3" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.635667 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb","Type":"ContainerDied","Data":"2eebd4ad662e6e17733e7e49525e73dcd42ab261dbfdc490140f01c21f8e4ec9"} Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.635718 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2eebd4ad662e6e17733e7e49525e73dcd42ab261dbfdc490140f01c21f8e4ec9" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.693518 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.713446 4723 scope.go:117] "RemoveContainer" containerID="f600bd95ed92947bbe218dbf141750e32f7e39e37c298df8afa64d12dc276f50" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.730392 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.733596 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-72f699bf-bb2b-4b1c-a36c-166b186b1319\") pod \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\" (UID: \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\") " Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.733738 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-rabbitmq-erlang-cookie\") pod \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\" (UID: \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\") " Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.733779 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-pod-info\") pod \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\" (UID: \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\") " Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.733900 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-config-data\") pod \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\" (UID: \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\") " Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.734048 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-rabbitmq-plugins\") pod \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\" (UID: \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\") " Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.734111 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-server-conf\") pod \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\" (UID: \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\") " Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.734131 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-rabbitmq-confd\") pod \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\" (UID: \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\") " Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.734188 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srpvs\" (UniqueName: \"kubernetes.io/projected/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-kube-api-access-srpvs\") pod \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\" (UID: \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\") " Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.734219 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-erlang-cookie-secret\") pod \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\" (UID: \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\") " Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.734298 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-rabbitmq-tls\") pod \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\" (UID: \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\") " Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.734315 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-plugins-conf\") pod \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\" (UID: \"4a39acc6-3d02-4b5a-957f-eb4e3d578aeb\") " Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.736913 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "4a39acc6-3d02-4b5a-957f-eb4e3d578aeb" (UID: "4a39acc6-3d02-4b5a-957f-eb4e3d578aeb"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.737976 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "4a39acc6-3d02-4b5a-957f-eb4e3d578aeb" (UID: "4a39acc6-3d02-4b5a-957f-eb4e3d578aeb"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.741974 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "4a39acc6-3d02-4b5a-957f-eb4e3d578aeb" (UID: "4a39acc6-3d02-4b5a-957f-eb4e3d578aeb"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.747361 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-kube-api-access-srpvs" (OuterVolumeSpecName: "kube-api-access-srpvs") pod "4a39acc6-3d02-4b5a-957f-eb4e3d578aeb" (UID: "4a39acc6-3d02-4b5a-957f-eb4e3d578aeb"). InnerVolumeSpecName "kube-api-access-srpvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.754109 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "4a39acc6-3d02-4b5a-957f-eb4e3d578aeb" (UID: "4a39acc6-3d02-4b5a-957f-eb4e3d578aeb"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.754128 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "4a39acc6-3d02-4b5a-957f-eb4e3d578aeb" (UID: "4a39acc6-3d02-4b5a-957f-eb4e3d578aeb"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.759111 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-pod-info" (OuterVolumeSpecName: "pod-info") pod "4a39acc6-3d02-4b5a-957f-eb4e3d578aeb" (UID: "4a39acc6-3d02-4b5a-957f-eb4e3d578aeb"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.797884 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-config-data" (OuterVolumeSpecName: "config-data") pod "4a39acc6-3d02-4b5a-957f-eb4e3d578aeb" (UID: "4a39acc6-3d02-4b5a-957f-eb4e3d578aeb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:17 crc kubenswrapper[4723]: E0309 13:25:17.824516 4723 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested" Mar 09 13:25:17 crc kubenswrapper[4723]: E0309 13:25:17.824586 4723 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested" Mar 09 13:25:17 crc kubenswrapper[4723]: E0309 13:25:17.824710 4723 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c4crf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-flx99_openstack(09a0d87c-3e58-4839-95fb-7c964152ed7c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 13:25:17 crc kubenswrapper[4723]: E0309 13:25:17.825973 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-flx99" podUID="09a0d87c-3e58-4839-95fb-7c964152ed7c" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.837054 4723 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.837090 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srpvs\" (UniqueName: \"kubernetes.io/projected/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-kube-api-access-srpvs\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.837120 4723 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.837133 4723 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.837144 4723 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.837156 4723 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.837166 4723 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-pod-info\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.837177 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.881543 4723 scope.go:117] "RemoveContainer" containerID="42687d97a92f3eec8eb044239bd85c0cb19dc311299573a1084789dac3e84d1d" Mar 09 13:25:17 crc kubenswrapper[4723]: E0309 13:25:17.894651 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42687d97a92f3eec8eb044239bd85c0cb19dc311299573a1084789dac3e84d1d\": container with ID starting with 42687d97a92f3eec8eb044239bd85c0cb19dc311299573a1084789dac3e84d1d not found: ID does not exist" containerID="42687d97a92f3eec8eb044239bd85c0cb19dc311299573a1084789dac3e84d1d" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.894698 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42687d97a92f3eec8eb044239bd85c0cb19dc311299573a1084789dac3e84d1d"} err="failed to get container status \"42687d97a92f3eec8eb044239bd85c0cb19dc311299573a1084789dac3e84d1d\": rpc error: code = NotFound desc = could not find container \"42687d97a92f3eec8eb044239bd85c0cb19dc311299573a1084789dac3e84d1d\": container with ID starting with 42687d97a92f3eec8eb044239bd85c0cb19dc311299573a1084789dac3e84d1d not found: ID does not exist" Mar 09 13:25:17 crc kubenswrapper[4723]: I0309 13:25:17.988243 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-server-conf" (OuterVolumeSpecName: "server-conf") pod "4a39acc6-3d02-4b5a-957f-eb4e3d578aeb" (UID: "4a39acc6-3d02-4b5a-957f-eb4e3d578aeb"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:18 crc kubenswrapper[4723]: I0309 13:25:18.048667 4723 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-server-conf\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:18 crc kubenswrapper[4723]: I0309 13:25:18.304255 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-72f699bf-bb2b-4b1c-a36c-166b186b1319" (OuterVolumeSpecName: "persistence") pod "4a39acc6-3d02-4b5a-957f-eb4e3d578aeb" (UID: "4a39acc6-3d02-4b5a-957f-eb4e3d578aeb"). InnerVolumeSpecName "pvc-72f699bf-bb2b-4b1c-a36c-166b186b1319". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 09 13:25:18 crc kubenswrapper[4723]: I0309 13:25:18.368153 4723 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-72f699bf-bb2b-4b1c-a36c-166b186b1319\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-72f699bf-bb2b-4b1c-a36c-166b186b1319\") on node \"crc\" " Mar 09 13:25:18 crc kubenswrapper[4723]: I0309 13:25:18.459662 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "4a39acc6-3d02-4b5a-957f-eb4e3d578aeb" (UID: "4a39acc6-3d02-4b5a-957f-eb4e3d578aeb"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:25:18 crc kubenswrapper[4723]: I0309 13:25:18.467211 4723 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 09 13:25:18 crc kubenswrapper[4723]: I0309 13:25:18.467405 4723 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-72f699bf-bb2b-4b1c-a36c-166b186b1319" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-72f699bf-bb2b-4b1c-a36c-166b186b1319") on node "crc" Mar 09 13:25:18 crc kubenswrapper[4723]: I0309 13:25:18.470586 4723 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:18 crc kubenswrapper[4723]: I0309 13:25:18.470614 4723 reconciler_common.go:293] "Volume detached for volume \"pvc-72f699bf-bb2b-4b1c-a36c-166b186b1319\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-72f699bf-bb2b-4b1c-a36c-166b186b1319\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:18 crc kubenswrapper[4723]: I0309 13:25:18.653733 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" event={"ID":"983d5ed4-cfc7-402a-b226-29dc071c6e4e","Type":"ContainerStarted","Data":"a02e51157ce32a3f4545e91341ee80bb0f1a0c3eaea82ea1beb5091e2f5cef4b"} Mar 09 13:25:18 crc kubenswrapper[4723]: I0309 13:25:18.658346 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:25:18 crc kubenswrapper[4723]: E0309 13:25:18.661436 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested\\\"\"" pod="openstack/heat-db-sync-flx99" podUID="09a0d87c-3e58-4839-95fb-7c964152ed7c" Mar 09 13:25:18 crc kubenswrapper[4723]: I0309 13:25:18.760496 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 13:25:18 crc kubenswrapper[4723]: I0309 13:25:18.783168 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 13:25:18 crc kubenswrapper[4723]: I0309 13:25:18.797506 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 13:25:18 crc kubenswrapper[4723]: E0309 13:25:18.798065 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a39acc6-3d02-4b5a-957f-eb4e3d578aeb" containerName="rabbitmq" Mar 09 13:25:18 crc kubenswrapper[4723]: I0309 13:25:18.798093 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a39acc6-3d02-4b5a-957f-eb4e3d578aeb" containerName="rabbitmq" Mar 09 13:25:18 crc kubenswrapper[4723]: E0309 13:25:18.798125 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a39acc6-3d02-4b5a-957f-eb4e3d578aeb" containerName="setup-container" Mar 09 13:25:18 crc kubenswrapper[4723]: I0309 13:25:18.798132 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a39acc6-3d02-4b5a-957f-eb4e3d578aeb" containerName="setup-container" Mar 09 13:25:18 crc kubenswrapper[4723]: I0309 13:25:18.801178 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a39acc6-3d02-4b5a-957f-eb4e3d578aeb" containerName="rabbitmq" Mar 09 13:25:18 crc kubenswrapper[4723]: I0309 13:25:18.812483 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:25:18 crc kubenswrapper[4723]: I0309 13:25:18.820729 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 13:25:18 crc kubenswrapper[4723]: I0309 13:25:18.831794 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 09 13:25:18 crc kubenswrapper[4723]: I0309 13:25:18.832122 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 09 13:25:18 crc kubenswrapper[4723]: I0309 13:25:18.832152 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-d6ws7" Mar 09 13:25:18 crc kubenswrapper[4723]: I0309 13:25:18.832929 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 09 13:25:18 crc kubenswrapper[4723]: I0309 13:25:18.833154 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 09 13:25:18 crc kubenswrapper[4723]: I0309 13:25:18.838894 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 09 13:25:18 crc kubenswrapper[4723]: I0309 13:25:18.839474 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 09 13:25:18 crc kubenswrapper[4723]: I0309 13:25:18.908466 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a39acc6-3d02-4b5a-957f-eb4e3d578aeb" path="/var/lib/kubelet/pods/4a39acc6-3d02-4b5a-957f-eb4e3d578aeb/volumes" Mar 09 13:25:18 crc kubenswrapper[4723]: I0309 13:25:18.914188 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54210e7b-b34d-411d-93e1-e8cc3448c4b0" path="/var/lib/kubelet/pods/54210e7b-b34d-411d-93e1-e8cc3448c4b0/volumes" Mar 09 13:25:18 crc kubenswrapper[4723]: I0309 13:25:18.929892 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-znk5w"] Mar 09 13:25:18 crc kubenswrapper[4723]: I0309 13:25:18.984472 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:25:18 crc kubenswrapper[4723]: I0309 13:25:18.984524 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:25:18 crc kubenswrapper[4723]: I0309 13:25:18.984554 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:25:18 crc kubenswrapper[4723]: I0309 13:25:18.984599 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:25:18 crc kubenswrapper[4723]: I0309 13:25:18.984623 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:25:18 crc kubenswrapper[4723]: I0309 13:25:18.984651 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:25:18 crc kubenswrapper[4723]: I0309 13:25:18.984673 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjkrd\" (UniqueName: \"kubernetes.io/projected/fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d-kube-api-access-vjkrd\") pod \"rabbitmq-cell1-server-0\" (UID: \"fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:25:18 crc kubenswrapper[4723]: I0309 13:25:18.984715 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:25:18 crc kubenswrapper[4723]: I0309 13:25:18.984738 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:25:18 crc kubenswrapper[4723]: I0309 13:25:18.984759 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:25:18 crc kubenswrapper[4723]: I0309 13:25:18.984789 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-72f699bf-bb2b-4b1c-a36c-166b186b1319\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-72f699bf-bb2b-4b1c-a36c-166b186b1319\") pod \"rabbitmq-cell1-server-0\" (UID: \"fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:25:19 crc kubenswrapper[4723]: I0309 13:25:19.043358 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68df85789f-6qr75"] Mar 09 13:25:19 crc kubenswrapper[4723]: I0309 13:25:19.068117 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 09 13:25:19 crc kubenswrapper[4723]: I0309 13:25:19.091827 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:25:19 crc kubenswrapper[4723]: I0309 13:25:19.091899 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:25:19 crc kubenswrapper[4723]: I0309 13:25:19.091926 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjkrd\" (UniqueName: \"kubernetes.io/projected/fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d-kube-api-access-vjkrd\") pod \"rabbitmq-cell1-server-0\" (UID: \"fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:25:19 crc kubenswrapper[4723]: I0309 13:25:19.091983 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:25:19 crc kubenswrapper[4723]: I0309 13:25:19.092010 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:25:19 crc kubenswrapper[4723]: I0309 13:25:19.092033 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:25:19 crc kubenswrapper[4723]: I0309 13:25:19.092054 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-72f699bf-bb2b-4b1c-a36c-166b186b1319\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-72f699bf-bb2b-4b1c-a36c-166b186b1319\") pod \"rabbitmq-cell1-server-0\" (UID: \"fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:25:19 crc kubenswrapper[4723]: I0309 13:25:19.092158 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:25:19 crc kubenswrapper[4723]: I0309 13:25:19.092179 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:25:19 crc kubenswrapper[4723]: I0309 13:25:19.092198 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:25:19 crc kubenswrapper[4723]: I0309 13:25:19.092243 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:25:19 crc kubenswrapper[4723]: I0309 13:25:19.093014 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:25:19 crc kubenswrapper[4723]: I0309 13:25:19.093403 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:25:19 crc kubenswrapper[4723]: I0309 13:25:19.100830 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:25:19 crc kubenswrapper[4723]: I0309 13:25:19.102493 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:25:19 crc kubenswrapper[4723]: I0309 13:25:19.102539 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:25:19 crc kubenswrapper[4723]: I0309 13:25:19.115568 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:25:19 crc kubenswrapper[4723]: I0309 13:25:19.116669 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:25:19 crc kubenswrapper[4723]: I0309 13:25:19.117443 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:25:19 crc kubenswrapper[4723]: I0309 13:25:19.119638 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:25:19 crc kubenswrapper[4723]: I0309 13:25:19.134646 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjkrd\" (UniqueName: \"kubernetes.io/projected/fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d-kube-api-access-vjkrd\") pod \"rabbitmq-cell1-server-0\" (UID: \"fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:25:19 crc kubenswrapper[4723]: I0309 13:25:19.147686 4723 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 13:25:19 crc kubenswrapper[4723]: I0309 13:25:19.147821 4723 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-72f699bf-bb2b-4b1c-a36c-166b186b1319\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-72f699bf-bb2b-4b1c-a36c-166b186b1319\") pod \"rabbitmq-cell1-server-0\" (UID: \"fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9904cf58384ea95f97211cb12cec7ec77900b8743fdc44d4dd98a8f5e64d4499/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:25:19 crc kubenswrapper[4723]: I0309 13:25:19.317030 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-72f699bf-bb2b-4b1c-a36c-166b186b1319\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-72f699bf-bb2b-4b1c-a36c-166b186b1319\") pod \"rabbitmq-cell1-server-0\" (UID: \"fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:25:19 crc kubenswrapper[4723]: I0309 13:25:19.391839 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-nb5mx" podUID="959952f4-8104-4bc2-988d-906fe1ea3662" containerName="registry-server" probeResult="failure" output=< Mar 09 13:25:19 crc kubenswrapper[4723]: timeout: failed to connect service ":50051" within 1s Mar 09 13:25:19 crc kubenswrapper[4723]: > Mar 09 13:25:19 crc kubenswrapper[4723]: I0309 13:25:19.470804 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:25:19 crc kubenswrapper[4723]: I0309 13:25:19.754305 4723 generic.go:334] "Generic (PLEG): container finished" podID="9917a47a-64be-47d0-a329-7380b87ac154" containerID="45813ab33a1ef4c24cab8d3d96b5e9fa1a7afe25d6db8aefdb236843d5b3428c" exitCode=0 Mar 09 13:25:19 crc kubenswrapper[4723]: I0309 13:25:19.754402 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-znk5w" event={"ID":"9917a47a-64be-47d0-a329-7380b87ac154","Type":"ContainerDied","Data":"45813ab33a1ef4c24cab8d3d96b5e9fa1a7afe25d6db8aefdb236843d5b3428c"} Mar 09 13:25:19 crc kubenswrapper[4723]: I0309 13:25:19.754427 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-znk5w" event={"ID":"9917a47a-64be-47d0-a329-7380b87ac154","Type":"ContainerStarted","Data":"462a0c8afe6644f9d6cf0042d529f347eb7bc653f19b2417724a7bb535629b96"} Mar 09 13:25:19 crc kubenswrapper[4723]: I0309 13:25:19.760384 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"2e967475-660d-4ada-b409-bae77e4f6905","Type":"ContainerStarted","Data":"204b31baa2f1723eae3b0be0bb6ff9b995b90e8f0d6e69730fe455e23eb2a417"} Mar 09 13:25:19 crc kubenswrapper[4723]: I0309 13:25:19.762497 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68df85789f-6qr75" event={"ID":"bfa18767-ca55-45cd-bcc4-64e6e7572efe","Type":"ContainerStarted","Data":"140cdb61153c2719fd628e7e82fdfbd17fc247c38400eefd74a9e77f581628af"} Mar 09 13:25:20 crc kubenswrapper[4723]: I0309 13:25:20.229463 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 13:25:20 crc kubenswrapper[4723]: I0309 13:25:20.492243 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="2ded56b6-9f9e-4a1d-8344-7f4a6d414da2" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 09 13:25:20 crc kubenswrapper[4723]: I0309 13:25:20.795424 4723 generic.go:334] "Generic (PLEG): container finished" podID="bfa18767-ca55-45cd-bcc4-64e6e7572efe" containerID="43ef4f7e8b52c8a1edcc39bb5230605b6656c6dea9f355a6e4344b068563a4db" exitCode=0 Mar 09 13:25:20 crc kubenswrapper[4723]: I0309 13:25:20.795555 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68df85789f-6qr75" event={"ID":"bfa18767-ca55-45cd-bcc4-64e6e7572efe","Type":"ContainerDied","Data":"43ef4f7e8b52c8a1edcc39bb5230605b6656c6dea9f355a6e4344b068563a4db"} Mar 09 13:25:20 crc kubenswrapper[4723]: I0309 13:25:20.804805 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d","Type":"ContainerStarted","Data":"576a496ab3bdd22d00d3bab58a3b872c8566b22489128748a96aa06ca00f403e"} Mar 09 13:25:21 crc kubenswrapper[4723]: I0309 13:25:21.816427 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-znk5w" event={"ID":"9917a47a-64be-47d0-a329-7380b87ac154","Type":"ContainerStarted","Data":"6dbcc4b77677882c200bb673a7f1a29d8bd1638a9239b8af8b2145ed4803c7a9"} Mar 09 13:25:21 crc kubenswrapper[4723]: I0309 13:25:21.818624 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"2e967475-660d-4ada-b409-bae77e4f6905","Type":"ContainerStarted","Data":"69686ee38e6c83423b412b76dcb2ac42567f3a941394103f109c7fe4803a283d"} Mar 09 13:25:21 crc kubenswrapper[4723]: I0309 13:25:21.820657 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68df85789f-6qr75" event={"ID":"bfa18767-ca55-45cd-bcc4-64e6e7572efe","Type":"ContainerStarted","Data":"adf856034772ecbc05fd5a4160867c585a5e82539c38e5c7b35afa1ecd573991"} Mar 09 13:25:21 crc kubenswrapper[4723]: I0309 13:25:21.820822 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-68df85789f-6qr75" Mar 09 13:25:21 crc kubenswrapper[4723]: I0309 13:25:21.888891 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-68df85789f-6qr75" podStartSLOduration=4.888854588 podStartE2EDuration="4.888854588s" podCreationTimestamp="2026-03-09 13:25:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:25:21.88289178 +0000 UTC m=+1595.897359340" watchObservedRunningTime="2026-03-09 13:25:21.888854588 +0000 UTC m=+1595.903322128" Mar 09 13:25:22 crc kubenswrapper[4723]: I0309 13:25:22.833473 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d","Type":"ContainerStarted","Data":"e8ab742b2d0e81a16863f2986110e7446d2af694a345fcba00cc2e44f5825c84"} Mar 09 13:25:23 crc kubenswrapper[4723]: I0309 13:25:23.845103 4723 generic.go:334] "Generic (PLEG): container finished" podID="9917a47a-64be-47d0-a329-7380b87ac154" containerID="6dbcc4b77677882c200bb673a7f1a29d8bd1638a9239b8af8b2145ed4803c7a9" exitCode=0 Mar 09 13:25:23 crc kubenswrapper[4723]: I0309 13:25:23.845189 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-znk5w" event={"ID":"9917a47a-64be-47d0-a329-7380b87ac154","Type":"ContainerDied","Data":"6dbcc4b77677882c200bb673a7f1a29d8bd1638a9239b8af8b2145ed4803c7a9"} Mar 09 13:25:24 crc kubenswrapper[4723]: I0309 13:25:24.861286 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-znk5w" event={"ID":"9917a47a-64be-47d0-a329-7380b87ac154","Type":"ContainerStarted","Data":"c7a671fe39a8559be5b523f88ce2c63e3968ea192ea3716cce6389438c3574a4"} Mar 09 13:25:24 crc kubenswrapper[4723]: I0309 13:25:24.895892 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-znk5w" podStartSLOduration=4.42406188 podStartE2EDuration="8.895852438s" podCreationTimestamp="2026-03-09 13:25:16 +0000 UTC" firstStartedPulling="2026-03-09 13:25:19.759441624 +0000 UTC m=+1593.773909164" lastFinishedPulling="2026-03-09 13:25:24.231232182 +0000 UTC m=+1598.245699722" observedRunningTime="2026-03-09 13:25:24.884718554 +0000 UTC m=+1598.899186124" watchObservedRunningTime="2026-03-09 13:25:24.895852438 +0000 UTC m=+1598.910319978" Mar 09 13:25:27 crc kubenswrapper[4723]: I0309 13:25:27.284774 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-znk5w" Mar 09 13:25:27 crc kubenswrapper[4723]: I0309 13:25:27.285119 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-znk5w" Mar 09 13:25:27 crc kubenswrapper[4723]: I0309 13:25:27.339676 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-znk5w" Mar 09 13:25:27 crc kubenswrapper[4723]: I0309 13:25:27.529041 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-68df85789f-6qr75" Mar 09 13:25:27 crc kubenswrapper[4723]: I0309 13:25:27.590986 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79b5d74c8c-kng4m"] Mar 09 13:25:27 crc kubenswrapper[4723]: I0309 13:25:27.591324 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79b5d74c8c-kng4m" podUID="e1247b23-18d7-4343-90cb-e35826999ba9" containerName="dnsmasq-dns" containerID="cri-o://a148f889e50bfd745321c33011dbca488fa39edc0bcdad45045f6df087a6b45d" gracePeriod=10 Mar 09 13:25:27 crc kubenswrapper[4723]: I0309 13:25:27.788743 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bb85b8995-q4x27"] Mar 09 13:25:27 crc kubenswrapper[4723]: I0309 13:25:27.790949 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bb85b8995-q4x27" Mar 09 13:25:27 crc kubenswrapper[4723]: I0309 13:25:27.856911 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bb85b8995-q4x27"] Mar 09 13:25:27 crc kubenswrapper[4723]: I0309 13:25:27.893112 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73fbad24-6f18-4227-bc88-968635e92584-ovsdbserver-nb\") pod \"dnsmasq-dns-bb85b8995-q4x27\" (UID: \"73fbad24-6f18-4227-bc88-968635e92584\") " pod="openstack/dnsmasq-dns-bb85b8995-q4x27" Mar 09 13:25:27 crc kubenswrapper[4723]: I0309 13:25:27.893250 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/73fbad24-6f18-4227-bc88-968635e92584-dns-swift-storage-0\") pod \"dnsmasq-dns-bb85b8995-q4x27\" (UID: \"73fbad24-6f18-4227-bc88-968635e92584\") " pod="openstack/dnsmasq-dns-bb85b8995-q4x27" Mar 09 13:25:27 crc kubenswrapper[4723]: I0309 13:25:27.893285 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73fbad24-6f18-4227-bc88-968635e92584-ovsdbserver-sb\") pod \"dnsmasq-dns-bb85b8995-q4x27\" (UID: \"73fbad24-6f18-4227-bc88-968635e92584\") " pod="openstack/dnsmasq-dns-bb85b8995-q4x27" Mar 09 13:25:27 crc kubenswrapper[4723]: I0309 13:25:27.893310 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmqzg\" (UniqueName: \"kubernetes.io/projected/73fbad24-6f18-4227-bc88-968635e92584-kube-api-access-kmqzg\") pod \"dnsmasq-dns-bb85b8995-q4x27\" (UID: \"73fbad24-6f18-4227-bc88-968635e92584\") " pod="openstack/dnsmasq-dns-bb85b8995-q4x27" Mar 09 13:25:27 crc kubenswrapper[4723]: I0309 13:25:27.893464 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/73fbad24-6f18-4227-bc88-968635e92584-openstack-edpm-ipam\") pod \"dnsmasq-dns-bb85b8995-q4x27\" (UID: \"73fbad24-6f18-4227-bc88-968635e92584\") " pod="openstack/dnsmasq-dns-bb85b8995-q4x27" Mar 09 13:25:27 crc kubenswrapper[4723]: I0309 13:25:27.893526 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73fbad24-6f18-4227-bc88-968635e92584-config\") pod \"dnsmasq-dns-bb85b8995-q4x27\" (UID: \"73fbad24-6f18-4227-bc88-968635e92584\") " pod="openstack/dnsmasq-dns-bb85b8995-q4x27" Mar 09 13:25:27 crc kubenswrapper[4723]: I0309 13:25:27.893586 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73fbad24-6f18-4227-bc88-968635e92584-dns-svc\") pod \"dnsmasq-dns-bb85b8995-q4x27\" (UID: \"73fbad24-6f18-4227-bc88-968635e92584\") " pod="openstack/dnsmasq-dns-bb85b8995-q4x27" Mar 09 13:25:27 crc kubenswrapper[4723]: I0309 13:25:27.942228 4723 generic.go:334] "Generic (PLEG): container finished" podID="e1247b23-18d7-4343-90cb-e35826999ba9" containerID="a148f889e50bfd745321c33011dbca488fa39edc0bcdad45045f6df087a6b45d" exitCode=0 Mar 09 13:25:27 crc kubenswrapper[4723]: I0309 13:25:27.944275 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79b5d74c8c-kng4m" event={"ID":"e1247b23-18d7-4343-90cb-e35826999ba9","Type":"ContainerDied","Data":"a148f889e50bfd745321c33011dbca488fa39edc0bcdad45045f6df087a6b45d"} Mar 09 13:25:28 crc kubenswrapper[4723]: I0309 13:25:28.005214 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/73fbad24-6f18-4227-bc88-968635e92584-openstack-edpm-ipam\") pod \"dnsmasq-dns-bb85b8995-q4x27\" (UID: \"73fbad24-6f18-4227-bc88-968635e92584\") " pod="openstack/dnsmasq-dns-bb85b8995-q4x27" Mar 09 13:25:28 crc kubenswrapper[4723]: I0309 13:25:28.005308 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73fbad24-6f18-4227-bc88-968635e92584-config\") pod \"dnsmasq-dns-bb85b8995-q4x27\" (UID: \"73fbad24-6f18-4227-bc88-968635e92584\") " pod="openstack/dnsmasq-dns-bb85b8995-q4x27" Mar 09 13:25:28 crc kubenswrapper[4723]: I0309 13:25:28.005378 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73fbad24-6f18-4227-bc88-968635e92584-dns-svc\") pod \"dnsmasq-dns-bb85b8995-q4x27\" (UID: \"73fbad24-6f18-4227-bc88-968635e92584\") " pod="openstack/dnsmasq-dns-bb85b8995-q4x27" Mar 09 13:25:28 crc kubenswrapper[4723]: I0309 13:25:28.005501 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73fbad24-6f18-4227-bc88-968635e92584-ovsdbserver-nb\") pod \"dnsmasq-dns-bb85b8995-q4x27\" (UID: \"73fbad24-6f18-4227-bc88-968635e92584\") " pod="openstack/dnsmasq-dns-bb85b8995-q4x27" Mar 09 13:25:28 crc kubenswrapper[4723]: I0309 13:25:28.005640 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/73fbad24-6f18-4227-bc88-968635e92584-dns-swift-storage-0\") pod \"dnsmasq-dns-bb85b8995-q4x27\" (UID: \"73fbad24-6f18-4227-bc88-968635e92584\") " pod="openstack/dnsmasq-dns-bb85b8995-q4x27" Mar 09 13:25:28 crc kubenswrapper[4723]: I0309 13:25:28.005694 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73fbad24-6f18-4227-bc88-968635e92584-ovsdbserver-sb\") pod \"dnsmasq-dns-bb85b8995-q4x27\" (UID: \"73fbad24-6f18-4227-bc88-968635e92584\") " pod="openstack/dnsmasq-dns-bb85b8995-q4x27" Mar 09 13:25:28 crc kubenswrapper[4723]: I0309 13:25:28.005716 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmqzg\" (UniqueName: \"kubernetes.io/projected/73fbad24-6f18-4227-bc88-968635e92584-kube-api-access-kmqzg\") pod \"dnsmasq-dns-bb85b8995-q4x27\" (UID: \"73fbad24-6f18-4227-bc88-968635e92584\") " pod="openstack/dnsmasq-dns-bb85b8995-q4x27" Mar 09 13:25:28 crc kubenswrapper[4723]: I0309 13:25:28.006657 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73fbad24-6f18-4227-bc88-968635e92584-dns-svc\") pod \"dnsmasq-dns-bb85b8995-q4x27\" (UID: \"73fbad24-6f18-4227-bc88-968635e92584\") " pod="openstack/dnsmasq-dns-bb85b8995-q4x27" Mar 09 13:25:28 crc kubenswrapper[4723]: I0309 13:25:28.007256 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/73fbad24-6f18-4227-bc88-968635e92584-openstack-edpm-ipam\") pod \"dnsmasq-dns-bb85b8995-q4x27\" (UID: \"73fbad24-6f18-4227-bc88-968635e92584\") " pod="openstack/dnsmasq-dns-bb85b8995-q4x27" Mar 09 13:25:28 crc kubenswrapper[4723]: I0309 13:25:28.008919 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73fbad24-6f18-4227-bc88-968635e92584-config\") pod \"dnsmasq-dns-bb85b8995-q4x27\" (UID: \"73fbad24-6f18-4227-bc88-968635e92584\") " pod="openstack/dnsmasq-dns-bb85b8995-q4x27" Mar 09 13:25:28 crc kubenswrapper[4723]: I0309 13:25:28.009731 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73fbad24-6f18-4227-bc88-968635e92584-ovsdbserver-nb\") pod \"dnsmasq-dns-bb85b8995-q4x27\" (UID: \"73fbad24-6f18-4227-bc88-968635e92584\") " pod="openstack/dnsmasq-dns-bb85b8995-q4x27" Mar 09 13:25:28 crc kubenswrapper[4723]: I0309 13:25:28.011434 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73fbad24-6f18-4227-bc88-968635e92584-ovsdbserver-sb\") pod \"dnsmasq-dns-bb85b8995-q4x27\" (UID: \"73fbad24-6f18-4227-bc88-968635e92584\") " pod="openstack/dnsmasq-dns-bb85b8995-q4x27" Mar 09 13:25:28 crc kubenswrapper[4723]: I0309 13:25:28.011933 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/73fbad24-6f18-4227-bc88-968635e92584-dns-swift-storage-0\") pod \"dnsmasq-dns-bb85b8995-q4x27\" (UID: \"73fbad24-6f18-4227-bc88-968635e92584\") " pod="openstack/dnsmasq-dns-bb85b8995-q4x27" Mar 09 13:25:28 crc kubenswrapper[4723]: I0309 13:25:28.088599 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nb5mx" Mar 09 13:25:28 crc kubenswrapper[4723]: I0309 13:25:28.097203 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmqzg\" (UniqueName: \"kubernetes.io/projected/73fbad24-6f18-4227-bc88-968635e92584-kube-api-access-kmqzg\") pod \"dnsmasq-dns-bb85b8995-q4x27\" (UID: \"73fbad24-6f18-4227-bc88-968635e92584\") " pod="openstack/dnsmasq-dns-bb85b8995-q4x27" Mar 09 13:25:28 crc kubenswrapper[4723]: I0309 13:25:28.139477 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bb85b8995-q4x27" Mar 09 13:25:28 crc kubenswrapper[4723]: I0309 13:25:28.339839 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nb5mx" Mar 09 13:25:28 crc kubenswrapper[4723]: I0309 13:25:28.530741 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79b5d74c8c-kng4m" Mar 09 13:25:28 crc kubenswrapper[4723]: I0309 13:25:28.597179 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nb5mx"] Mar 09 13:25:28 crc kubenswrapper[4723]: I0309 13:25:28.632192 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1247b23-18d7-4343-90cb-e35826999ba9-dns-svc\") pod \"e1247b23-18d7-4343-90cb-e35826999ba9\" (UID: \"e1247b23-18d7-4343-90cb-e35826999ba9\") " Mar 09 13:25:28 crc kubenswrapper[4723]: I0309 13:25:28.632309 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1247b23-18d7-4343-90cb-e35826999ba9-ovsdbserver-sb\") pod \"e1247b23-18d7-4343-90cb-e35826999ba9\" (UID: \"e1247b23-18d7-4343-90cb-e35826999ba9\") " Mar 09 13:25:28 crc kubenswrapper[4723]: I0309 13:25:28.632346 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jg288\" (UniqueName: \"kubernetes.io/projected/e1247b23-18d7-4343-90cb-e35826999ba9-kube-api-access-jg288\") pod \"e1247b23-18d7-4343-90cb-e35826999ba9\" (UID: \"e1247b23-18d7-4343-90cb-e35826999ba9\") " Mar 09 13:25:28 crc kubenswrapper[4723]: I0309 13:25:28.632379 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1247b23-18d7-4343-90cb-e35826999ba9-ovsdbserver-nb\") pod \"e1247b23-18d7-4343-90cb-e35826999ba9\" (UID: \"e1247b23-18d7-4343-90cb-e35826999ba9\") " Mar 09 13:25:28 crc kubenswrapper[4723]: I0309 13:25:28.632451 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1247b23-18d7-4343-90cb-e35826999ba9-dns-swift-storage-0\") pod \"e1247b23-18d7-4343-90cb-e35826999ba9\" (UID: \"e1247b23-18d7-4343-90cb-e35826999ba9\") " Mar 09 13:25:28 crc kubenswrapper[4723]: I0309 13:25:28.632473 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1247b23-18d7-4343-90cb-e35826999ba9-config\") pod \"e1247b23-18d7-4343-90cb-e35826999ba9\" (UID: \"e1247b23-18d7-4343-90cb-e35826999ba9\") " Mar 09 13:25:28 crc kubenswrapper[4723]: I0309 13:25:28.638942 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1247b23-18d7-4343-90cb-e35826999ba9-kube-api-access-jg288" (OuterVolumeSpecName: "kube-api-access-jg288") pod "e1247b23-18d7-4343-90cb-e35826999ba9" (UID: "e1247b23-18d7-4343-90cb-e35826999ba9"). InnerVolumeSpecName "kube-api-access-jg288". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:25:28 crc kubenswrapper[4723]: I0309 13:25:28.712344 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1247b23-18d7-4343-90cb-e35826999ba9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e1247b23-18d7-4343-90cb-e35826999ba9" (UID: "e1247b23-18d7-4343-90cb-e35826999ba9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:28 crc kubenswrapper[4723]: I0309 13:25:28.721754 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1247b23-18d7-4343-90cb-e35826999ba9-config" (OuterVolumeSpecName: "config") pod "e1247b23-18d7-4343-90cb-e35826999ba9" (UID: "e1247b23-18d7-4343-90cb-e35826999ba9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:28 crc kubenswrapper[4723]: I0309 13:25:28.729953 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1247b23-18d7-4343-90cb-e35826999ba9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e1247b23-18d7-4343-90cb-e35826999ba9" (UID: "e1247b23-18d7-4343-90cb-e35826999ba9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:28 crc kubenswrapper[4723]: I0309 13:25:28.741928 4723 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1247b23-18d7-4343-90cb-e35826999ba9-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:28 crc kubenswrapper[4723]: I0309 13:25:28.741966 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jg288\" (UniqueName: \"kubernetes.io/projected/e1247b23-18d7-4343-90cb-e35826999ba9-kube-api-access-jg288\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:28 crc kubenswrapper[4723]: I0309 13:25:28.741982 4723 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1247b23-18d7-4343-90cb-e35826999ba9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:28 crc kubenswrapper[4723]: I0309 13:25:28.741996 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1247b23-18d7-4343-90cb-e35826999ba9-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:28 crc kubenswrapper[4723]: I0309 13:25:28.742384 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1247b23-18d7-4343-90cb-e35826999ba9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e1247b23-18d7-4343-90cb-e35826999ba9" (UID: "e1247b23-18d7-4343-90cb-e35826999ba9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:28 crc kubenswrapper[4723]: I0309 13:25:28.758460 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1247b23-18d7-4343-90cb-e35826999ba9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e1247b23-18d7-4343-90cb-e35826999ba9" (UID: "e1247b23-18d7-4343-90cb-e35826999ba9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:28 crc kubenswrapper[4723]: I0309 13:25:28.844022 4723 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1247b23-18d7-4343-90cb-e35826999ba9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:28 crc kubenswrapper[4723]: I0309 13:25:28.844057 4723 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1247b23-18d7-4343-90cb-e35826999ba9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:28 crc kubenswrapper[4723]: I0309 13:25:28.955949 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79b5d74c8c-kng4m" Mar 09 13:25:28 crc kubenswrapper[4723]: I0309 13:25:28.956911 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79b5d74c8c-kng4m" event={"ID":"e1247b23-18d7-4343-90cb-e35826999ba9","Type":"ContainerDied","Data":"e6c59a0cf90fb6ca4dffd5aa650f7f80dcdb7d95bd4a19de6242e9103210d852"} Mar 09 13:25:28 crc kubenswrapper[4723]: I0309 13:25:28.957081 4723 scope.go:117] "RemoveContainer" containerID="a148f889e50bfd745321c33011dbca488fa39edc0bcdad45045f6df087a6b45d" Mar 09 13:25:28 crc kubenswrapper[4723]: I0309 13:25:28.987181 4723 scope.go:117] "RemoveContainer" containerID="4149c338266a383a890ee2025b6f783bca7bdb9600bb31dc56c9dd1e756f1dc5" Mar 09 13:25:28 crc kubenswrapper[4723]: I0309 13:25:28.989875 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79b5d74c8c-kng4m"] Mar 09 13:25:29 crc kubenswrapper[4723]: I0309 13:25:29.004259 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79b5d74c8c-kng4m"] Mar 09 13:25:29 crc kubenswrapper[4723]: I0309 13:25:29.032501 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bb85b8995-q4x27"] Mar 09 13:25:29 crc kubenswrapper[4723]: I0309 13:25:29.972340 4723 generic.go:334] "Generic (PLEG): container finished" podID="73fbad24-6f18-4227-bc88-968635e92584" containerID="30e0690d64ebfcd0a5c8a4834304c0d3b2549dcccccdf90295379a40004aa0e4" exitCode=0 Mar 09 13:25:29 crc kubenswrapper[4723]: I0309 13:25:29.972376 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bb85b8995-q4x27" event={"ID":"73fbad24-6f18-4227-bc88-968635e92584","Type":"ContainerDied","Data":"30e0690d64ebfcd0a5c8a4834304c0d3b2549dcccccdf90295379a40004aa0e4"} Mar 09 13:25:29 crc kubenswrapper[4723]: I0309 13:25:29.972835 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bb85b8995-q4x27" event={"ID":"73fbad24-6f18-4227-bc88-968635e92584","Type":"ContainerStarted","Data":"660a59c3ee16519a8d137741caebe431e7186fe081b55939305bca5862d6e0e4"} Mar 09 13:25:29 crc kubenswrapper[4723]: I0309 13:25:29.975643 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nb5mx" podUID="959952f4-8104-4bc2-988d-906fe1ea3662" containerName="registry-server" containerID="cri-o://aadab442fb27183d0f9a96ea5667285ac2c876d2e1b29d5d8d307d4fc32978de" gracePeriod=2 Mar 09 13:25:30 crc kubenswrapper[4723]: I0309 13:25:30.456837 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nb5mx" Mar 09 13:25:30 crc kubenswrapper[4723]: I0309 13:25:30.583543 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/959952f4-8104-4bc2-988d-906fe1ea3662-utilities\") pod \"959952f4-8104-4bc2-988d-906fe1ea3662\" (UID: \"959952f4-8104-4bc2-988d-906fe1ea3662\") " Mar 09 13:25:30 crc kubenswrapper[4723]: I0309 13:25:30.583907 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrvbf\" (UniqueName: \"kubernetes.io/projected/959952f4-8104-4bc2-988d-906fe1ea3662-kube-api-access-hrvbf\") pod \"959952f4-8104-4bc2-988d-906fe1ea3662\" (UID: \"959952f4-8104-4bc2-988d-906fe1ea3662\") " Mar 09 13:25:30 crc kubenswrapper[4723]: I0309 13:25:30.584002 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/959952f4-8104-4bc2-988d-906fe1ea3662-catalog-content\") pod \"959952f4-8104-4bc2-988d-906fe1ea3662\" (UID: \"959952f4-8104-4bc2-988d-906fe1ea3662\") " Mar 09 13:25:30 crc kubenswrapper[4723]: I0309 13:25:30.585108 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/959952f4-8104-4bc2-988d-906fe1ea3662-utilities" (OuterVolumeSpecName: "utilities") pod "959952f4-8104-4bc2-988d-906fe1ea3662" (UID: "959952f4-8104-4bc2-988d-906fe1ea3662"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:25:30 crc kubenswrapper[4723]: I0309 13:25:30.590178 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/959952f4-8104-4bc2-988d-906fe1ea3662-kube-api-access-hrvbf" (OuterVolumeSpecName: "kube-api-access-hrvbf") pod "959952f4-8104-4bc2-988d-906fe1ea3662" (UID: "959952f4-8104-4bc2-988d-906fe1ea3662"). InnerVolumeSpecName "kube-api-access-hrvbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:25:30 crc kubenswrapper[4723]: I0309 13:25:30.619062 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/959952f4-8104-4bc2-988d-906fe1ea3662-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "959952f4-8104-4bc2-988d-906fe1ea3662" (UID: "959952f4-8104-4bc2-988d-906fe1ea3662"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:25:30 crc kubenswrapper[4723]: I0309 13:25:30.687891 4723 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/959952f4-8104-4bc2-988d-906fe1ea3662-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:30 crc kubenswrapper[4723]: I0309 13:25:30.687923 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrvbf\" (UniqueName: \"kubernetes.io/projected/959952f4-8104-4bc2-988d-906fe1ea3662-kube-api-access-hrvbf\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:30 crc kubenswrapper[4723]: I0309 13:25:30.687935 4723 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/959952f4-8104-4bc2-988d-906fe1ea3662-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:30 crc kubenswrapper[4723]: I0309 13:25:30.894764 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1247b23-18d7-4343-90cb-e35826999ba9" path="/var/lib/kubelet/pods/e1247b23-18d7-4343-90cb-e35826999ba9/volumes" Mar 09 13:25:30 crc kubenswrapper[4723]: I0309 13:25:30.986940 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-flx99" event={"ID":"09a0d87c-3e58-4839-95fb-7c964152ed7c","Type":"ContainerStarted","Data":"ddf76b174c171ce02f933683ef40712510e279024507bd1418f8e581a6808299"} Mar 09 13:25:30 crc kubenswrapper[4723]: I0309 13:25:30.990046 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bb85b8995-q4x27" event={"ID":"73fbad24-6f18-4227-bc88-968635e92584","Type":"ContainerStarted","Data":"8aab4f19ff86733483e8e05d5877ece049bbae605bfc81995c87e6b120e41ee8"} Mar 09 13:25:30 crc kubenswrapper[4723]: I0309 13:25:30.990289 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bb85b8995-q4x27" Mar 09 13:25:30 crc kubenswrapper[4723]: I0309 13:25:30.992118 4723 generic.go:334] "Generic (PLEG): container finished" podID="959952f4-8104-4bc2-988d-906fe1ea3662" containerID="aadab442fb27183d0f9a96ea5667285ac2c876d2e1b29d5d8d307d4fc32978de" exitCode=0 Mar 09 13:25:30 crc kubenswrapper[4723]: I0309 13:25:30.992153 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nb5mx" event={"ID":"959952f4-8104-4bc2-988d-906fe1ea3662","Type":"ContainerDied","Data":"aadab442fb27183d0f9a96ea5667285ac2c876d2e1b29d5d8d307d4fc32978de"} Mar 09 13:25:30 crc kubenswrapper[4723]: I0309 13:25:30.992173 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nb5mx" event={"ID":"959952f4-8104-4bc2-988d-906fe1ea3662","Type":"ContainerDied","Data":"69622cef8b65b42a782591fc8001bb42be25fbe651d28a3227cc3947e70a0542"} Mar 09 13:25:30 crc kubenswrapper[4723]: I0309 13:25:30.992189 4723 scope.go:117] "RemoveContainer" containerID="aadab442fb27183d0f9a96ea5667285ac2c876d2e1b29d5d8d307d4fc32978de" Mar 09 13:25:30 crc kubenswrapper[4723]: I0309 13:25:30.992300 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nb5mx" Mar 09 13:25:31 crc kubenswrapper[4723]: I0309 13:25:31.012075 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-flx99" podStartSLOduration=2.615919967 podStartE2EDuration="40.01205894s" podCreationTimestamp="2026-03-09 13:24:51 +0000 UTC" firstStartedPulling="2026-03-09 13:24:52.718742533 +0000 UTC m=+1566.733210073" lastFinishedPulling="2026-03-09 13:25:30.114881506 +0000 UTC m=+1604.129349046" observedRunningTime="2026-03-09 13:25:31.001206213 +0000 UTC m=+1605.015673753" watchObservedRunningTime="2026-03-09 13:25:31.01205894 +0000 UTC m=+1605.026526480" Mar 09 13:25:31 crc kubenswrapper[4723]: I0309 13:25:31.034752 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bb85b8995-q4x27" podStartSLOduration=4.034732649 podStartE2EDuration="4.034732649s" podCreationTimestamp="2026-03-09 13:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:25:31.025701011 +0000 UTC m=+1605.040168561" watchObservedRunningTime="2026-03-09 13:25:31.034732649 +0000 UTC m=+1605.049200189" Mar 09 13:25:31 crc kubenswrapper[4723]: I0309 13:25:31.079514 4723 scope.go:117] "RemoveContainer" containerID="4ae5c90b5639d2a29447ccddba7ce528eb13cb46ae1698834f463c210c0969c6" Mar 09 13:25:31 crc kubenswrapper[4723]: I0309 13:25:31.081779 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nb5mx"] Mar 09 13:25:31 crc kubenswrapper[4723]: I0309 13:25:31.094300 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nb5mx"] Mar 09 13:25:31 crc kubenswrapper[4723]: I0309 13:25:31.108193 4723 scope.go:117] "RemoveContainer" containerID="34c875e78833c8bfc74e76e13af608804248e22f883b5d05ad4d07e2af019b24" Mar 09 13:25:31 crc kubenswrapper[4723]: I0309 13:25:31.160515 4723 scope.go:117] "RemoveContainer" containerID="aadab442fb27183d0f9a96ea5667285ac2c876d2e1b29d5d8d307d4fc32978de" Mar 09 13:25:31 crc kubenswrapper[4723]: E0309 13:25:31.161049 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aadab442fb27183d0f9a96ea5667285ac2c876d2e1b29d5d8d307d4fc32978de\": container with ID starting with aadab442fb27183d0f9a96ea5667285ac2c876d2e1b29d5d8d307d4fc32978de not found: ID does not exist" containerID="aadab442fb27183d0f9a96ea5667285ac2c876d2e1b29d5d8d307d4fc32978de" Mar 09 13:25:31 crc kubenswrapper[4723]: I0309 13:25:31.161091 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aadab442fb27183d0f9a96ea5667285ac2c876d2e1b29d5d8d307d4fc32978de"} err="failed to get container status \"aadab442fb27183d0f9a96ea5667285ac2c876d2e1b29d5d8d307d4fc32978de\": rpc error: code = NotFound desc = could not find container \"aadab442fb27183d0f9a96ea5667285ac2c876d2e1b29d5d8d307d4fc32978de\": container with ID starting with aadab442fb27183d0f9a96ea5667285ac2c876d2e1b29d5d8d307d4fc32978de not found: ID does not exist" Mar 09 13:25:31 crc kubenswrapper[4723]: I0309 13:25:31.161137 4723 scope.go:117] "RemoveContainer" containerID="4ae5c90b5639d2a29447ccddba7ce528eb13cb46ae1698834f463c210c0969c6" Mar 09 13:25:31 crc kubenswrapper[4723]: E0309 13:25:31.161553 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ae5c90b5639d2a29447ccddba7ce528eb13cb46ae1698834f463c210c0969c6\": container with ID starting with 4ae5c90b5639d2a29447ccddba7ce528eb13cb46ae1698834f463c210c0969c6 not found: ID does not exist" containerID="4ae5c90b5639d2a29447ccddba7ce528eb13cb46ae1698834f463c210c0969c6" Mar 09 13:25:31 crc kubenswrapper[4723]: I0309 13:25:31.161593 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ae5c90b5639d2a29447ccddba7ce528eb13cb46ae1698834f463c210c0969c6"} err="failed to get container status \"4ae5c90b5639d2a29447ccddba7ce528eb13cb46ae1698834f463c210c0969c6\": rpc error: code = NotFound desc = could not find container \"4ae5c90b5639d2a29447ccddba7ce528eb13cb46ae1698834f463c210c0969c6\": container with ID starting with 4ae5c90b5639d2a29447ccddba7ce528eb13cb46ae1698834f463c210c0969c6 not found: ID does not exist" Mar 09 13:25:31 crc kubenswrapper[4723]: I0309 13:25:31.161608 4723 scope.go:117] "RemoveContainer" containerID="34c875e78833c8bfc74e76e13af608804248e22f883b5d05ad4d07e2af019b24" Mar 09 13:25:31 crc kubenswrapper[4723]: E0309 13:25:31.162048 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34c875e78833c8bfc74e76e13af608804248e22f883b5d05ad4d07e2af019b24\": container with ID starting with 34c875e78833c8bfc74e76e13af608804248e22f883b5d05ad4d07e2af019b24 not found: ID does not exist" containerID="34c875e78833c8bfc74e76e13af608804248e22f883b5d05ad4d07e2af019b24" Mar 09 13:25:31 crc kubenswrapper[4723]: I0309 13:25:31.162067 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34c875e78833c8bfc74e76e13af608804248e22f883b5d05ad4d07e2af019b24"} err="failed to get container status \"34c875e78833c8bfc74e76e13af608804248e22f883b5d05ad4d07e2af019b24\": rpc error: code = NotFound desc = could not find container \"34c875e78833c8bfc74e76e13af608804248e22f883b5d05ad4d07e2af019b24\": container with ID starting with 34c875e78833c8bfc74e76e13af608804248e22f883b5d05ad4d07e2af019b24 not found: ID does not exist" Mar 09 13:25:32 crc kubenswrapper[4723]: I0309 13:25:32.007049 4723 generic.go:334] "Generic (PLEG): container finished" podID="2ded56b6-9f9e-4a1d-8344-7f4a6d414da2" containerID="0a7c75ec59d20af361a4c913ae4e63d868a7a6ece77ba7efe7f6210dbb1af257" exitCode=137 Mar 09 13:25:32 crc kubenswrapper[4723]: I0309 13:25:32.007120 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2","Type":"ContainerDied","Data":"0a7c75ec59d20af361a4c913ae4e63d868a7a6ece77ba7efe7f6210dbb1af257"} Mar 09 13:25:32 crc kubenswrapper[4723]: I0309 13:25:32.148115 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:25:32 crc kubenswrapper[4723]: I0309 13:25:32.279109 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-scripts\") pod \"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2\" (UID: \"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2\") " Mar 09 13:25:32 crc kubenswrapper[4723]: I0309 13:25:32.279209 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-sg-core-conf-yaml\") pod \"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2\" (UID: \"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2\") " Mar 09 13:25:32 crc kubenswrapper[4723]: I0309 13:25:32.279351 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-run-httpd\") pod \"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2\" (UID: \"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2\") " Mar 09 13:25:32 crc kubenswrapper[4723]: I0309 13:25:32.279538 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-config-data\") pod \"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2\" (UID: \"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2\") " Mar 09 13:25:32 crc kubenswrapper[4723]: I0309 13:25:32.279618 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghr7t\" (UniqueName: \"kubernetes.io/projected/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-kube-api-access-ghr7t\") pod \"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2\" (UID: \"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2\") " Mar 09 13:25:32 crc kubenswrapper[4723]: I0309 13:25:32.279783 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-ceilometer-tls-certs\") pod \"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2\" (UID: \"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2\") " Mar 09 13:25:32 crc kubenswrapper[4723]: I0309 13:25:32.279825 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-log-httpd\") pod \"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2\" (UID: \"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2\") " Mar 09 13:25:32 crc kubenswrapper[4723]: I0309 13:25:32.279900 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-combined-ca-bundle\") pod \"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2\" (UID: \"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2\") " Mar 09 13:25:32 crc kubenswrapper[4723]: I0309 13:25:32.279997 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2ded56b6-9f9e-4a1d-8344-7f4a6d414da2" (UID: "2ded56b6-9f9e-4a1d-8344-7f4a6d414da2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:25:32 crc kubenswrapper[4723]: I0309 13:25:32.280278 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2ded56b6-9f9e-4a1d-8344-7f4a6d414da2" (UID: "2ded56b6-9f9e-4a1d-8344-7f4a6d414da2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:25:32 crc kubenswrapper[4723]: I0309 13:25:32.280778 4723 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:32 crc kubenswrapper[4723]: I0309 13:25:32.280809 4723 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:32 crc kubenswrapper[4723]: I0309 13:25:32.285042 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-kube-api-access-ghr7t" (OuterVolumeSpecName: "kube-api-access-ghr7t") pod "2ded56b6-9f9e-4a1d-8344-7f4a6d414da2" (UID: "2ded56b6-9f9e-4a1d-8344-7f4a6d414da2"). InnerVolumeSpecName "kube-api-access-ghr7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:25:32 crc kubenswrapper[4723]: I0309 13:25:32.291241 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-scripts" (OuterVolumeSpecName: "scripts") pod "2ded56b6-9f9e-4a1d-8344-7f4a6d414da2" (UID: "2ded56b6-9f9e-4a1d-8344-7f4a6d414da2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:25:32 crc kubenswrapper[4723]: I0309 13:25:32.314793 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2ded56b6-9f9e-4a1d-8344-7f4a6d414da2" (UID: "2ded56b6-9f9e-4a1d-8344-7f4a6d414da2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:25:32 crc kubenswrapper[4723]: I0309 13:25:32.359244 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "2ded56b6-9f9e-4a1d-8344-7f4a6d414da2" (UID: "2ded56b6-9f9e-4a1d-8344-7f4a6d414da2"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:25:32 crc kubenswrapper[4723]: I0309 13:25:32.385650 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghr7t\" (UniqueName: \"kubernetes.io/projected/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-kube-api-access-ghr7t\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:32 crc kubenswrapper[4723]: I0309 13:25:32.385698 4723 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:32 crc kubenswrapper[4723]: I0309 13:25:32.385711 4723 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:32 crc kubenswrapper[4723]: I0309 13:25:32.385727 4723 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:32 crc kubenswrapper[4723]: I0309 13:25:32.408073 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ded56b6-9f9e-4a1d-8344-7f4a6d414da2" (UID: "2ded56b6-9f9e-4a1d-8344-7f4a6d414da2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:25:32 crc kubenswrapper[4723]: I0309 13:25:32.414613 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-config-data" (OuterVolumeSpecName: "config-data") pod "2ded56b6-9f9e-4a1d-8344-7f4a6d414da2" (UID: "2ded56b6-9f9e-4a1d-8344-7f4a6d414da2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:25:32 crc kubenswrapper[4723]: I0309 13:25:32.487317 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:32 crc kubenswrapper[4723]: I0309 13:25:32.487600 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:32 crc kubenswrapper[4723]: I0309 13:25:32.900343 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="959952f4-8104-4bc2-988d-906fe1ea3662" path="/var/lib/kubelet/pods/959952f4-8104-4bc2-988d-906fe1ea3662/volumes" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.042682 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ded56b6-9f9e-4a1d-8344-7f4a6d414da2","Type":"ContainerDied","Data":"833f4666a2bdefe79a2de77132af3bcbc4ff6394ec2d4dd025a396e703ddd5fb"} Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.042725 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.042740 4723 scope.go:117] "RemoveContainer" containerID="0a7c75ec59d20af361a4c913ae4e63d868a7a6ece77ba7efe7f6210dbb1af257" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.075516 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.093533 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.093803 4723 scope.go:117] "RemoveContainer" containerID="2cea2fee01e6224a831d4f4bf7203591d3031d8950cd512479fadd9c6fcc6dd3" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.113905 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:25:33 crc kubenswrapper[4723]: E0309 13:25:33.114580 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ded56b6-9f9e-4a1d-8344-7f4a6d414da2" containerName="proxy-httpd" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.114606 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ded56b6-9f9e-4a1d-8344-7f4a6d414da2" containerName="proxy-httpd" Mar 09 13:25:33 crc kubenswrapper[4723]: E0309 13:25:33.114625 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="959952f4-8104-4bc2-988d-906fe1ea3662" containerName="registry-server" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.114634 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="959952f4-8104-4bc2-988d-906fe1ea3662" containerName="registry-server" Mar 09 13:25:33 crc kubenswrapper[4723]: E0309 13:25:33.114657 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="959952f4-8104-4bc2-988d-906fe1ea3662" containerName="extract-utilities" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.114666 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="959952f4-8104-4bc2-988d-906fe1ea3662" containerName="extract-utilities" Mar 09 13:25:33 crc kubenswrapper[4723]: E0309 13:25:33.114677 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ded56b6-9f9e-4a1d-8344-7f4a6d414da2" containerName="ceilometer-notification-agent" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.114685 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ded56b6-9f9e-4a1d-8344-7f4a6d414da2" containerName="ceilometer-notification-agent" Mar 09 13:25:33 crc kubenswrapper[4723]: E0309 13:25:33.114695 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="959952f4-8104-4bc2-988d-906fe1ea3662" containerName="extract-content" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.114706 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="959952f4-8104-4bc2-988d-906fe1ea3662" containerName="extract-content" Mar 09 13:25:33 crc kubenswrapper[4723]: E0309 13:25:33.114731 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ded56b6-9f9e-4a1d-8344-7f4a6d414da2" containerName="sg-core" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.114739 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ded56b6-9f9e-4a1d-8344-7f4a6d414da2" containerName="sg-core" Mar 09 13:25:33 crc kubenswrapper[4723]: E0309 13:25:33.114771 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ded56b6-9f9e-4a1d-8344-7f4a6d414da2" containerName="ceilometer-central-agent" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.114779 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ded56b6-9f9e-4a1d-8344-7f4a6d414da2" containerName="ceilometer-central-agent" Mar 09 13:25:33 crc kubenswrapper[4723]: E0309 13:25:33.114797 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1247b23-18d7-4343-90cb-e35826999ba9" containerName="dnsmasq-dns" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.114807 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1247b23-18d7-4343-90cb-e35826999ba9" containerName="dnsmasq-dns" Mar 09 13:25:33 crc kubenswrapper[4723]: E0309 13:25:33.114820 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1247b23-18d7-4343-90cb-e35826999ba9" containerName="init" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.114827 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1247b23-18d7-4343-90cb-e35826999ba9" containerName="init" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.115130 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ded56b6-9f9e-4a1d-8344-7f4a6d414da2" containerName="ceilometer-notification-agent" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.115156 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1247b23-18d7-4343-90cb-e35826999ba9" containerName="dnsmasq-dns" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.115171 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="959952f4-8104-4bc2-988d-906fe1ea3662" containerName="registry-server" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.115185 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ded56b6-9f9e-4a1d-8344-7f4a6d414da2" containerName="sg-core" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.115193 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ded56b6-9f9e-4a1d-8344-7f4a6d414da2" containerName="proxy-httpd" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.115214 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ded56b6-9f9e-4a1d-8344-7f4a6d414da2" containerName="ceilometer-central-agent" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.117842 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.121529 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.122757 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.126188 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.126270 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.147643 4723 scope.go:117] "RemoveContainer" containerID="6c4f7025c883c3c5a5aba017bd7289bc78337de42ffa23268639b9f3c7261d00" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.188287 4723 scope.go:117] "RemoveContainer" containerID="674ef4261250890626ed068cd56de70f69bbe6476d52344252ccc30a4e2612b9" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.203886 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df18bf19-d23a-471f-8074-2eaaa7c4aead-config-data\") pod \"ceilometer-0\" (UID: \"df18bf19-d23a-471f-8074-2eaaa7c4aead\") " pod="openstack/ceilometer-0" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.203990 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df18bf19-d23a-471f-8074-2eaaa7c4aead-log-httpd\") pod \"ceilometer-0\" (UID: \"df18bf19-d23a-471f-8074-2eaaa7c4aead\") " pod="openstack/ceilometer-0" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.204050 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df18bf19-d23a-471f-8074-2eaaa7c4aead-run-httpd\") pod \"ceilometer-0\" (UID: \"df18bf19-d23a-471f-8074-2eaaa7c4aead\") " pod="openstack/ceilometer-0" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.204165 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fm8gw\" (UniqueName: \"kubernetes.io/projected/df18bf19-d23a-471f-8074-2eaaa7c4aead-kube-api-access-fm8gw\") pod \"ceilometer-0\" (UID: \"df18bf19-d23a-471f-8074-2eaaa7c4aead\") " pod="openstack/ceilometer-0" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.204480 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df18bf19-d23a-471f-8074-2eaaa7c4aead-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"df18bf19-d23a-471f-8074-2eaaa7c4aead\") " pod="openstack/ceilometer-0" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.204672 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df18bf19-d23a-471f-8074-2eaaa7c4aead-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"df18bf19-d23a-471f-8074-2eaaa7c4aead\") " pod="openstack/ceilometer-0" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.204759 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df18bf19-d23a-471f-8074-2eaaa7c4aead-scripts\") pod \"ceilometer-0\" (UID: \"df18bf19-d23a-471f-8074-2eaaa7c4aead\") " pod="openstack/ceilometer-0" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.204895 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/df18bf19-d23a-471f-8074-2eaaa7c4aead-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"df18bf19-d23a-471f-8074-2eaaa7c4aead\") " pod="openstack/ceilometer-0" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.307455 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df18bf19-d23a-471f-8074-2eaaa7c4aead-config-data\") pod \"ceilometer-0\" (UID: \"df18bf19-d23a-471f-8074-2eaaa7c4aead\") " pod="openstack/ceilometer-0" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.307539 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df18bf19-d23a-471f-8074-2eaaa7c4aead-log-httpd\") pod \"ceilometer-0\" (UID: \"df18bf19-d23a-471f-8074-2eaaa7c4aead\") " pod="openstack/ceilometer-0" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.307566 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df18bf19-d23a-471f-8074-2eaaa7c4aead-run-httpd\") pod \"ceilometer-0\" (UID: \"df18bf19-d23a-471f-8074-2eaaa7c4aead\") " pod="openstack/ceilometer-0" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.307623 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fm8gw\" (UniqueName: \"kubernetes.io/projected/df18bf19-d23a-471f-8074-2eaaa7c4aead-kube-api-access-fm8gw\") pod \"ceilometer-0\" (UID: \"df18bf19-d23a-471f-8074-2eaaa7c4aead\") " pod="openstack/ceilometer-0" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.307727 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df18bf19-d23a-471f-8074-2eaaa7c4aead-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"df18bf19-d23a-471f-8074-2eaaa7c4aead\") " pod="openstack/ceilometer-0" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.307777 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df18bf19-d23a-471f-8074-2eaaa7c4aead-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"df18bf19-d23a-471f-8074-2eaaa7c4aead\") " pod="openstack/ceilometer-0" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.307803 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df18bf19-d23a-471f-8074-2eaaa7c4aead-scripts\") pod \"ceilometer-0\" (UID: \"df18bf19-d23a-471f-8074-2eaaa7c4aead\") " pod="openstack/ceilometer-0" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.307845 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/df18bf19-d23a-471f-8074-2eaaa7c4aead-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"df18bf19-d23a-471f-8074-2eaaa7c4aead\") " pod="openstack/ceilometer-0" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.308057 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df18bf19-d23a-471f-8074-2eaaa7c4aead-log-httpd\") pod \"ceilometer-0\" (UID: \"df18bf19-d23a-471f-8074-2eaaa7c4aead\") " pod="openstack/ceilometer-0" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.308113 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df18bf19-d23a-471f-8074-2eaaa7c4aead-run-httpd\") pod \"ceilometer-0\" (UID: \"df18bf19-d23a-471f-8074-2eaaa7c4aead\") " pod="openstack/ceilometer-0" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.313222 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df18bf19-d23a-471f-8074-2eaaa7c4aead-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"df18bf19-d23a-471f-8074-2eaaa7c4aead\") " pod="openstack/ceilometer-0" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.313298 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/df18bf19-d23a-471f-8074-2eaaa7c4aead-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"df18bf19-d23a-471f-8074-2eaaa7c4aead\") " pod="openstack/ceilometer-0" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.314015 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df18bf19-d23a-471f-8074-2eaaa7c4aead-config-data\") pod \"ceilometer-0\" (UID: \"df18bf19-d23a-471f-8074-2eaaa7c4aead\") " pod="openstack/ceilometer-0" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.314852 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df18bf19-d23a-471f-8074-2eaaa7c4aead-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"df18bf19-d23a-471f-8074-2eaaa7c4aead\") " pod="openstack/ceilometer-0" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.315590 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df18bf19-d23a-471f-8074-2eaaa7c4aead-scripts\") pod \"ceilometer-0\" (UID: \"df18bf19-d23a-471f-8074-2eaaa7c4aead\") " pod="openstack/ceilometer-0" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.328118 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fm8gw\" (UniqueName: \"kubernetes.io/projected/df18bf19-d23a-471f-8074-2eaaa7c4aead-kube-api-access-fm8gw\") pod \"ceilometer-0\" (UID: \"df18bf19-d23a-471f-8074-2eaaa7c4aead\") " pod="openstack/ceilometer-0" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.420058 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-79b5d74c8c-kng4m" podUID="e1247b23-18d7-4343-90cb-e35826999ba9" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.8:5353: i/o timeout" Mar 09 13:25:33 crc kubenswrapper[4723]: I0309 13:25:33.454480 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:25:34 crc kubenswrapper[4723]: W0309 13:25:34.016297 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf18bf19_d23a_471f_8074_2eaaa7c4aead.slice/crio-86c4298ef4fe09b0008bacfa74a4c6e0d9aaa9b259ab25dafd4d4231055c2274 WatchSource:0}: Error finding container 86c4298ef4fe09b0008bacfa74a4c6e0d9aaa9b259ab25dafd4d4231055c2274: Status 404 returned error can't find the container with id 86c4298ef4fe09b0008bacfa74a4c6e0d9aaa9b259ab25dafd4d4231055c2274 Mar 09 13:25:34 crc kubenswrapper[4723]: I0309 13:25:34.030935 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:25:34 crc kubenswrapper[4723]: I0309 13:25:34.087476 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df18bf19-d23a-471f-8074-2eaaa7c4aead","Type":"ContainerStarted","Data":"86c4298ef4fe09b0008bacfa74a4c6e0d9aaa9b259ab25dafd4d4231055c2274"} Mar 09 13:25:34 crc kubenswrapper[4723]: I0309 13:25:34.900988 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ded56b6-9f9e-4a1d-8344-7f4a6d414da2" path="/var/lib/kubelet/pods/2ded56b6-9f9e-4a1d-8344-7f4a6d414da2/volumes" Mar 09 13:25:35 crc kubenswrapper[4723]: I0309 13:25:35.113396 4723 generic.go:334] "Generic (PLEG): container finished" podID="09a0d87c-3e58-4839-95fb-7c964152ed7c" containerID="ddf76b174c171ce02f933683ef40712510e279024507bd1418f8e581a6808299" exitCode=0 Mar 09 13:25:35 crc kubenswrapper[4723]: I0309 13:25:35.113439 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-flx99" event={"ID":"09a0d87c-3e58-4839-95fb-7c964152ed7c","Type":"ContainerDied","Data":"ddf76b174c171ce02f933683ef40712510e279024507bd1418f8e581a6808299"} Mar 09 13:25:37 crc kubenswrapper[4723]: I0309 13:25:37.355891 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-znk5w" Mar 09 13:25:37 crc kubenswrapper[4723]: I0309 13:25:37.439263 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-znk5w"] Mar 09 13:25:38 crc kubenswrapper[4723]: I0309 13:25:38.025658 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-flx99" Mar 09 13:25:38 crc kubenswrapper[4723]: I0309 13:25:38.141879 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bb85b8995-q4x27" Mar 09 13:25:38 crc kubenswrapper[4723]: I0309 13:25:38.143301 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09a0d87c-3e58-4839-95fb-7c964152ed7c-combined-ca-bundle\") pod \"09a0d87c-3e58-4839-95fb-7c964152ed7c\" (UID: \"09a0d87c-3e58-4839-95fb-7c964152ed7c\") " Mar 09 13:25:38 crc kubenswrapper[4723]: I0309 13:25:38.143445 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09a0d87c-3e58-4839-95fb-7c964152ed7c-config-data\") pod \"09a0d87c-3e58-4839-95fb-7c964152ed7c\" (UID: \"09a0d87c-3e58-4839-95fb-7c964152ed7c\") " Mar 09 13:25:38 crc kubenswrapper[4723]: I0309 13:25:38.143630 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4crf\" (UniqueName: \"kubernetes.io/projected/09a0d87c-3e58-4839-95fb-7c964152ed7c-kube-api-access-c4crf\") pod \"09a0d87c-3e58-4839-95fb-7c964152ed7c\" (UID: \"09a0d87c-3e58-4839-95fb-7c964152ed7c\") " Mar 09 13:25:38 crc kubenswrapper[4723]: I0309 13:25:38.149144 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09a0d87c-3e58-4839-95fb-7c964152ed7c-kube-api-access-c4crf" (OuterVolumeSpecName: "kube-api-access-c4crf") pod "09a0d87c-3e58-4839-95fb-7c964152ed7c" (UID: "09a0d87c-3e58-4839-95fb-7c964152ed7c"). InnerVolumeSpecName "kube-api-access-c4crf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:25:38 crc kubenswrapper[4723]: I0309 13:25:38.217549 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09a0d87c-3e58-4839-95fb-7c964152ed7c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "09a0d87c-3e58-4839-95fb-7c964152ed7c" (UID: "09a0d87c-3e58-4839-95fb-7c964152ed7c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:25:38 crc kubenswrapper[4723]: I0309 13:25:38.220276 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df18bf19-d23a-471f-8074-2eaaa7c4aead","Type":"ContainerStarted","Data":"82ad22c202020bad5c2bc30629a84c341955f73333b0feea5d43b28474a7488e"} Mar 09 13:25:38 crc kubenswrapper[4723]: I0309 13:25:38.230456 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-flx99" Mar 09 13:25:38 crc kubenswrapper[4723]: I0309 13:25:38.229608 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-znk5w" podUID="9917a47a-64be-47d0-a329-7380b87ac154" containerName="registry-server" containerID="cri-o://c7a671fe39a8559be5b523f88ce2c63e3968ea192ea3716cce6389438c3574a4" gracePeriod=2 Mar 09 13:25:38 crc kubenswrapper[4723]: I0309 13:25:38.231085 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-flx99" event={"ID":"09a0d87c-3e58-4839-95fb-7c964152ed7c","Type":"ContainerDied","Data":"45fed542ca68f1a0952f67e059b3608c789ba2870c5844fc8d81cb1615c75a5f"} Mar 09 13:25:38 crc kubenswrapper[4723]: I0309 13:25:38.231135 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45fed542ca68f1a0952f67e059b3608c789ba2870c5844fc8d81cb1615c75a5f" Mar 09 13:25:38 crc kubenswrapper[4723]: I0309 13:25:38.237088 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68df85789f-6qr75"] Mar 09 13:25:38 crc kubenswrapper[4723]: I0309 13:25:38.237422 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-68df85789f-6qr75" podUID="bfa18767-ca55-45cd-bcc4-64e6e7572efe" containerName="dnsmasq-dns" containerID="cri-o://adf856034772ecbc05fd5a4160867c585a5e82539c38e5c7b35afa1ecd573991" gracePeriod=10 Mar 09 13:25:38 crc kubenswrapper[4723]: I0309 13:25:38.246667 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09a0d87c-3e58-4839-95fb-7c964152ed7c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:38 crc kubenswrapper[4723]: I0309 13:25:38.247026 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4crf\" (UniqueName: \"kubernetes.io/projected/09a0d87c-3e58-4839-95fb-7c964152ed7c-kube-api-access-c4crf\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:38 crc kubenswrapper[4723]: I0309 13:25:38.309981 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09a0d87c-3e58-4839-95fb-7c964152ed7c-config-data" (OuterVolumeSpecName: "config-data") pod "09a0d87c-3e58-4839-95fb-7c964152ed7c" (UID: "09a0d87c-3e58-4839-95fb-7c964152ed7c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:25:38 crc kubenswrapper[4723]: I0309 13:25:38.354473 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09a0d87c-3e58-4839-95fb-7c964152ed7c-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.160672 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-znk5w" Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.163956 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68df85789f-6qr75" Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.247085 4723 generic.go:334] "Generic (PLEG): container finished" podID="9917a47a-64be-47d0-a329-7380b87ac154" containerID="c7a671fe39a8559be5b523f88ce2c63e3968ea192ea3716cce6389438c3574a4" exitCode=0 Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.247155 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-znk5w" Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.247174 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-znk5w" event={"ID":"9917a47a-64be-47d0-a329-7380b87ac154","Type":"ContainerDied","Data":"c7a671fe39a8559be5b523f88ce2c63e3968ea192ea3716cce6389438c3574a4"} Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.247550 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-znk5w" event={"ID":"9917a47a-64be-47d0-a329-7380b87ac154","Type":"ContainerDied","Data":"462a0c8afe6644f9d6cf0042d529f347eb7bc653f19b2417724a7bb535629b96"} Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.247974 4723 scope.go:117] "RemoveContainer" containerID="c7a671fe39a8559be5b523f88ce2c63e3968ea192ea3716cce6389438c3574a4" Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.256163 4723 generic.go:334] "Generic (PLEG): container finished" podID="bfa18767-ca55-45cd-bcc4-64e6e7572efe" containerID="adf856034772ecbc05fd5a4160867c585a5e82539c38e5c7b35afa1ecd573991" exitCode=0 Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.256197 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68df85789f-6qr75" event={"ID":"bfa18767-ca55-45cd-bcc4-64e6e7572efe","Type":"ContainerDied","Data":"adf856034772ecbc05fd5a4160867c585a5e82539c38e5c7b35afa1ecd573991"} Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.256217 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68df85789f-6qr75" event={"ID":"bfa18767-ca55-45cd-bcc4-64e6e7572efe","Type":"ContainerDied","Data":"140cdb61153c2719fd628e7e82fdfbd17fc247c38400eefd74a9e77f581628af"} Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.256269 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68df85789f-6qr75" Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.269829 4723 scope.go:117] "RemoveContainer" containerID="6dbcc4b77677882c200bb673a7f1a29d8bd1638a9239b8af8b2145ed4803c7a9" Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.294767 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9917a47a-64be-47d0-a329-7380b87ac154-catalog-content\") pod \"9917a47a-64be-47d0-a329-7380b87ac154\" (UID: \"9917a47a-64be-47d0-a329-7380b87ac154\") " Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.294825 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bfa18767-ca55-45cd-bcc4-64e6e7572efe-dns-swift-storage-0\") pod \"bfa18767-ca55-45cd-bcc4-64e6e7572efe\" (UID: \"bfa18767-ca55-45cd-bcc4-64e6e7572efe\") " Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.294903 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfa18767-ca55-45cd-bcc4-64e6e7572efe-dns-svc\") pod \"bfa18767-ca55-45cd-bcc4-64e6e7572efe\" (UID: \"bfa18767-ca55-45cd-bcc4-64e6e7572efe\") " Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.294990 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bfa18767-ca55-45cd-bcc4-64e6e7572efe-ovsdbserver-sb\") pod \"bfa18767-ca55-45cd-bcc4-64e6e7572efe\" (UID: \"bfa18767-ca55-45cd-bcc4-64e6e7572efe\") " Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.295019 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bvms\" (UniqueName: \"kubernetes.io/projected/9917a47a-64be-47d0-a329-7380b87ac154-kube-api-access-2bvms\") pod \"9917a47a-64be-47d0-a329-7380b87ac154\" (UID: \"9917a47a-64be-47d0-a329-7380b87ac154\") " Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.295086 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/bfa18767-ca55-45cd-bcc4-64e6e7572efe-openstack-edpm-ipam\") pod \"bfa18767-ca55-45cd-bcc4-64e6e7572efe\" (UID: \"bfa18767-ca55-45cd-bcc4-64e6e7572efe\") " Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.295115 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfa18767-ca55-45cd-bcc4-64e6e7572efe-config\") pod \"bfa18767-ca55-45cd-bcc4-64e6e7572efe\" (UID: \"bfa18767-ca55-45cd-bcc4-64e6e7572efe\") " Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.295163 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9917a47a-64be-47d0-a329-7380b87ac154-utilities\") pod \"9917a47a-64be-47d0-a329-7380b87ac154\" (UID: \"9917a47a-64be-47d0-a329-7380b87ac154\") " Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.295284 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bfa18767-ca55-45cd-bcc4-64e6e7572efe-ovsdbserver-nb\") pod \"bfa18767-ca55-45cd-bcc4-64e6e7572efe\" (UID: \"bfa18767-ca55-45cd-bcc4-64e6e7572efe\") " Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.295331 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfjls\" (UniqueName: \"kubernetes.io/projected/bfa18767-ca55-45cd-bcc4-64e6e7572efe-kube-api-access-zfjls\") pod \"bfa18767-ca55-45cd-bcc4-64e6e7572efe\" (UID: \"bfa18767-ca55-45cd-bcc4-64e6e7572efe\") " Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.296353 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9917a47a-64be-47d0-a329-7380b87ac154-utilities" (OuterVolumeSpecName: "utilities") pod "9917a47a-64be-47d0-a329-7380b87ac154" (UID: "9917a47a-64be-47d0-a329-7380b87ac154"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.300060 4723 scope.go:117] "RemoveContainer" containerID="45813ab33a1ef4c24cab8d3d96b5e9fa1a7afe25d6db8aefdb236843d5b3428c" Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.304053 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfa18767-ca55-45cd-bcc4-64e6e7572efe-kube-api-access-zfjls" (OuterVolumeSpecName: "kube-api-access-zfjls") pod "bfa18767-ca55-45cd-bcc4-64e6e7572efe" (UID: "bfa18767-ca55-45cd-bcc4-64e6e7572efe"). InnerVolumeSpecName "kube-api-access-zfjls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.308826 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9917a47a-64be-47d0-a329-7380b87ac154-kube-api-access-2bvms" (OuterVolumeSpecName: "kube-api-access-2bvms") pod "9917a47a-64be-47d0-a329-7380b87ac154" (UID: "9917a47a-64be-47d0-a329-7380b87ac154"). InnerVolumeSpecName "kube-api-access-2bvms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.339506 4723 scope.go:117] "RemoveContainer" containerID="c7a671fe39a8559be5b523f88ce2c63e3968ea192ea3716cce6389438c3574a4" Mar 09 13:25:39 crc kubenswrapper[4723]: E0309 13:25:39.340420 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7a671fe39a8559be5b523f88ce2c63e3968ea192ea3716cce6389438c3574a4\": container with ID starting with c7a671fe39a8559be5b523f88ce2c63e3968ea192ea3716cce6389438c3574a4 not found: ID does not exist" containerID="c7a671fe39a8559be5b523f88ce2c63e3968ea192ea3716cce6389438c3574a4" Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.340456 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7a671fe39a8559be5b523f88ce2c63e3968ea192ea3716cce6389438c3574a4"} err="failed to get container status \"c7a671fe39a8559be5b523f88ce2c63e3968ea192ea3716cce6389438c3574a4\": rpc error: code = NotFound desc = could not find container \"c7a671fe39a8559be5b523f88ce2c63e3968ea192ea3716cce6389438c3574a4\": container with ID starting with c7a671fe39a8559be5b523f88ce2c63e3968ea192ea3716cce6389438c3574a4 not found: ID does not exist" Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.340498 4723 scope.go:117] "RemoveContainer" containerID="6dbcc4b77677882c200bb673a7f1a29d8bd1638a9239b8af8b2145ed4803c7a9" Mar 09 13:25:39 crc kubenswrapper[4723]: E0309 13:25:39.341460 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dbcc4b77677882c200bb673a7f1a29d8bd1638a9239b8af8b2145ed4803c7a9\": container with ID starting with 6dbcc4b77677882c200bb673a7f1a29d8bd1638a9239b8af8b2145ed4803c7a9 not found: ID does not exist" containerID="6dbcc4b77677882c200bb673a7f1a29d8bd1638a9239b8af8b2145ed4803c7a9" Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.341499 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dbcc4b77677882c200bb673a7f1a29d8bd1638a9239b8af8b2145ed4803c7a9"} err="failed to get container status \"6dbcc4b77677882c200bb673a7f1a29d8bd1638a9239b8af8b2145ed4803c7a9\": rpc error: code = NotFound desc = could not find container \"6dbcc4b77677882c200bb673a7f1a29d8bd1638a9239b8af8b2145ed4803c7a9\": container with ID starting with 6dbcc4b77677882c200bb673a7f1a29d8bd1638a9239b8af8b2145ed4803c7a9 not found: ID does not exist" Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.341522 4723 scope.go:117] "RemoveContainer" containerID="45813ab33a1ef4c24cab8d3d96b5e9fa1a7afe25d6db8aefdb236843d5b3428c" Mar 09 13:25:39 crc kubenswrapper[4723]: E0309 13:25:39.341770 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45813ab33a1ef4c24cab8d3d96b5e9fa1a7afe25d6db8aefdb236843d5b3428c\": container with ID starting with 45813ab33a1ef4c24cab8d3d96b5e9fa1a7afe25d6db8aefdb236843d5b3428c not found: ID does not exist" containerID="45813ab33a1ef4c24cab8d3d96b5e9fa1a7afe25d6db8aefdb236843d5b3428c" Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.341811 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45813ab33a1ef4c24cab8d3d96b5e9fa1a7afe25d6db8aefdb236843d5b3428c"} err="failed to get container status \"45813ab33a1ef4c24cab8d3d96b5e9fa1a7afe25d6db8aefdb236843d5b3428c\": rpc error: code = NotFound desc = could not find container \"45813ab33a1ef4c24cab8d3d96b5e9fa1a7afe25d6db8aefdb236843d5b3428c\": container with ID starting with 45813ab33a1ef4c24cab8d3d96b5e9fa1a7afe25d6db8aefdb236843d5b3428c not found: ID does not exist" Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.341826 4723 scope.go:117] "RemoveContainer" containerID="adf856034772ecbc05fd5a4160867c585a5e82539c38e5c7b35afa1ecd573991" Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.378062 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9917a47a-64be-47d0-a329-7380b87ac154-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9917a47a-64be-47d0-a329-7380b87ac154" (UID: "9917a47a-64be-47d0-a329-7380b87ac154"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.394236 4723 scope.go:117] "RemoveContainer" containerID="43ef4f7e8b52c8a1edcc39bb5230605b6656c6dea9f355a6e4344b068563a4db" Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.397315 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfa18767-ca55-45cd-bcc4-64e6e7572efe-config" (OuterVolumeSpecName: "config") pod "bfa18767-ca55-45cd-bcc4-64e6e7572efe" (UID: "bfa18767-ca55-45cd-bcc4-64e6e7572efe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.397602 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfjls\" (UniqueName: \"kubernetes.io/projected/bfa18767-ca55-45cd-bcc4-64e6e7572efe-kube-api-access-zfjls\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.397627 4723 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9917a47a-64be-47d0-a329-7380b87ac154-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.397637 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bvms\" (UniqueName: \"kubernetes.io/projected/9917a47a-64be-47d0-a329-7380b87ac154-kube-api-access-2bvms\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.397645 4723 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfa18767-ca55-45cd-bcc4-64e6e7572efe-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.397654 4723 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9917a47a-64be-47d0-a329-7380b87ac154-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.402423 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfa18767-ca55-45cd-bcc4-64e6e7572efe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bfa18767-ca55-45cd-bcc4-64e6e7572efe" (UID: "bfa18767-ca55-45cd-bcc4-64e6e7572efe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.420966 4723 scope.go:117] "RemoveContainer" containerID="adf856034772ecbc05fd5a4160867c585a5e82539c38e5c7b35afa1ecd573991" Mar 09 13:25:39 crc kubenswrapper[4723]: E0309 13:25:39.421524 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adf856034772ecbc05fd5a4160867c585a5e82539c38e5c7b35afa1ecd573991\": container with ID starting with adf856034772ecbc05fd5a4160867c585a5e82539c38e5c7b35afa1ecd573991 not found: ID does not exist" containerID="adf856034772ecbc05fd5a4160867c585a5e82539c38e5c7b35afa1ecd573991" Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.421562 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adf856034772ecbc05fd5a4160867c585a5e82539c38e5c7b35afa1ecd573991"} err="failed to get container status \"adf856034772ecbc05fd5a4160867c585a5e82539c38e5c7b35afa1ecd573991\": rpc error: code = NotFound desc = could not find container \"adf856034772ecbc05fd5a4160867c585a5e82539c38e5c7b35afa1ecd573991\": container with ID starting with adf856034772ecbc05fd5a4160867c585a5e82539c38e5c7b35afa1ecd573991 not found: ID does not exist" Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.421583 4723 scope.go:117] "RemoveContainer" containerID="43ef4f7e8b52c8a1edcc39bb5230605b6656c6dea9f355a6e4344b068563a4db" Mar 09 13:25:39 crc kubenswrapper[4723]: E0309 13:25:39.422169 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43ef4f7e8b52c8a1edcc39bb5230605b6656c6dea9f355a6e4344b068563a4db\": container with ID starting with 43ef4f7e8b52c8a1edcc39bb5230605b6656c6dea9f355a6e4344b068563a4db not found: ID does not exist" containerID="43ef4f7e8b52c8a1edcc39bb5230605b6656c6dea9f355a6e4344b068563a4db" Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.422216 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43ef4f7e8b52c8a1edcc39bb5230605b6656c6dea9f355a6e4344b068563a4db"} err="failed to get container status \"43ef4f7e8b52c8a1edcc39bb5230605b6656c6dea9f355a6e4344b068563a4db\": rpc error: code = NotFound desc = could not find container \"43ef4f7e8b52c8a1edcc39bb5230605b6656c6dea9f355a6e4344b068563a4db\": container with ID starting with 43ef4f7e8b52c8a1edcc39bb5230605b6656c6dea9f355a6e4344b068563a4db not found: ID does not exist" Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.430939 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfa18767-ca55-45cd-bcc4-64e6e7572efe-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bfa18767-ca55-45cd-bcc4-64e6e7572efe" (UID: "bfa18767-ca55-45cd-bcc4-64e6e7572efe"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.432554 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfa18767-ca55-45cd-bcc4-64e6e7572efe-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bfa18767-ca55-45cd-bcc4-64e6e7572efe" (UID: "bfa18767-ca55-45cd-bcc4-64e6e7572efe"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.442138 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfa18767-ca55-45cd-bcc4-64e6e7572efe-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bfa18767-ca55-45cd-bcc4-64e6e7572efe" (UID: "bfa18767-ca55-45cd-bcc4-64e6e7572efe"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.456086 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfa18767-ca55-45cd-bcc4-64e6e7572efe-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "bfa18767-ca55-45cd-bcc4-64e6e7572efe" (UID: "bfa18767-ca55-45cd-bcc4-64e6e7572efe"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.500290 4723 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bfa18767-ca55-45cd-bcc4-64e6e7572efe-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.500328 4723 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfa18767-ca55-45cd-bcc4-64e6e7572efe-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.500342 4723 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bfa18767-ca55-45cd-bcc4-64e6e7572efe-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.500354 4723 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/bfa18767-ca55-45cd-bcc4-64e6e7572efe-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.500367 4723 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bfa18767-ca55-45cd-bcc4-64e6e7572efe-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.604962 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-znk5w"] Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.622815 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-znk5w"] Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.634437 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68df85789f-6qr75"] Mar 09 13:25:39 crc kubenswrapper[4723]: I0309 13:25:39.646602 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68df85789f-6qr75"] Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.271544 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df18bf19-d23a-471f-8074-2eaaa7c4aead","Type":"ContainerStarted","Data":"0d55dea8828b081d25249319534de8d0f9c9cf5705270b5351aa3833d425ddc8"} Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.271939 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df18bf19-d23a-471f-8074-2eaaa7c4aead","Type":"ContainerStarted","Data":"c6a3d9b337c6fb2fe0ae818452ff1c102da1eeddfb79b6a7a090eaffaa3c1bdf"} Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.377840 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-cd9b85f6c-jhcds"] Mar 09 13:25:40 crc kubenswrapper[4723]: E0309 13:25:40.379213 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9917a47a-64be-47d0-a329-7380b87ac154" containerName="registry-server" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.379242 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="9917a47a-64be-47d0-a329-7380b87ac154" containerName="registry-server" Mar 09 13:25:40 crc kubenswrapper[4723]: E0309 13:25:40.379291 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfa18767-ca55-45cd-bcc4-64e6e7572efe" containerName="init" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.379300 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfa18767-ca55-45cd-bcc4-64e6e7572efe" containerName="init" Mar 09 13:25:40 crc kubenswrapper[4723]: E0309 13:25:40.379315 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09a0d87c-3e58-4839-95fb-7c964152ed7c" containerName="heat-db-sync" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.379323 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="09a0d87c-3e58-4839-95fb-7c964152ed7c" containerName="heat-db-sync" Mar 09 13:25:40 crc kubenswrapper[4723]: E0309 13:25:40.379349 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9917a47a-64be-47d0-a329-7380b87ac154" containerName="extract-content" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.379356 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="9917a47a-64be-47d0-a329-7380b87ac154" containerName="extract-content" Mar 09 13:25:40 crc kubenswrapper[4723]: E0309 13:25:40.379397 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9917a47a-64be-47d0-a329-7380b87ac154" containerName="extract-utilities" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.379405 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="9917a47a-64be-47d0-a329-7380b87ac154" containerName="extract-utilities" Mar 09 13:25:40 crc kubenswrapper[4723]: E0309 13:25:40.379434 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfa18767-ca55-45cd-bcc4-64e6e7572efe" containerName="dnsmasq-dns" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.379445 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfa18767-ca55-45cd-bcc4-64e6e7572efe" containerName="dnsmasq-dns" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.393441 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="09a0d87c-3e58-4839-95fb-7c964152ed7c" containerName="heat-db-sync" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.393502 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfa18767-ca55-45cd-bcc4-64e6e7572efe" containerName="dnsmasq-dns" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.393559 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="9917a47a-64be-47d0-a329-7380b87ac154" containerName="registry-server" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.397653 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-cd9b85f6c-jhcds" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.428354 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/655446a7-da12-4146-bf48-9a7a74765c5c-config-data\") pod \"heat-engine-cd9b85f6c-jhcds\" (UID: \"655446a7-da12-4146-bf48-9a7a74765c5c\") " pod="openstack/heat-engine-cd9b85f6c-jhcds" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.428451 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/655446a7-da12-4146-bf48-9a7a74765c5c-config-data-custom\") pod \"heat-engine-cd9b85f6c-jhcds\" (UID: \"655446a7-da12-4146-bf48-9a7a74765c5c\") " pod="openstack/heat-engine-cd9b85f6c-jhcds" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.428498 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zx87\" (UniqueName: \"kubernetes.io/projected/655446a7-da12-4146-bf48-9a7a74765c5c-kube-api-access-7zx87\") pod \"heat-engine-cd9b85f6c-jhcds\" (UID: \"655446a7-da12-4146-bf48-9a7a74765c5c\") " pod="openstack/heat-engine-cd9b85f6c-jhcds" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.428664 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/655446a7-da12-4146-bf48-9a7a74765c5c-combined-ca-bundle\") pod \"heat-engine-cd9b85f6c-jhcds\" (UID: \"655446a7-da12-4146-bf48-9a7a74765c5c\") " pod="openstack/heat-engine-cd9b85f6c-jhcds" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.457907 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-cd9b85f6c-jhcds"] Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.482938 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-6c4d4cb58-tqwpm"] Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.485083 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6c4d4cb58-tqwpm" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.508299 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6c4d4cb58-tqwpm"] Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.522231 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-85fdddbb7f-wkltg"] Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.524077 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-85fdddbb7f-wkltg" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.530386 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zx87\" (UniqueName: \"kubernetes.io/projected/655446a7-da12-4146-bf48-9a7a74765c5c-kube-api-access-7zx87\") pod \"heat-engine-cd9b85f6c-jhcds\" (UID: \"655446a7-da12-4146-bf48-9a7a74765c5c\") " pod="openstack/heat-engine-cd9b85f6c-jhcds" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.530446 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e28e2823-9dea-4b47-9489-47442b2cfa08-combined-ca-bundle\") pod \"heat-api-6c4d4cb58-tqwpm\" (UID: \"e28e2823-9dea-4b47-9489-47442b2cfa08\") " pod="openstack/heat-api-6c4d4cb58-tqwpm" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.530513 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e28e2823-9dea-4b47-9489-47442b2cfa08-public-tls-certs\") pod \"heat-api-6c4d4cb58-tqwpm\" (UID: \"e28e2823-9dea-4b47-9489-47442b2cfa08\") " pod="openstack/heat-api-6c4d4cb58-tqwpm" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.530589 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e28e2823-9dea-4b47-9489-47442b2cfa08-config-data-custom\") pod \"heat-api-6c4d4cb58-tqwpm\" (UID: \"e28e2823-9dea-4b47-9489-47442b2cfa08\") " pod="openstack/heat-api-6c4d4cb58-tqwpm" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.530608 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e28e2823-9dea-4b47-9489-47442b2cfa08-internal-tls-certs\") pod \"heat-api-6c4d4cb58-tqwpm\" (UID: \"e28e2823-9dea-4b47-9489-47442b2cfa08\") " pod="openstack/heat-api-6c4d4cb58-tqwpm" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.530633 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e28e2823-9dea-4b47-9489-47442b2cfa08-config-data\") pod \"heat-api-6c4d4cb58-tqwpm\" (UID: \"e28e2823-9dea-4b47-9489-47442b2cfa08\") " pod="openstack/heat-api-6c4d4cb58-tqwpm" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.530711 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/655446a7-da12-4146-bf48-9a7a74765c5c-combined-ca-bundle\") pod \"heat-engine-cd9b85f6c-jhcds\" (UID: \"655446a7-da12-4146-bf48-9a7a74765c5c\") " pod="openstack/heat-engine-cd9b85f6c-jhcds" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.530753 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/655446a7-da12-4146-bf48-9a7a74765c5c-config-data\") pod \"heat-engine-cd9b85f6c-jhcds\" (UID: \"655446a7-da12-4146-bf48-9a7a74765c5c\") " pod="openstack/heat-engine-cd9b85f6c-jhcds" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.530792 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/655446a7-da12-4146-bf48-9a7a74765c5c-config-data-custom\") pod \"heat-engine-cd9b85f6c-jhcds\" (UID: \"655446a7-da12-4146-bf48-9a7a74765c5c\") " pod="openstack/heat-engine-cd9b85f6c-jhcds" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.530818 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-799t7\" (UniqueName: \"kubernetes.io/projected/e28e2823-9dea-4b47-9489-47442b2cfa08-kube-api-access-799t7\") pod \"heat-api-6c4d4cb58-tqwpm\" (UID: \"e28e2823-9dea-4b47-9489-47442b2cfa08\") " pod="openstack/heat-api-6c4d4cb58-tqwpm" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.533598 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-85fdddbb7f-wkltg"] Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.536134 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/655446a7-da12-4146-bf48-9a7a74765c5c-config-data-custom\") pod \"heat-engine-cd9b85f6c-jhcds\" (UID: \"655446a7-da12-4146-bf48-9a7a74765c5c\") " pod="openstack/heat-engine-cd9b85f6c-jhcds" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.536927 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/655446a7-da12-4146-bf48-9a7a74765c5c-combined-ca-bundle\") pod \"heat-engine-cd9b85f6c-jhcds\" (UID: \"655446a7-da12-4146-bf48-9a7a74765c5c\") " pod="openstack/heat-engine-cd9b85f6c-jhcds" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.538940 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/655446a7-da12-4146-bf48-9a7a74765c5c-config-data\") pod \"heat-engine-cd9b85f6c-jhcds\" (UID: \"655446a7-da12-4146-bf48-9a7a74765c5c\") " pod="openstack/heat-engine-cd9b85f6c-jhcds" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.546024 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zx87\" (UniqueName: \"kubernetes.io/projected/655446a7-da12-4146-bf48-9a7a74765c5c-kube-api-access-7zx87\") pod \"heat-engine-cd9b85f6c-jhcds\" (UID: \"655446a7-da12-4146-bf48-9a7a74765c5c\") " pod="openstack/heat-engine-cd9b85f6c-jhcds" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.632453 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e28e2823-9dea-4b47-9489-47442b2cfa08-public-tls-certs\") pod \"heat-api-6c4d4cb58-tqwpm\" (UID: \"e28e2823-9dea-4b47-9489-47442b2cfa08\") " pod="openstack/heat-api-6c4d4cb58-tqwpm" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.632538 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bc1a36b-8585-4de5-aca1-32edc3ef3b8d-internal-tls-certs\") pod \"heat-cfnapi-85fdddbb7f-wkltg\" (UID: \"3bc1a36b-8585-4de5-aca1-32edc3ef3b8d\") " pod="openstack/heat-cfnapi-85fdddbb7f-wkltg" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.632576 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e28e2823-9dea-4b47-9489-47442b2cfa08-config-data-custom\") pod \"heat-api-6c4d4cb58-tqwpm\" (UID: \"e28e2823-9dea-4b47-9489-47442b2cfa08\") " pod="openstack/heat-api-6c4d4cb58-tqwpm" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.632599 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e28e2823-9dea-4b47-9489-47442b2cfa08-internal-tls-certs\") pod \"heat-api-6c4d4cb58-tqwpm\" (UID: \"e28e2823-9dea-4b47-9489-47442b2cfa08\") " pod="openstack/heat-api-6c4d4cb58-tqwpm" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.632622 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e28e2823-9dea-4b47-9489-47442b2cfa08-config-data\") pod \"heat-api-6c4d4cb58-tqwpm\" (UID: \"e28e2823-9dea-4b47-9489-47442b2cfa08\") " pod="openstack/heat-api-6c4d4cb58-tqwpm" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.632657 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bc1a36b-8585-4de5-aca1-32edc3ef3b8d-public-tls-certs\") pod \"heat-cfnapi-85fdddbb7f-wkltg\" (UID: \"3bc1a36b-8585-4de5-aca1-32edc3ef3b8d\") " pod="openstack/heat-cfnapi-85fdddbb7f-wkltg" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.632691 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3bc1a36b-8585-4de5-aca1-32edc3ef3b8d-config-data-custom\") pod \"heat-cfnapi-85fdddbb7f-wkltg\" (UID: \"3bc1a36b-8585-4de5-aca1-32edc3ef3b8d\") " pod="openstack/heat-cfnapi-85fdddbb7f-wkltg" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.632739 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrprc\" (UniqueName: \"kubernetes.io/projected/3bc1a36b-8585-4de5-aca1-32edc3ef3b8d-kube-api-access-rrprc\") pod \"heat-cfnapi-85fdddbb7f-wkltg\" (UID: \"3bc1a36b-8585-4de5-aca1-32edc3ef3b8d\") " pod="openstack/heat-cfnapi-85fdddbb7f-wkltg" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.632764 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bc1a36b-8585-4de5-aca1-32edc3ef3b8d-combined-ca-bundle\") pod \"heat-cfnapi-85fdddbb7f-wkltg\" (UID: \"3bc1a36b-8585-4de5-aca1-32edc3ef3b8d\") " pod="openstack/heat-cfnapi-85fdddbb7f-wkltg" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.632813 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-799t7\" (UniqueName: \"kubernetes.io/projected/e28e2823-9dea-4b47-9489-47442b2cfa08-kube-api-access-799t7\") pod \"heat-api-6c4d4cb58-tqwpm\" (UID: \"e28e2823-9dea-4b47-9489-47442b2cfa08\") " pod="openstack/heat-api-6c4d4cb58-tqwpm" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.632841 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bc1a36b-8585-4de5-aca1-32edc3ef3b8d-config-data\") pod \"heat-cfnapi-85fdddbb7f-wkltg\" (UID: \"3bc1a36b-8585-4de5-aca1-32edc3ef3b8d\") " pod="openstack/heat-cfnapi-85fdddbb7f-wkltg" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.632889 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e28e2823-9dea-4b47-9489-47442b2cfa08-combined-ca-bundle\") pod \"heat-api-6c4d4cb58-tqwpm\" (UID: \"e28e2823-9dea-4b47-9489-47442b2cfa08\") " pod="openstack/heat-api-6c4d4cb58-tqwpm" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.635846 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e28e2823-9dea-4b47-9489-47442b2cfa08-public-tls-certs\") pod \"heat-api-6c4d4cb58-tqwpm\" (UID: \"e28e2823-9dea-4b47-9489-47442b2cfa08\") " pod="openstack/heat-api-6c4d4cb58-tqwpm" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.636563 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e28e2823-9dea-4b47-9489-47442b2cfa08-combined-ca-bundle\") pod \"heat-api-6c4d4cb58-tqwpm\" (UID: \"e28e2823-9dea-4b47-9489-47442b2cfa08\") " pod="openstack/heat-api-6c4d4cb58-tqwpm" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.639366 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e28e2823-9dea-4b47-9489-47442b2cfa08-config-data\") pod \"heat-api-6c4d4cb58-tqwpm\" (UID: \"e28e2823-9dea-4b47-9489-47442b2cfa08\") " pod="openstack/heat-api-6c4d4cb58-tqwpm" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.640069 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e28e2823-9dea-4b47-9489-47442b2cfa08-internal-tls-certs\") pod \"heat-api-6c4d4cb58-tqwpm\" (UID: \"e28e2823-9dea-4b47-9489-47442b2cfa08\") " pod="openstack/heat-api-6c4d4cb58-tqwpm" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.640761 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e28e2823-9dea-4b47-9489-47442b2cfa08-config-data-custom\") pod \"heat-api-6c4d4cb58-tqwpm\" (UID: \"e28e2823-9dea-4b47-9489-47442b2cfa08\") " pod="openstack/heat-api-6c4d4cb58-tqwpm" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.647664 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-799t7\" (UniqueName: \"kubernetes.io/projected/e28e2823-9dea-4b47-9489-47442b2cfa08-kube-api-access-799t7\") pod \"heat-api-6c4d4cb58-tqwpm\" (UID: \"e28e2823-9dea-4b47-9489-47442b2cfa08\") " pod="openstack/heat-api-6c4d4cb58-tqwpm" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.720010 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-cd9b85f6c-jhcds" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.734945 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bc1a36b-8585-4de5-aca1-32edc3ef3b8d-internal-tls-certs\") pod \"heat-cfnapi-85fdddbb7f-wkltg\" (UID: \"3bc1a36b-8585-4de5-aca1-32edc3ef3b8d\") " pod="openstack/heat-cfnapi-85fdddbb7f-wkltg" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.735008 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bc1a36b-8585-4de5-aca1-32edc3ef3b8d-public-tls-certs\") pod \"heat-cfnapi-85fdddbb7f-wkltg\" (UID: \"3bc1a36b-8585-4de5-aca1-32edc3ef3b8d\") " pod="openstack/heat-cfnapi-85fdddbb7f-wkltg" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.735043 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3bc1a36b-8585-4de5-aca1-32edc3ef3b8d-config-data-custom\") pod \"heat-cfnapi-85fdddbb7f-wkltg\" (UID: \"3bc1a36b-8585-4de5-aca1-32edc3ef3b8d\") " pod="openstack/heat-cfnapi-85fdddbb7f-wkltg" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.735092 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrprc\" (UniqueName: \"kubernetes.io/projected/3bc1a36b-8585-4de5-aca1-32edc3ef3b8d-kube-api-access-rrprc\") pod \"heat-cfnapi-85fdddbb7f-wkltg\" (UID: \"3bc1a36b-8585-4de5-aca1-32edc3ef3b8d\") " pod="openstack/heat-cfnapi-85fdddbb7f-wkltg" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.735119 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bc1a36b-8585-4de5-aca1-32edc3ef3b8d-combined-ca-bundle\") pod \"heat-cfnapi-85fdddbb7f-wkltg\" (UID: \"3bc1a36b-8585-4de5-aca1-32edc3ef3b8d\") " pod="openstack/heat-cfnapi-85fdddbb7f-wkltg" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.735179 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bc1a36b-8585-4de5-aca1-32edc3ef3b8d-config-data\") pod \"heat-cfnapi-85fdddbb7f-wkltg\" (UID: \"3bc1a36b-8585-4de5-aca1-32edc3ef3b8d\") " pod="openstack/heat-cfnapi-85fdddbb7f-wkltg" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.739254 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bc1a36b-8585-4de5-aca1-32edc3ef3b8d-combined-ca-bundle\") pod \"heat-cfnapi-85fdddbb7f-wkltg\" (UID: \"3bc1a36b-8585-4de5-aca1-32edc3ef3b8d\") " pod="openstack/heat-cfnapi-85fdddbb7f-wkltg" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.739601 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bc1a36b-8585-4de5-aca1-32edc3ef3b8d-internal-tls-certs\") pod \"heat-cfnapi-85fdddbb7f-wkltg\" (UID: \"3bc1a36b-8585-4de5-aca1-32edc3ef3b8d\") " pod="openstack/heat-cfnapi-85fdddbb7f-wkltg" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.743741 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bc1a36b-8585-4de5-aca1-32edc3ef3b8d-config-data\") pod \"heat-cfnapi-85fdddbb7f-wkltg\" (UID: \"3bc1a36b-8585-4de5-aca1-32edc3ef3b8d\") " pod="openstack/heat-cfnapi-85fdddbb7f-wkltg" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.746780 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3bc1a36b-8585-4de5-aca1-32edc3ef3b8d-config-data-custom\") pod \"heat-cfnapi-85fdddbb7f-wkltg\" (UID: \"3bc1a36b-8585-4de5-aca1-32edc3ef3b8d\") " pod="openstack/heat-cfnapi-85fdddbb7f-wkltg" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.746894 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bc1a36b-8585-4de5-aca1-32edc3ef3b8d-public-tls-certs\") pod \"heat-cfnapi-85fdddbb7f-wkltg\" (UID: \"3bc1a36b-8585-4de5-aca1-32edc3ef3b8d\") " pod="openstack/heat-cfnapi-85fdddbb7f-wkltg" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.758199 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrprc\" (UniqueName: \"kubernetes.io/projected/3bc1a36b-8585-4de5-aca1-32edc3ef3b8d-kube-api-access-rrprc\") pod \"heat-cfnapi-85fdddbb7f-wkltg\" (UID: \"3bc1a36b-8585-4de5-aca1-32edc3ef3b8d\") " pod="openstack/heat-cfnapi-85fdddbb7f-wkltg" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.816826 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6c4d4cb58-tqwpm" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.892434 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-85fdddbb7f-wkltg" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.918091 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9917a47a-64be-47d0-a329-7380b87ac154" path="/var/lib/kubelet/pods/9917a47a-64be-47d0-a329-7380b87ac154/volumes" Mar 09 13:25:40 crc kubenswrapper[4723]: I0309 13:25:40.919090 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfa18767-ca55-45cd-bcc4-64e6e7572efe" path="/var/lib/kubelet/pods/bfa18767-ca55-45cd-bcc4-64e6e7572efe/volumes" Mar 09 13:25:42 crc kubenswrapper[4723]: I0309 13:25:41.268256 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-cd9b85f6c-jhcds"] Mar 09 13:25:42 crc kubenswrapper[4723]: I0309 13:25:41.295064 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-cd9b85f6c-jhcds" event={"ID":"655446a7-da12-4146-bf48-9a7a74765c5c","Type":"ContainerStarted","Data":"829ba90b17cfb5336f29f4281d34e444232dbb99b242918caa411b45d894aada"} Mar 09 13:25:42 crc kubenswrapper[4723]: I0309 13:25:42.155308 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6c4d4cb58-tqwpm"] Mar 09 13:25:42 crc kubenswrapper[4723]: W0309 13:25:42.155826 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode28e2823_9dea_4b47_9489_47442b2cfa08.slice/crio-49566a76318e49e2fe7298ecb7e61246222aaa3131ef3adddea0df62a7e01518 WatchSource:0}: Error finding container 49566a76318e49e2fe7298ecb7e61246222aaa3131ef3adddea0df62a7e01518: Status 404 returned error can't find the container with id 49566a76318e49e2fe7298ecb7e61246222aaa3131ef3adddea0df62a7e01518 Mar 09 13:25:42 crc kubenswrapper[4723]: I0309 13:25:42.167199 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-85fdddbb7f-wkltg"] Mar 09 13:25:42 crc kubenswrapper[4723]: I0309 13:25:42.310737 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-cd9b85f6c-jhcds" event={"ID":"655446a7-da12-4146-bf48-9a7a74765c5c","Type":"ContainerStarted","Data":"56b12060e5132d232912696eed6118b8d1a3e74b0846bc86980a0538950f76fe"} Mar 09 13:25:42 crc kubenswrapper[4723]: I0309 13:25:42.310814 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-cd9b85f6c-jhcds" Mar 09 13:25:42 crc kubenswrapper[4723]: I0309 13:25:42.313750 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6c4d4cb58-tqwpm" event={"ID":"e28e2823-9dea-4b47-9489-47442b2cfa08","Type":"ContainerStarted","Data":"49566a76318e49e2fe7298ecb7e61246222aaa3131ef3adddea0df62a7e01518"} Mar 09 13:25:42 crc kubenswrapper[4723]: I0309 13:25:42.315718 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-85fdddbb7f-wkltg" event={"ID":"3bc1a36b-8585-4de5-aca1-32edc3ef3b8d","Type":"ContainerStarted","Data":"9da3cb58581bb8b77f5342307b1a3f90759ef9fe676e7aeaada20e6f3b26648d"} Mar 09 13:25:42 crc kubenswrapper[4723]: I0309 13:25:42.340853 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-cd9b85f6c-jhcds" podStartSLOduration=2.3408320590000002 podStartE2EDuration="2.340832059s" podCreationTimestamp="2026-03-09 13:25:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:25:42.324674412 +0000 UTC m=+1616.339141962" watchObservedRunningTime="2026-03-09 13:25:42.340832059 +0000 UTC m=+1616.355299599" Mar 09 13:25:43 crc kubenswrapper[4723]: I0309 13:25:43.362066 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df18bf19-d23a-471f-8074-2eaaa7c4aead","Type":"ContainerStarted","Data":"6919a69720391ba285fbb945f6b483d4bb4f0c3cd5b88f0c5c553f1e799963f7"} Mar 09 13:25:43 crc kubenswrapper[4723]: I0309 13:25:43.362621 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 13:25:43 crc kubenswrapper[4723]: I0309 13:25:43.389242 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.470203135 podStartE2EDuration="10.389224129s" podCreationTimestamp="2026-03-09 13:25:33 +0000 UTC" firstStartedPulling="2026-03-09 13:25:34.018605348 +0000 UTC m=+1608.033072888" lastFinishedPulling="2026-03-09 13:25:41.937626332 +0000 UTC m=+1615.952093882" observedRunningTime="2026-03-09 13:25:43.386559109 +0000 UTC m=+1617.401026649" watchObservedRunningTime="2026-03-09 13:25:43.389224129 +0000 UTC m=+1617.403691679" Mar 09 13:25:45 crc kubenswrapper[4723]: I0309 13:25:45.381735 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6c4d4cb58-tqwpm" event={"ID":"e28e2823-9dea-4b47-9489-47442b2cfa08","Type":"ContainerStarted","Data":"a23d32438ba937d34a6cc3ba2770eccc11c29ffce5a569998f6097eda8ac074b"} Mar 09 13:25:45 crc kubenswrapper[4723]: I0309 13:25:45.382295 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-6c4d4cb58-tqwpm" Mar 09 13:25:45 crc kubenswrapper[4723]: I0309 13:25:45.383837 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-85fdddbb7f-wkltg" event={"ID":"3bc1a36b-8585-4de5-aca1-32edc3ef3b8d","Type":"ContainerStarted","Data":"38cdb3d840c48ea01a76b632f017f50c2c247d25c6b1b91b3228fba12485410d"} Mar 09 13:25:45 crc kubenswrapper[4723]: I0309 13:25:45.383964 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-85fdddbb7f-wkltg" Mar 09 13:25:45 crc kubenswrapper[4723]: I0309 13:25:45.416739 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-6c4d4cb58-tqwpm" podStartSLOduration=3.452182504 podStartE2EDuration="5.41672131s" podCreationTimestamp="2026-03-09 13:25:40 +0000 UTC" firstStartedPulling="2026-03-09 13:25:42.159674301 +0000 UTC m=+1616.174141851" lastFinishedPulling="2026-03-09 13:25:44.124213117 +0000 UTC m=+1618.138680657" observedRunningTime="2026-03-09 13:25:45.401085157 +0000 UTC m=+1619.415552717" watchObservedRunningTime="2026-03-09 13:25:45.41672131 +0000 UTC m=+1619.431188850" Mar 09 13:25:45 crc kubenswrapper[4723]: I0309 13:25:45.445478 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-85fdddbb7f-wkltg" podStartSLOduration=3.48412766 podStartE2EDuration="5.44545388s" podCreationTimestamp="2026-03-09 13:25:40 +0000 UTC" firstStartedPulling="2026-03-09 13:25:42.161206072 +0000 UTC m=+1616.175673612" lastFinishedPulling="2026-03-09 13:25:44.122532302 +0000 UTC m=+1618.136999832" observedRunningTime="2026-03-09 13:25:45.428737438 +0000 UTC m=+1619.443205018" watchObservedRunningTime="2026-03-09 13:25:45.44545388 +0000 UTC m=+1619.459921430" Mar 09 13:25:53 crc kubenswrapper[4723]: I0309 13:25:53.096645 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fxqdh"] Mar 09 13:25:53 crc kubenswrapper[4723]: I0309 13:25:53.098892 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fxqdh" Mar 09 13:25:53 crc kubenswrapper[4723]: I0309 13:25:53.102878 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gw7vt" Mar 09 13:25:53 crc kubenswrapper[4723]: I0309 13:25:53.103102 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:25:53 crc kubenswrapper[4723]: I0309 13:25:53.103546 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:25:53 crc kubenswrapper[4723]: I0309 13:25:53.103672 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:25:53 crc kubenswrapper[4723]: I0309 13:25:53.139874 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fxqdh"] Mar 09 13:25:53 crc kubenswrapper[4723]: I0309 13:25:53.161260 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a2886a0-218f-4284-aacd-19614f6f602f-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fxqdh\" (UID: \"6a2886a0-218f-4284-aacd-19614f6f602f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fxqdh" Mar 09 13:25:53 crc kubenswrapper[4723]: I0309 13:25:53.161380 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a2886a0-218f-4284-aacd-19614f6f602f-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fxqdh\" (UID: \"6a2886a0-218f-4284-aacd-19614f6f602f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fxqdh" Mar 09 13:25:53 crc kubenswrapper[4723]: I0309 13:25:53.161430 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6a2886a0-218f-4284-aacd-19614f6f602f-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fxqdh\" (UID: \"6a2886a0-218f-4284-aacd-19614f6f602f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fxqdh" Mar 09 13:25:53 crc kubenswrapper[4723]: I0309 13:25:53.161456 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2kds\" (UniqueName: \"kubernetes.io/projected/6a2886a0-218f-4284-aacd-19614f6f602f-kube-api-access-n2kds\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fxqdh\" (UID: \"6a2886a0-218f-4284-aacd-19614f6f602f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fxqdh" Mar 09 13:25:53 crc kubenswrapper[4723]: I0309 13:25:53.263076 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a2886a0-218f-4284-aacd-19614f6f602f-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fxqdh\" (UID: \"6a2886a0-218f-4284-aacd-19614f6f602f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fxqdh" Mar 09 13:25:53 crc kubenswrapper[4723]: I0309 13:25:53.263171 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a2886a0-218f-4284-aacd-19614f6f602f-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fxqdh\" (UID: \"6a2886a0-218f-4284-aacd-19614f6f602f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fxqdh" Mar 09 13:25:53 crc kubenswrapper[4723]: I0309 13:25:53.263211 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6a2886a0-218f-4284-aacd-19614f6f602f-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fxqdh\" (UID: \"6a2886a0-218f-4284-aacd-19614f6f602f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fxqdh" Mar 09 13:25:53 crc kubenswrapper[4723]: I0309 13:25:53.263233 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2kds\" (UniqueName: \"kubernetes.io/projected/6a2886a0-218f-4284-aacd-19614f6f602f-kube-api-access-n2kds\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fxqdh\" (UID: \"6a2886a0-218f-4284-aacd-19614f6f602f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fxqdh" Mar 09 13:25:53 crc kubenswrapper[4723]: I0309 13:25:53.279505 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a2886a0-218f-4284-aacd-19614f6f602f-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fxqdh\" (UID: \"6a2886a0-218f-4284-aacd-19614f6f602f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fxqdh" Mar 09 13:25:53 crc kubenswrapper[4723]: I0309 13:25:53.280932 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6a2886a0-218f-4284-aacd-19614f6f602f-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fxqdh\" (UID: \"6a2886a0-218f-4284-aacd-19614f6f602f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fxqdh" Mar 09 13:25:53 crc kubenswrapper[4723]: I0309 13:25:53.284292 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a2886a0-218f-4284-aacd-19614f6f602f-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fxqdh\" (UID: \"6a2886a0-218f-4284-aacd-19614f6f602f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fxqdh" Mar 09 13:25:53 crc kubenswrapper[4723]: I0309 13:25:53.305671 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2kds\" (UniqueName: \"kubernetes.io/projected/6a2886a0-218f-4284-aacd-19614f6f602f-kube-api-access-n2kds\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fxqdh\" (UID: \"6a2886a0-218f-4284-aacd-19614f6f602f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fxqdh" Mar 09 13:25:53 crc kubenswrapper[4723]: I0309 13:25:53.418559 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fxqdh" Mar 09 13:25:54 crc kubenswrapper[4723]: I0309 13:25:54.503774 4723 generic.go:334] "Generic (PLEG): container finished" podID="2e967475-660d-4ada-b409-bae77e4f6905" containerID="69686ee38e6c83423b412b76dcb2ac42567f3a941394103f109c7fe4803a283d" exitCode=0 Mar 09 13:25:54 crc kubenswrapper[4723]: I0309 13:25:54.503871 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"2e967475-660d-4ada-b409-bae77e4f6905","Type":"ContainerDied","Data":"69686ee38e6c83423b412b76dcb2ac42567f3a941394103f109c7fe4803a283d"} Mar 09 13:25:54 crc kubenswrapper[4723]: I0309 13:25:54.506813 4723 generic.go:334] "Generic (PLEG): container finished" podID="fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d" containerID="e8ab742b2d0e81a16863f2986110e7446d2af694a345fcba00cc2e44f5825c84" exitCode=0 Mar 09 13:25:54 crc kubenswrapper[4723]: I0309 13:25:54.506899 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d","Type":"ContainerDied","Data":"e8ab742b2d0e81a16863f2986110e7446d2af694a345fcba00cc2e44f5825c84"} Mar 09 13:25:55 crc kubenswrapper[4723]: I0309 13:25:55.036792 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fxqdh"] Mar 09 13:25:55 crc kubenswrapper[4723]: I0309 13:25:55.482351 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-6c4d4cb58-tqwpm" Mar 09 13:25:55 crc kubenswrapper[4723]: I0309 13:25:55.529554 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d","Type":"ContainerStarted","Data":"a8688fc669559759464fa3593d4b8254cdef48dfae939710988501a772cdf688"} Mar 09 13:25:55 crc kubenswrapper[4723]: I0309 13:25:55.530039 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:25:55 crc kubenswrapper[4723]: I0309 13:25:55.532502 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fxqdh" event={"ID":"6a2886a0-218f-4284-aacd-19614f6f602f","Type":"ContainerStarted","Data":"68c2b6eeea7cd07d1618e16eef1808b774b90c7b0408be0e86193885c05defa1"} Mar 09 13:25:55 crc kubenswrapper[4723]: I0309 13:25:55.562227 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"2e967475-660d-4ada-b409-bae77e4f6905","Type":"ContainerStarted","Data":"23b21764fb54da6e8cfa2a245bc9a1257817589cca78cdaa7039f77646e77e97"} Mar 09 13:25:55 crc kubenswrapper[4723]: I0309 13:25:55.563420 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-2" Mar 09 13:25:55 crc kubenswrapper[4723]: I0309 13:25:55.590002 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-68d98b8999-qqz47"] Mar 09 13:25:55 crc kubenswrapper[4723]: I0309 13:25:55.590272 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-68d98b8999-qqz47" podUID="a4841a92-8277-45f9-b366-8913a20ec8ad" containerName="heat-api" containerID="cri-o://39f1a1293b39599f2405d7759e1a23a7dee3c5c94e095be6f36696d10beb6dee" gracePeriod=60 Mar 09 13:25:55 crc kubenswrapper[4723]: I0309 13:25:55.599518 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.599493537 podStartE2EDuration="37.599493537s" podCreationTimestamp="2026-03-09 13:25:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:25:55.586269838 +0000 UTC m=+1629.600737378" watchObservedRunningTime="2026-03-09 13:25:55.599493537 +0000 UTC m=+1629.613961077" Mar 09 13:25:55 crc kubenswrapper[4723]: I0309 13:25:55.653972 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-2" podStartSLOduration=38.653952247 podStartE2EDuration="38.653952247s" podCreationTimestamp="2026-03-09 13:25:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:25:55.630847886 +0000 UTC m=+1629.645315446" watchObservedRunningTime="2026-03-09 13:25:55.653952247 +0000 UTC m=+1629.668419797" Mar 09 13:25:56 crc kubenswrapper[4723]: I0309 13:25:56.358915 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-85fdddbb7f-wkltg" Mar 09 13:25:56 crc kubenswrapper[4723]: I0309 13:25:56.442527 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6578b64f7d-9cxnx"] Mar 09 13:25:56 crc kubenswrapper[4723]: I0309 13:25:56.442817 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-6578b64f7d-9cxnx" podUID="9b314084-941d-4d00-bae6-6fdce2dc24db" containerName="heat-cfnapi" containerID="cri-o://1848579cb3ae52f7204c15281b7aa115c813e778cf2db883f38374cd706d1d90" gracePeriod=60 Mar 09 13:25:58 crc kubenswrapper[4723]: I0309 13:25:58.836492 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-68d98b8999-qqz47" podUID="a4841a92-8277-45f9-b366-8913a20ec8ad" containerName="heat-api" probeResult="failure" output="Get \"https://10.217.0.237:8004/healthcheck\": read tcp 10.217.0.2:55942->10.217.0.237:8004: read: connection reset by peer" Mar 09 13:25:59 crc kubenswrapper[4723]: I0309 13:25:59.479540 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-68d98b8999-qqz47" Mar 09 13:25:59 crc kubenswrapper[4723]: I0309 13:25:59.500439 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4841a92-8277-45f9-b366-8913a20ec8ad-internal-tls-certs\") pod \"a4841a92-8277-45f9-b366-8913a20ec8ad\" (UID: \"a4841a92-8277-45f9-b366-8913a20ec8ad\") " Mar 09 13:25:59 crc kubenswrapper[4723]: I0309 13:25:59.500679 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4841a92-8277-45f9-b366-8913a20ec8ad-combined-ca-bundle\") pod \"a4841a92-8277-45f9-b366-8913a20ec8ad\" (UID: \"a4841a92-8277-45f9-b366-8913a20ec8ad\") " Mar 09 13:25:59 crc kubenswrapper[4723]: I0309 13:25:59.500739 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4841a92-8277-45f9-b366-8913a20ec8ad-config-data-custom\") pod \"a4841a92-8277-45f9-b366-8913a20ec8ad\" (UID: \"a4841a92-8277-45f9-b366-8913a20ec8ad\") " Mar 09 13:25:59 crc kubenswrapper[4723]: I0309 13:25:59.500762 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4841a92-8277-45f9-b366-8913a20ec8ad-public-tls-certs\") pod \"a4841a92-8277-45f9-b366-8913a20ec8ad\" (UID: \"a4841a92-8277-45f9-b366-8913a20ec8ad\") " Mar 09 13:25:59 crc kubenswrapper[4723]: I0309 13:25:59.500782 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbs6l\" (UniqueName: \"kubernetes.io/projected/a4841a92-8277-45f9-b366-8913a20ec8ad-kube-api-access-vbs6l\") pod \"a4841a92-8277-45f9-b366-8913a20ec8ad\" (UID: \"a4841a92-8277-45f9-b366-8913a20ec8ad\") " Mar 09 13:25:59 crc kubenswrapper[4723]: I0309 13:25:59.500801 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4841a92-8277-45f9-b366-8913a20ec8ad-config-data\") pod \"a4841a92-8277-45f9-b366-8913a20ec8ad\" (UID: \"a4841a92-8277-45f9-b366-8913a20ec8ad\") " Mar 09 13:25:59 crc kubenswrapper[4723]: I0309 13:25:59.514932 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4841a92-8277-45f9-b366-8913a20ec8ad-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a4841a92-8277-45f9-b366-8913a20ec8ad" (UID: "a4841a92-8277-45f9-b366-8913a20ec8ad"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:25:59 crc kubenswrapper[4723]: I0309 13:25:59.515609 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4841a92-8277-45f9-b366-8913a20ec8ad-kube-api-access-vbs6l" (OuterVolumeSpecName: "kube-api-access-vbs6l") pod "a4841a92-8277-45f9-b366-8913a20ec8ad" (UID: "a4841a92-8277-45f9-b366-8913a20ec8ad"). InnerVolumeSpecName "kube-api-access-vbs6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:25:59 crc kubenswrapper[4723]: I0309 13:25:59.584962 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4841a92-8277-45f9-b366-8913a20ec8ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4841a92-8277-45f9-b366-8913a20ec8ad" (UID: "a4841a92-8277-45f9-b366-8913a20ec8ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:25:59 crc kubenswrapper[4723]: I0309 13:25:59.611248 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4841a92-8277-45f9-b366-8913a20ec8ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:59 crc kubenswrapper[4723]: I0309 13:25:59.611288 4723 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4841a92-8277-45f9-b366-8913a20ec8ad-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:59 crc kubenswrapper[4723]: I0309 13:25:59.611302 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbs6l\" (UniqueName: \"kubernetes.io/projected/a4841a92-8277-45f9-b366-8913a20ec8ad-kube-api-access-vbs6l\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:59 crc kubenswrapper[4723]: I0309 13:25:59.632766 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4841a92-8277-45f9-b366-8913a20ec8ad-config-data" (OuterVolumeSpecName: "config-data") pod "a4841a92-8277-45f9-b366-8913a20ec8ad" (UID: "a4841a92-8277-45f9-b366-8913a20ec8ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:25:59 crc kubenswrapper[4723]: I0309 13:25:59.671527 4723 generic.go:334] "Generic (PLEG): container finished" podID="a4841a92-8277-45f9-b366-8913a20ec8ad" containerID="39f1a1293b39599f2405d7759e1a23a7dee3c5c94e095be6f36696d10beb6dee" exitCode=0 Mar 09 13:25:59 crc kubenswrapper[4723]: I0309 13:25:59.671571 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-68d98b8999-qqz47" event={"ID":"a4841a92-8277-45f9-b366-8913a20ec8ad","Type":"ContainerDied","Data":"39f1a1293b39599f2405d7759e1a23a7dee3c5c94e095be6f36696d10beb6dee"} Mar 09 13:25:59 crc kubenswrapper[4723]: I0309 13:25:59.671595 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-68d98b8999-qqz47" event={"ID":"a4841a92-8277-45f9-b366-8913a20ec8ad","Type":"ContainerDied","Data":"397c9ac834a8653b4e8065ceeef8bd7bd7300bf37d394d257680654a0671cba1"} Mar 09 13:25:59 crc kubenswrapper[4723]: I0309 13:25:59.671612 4723 scope.go:117] "RemoveContainer" containerID="39f1a1293b39599f2405d7759e1a23a7dee3c5c94e095be6f36696d10beb6dee" Mar 09 13:25:59 crc kubenswrapper[4723]: I0309 13:25:59.671690 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-68d98b8999-qqz47" Mar 09 13:25:59 crc kubenswrapper[4723]: I0309 13:25:59.715643 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4841a92-8277-45f9-b366-8913a20ec8ad-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:59 crc kubenswrapper[4723]: I0309 13:25:59.719051 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4841a92-8277-45f9-b366-8913a20ec8ad-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a4841a92-8277-45f9-b366-8913a20ec8ad" (UID: "a4841a92-8277-45f9-b366-8913a20ec8ad"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:25:59 crc kubenswrapper[4723]: I0309 13:25:59.724461 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4841a92-8277-45f9-b366-8913a20ec8ad-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a4841a92-8277-45f9-b366-8913a20ec8ad" (UID: "a4841a92-8277-45f9-b366-8913a20ec8ad"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:25:59 crc kubenswrapper[4723]: I0309 13:25:59.738662 4723 scope.go:117] "RemoveContainer" containerID="39f1a1293b39599f2405d7759e1a23a7dee3c5c94e095be6f36696d10beb6dee" Mar 09 13:25:59 crc kubenswrapper[4723]: E0309 13:25:59.741921 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39f1a1293b39599f2405d7759e1a23a7dee3c5c94e095be6f36696d10beb6dee\": container with ID starting with 39f1a1293b39599f2405d7759e1a23a7dee3c5c94e095be6f36696d10beb6dee not found: ID does not exist" containerID="39f1a1293b39599f2405d7759e1a23a7dee3c5c94e095be6f36696d10beb6dee" Mar 09 13:25:59 crc kubenswrapper[4723]: I0309 13:25:59.741974 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39f1a1293b39599f2405d7759e1a23a7dee3c5c94e095be6f36696d10beb6dee"} err="failed to get container status \"39f1a1293b39599f2405d7759e1a23a7dee3c5c94e095be6f36696d10beb6dee\": rpc error: code = NotFound desc = could not find container \"39f1a1293b39599f2405d7759e1a23a7dee3c5c94e095be6f36696d10beb6dee\": container with ID starting with 39f1a1293b39599f2405d7759e1a23a7dee3c5c94e095be6f36696d10beb6dee not found: ID does not exist" Mar 09 13:25:59 crc kubenswrapper[4723]: I0309 13:25:59.818452 4723 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4841a92-8277-45f9-b366-8913a20ec8ad-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:59 crc kubenswrapper[4723]: I0309 13:25:59.818483 4723 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4841a92-8277-45f9-b366-8913a20ec8ad-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:00 crc kubenswrapper[4723]: I0309 13:26:00.033622 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-68d98b8999-qqz47"] Mar 09 13:26:00 crc kubenswrapper[4723]: I0309 13:26:00.056650 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-68d98b8999-qqz47"] Mar 09 13:26:00 crc kubenswrapper[4723]: I0309 13:26:00.175824 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551046-t2lf4"] Mar 09 13:26:00 crc kubenswrapper[4723]: E0309 13:26:00.176566 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4841a92-8277-45f9-b366-8913a20ec8ad" containerName="heat-api" Mar 09 13:26:00 crc kubenswrapper[4723]: I0309 13:26:00.176728 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4841a92-8277-45f9-b366-8913a20ec8ad" containerName="heat-api" Mar 09 13:26:00 crc kubenswrapper[4723]: I0309 13:26:00.177103 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4841a92-8277-45f9-b366-8913a20ec8ad" containerName="heat-api" Mar 09 13:26:00 crc kubenswrapper[4723]: I0309 13:26:00.178003 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551046-t2lf4" Mar 09 13:26:00 crc kubenswrapper[4723]: I0309 13:26:00.185562 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jrp4x" Mar 09 13:26:00 crc kubenswrapper[4723]: I0309 13:26:00.185800 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:26:00 crc kubenswrapper[4723]: I0309 13:26:00.185939 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:26:00 crc kubenswrapper[4723]: I0309 13:26:00.189669 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551046-t2lf4"] Mar 09 13:26:00 crc kubenswrapper[4723]: I0309 13:26:00.233869 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfq85\" (UniqueName: \"kubernetes.io/projected/f1680395-afcc-4923-a2e8-dcdc08604cda-kube-api-access-dfq85\") pod \"auto-csr-approver-29551046-t2lf4\" (UID: \"f1680395-afcc-4923-a2e8-dcdc08604cda\") " pod="openshift-infra/auto-csr-approver-29551046-t2lf4" Mar 09 13:26:00 crc kubenswrapper[4723]: I0309 13:26:00.335991 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfq85\" (UniqueName: \"kubernetes.io/projected/f1680395-afcc-4923-a2e8-dcdc08604cda-kube-api-access-dfq85\") pod \"auto-csr-approver-29551046-t2lf4\" (UID: \"f1680395-afcc-4923-a2e8-dcdc08604cda\") " pod="openshift-infra/auto-csr-approver-29551046-t2lf4" Mar 09 13:26:00 crc kubenswrapper[4723]: I0309 13:26:00.355030 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfq85\" (UniqueName: \"kubernetes.io/projected/f1680395-afcc-4923-a2e8-dcdc08604cda-kube-api-access-dfq85\") pod \"auto-csr-approver-29551046-t2lf4\" (UID: \"f1680395-afcc-4923-a2e8-dcdc08604cda\") " pod="openshift-infra/auto-csr-approver-29551046-t2lf4" Mar 09 13:26:00 crc kubenswrapper[4723]: I0309 13:26:00.464475 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6578b64f7d-9cxnx" Mar 09 13:26:00 crc kubenswrapper[4723]: I0309 13:26:00.515497 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551046-t2lf4" Mar 09 13:26:00 crc kubenswrapper[4723]: I0309 13:26:00.541505 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b314084-941d-4d00-bae6-6fdce2dc24db-combined-ca-bundle\") pod \"9b314084-941d-4d00-bae6-6fdce2dc24db\" (UID: \"9b314084-941d-4d00-bae6-6fdce2dc24db\") " Mar 09 13:26:00 crc kubenswrapper[4723]: I0309 13:26:00.541546 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b314084-941d-4d00-bae6-6fdce2dc24db-internal-tls-certs\") pod \"9b314084-941d-4d00-bae6-6fdce2dc24db\" (UID: \"9b314084-941d-4d00-bae6-6fdce2dc24db\") " Mar 09 13:26:00 crc kubenswrapper[4723]: I0309 13:26:00.541580 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b314084-941d-4d00-bae6-6fdce2dc24db-config-data\") pod \"9b314084-941d-4d00-bae6-6fdce2dc24db\" (UID: \"9b314084-941d-4d00-bae6-6fdce2dc24db\") " Mar 09 13:26:00 crc kubenswrapper[4723]: I0309 13:26:00.541613 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9b314084-941d-4d00-bae6-6fdce2dc24db-config-data-custom\") pod \"9b314084-941d-4d00-bae6-6fdce2dc24db\" (UID: \"9b314084-941d-4d00-bae6-6fdce2dc24db\") " Mar 09 13:26:00 crc kubenswrapper[4723]: I0309 13:26:00.541628 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b314084-941d-4d00-bae6-6fdce2dc24db-public-tls-certs\") pod \"9b314084-941d-4d00-bae6-6fdce2dc24db\" (UID: \"9b314084-941d-4d00-bae6-6fdce2dc24db\") " Mar 09 13:26:00 crc kubenswrapper[4723]: I0309 13:26:00.541652 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqxmp\" (UniqueName: \"kubernetes.io/projected/9b314084-941d-4d00-bae6-6fdce2dc24db-kube-api-access-rqxmp\") pod \"9b314084-941d-4d00-bae6-6fdce2dc24db\" (UID: \"9b314084-941d-4d00-bae6-6fdce2dc24db\") " Mar 09 13:26:00 crc kubenswrapper[4723]: I0309 13:26:00.582596 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b314084-941d-4d00-bae6-6fdce2dc24db-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9b314084-941d-4d00-bae6-6fdce2dc24db" (UID: "9b314084-941d-4d00-bae6-6fdce2dc24db"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:26:00 crc kubenswrapper[4723]: I0309 13:26:00.583687 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b314084-941d-4d00-bae6-6fdce2dc24db-kube-api-access-rqxmp" (OuterVolumeSpecName: "kube-api-access-rqxmp") pod "9b314084-941d-4d00-bae6-6fdce2dc24db" (UID: "9b314084-941d-4d00-bae6-6fdce2dc24db"). InnerVolumeSpecName "kube-api-access-rqxmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:26:00 crc kubenswrapper[4723]: I0309 13:26:00.593040 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b314084-941d-4d00-bae6-6fdce2dc24db-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b314084-941d-4d00-bae6-6fdce2dc24db" (UID: "9b314084-941d-4d00-bae6-6fdce2dc24db"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:26:00 crc kubenswrapper[4723]: I0309 13:26:00.634426 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b314084-941d-4d00-bae6-6fdce2dc24db-config-data" (OuterVolumeSpecName: "config-data") pod "9b314084-941d-4d00-bae6-6fdce2dc24db" (UID: "9b314084-941d-4d00-bae6-6fdce2dc24db"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:26:00 crc kubenswrapper[4723]: I0309 13:26:00.645094 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b314084-941d-4d00-bae6-6fdce2dc24db-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:00 crc kubenswrapper[4723]: I0309 13:26:00.645133 4723 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9b314084-941d-4d00-bae6-6fdce2dc24db-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:00 crc kubenswrapper[4723]: I0309 13:26:00.645148 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqxmp\" (UniqueName: \"kubernetes.io/projected/9b314084-941d-4d00-bae6-6fdce2dc24db-kube-api-access-rqxmp\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:00 crc kubenswrapper[4723]: I0309 13:26:00.645160 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b314084-941d-4d00-bae6-6fdce2dc24db-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:00 crc kubenswrapper[4723]: I0309 13:26:00.656837 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b314084-941d-4d00-bae6-6fdce2dc24db-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9b314084-941d-4d00-bae6-6fdce2dc24db" (UID: "9b314084-941d-4d00-bae6-6fdce2dc24db"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:26:00 crc kubenswrapper[4723]: I0309 13:26:00.671464 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b314084-941d-4d00-bae6-6fdce2dc24db-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9b314084-941d-4d00-bae6-6fdce2dc24db" (UID: "9b314084-941d-4d00-bae6-6fdce2dc24db"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:26:00 crc kubenswrapper[4723]: I0309 13:26:00.698733 4723 generic.go:334] "Generic (PLEG): container finished" podID="9b314084-941d-4d00-bae6-6fdce2dc24db" containerID="1848579cb3ae52f7204c15281b7aa115c813e778cf2db883f38374cd706d1d90" exitCode=0 Mar 09 13:26:00 crc kubenswrapper[4723]: I0309 13:26:00.698776 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6578b64f7d-9cxnx" event={"ID":"9b314084-941d-4d00-bae6-6fdce2dc24db","Type":"ContainerDied","Data":"1848579cb3ae52f7204c15281b7aa115c813e778cf2db883f38374cd706d1d90"} Mar 09 13:26:00 crc kubenswrapper[4723]: I0309 13:26:00.698802 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6578b64f7d-9cxnx" event={"ID":"9b314084-941d-4d00-bae6-6fdce2dc24db","Type":"ContainerDied","Data":"4c88b4129ecc293ac1b2d7276595947520931a6f4a60127f8b69ca4b33cbd0eb"} Mar 09 13:26:00 crc kubenswrapper[4723]: I0309 13:26:00.698817 4723 scope.go:117] "RemoveContainer" containerID="1848579cb3ae52f7204c15281b7aa115c813e778cf2db883f38374cd706d1d90" Mar 09 13:26:00 crc kubenswrapper[4723]: I0309 13:26:00.699017 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6578b64f7d-9cxnx" Mar 09 13:26:00 crc kubenswrapper[4723]: I0309 13:26:00.747435 4723 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b314084-941d-4d00-bae6-6fdce2dc24db-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:00 crc kubenswrapper[4723]: I0309 13:26:00.747462 4723 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b314084-941d-4d00-bae6-6fdce2dc24db-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:00 crc kubenswrapper[4723]: I0309 13:26:00.747486 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6578b64f7d-9cxnx"] Mar 09 13:26:00 crc kubenswrapper[4723]: I0309 13:26:00.775828 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-6578b64f7d-9cxnx"] Mar 09 13:26:00 crc kubenswrapper[4723]: I0309 13:26:00.777534 4723 scope.go:117] "RemoveContainer" containerID="1848579cb3ae52f7204c15281b7aa115c813e778cf2db883f38374cd706d1d90" Mar 09 13:26:00 crc kubenswrapper[4723]: E0309 13:26:00.779700 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1848579cb3ae52f7204c15281b7aa115c813e778cf2db883f38374cd706d1d90\": container with ID starting with 1848579cb3ae52f7204c15281b7aa115c813e778cf2db883f38374cd706d1d90 not found: ID does not exist" containerID="1848579cb3ae52f7204c15281b7aa115c813e778cf2db883f38374cd706d1d90" Mar 09 13:26:00 crc kubenswrapper[4723]: I0309 13:26:00.779748 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1848579cb3ae52f7204c15281b7aa115c813e778cf2db883f38374cd706d1d90"} err="failed to get container status \"1848579cb3ae52f7204c15281b7aa115c813e778cf2db883f38374cd706d1d90\": rpc error: code = NotFound desc = could not find container \"1848579cb3ae52f7204c15281b7aa115c813e778cf2db883f38374cd706d1d90\": container with ID starting with 1848579cb3ae52f7204c15281b7aa115c813e778cf2db883f38374cd706d1d90 not found: ID does not exist" Mar 09 13:26:00 crc kubenswrapper[4723]: I0309 13:26:00.785455 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-cd9b85f6c-jhcds" Mar 09 13:26:00 crc kubenswrapper[4723]: I0309 13:26:00.904675 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b314084-941d-4d00-bae6-6fdce2dc24db" path="/var/lib/kubelet/pods/9b314084-941d-4d00-bae6-6fdce2dc24db/volumes" Mar 09 13:26:00 crc kubenswrapper[4723]: I0309 13:26:00.906607 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4841a92-8277-45f9-b366-8913a20ec8ad" path="/var/lib/kubelet/pods/a4841a92-8277-45f9-b366-8913a20ec8ad/volumes" Mar 09 13:26:00 crc kubenswrapper[4723]: I0309 13:26:00.941845 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-5d4f94b9d4-2l2jj"] Mar 09 13:26:00 crc kubenswrapper[4723]: I0309 13:26:00.942068 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-5d4f94b9d4-2l2jj" podUID="227cded8-49e9-4484-94a3-5ffebb8e4e47" containerName="heat-engine" containerID="cri-o://233f93a7e129cda557de9345227cb5c5f963726f0bf1ef8d1bfb537e3258e03d" gracePeriod=60 Mar 09 13:26:01 crc kubenswrapper[4723]: I0309 13:26:01.066049 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551046-t2lf4"] Mar 09 13:26:01 crc kubenswrapper[4723]: I0309 13:26:01.709966 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551046-t2lf4" event={"ID":"f1680395-afcc-4923-a2e8-dcdc08604cda","Type":"ContainerStarted","Data":"c5e997c460e3d267ef6f8c75faeced188739a48fbbf0132d7d77a119b88c9dc6"} Mar 09 13:26:03 crc kubenswrapper[4723]: I0309 13:26:03.470038 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 09 13:26:07 crc kubenswrapper[4723]: E0309 13:26:07.629959 4723 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="233f93a7e129cda557de9345227cb5c5f963726f0bf1ef8d1bfb537e3258e03d" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 09 13:26:07 crc kubenswrapper[4723]: E0309 13:26:07.631811 4723 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="233f93a7e129cda557de9345227cb5c5f963726f0bf1ef8d1bfb537e3258e03d" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 09 13:26:07 crc kubenswrapper[4723]: E0309 13:26:07.633963 4723 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="233f93a7e129cda557de9345227cb5c5f963726f0bf1ef8d1bfb537e3258e03d" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 09 13:26:07 crc kubenswrapper[4723]: E0309 13:26:07.634013 4723 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-5d4f94b9d4-2l2jj" podUID="227cded8-49e9-4484-94a3-5ffebb8e4e47" containerName="heat-engine" Mar 09 13:26:07 crc kubenswrapper[4723]: I0309 13:26:07.733173 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-2" podUID="2e967475-660d-4ada-b409-bae77e4f6905" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.24:5671: connect: connection refused" Mar 09 13:26:09 crc kubenswrapper[4723]: I0309 13:26:09.475064 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:26:10 crc kubenswrapper[4723]: I0309 13:26:10.853668 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fxqdh" podStartSLOduration=2.388944672 podStartE2EDuration="17.853647571s" podCreationTimestamp="2026-03-09 13:25:53 +0000 UTC" firstStartedPulling="2026-03-09 13:25:55.044295002 +0000 UTC m=+1629.058762542" lastFinishedPulling="2026-03-09 13:26:10.508997881 +0000 UTC m=+1644.523465441" observedRunningTime="2026-03-09 13:26:10.848363621 +0000 UTC m=+1644.862831171" watchObservedRunningTime="2026-03-09 13:26:10.853647571 +0000 UTC m=+1644.868115111" Mar 09 13:26:11 crc kubenswrapper[4723]: I0309 13:26:11.862350 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fxqdh" event={"ID":"6a2886a0-218f-4284-aacd-19614f6f602f","Type":"ContainerStarted","Data":"6b2c40318896ed41f46c21b8b57f308866773934badaa5844f8f1fa10e0adcf6"} Mar 09 13:26:11 crc kubenswrapper[4723]: I0309 13:26:11.871596 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551046-t2lf4" event={"ID":"f1680395-afcc-4923-a2e8-dcdc08604cda","Type":"ContainerStarted","Data":"efccf5749aadb52cbc6bcfcf7ebba24a2aca927f5ea5da8a38aa95fafb98d8b8"} Mar 09 13:26:11 crc kubenswrapper[4723]: I0309 13:26:11.903164 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551046-t2lf4" podStartSLOduration=2.47054792 podStartE2EDuration="11.90314103s" podCreationTimestamp="2026-03-09 13:26:00 +0000 UTC" firstStartedPulling="2026-03-09 13:26:01.073436293 +0000 UTC m=+1635.087903833" lastFinishedPulling="2026-03-09 13:26:10.506029363 +0000 UTC m=+1644.520496943" observedRunningTime="2026-03-09 13:26:11.892485269 +0000 UTC m=+1645.906952829" watchObservedRunningTime="2026-03-09 13:26:11.90314103 +0000 UTC m=+1645.917608570" Mar 09 13:26:12 crc kubenswrapper[4723]: I0309 13:26:12.041187 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-5td4v"] Mar 09 13:26:12 crc kubenswrapper[4723]: I0309 13:26:12.052245 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-5td4v"] Mar 09 13:26:12 crc kubenswrapper[4723]: I0309 13:26:12.151968 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-js2tm"] Mar 09 13:26:12 crc kubenswrapper[4723]: E0309 13:26:12.152514 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b314084-941d-4d00-bae6-6fdce2dc24db" containerName="heat-cfnapi" Mar 09 13:26:12 crc kubenswrapper[4723]: I0309 13:26:12.152532 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b314084-941d-4d00-bae6-6fdce2dc24db" containerName="heat-cfnapi" Mar 09 13:26:12 crc kubenswrapper[4723]: I0309 13:26:12.152736 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b314084-941d-4d00-bae6-6fdce2dc24db" containerName="heat-cfnapi" Mar 09 13:26:12 crc kubenswrapper[4723]: I0309 13:26:12.153558 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-js2tm" Mar 09 13:26:12 crc kubenswrapper[4723]: I0309 13:26:12.185702 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 09 13:26:12 crc kubenswrapper[4723]: I0309 13:26:12.220620 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-js2tm"] Mar 09 13:26:12 crc kubenswrapper[4723]: I0309 13:26:12.256062 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d-config-data\") pod \"aodh-db-sync-js2tm\" (UID: \"ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d\") " pod="openstack/aodh-db-sync-js2tm" Mar 09 13:26:12 crc kubenswrapper[4723]: I0309 13:26:12.256157 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw545\" (UniqueName: \"kubernetes.io/projected/ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d-kube-api-access-jw545\") pod \"aodh-db-sync-js2tm\" (UID: \"ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d\") " pod="openstack/aodh-db-sync-js2tm" Mar 09 13:26:12 crc kubenswrapper[4723]: I0309 13:26:12.256219 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d-combined-ca-bundle\") pod \"aodh-db-sync-js2tm\" (UID: \"ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d\") " pod="openstack/aodh-db-sync-js2tm" Mar 09 13:26:12 crc kubenswrapper[4723]: I0309 13:26:12.256269 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d-scripts\") pod \"aodh-db-sync-js2tm\" (UID: \"ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d\") " pod="openstack/aodh-db-sync-js2tm" Mar 09 13:26:12 crc kubenswrapper[4723]: I0309 13:26:12.358219 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jw545\" (UniqueName: \"kubernetes.io/projected/ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d-kube-api-access-jw545\") pod \"aodh-db-sync-js2tm\" (UID: \"ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d\") " pod="openstack/aodh-db-sync-js2tm" Mar 09 13:26:12 crc kubenswrapper[4723]: I0309 13:26:12.358321 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d-combined-ca-bundle\") pod \"aodh-db-sync-js2tm\" (UID: \"ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d\") " pod="openstack/aodh-db-sync-js2tm" Mar 09 13:26:12 crc kubenswrapper[4723]: I0309 13:26:12.358389 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d-scripts\") pod \"aodh-db-sync-js2tm\" (UID: \"ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d\") " pod="openstack/aodh-db-sync-js2tm" Mar 09 13:26:12 crc kubenswrapper[4723]: I0309 13:26:12.358495 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d-config-data\") pod \"aodh-db-sync-js2tm\" (UID: \"ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d\") " pod="openstack/aodh-db-sync-js2tm" Mar 09 13:26:12 crc kubenswrapper[4723]: I0309 13:26:12.365604 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d-scripts\") pod \"aodh-db-sync-js2tm\" (UID: \"ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d\") " pod="openstack/aodh-db-sync-js2tm" Mar 09 13:26:12 crc kubenswrapper[4723]: I0309 13:26:12.366832 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d-combined-ca-bundle\") pod \"aodh-db-sync-js2tm\" (UID: \"ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d\") " pod="openstack/aodh-db-sync-js2tm" Mar 09 13:26:12 crc kubenswrapper[4723]: I0309 13:26:12.367293 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d-config-data\") pod \"aodh-db-sync-js2tm\" (UID: \"ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d\") " pod="openstack/aodh-db-sync-js2tm" Mar 09 13:26:12 crc kubenswrapper[4723]: I0309 13:26:12.380088 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw545\" (UniqueName: \"kubernetes.io/projected/ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d-kube-api-access-jw545\") pod \"aodh-db-sync-js2tm\" (UID: \"ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d\") " pod="openstack/aodh-db-sync-js2tm" Mar 09 13:26:12 crc kubenswrapper[4723]: I0309 13:26:12.511830 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-js2tm" Mar 09 13:26:12 crc kubenswrapper[4723]: I0309 13:26:12.897393 4723 generic.go:334] "Generic (PLEG): container finished" podID="f1680395-afcc-4923-a2e8-dcdc08604cda" containerID="efccf5749aadb52cbc6bcfcf7ebba24a2aca927f5ea5da8a38aa95fafb98d8b8" exitCode=0 Mar 09 13:26:12 crc kubenswrapper[4723]: I0309 13:26:12.898255 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6f0b3e5-b717-4ea6-ae5b-400876201699" path="/var/lib/kubelet/pods/f6f0b3e5-b717-4ea6-ae5b-400876201699/volumes" Mar 09 13:26:12 crc kubenswrapper[4723]: I0309 13:26:12.902111 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551046-t2lf4" event={"ID":"f1680395-afcc-4923-a2e8-dcdc08604cda","Type":"ContainerDied","Data":"efccf5749aadb52cbc6bcfcf7ebba24a2aca927f5ea5da8a38aa95fafb98d8b8"} Mar 09 13:26:13 crc kubenswrapper[4723]: I0309 13:26:13.042327 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-js2tm"] Mar 09 13:26:13 crc kubenswrapper[4723]: W0309 13:26:13.048654 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podebd5cf0f_83bf_448a_9fe6_1ddfb4195a1d.slice/crio-710a78887dfce896cfbd129ebedc0ef58d43f236af22d3629bc38febe5c38a2f WatchSource:0}: Error finding container 710a78887dfce896cfbd129ebedc0ef58d43f236af22d3629bc38febe5c38a2f: Status 404 returned error can't find the container with id 710a78887dfce896cfbd129ebedc0ef58d43f236af22d3629bc38febe5c38a2f Mar 09 13:26:13 crc kubenswrapper[4723]: I0309 13:26:13.914414 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-js2tm" event={"ID":"ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d","Type":"ContainerStarted","Data":"710a78887dfce896cfbd129ebedc0ef58d43f236af22d3629bc38febe5c38a2f"} Mar 09 13:26:14 crc kubenswrapper[4723]: I0309 13:26:14.418076 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551046-t2lf4" Mar 09 13:26:14 crc kubenswrapper[4723]: I0309 13:26:14.508146 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfq85\" (UniqueName: \"kubernetes.io/projected/f1680395-afcc-4923-a2e8-dcdc08604cda-kube-api-access-dfq85\") pod \"f1680395-afcc-4923-a2e8-dcdc08604cda\" (UID: \"f1680395-afcc-4923-a2e8-dcdc08604cda\") " Mar 09 13:26:14 crc kubenswrapper[4723]: I0309 13:26:14.514983 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1680395-afcc-4923-a2e8-dcdc08604cda-kube-api-access-dfq85" (OuterVolumeSpecName: "kube-api-access-dfq85") pod "f1680395-afcc-4923-a2e8-dcdc08604cda" (UID: "f1680395-afcc-4923-a2e8-dcdc08604cda"). InnerVolumeSpecName "kube-api-access-dfq85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:26:14 crc kubenswrapper[4723]: I0309 13:26:14.611502 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfq85\" (UniqueName: \"kubernetes.io/projected/f1680395-afcc-4923-a2e8-dcdc08604cda-kube-api-access-dfq85\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:14 crc kubenswrapper[4723]: I0309 13:26:14.935371 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551046-t2lf4" event={"ID":"f1680395-afcc-4923-a2e8-dcdc08604cda","Type":"ContainerDied","Data":"c5e997c460e3d267ef6f8c75faeced188739a48fbbf0132d7d77a119b88c9dc6"} Mar 09 13:26:14 crc kubenswrapper[4723]: I0309 13:26:14.935632 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5e997c460e3d267ef6f8c75faeced188739a48fbbf0132d7d77a119b88c9dc6" Mar 09 13:26:14 crc kubenswrapper[4723]: I0309 13:26:14.935695 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551046-t2lf4" Mar 09 13:26:14 crc kubenswrapper[4723]: I0309 13:26:14.961738 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551040-pqgfx"] Mar 09 13:26:14 crc kubenswrapper[4723]: I0309 13:26:14.974473 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551040-pqgfx"] Mar 09 13:26:16 crc kubenswrapper[4723]: I0309 13:26:16.906410 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8cfb1f4-2f08-4850-a398-679a25dacc26" path="/var/lib/kubelet/pods/f8cfb1f4-2f08-4850-a398-679a25dacc26/volumes" Mar 09 13:26:17 crc kubenswrapper[4723]: E0309 13:26:17.636949 4723 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="233f93a7e129cda557de9345227cb5c5f963726f0bf1ef8d1bfb537e3258e03d" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 09 13:26:17 crc kubenswrapper[4723]: E0309 13:26:17.638603 4723 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="233f93a7e129cda557de9345227cb5c5f963726f0bf1ef8d1bfb537e3258e03d" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 09 13:26:17 crc kubenswrapper[4723]: E0309 13:26:17.639960 4723 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="233f93a7e129cda557de9345227cb5c5f963726f0bf1ef8d1bfb537e3258e03d" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 09 13:26:17 crc kubenswrapper[4723]: E0309 13:26:17.640039 4723 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-5d4f94b9d4-2l2jj" podUID="227cded8-49e9-4484-94a3-5ffebb8e4e47" containerName="heat-engine" Mar 09 13:26:17 crc kubenswrapper[4723]: I0309 13:26:17.733078 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-2" Mar 09 13:26:17 crc kubenswrapper[4723]: I0309 13:26:17.791295 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 09 13:26:18 crc kubenswrapper[4723]: I0309 13:26:18.599064 4723 scope.go:117] "RemoveContainer" containerID="8bdf8e2c56fdbc2f38f4e61046b7273b9601691fd104c700f981345420a2b2a8" Mar 09 13:26:18 crc kubenswrapper[4723]: I0309 13:26:18.871877 4723 scope.go:117] "RemoveContainer" containerID="8e11255da285ce3f57b0b4ccf342daccc1e3fb68582f73279f9605a2391b9e1d" Mar 09 13:26:19 crc kubenswrapper[4723]: I0309 13:26:19.030335 4723 generic.go:334] "Generic (PLEG): container finished" podID="227cded8-49e9-4484-94a3-5ffebb8e4e47" containerID="233f93a7e129cda557de9345227cb5c5f963726f0bf1ef8d1bfb537e3258e03d" exitCode=0 Mar 09 13:26:19 crc kubenswrapper[4723]: I0309 13:26:19.030376 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5d4f94b9d4-2l2jj" event={"ID":"227cded8-49e9-4484-94a3-5ffebb8e4e47","Type":"ContainerDied","Data":"233f93a7e129cda557de9345227cb5c5f963726f0bf1ef8d1bfb537e3258e03d"} Mar 09 13:26:19 crc kubenswrapper[4723]: I0309 13:26:19.967245 4723 scope.go:117] "RemoveContainer" containerID="949cfd774a5ac85ffdc5516fd1299f2c3fc1e7abdb1a2335f187c11475bef008" Mar 09 13:26:20 crc kubenswrapper[4723]: I0309 13:26:20.075269 4723 scope.go:117] "RemoveContainer" containerID="c2075174a9cd19a29e40fa0c0885414e2511911e74a51b19dd10ec38d93573c6" Mar 09 13:26:20 crc kubenswrapper[4723]: I0309 13:26:20.306152 4723 scope.go:117] "RemoveContainer" containerID="2a8b8e6a52c0f4d91f7a644a265b41484dbc581fd88186f79df9170db828f221" Mar 09 13:26:20 crc kubenswrapper[4723]: I0309 13:26:20.349275 4723 scope.go:117] "RemoveContainer" containerID="bc25c8c5930c5c5f007ae41650a217c3741dcf849edb5962c108af4745085d35" Mar 09 13:26:20 crc kubenswrapper[4723]: I0309 13:26:20.470881 4723 scope.go:117] "RemoveContainer" containerID="da6b2137bd9781c109f3e16b89ef5f82274b5c5c675a63a1ce457a03997ee6e0" Mar 09 13:26:20 crc kubenswrapper[4723]: I0309 13:26:20.497217 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5d4f94b9d4-2l2jj" Mar 09 13:26:20 crc kubenswrapper[4723]: I0309 13:26:20.570060 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/227cded8-49e9-4484-94a3-5ffebb8e4e47-combined-ca-bundle\") pod \"227cded8-49e9-4484-94a3-5ffebb8e4e47\" (UID: \"227cded8-49e9-4484-94a3-5ffebb8e4e47\") " Mar 09 13:26:20 crc kubenswrapper[4723]: I0309 13:26:20.570362 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/227cded8-49e9-4484-94a3-5ffebb8e4e47-config-data-custom\") pod \"227cded8-49e9-4484-94a3-5ffebb8e4e47\" (UID: \"227cded8-49e9-4484-94a3-5ffebb8e4e47\") " Mar 09 13:26:20 crc kubenswrapper[4723]: I0309 13:26:20.570527 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/227cded8-49e9-4484-94a3-5ffebb8e4e47-config-data\") pod \"227cded8-49e9-4484-94a3-5ffebb8e4e47\" (UID: \"227cded8-49e9-4484-94a3-5ffebb8e4e47\") " Mar 09 13:26:20 crc kubenswrapper[4723]: I0309 13:26:20.570734 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nwqw\" (UniqueName: \"kubernetes.io/projected/227cded8-49e9-4484-94a3-5ffebb8e4e47-kube-api-access-4nwqw\") pod \"227cded8-49e9-4484-94a3-5ffebb8e4e47\" (UID: \"227cded8-49e9-4484-94a3-5ffebb8e4e47\") " Mar 09 13:26:20 crc kubenswrapper[4723]: I0309 13:26:20.577077 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/227cded8-49e9-4484-94a3-5ffebb8e4e47-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "227cded8-49e9-4484-94a3-5ffebb8e4e47" (UID: "227cded8-49e9-4484-94a3-5ffebb8e4e47"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:26:20 crc kubenswrapper[4723]: I0309 13:26:20.593004 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/227cded8-49e9-4484-94a3-5ffebb8e4e47-kube-api-access-4nwqw" (OuterVolumeSpecName: "kube-api-access-4nwqw") pod "227cded8-49e9-4484-94a3-5ffebb8e4e47" (UID: "227cded8-49e9-4484-94a3-5ffebb8e4e47"). InnerVolumeSpecName "kube-api-access-4nwqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:26:20 crc kubenswrapper[4723]: I0309 13:26:20.653625 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/227cded8-49e9-4484-94a3-5ffebb8e4e47-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "227cded8-49e9-4484-94a3-5ffebb8e4e47" (UID: "227cded8-49e9-4484-94a3-5ffebb8e4e47"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:26:20 crc kubenswrapper[4723]: I0309 13:26:20.673445 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/227cded8-49e9-4484-94a3-5ffebb8e4e47-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:20 crc kubenswrapper[4723]: I0309 13:26:20.673479 4723 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/227cded8-49e9-4484-94a3-5ffebb8e4e47-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:20 crc kubenswrapper[4723]: I0309 13:26:20.673489 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nwqw\" (UniqueName: \"kubernetes.io/projected/227cded8-49e9-4484-94a3-5ffebb8e4e47-kube-api-access-4nwqw\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:20 crc kubenswrapper[4723]: I0309 13:26:20.687014 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/227cded8-49e9-4484-94a3-5ffebb8e4e47-config-data" (OuterVolumeSpecName: "config-data") pod "227cded8-49e9-4484-94a3-5ffebb8e4e47" (UID: "227cded8-49e9-4484-94a3-5ffebb8e4e47"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:26:20 crc kubenswrapper[4723]: I0309 13:26:20.774592 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/227cded8-49e9-4484-94a3-5ffebb8e4e47-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:21 crc kubenswrapper[4723]: I0309 13:26:21.094317 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-js2tm" event={"ID":"ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d","Type":"ContainerStarted","Data":"57130dd725d7cc21c2a89d5bda87ab79dfee89089bd9e3287144c0588f8dd44b"} Mar 09 13:26:21 crc kubenswrapper[4723]: I0309 13:26:21.096145 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5d4f94b9d4-2l2jj" event={"ID":"227cded8-49e9-4484-94a3-5ffebb8e4e47","Type":"ContainerDied","Data":"3867f6ec87f105547d9166c7c33cb5290d7ae9336cdb4843437eb5a35264341a"} Mar 09 13:26:21 crc kubenswrapper[4723]: I0309 13:26:21.096200 4723 scope.go:117] "RemoveContainer" containerID="233f93a7e129cda557de9345227cb5c5f963726f0bf1ef8d1bfb537e3258e03d" Mar 09 13:26:21 crc kubenswrapper[4723]: I0309 13:26:21.096335 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5d4f94b9d4-2l2jj" Mar 09 13:26:21 crc kubenswrapper[4723]: I0309 13:26:21.118641 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-js2tm" podStartSLOduration=2.091087772 podStartE2EDuration="9.118619312s" podCreationTimestamp="2026-03-09 13:26:12 +0000 UTC" firstStartedPulling="2026-03-09 13:26:13.051706149 +0000 UTC m=+1647.066173689" lastFinishedPulling="2026-03-09 13:26:20.079237689 +0000 UTC m=+1654.093705229" observedRunningTime="2026-03-09 13:26:21.112532621 +0000 UTC m=+1655.127000171" watchObservedRunningTime="2026-03-09 13:26:21.118619312 +0000 UTC m=+1655.133086862" Mar 09 13:26:21 crc kubenswrapper[4723]: I0309 13:26:21.136939 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-5d4f94b9d4-2l2jj"] Mar 09 13:26:21 crc kubenswrapper[4723]: I0309 13:26:21.147957 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-5d4f94b9d4-2l2jj"] Mar 09 13:26:22 crc kubenswrapper[4723]: I0309 13:26:22.113572 4723 generic.go:334] "Generic (PLEG): container finished" podID="6a2886a0-218f-4284-aacd-19614f6f602f" containerID="6b2c40318896ed41f46c21b8b57f308866773934badaa5844f8f1fa10e0adcf6" exitCode=0 Mar 09 13:26:22 crc kubenswrapper[4723]: I0309 13:26:22.113845 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fxqdh" event={"ID":"6a2886a0-218f-4284-aacd-19614f6f602f","Type":"ContainerDied","Data":"6b2c40318896ed41f46c21b8b57f308866773934badaa5844f8f1fa10e0adcf6"} Mar 09 13:26:22 crc kubenswrapper[4723]: I0309 13:26:22.908570 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="227cded8-49e9-4484-94a3-5ffebb8e4e47" path="/var/lib/kubelet/pods/227cded8-49e9-4484-94a3-5ffebb8e4e47/volumes" Mar 09 13:26:23 crc kubenswrapper[4723]: I0309 13:26:23.215424 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-1" podUID="daa528e2-bcd7-43a8-bfea-a0911b3020c5" containerName="rabbitmq" containerID="cri-o://f12e964565b682de4c859455f62e9203db365e9420b552bf6ee54ba492d5bdee" gracePeriod=604795 Mar 09 13:26:23 crc kubenswrapper[4723]: I0309 13:26:23.722051 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fxqdh" Mar 09 13:26:23 crc kubenswrapper[4723]: I0309 13:26:23.847632 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6a2886a0-218f-4284-aacd-19614f6f602f-ssh-key-openstack-edpm-ipam\") pod \"6a2886a0-218f-4284-aacd-19614f6f602f\" (UID: \"6a2886a0-218f-4284-aacd-19614f6f602f\") " Mar 09 13:26:23 crc kubenswrapper[4723]: I0309 13:26:23.847695 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a2886a0-218f-4284-aacd-19614f6f602f-repo-setup-combined-ca-bundle\") pod \"6a2886a0-218f-4284-aacd-19614f6f602f\" (UID: \"6a2886a0-218f-4284-aacd-19614f6f602f\") " Mar 09 13:26:23 crc kubenswrapper[4723]: I0309 13:26:23.847977 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2kds\" (UniqueName: \"kubernetes.io/projected/6a2886a0-218f-4284-aacd-19614f6f602f-kube-api-access-n2kds\") pod \"6a2886a0-218f-4284-aacd-19614f6f602f\" (UID: \"6a2886a0-218f-4284-aacd-19614f6f602f\") " Mar 09 13:26:23 crc kubenswrapper[4723]: I0309 13:26:23.848115 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a2886a0-218f-4284-aacd-19614f6f602f-inventory\") pod \"6a2886a0-218f-4284-aacd-19614f6f602f\" (UID: \"6a2886a0-218f-4284-aacd-19614f6f602f\") " Mar 09 13:26:23 crc kubenswrapper[4723]: I0309 13:26:23.853350 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a2886a0-218f-4284-aacd-19614f6f602f-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "6a2886a0-218f-4284-aacd-19614f6f602f" (UID: "6a2886a0-218f-4284-aacd-19614f6f602f"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:26:23 crc kubenswrapper[4723]: I0309 13:26:23.854051 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a2886a0-218f-4284-aacd-19614f6f602f-kube-api-access-n2kds" (OuterVolumeSpecName: "kube-api-access-n2kds") pod "6a2886a0-218f-4284-aacd-19614f6f602f" (UID: "6a2886a0-218f-4284-aacd-19614f6f602f"). InnerVolumeSpecName "kube-api-access-n2kds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:26:23 crc kubenswrapper[4723]: I0309 13:26:23.888381 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a2886a0-218f-4284-aacd-19614f6f602f-inventory" (OuterVolumeSpecName: "inventory") pod "6a2886a0-218f-4284-aacd-19614f6f602f" (UID: "6a2886a0-218f-4284-aacd-19614f6f602f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:26:23 crc kubenswrapper[4723]: I0309 13:26:23.894683 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a2886a0-218f-4284-aacd-19614f6f602f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6a2886a0-218f-4284-aacd-19614f6f602f" (UID: "6a2886a0-218f-4284-aacd-19614f6f602f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:26:23 crc kubenswrapper[4723]: I0309 13:26:23.951317 4723 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a2886a0-218f-4284-aacd-19614f6f602f-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:23 crc kubenswrapper[4723]: I0309 13:26:23.951353 4723 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6a2886a0-218f-4284-aacd-19614f6f602f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:23 crc kubenswrapper[4723]: I0309 13:26:23.951363 4723 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a2886a0-218f-4284-aacd-19614f6f602f-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:23 crc kubenswrapper[4723]: I0309 13:26:23.951372 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2kds\" (UniqueName: \"kubernetes.io/projected/6a2886a0-218f-4284-aacd-19614f6f602f-kube-api-access-n2kds\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:24 crc kubenswrapper[4723]: I0309 13:26:24.140952 4723 generic.go:334] "Generic (PLEG): container finished" podID="ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d" containerID="57130dd725d7cc21c2a89d5bda87ab79dfee89089bd9e3287144c0588f8dd44b" exitCode=0 Mar 09 13:26:24 crc kubenswrapper[4723]: I0309 13:26:24.141691 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-js2tm" event={"ID":"ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d","Type":"ContainerDied","Data":"57130dd725d7cc21c2a89d5bda87ab79dfee89089bd9e3287144c0588f8dd44b"} Mar 09 13:26:24 crc kubenswrapper[4723]: I0309 13:26:24.143549 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fxqdh" event={"ID":"6a2886a0-218f-4284-aacd-19614f6f602f","Type":"ContainerDied","Data":"68c2b6eeea7cd07d1618e16eef1808b774b90c7b0408be0e86193885c05defa1"} Mar 09 13:26:24 crc kubenswrapper[4723]: I0309 13:26:24.143588 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68c2b6eeea7cd07d1618e16eef1808b774b90c7b0408be0e86193885c05defa1" Mar 09 13:26:24 crc kubenswrapper[4723]: I0309 13:26:24.143609 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fxqdh" Mar 09 13:26:24 crc kubenswrapper[4723]: I0309 13:26:24.237928 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-fcdn6"] Mar 09 13:26:24 crc kubenswrapper[4723]: E0309 13:26:24.238548 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a2886a0-218f-4284-aacd-19614f6f602f" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 09 13:26:24 crc kubenswrapper[4723]: I0309 13:26:24.238567 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a2886a0-218f-4284-aacd-19614f6f602f" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 09 13:26:24 crc kubenswrapper[4723]: E0309 13:26:24.238602 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="227cded8-49e9-4484-94a3-5ffebb8e4e47" containerName="heat-engine" Mar 09 13:26:24 crc kubenswrapper[4723]: I0309 13:26:24.238609 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="227cded8-49e9-4484-94a3-5ffebb8e4e47" containerName="heat-engine" Mar 09 13:26:24 crc kubenswrapper[4723]: E0309 13:26:24.238635 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1680395-afcc-4923-a2e8-dcdc08604cda" containerName="oc" Mar 09 13:26:24 crc kubenswrapper[4723]: I0309 13:26:24.238643 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1680395-afcc-4923-a2e8-dcdc08604cda" containerName="oc" Mar 09 13:26:24 crc kubenswrapper[4723]: I0309 13:26:24.238906 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="227cded8-49e9-4484-94a3-5ffebb8e4e47" containerName="heat-engine" Mar 09 13:26:24 crc kubenswrapper[4723]: I0309 13:26:24.238927 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1680395-afcc-4923-a2e8-dcdc08604cda" containerName="oc" Mar 09 13:26:24 crc kubenswrapper[4723]: I0309 13:26:24.238963 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a2886a0-218f-4284-aacd-19614f6f602f" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 09 13:26:24 crc kubenswrapper[4723]: I0309 13:26:24.239900 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fcdn6" Mar 09 13:26:24 crc kubenswrapper[4723]: I0309 13:26:24.244544 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gw7vt" Mar 09 13:26:24 crc kubenswrapper[4723]: I0309 13:26:24.245273 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:26:24 crc kubenswrapper[4723]: I0309 13:26:24.245451 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:26:24 crc kubenswrapper[4723]: I0309 13:26:24.246912 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:26:24 crc kubenswrapper[4723]: I0309 13:26:24.253644 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-fcdn6"] Mar 09 13:26:24 crc kubenswrapper[4723]: I0309 13:26:24.359971 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c97cec8e-5cb2-455b-8b57-8179ced146c3-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fcdn6\" (UID: \"c97cec8e-5cb2-455b-8b57-8179ced146c3\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fcdn6" Mar 09 13:26:24 crc kubenswrapper[4723]: I0309 13:26:24.360231 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhv4t\" (UniqueName: \"kubernetes.io/projected/c97cec8e-5cb2-455b-8b57-8179ced146c3-kube-api-access-mhv4t\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fcdn6\" (UID: \"c97cec8e-5cb2-455b-8b57-8179ced146c3\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fcdn6" Mar 09 13:26:24 crc kubenswrapper[4723]: I0309 13:26:24.360289 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c97cec8e-5cb2-455b-8b57-8179ced146c3-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fcdn6\" (UID: \"c97cec8e-5cb2-455b-8b57-8179ced146c3\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fcdn6" Mar 09 13:26:24 crc kubenswrapper[4723]: I0309 13:26:24.462248 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c97cec8e-5cb2-455b-8b57-8179ced146c3-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fcdn6\" (UID: \"c97cec8e-5cb2-455b-8b57-8179ced146c3\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fcdn6" Mar 09 13:26:24 crc kubenswrapper[4723]: I0309 13:26:24.462434 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhv4t\" (UniqueName: \"kubernetes.io/projected/c97cec8e-5cb2-455b-8b57-8179ced146c3-kube-api-access-mhv4t\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fcdn6\" (UID: \"c97cec8e-5cb2-455b-8b57-8179ced146c3\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fcdn6" Mar 09 13:26:24 crc kubenswrapper[4723]: I0309 13:26:24.462493 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c97cec8e-5cb2-455b-8b57-8179ced146c3-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fcdn6\" (UID: \"c97cec8e-5cb2-455b-8b57-8179ced146c3\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fcdn6" Mar 09 13:26:24 crc kubenswrapper[4723]: I0309 13:26:24.466483 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c97cec8e-5cb2-455b-8b57-8179ced146c3-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fcdn6\" (UID: \"c97cec8e-5cb2-455b-8b57-8179ced146c3\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fcdn6" Mar 09 13:26:24 crc kubenswrapper[4723]: I0309 13:26:24.466735 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c97cec8e-5cb2-455b-8b57-8179ced146c3-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fcdn6\" (UID: \"c97cec8e-5cb2-455b-8b57-8179ced146c3\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fcdn6" Mar 09 13:26:24 crc kubenswrapper[4723]: I0309 13:26:24.479303 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhv4t\" (UniqueName: \"kubernetes.io/projected/c97cec8e-5cb2-455b-8b57-8179ced146c3-kube-api-access-mhv4t\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fcdn6\" (UID: \"c97cec8e-5cb2-455b-8b57-8179ced146c3\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fcdn6" Mar 09 13:26:24 crc kubenswrapper[4723]: I0309 13:26:24.557267 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fcdn6" Mar 09 13:26:25 crc kubenswrapper[4723]: I0309 13:26:25.112248 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-fcdn6"] Mar 09 13:26:25 crc kubenswrapper[4723]: I0309 13:26:25.155437 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fcdn6" event={"ID":"c97cec8e-5cb2-455b-8b57-8179ced146c3","Type":"ContainerStarted","Data":"b3283be0294e864fcee647380b74d0ee0d9b66d25e08ea7060832a6f36dbf3a4"} Mar 09 13:26:25 crc kubenswrapper[4723]: I0309 13:26:25.746356 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-js2tm" Mar 09 13:26:25 crc kubenswrapper[4723]: I0309 13:26:25.799249 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d-config-data\") pod \"ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d\" (UID: \"ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d\") " Mar 09 13:26:25 crc kubenswrapper[4723]: I0309 13:26:25.799499 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d-scripts\") pod \"ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d\" (UID: \"ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d\") " Mar 09 13:26:25 crc kubenswrapper[4723]: I0309 13:26:25.799534 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jw545\" (UniqueName: \"kubernetes.io/projected/ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d-kube-api-access-jw545\") pod \"ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d\" (UID: \"ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d\") " Mar 09 13:26:25 crc kubenswrapper[4723]: I0309 13:26:25.799589 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d-combined-ca-bundle\") pod \"ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d\" (UID: \"ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d\") " Mar 09 13:26:25 crc kubenswrapper[4723]: I0309 13:26:25.807610 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d-scripts" (OuterVolumeSpecName: "scripts") pod "ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d" (UID: "ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:26:25 crc kubenswrapper[4723]: I0309 13:26:25.807759 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d-kube-api-access-jw545" (OuterVolumeSpecName: "kube-api-access-jw545") pod "ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d" (UID: "ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d"). InnerVolumeSpecName "kube-api-access-jw545". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:26:25 crc kubenswrapper[4723]: I0309 13:26:25.846387 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d" (UID: "ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:26:25 crc kubenswrapper[4723]: I0309 13:26:25.848087 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d-config-data" (OuterVolumeSpecName: "config-data") pod "ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d" (UID: "ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:26:25 crc kubenswrapper[4723]: I0309 13:26:25.905123 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:25 crc kubenswrapper[4723]: I0309 13:26:25.905385 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:25 crc kubenswrapper[4723]: I0309 13:26:25.905555 4723 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:25 crc kubenswrapper[4723]: I0309 13:26:25.905577 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jw545\" (UniqueName: \"kubernetes.io/projected/ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d-kube-api-access-jw545\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:26 crc kubenswrapper[4723]: I0309 13:26:26.168590 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fcdn6" event={"ID":"c97cec8e-5cb2-455b-8b57-8179ced146c3","Type":"ContainerStarted","Data":"312079012e8257ade1d0f49ec8ce9610ee7d52c3793feccc389eeea9a72e122e"} Mar 09 13:26:26 crc kubenswrapper[4723]: I0309 13:26:26.170437 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-js2tm" event={"ID":"ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d","Type":"ContainerDied","Data":"710a78887dfce896cfbd129ebedc0ef58d43f236af22d3629bc38febe5c38a2f"} Mar 09 13:26:26 crc kubenswrapper[4723]: I0309 13:26:26.170567 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="710a78887dfce896cfbd129ebedc0ef58d43f236af22d3629bc38febe5c38a2f" Mar 09 13:26:26 crc kubenswrapper[4723]: I0309 13:26:26.170477 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-js2tm" Mar 09 13:26:26 crc kubenswrapper[4723]: I0309 13:26:26.204607 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fcdn6" podStartSLOduration=1.73314113 podStartE2EDuration="2.204583451s" podCreationTimestamp="2026-03-09 13:26:24 +0000 UTC" firstStartedPulling="2026-03-09 13:26:25.112353822 +0000 UTC m=+1659.126821382" lastFinishedPulling="2026-03-09 13:26:25.583796163 +0000 UTC m=+1659.598263703" observedRunningTime="2026-03-09 13:26:26.192201623 +0000 UTC m=+1660.206669183" watchObservedRunningTime="2026-03-09 13:26:26.204583451 +0000 UTC m=+1660.219050991" Mar 09 13:26:27 crc kubenswrapper[4723]: I0309 13:26:27.128617 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 09 13:26:27 crc kubenswrapper[4723]: I0309 13:26:27.129358 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="449e6144-ad49-44a8-ad79-809de89fa5c6" containerName="aodh-api" containerID="cri-o://2a43ed24c53c90ff73e9513c57760e48ad17450200a520872d6c47cb4cc380e0" gracePeriod=30 Mar 09 13:26:27 crc kubenswrapper[4723]: I0309 13:26:27.129454 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="449e6144-ad49-44a8-ad79-809de89fa5c6" containerName="aodh-listener" containerID="cri-o://614f6f3f3d2a3090f8d68732d63bcf45362a34004d6e5414bf4083a904fe172c" gracePeriod=30 Mar 09 13:26:27 crc kubenswrapper[4723]: I0309 13:26:27.129565 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="449e6144-ad49-44a8-ad79-809de89fa5c6" containerName="aodh-evaluator" containerID="cri-o://78281b4a2b76e933fde883105a715006fbc28f0156e8152af1b38dc6f7ebdd6d" gracePeriod=30 Mar 09 13:26:27 crc kubenswrapper[4723]: I0309 13:26:27.129623 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="449e6144-ad49-44a8-ad79-809de89fa5c6" containerName="aodh-notifier" containerID="cri-o://1e523dfd4062df573b2177b8e2288353fdf30bdfcdddeb4ec818d336c2c949ee" gracePeriod=30 Mar 09 13:26:28 crc kubenswrapper[4723]: I0309 13:26:28.194285 4723 generic.go:334] "Generic (PLEG): container finished" podID="449e6144-ad49-44a8-ad79-809de89fa5c6" containerID="78281b4a2b76e933fde883105a715006fbc28f0156e8152af1b38dc6f7ebdd6d" exitCode=0 Mar 09 13:26:28 crc kubenswrapper[4723]: I0309 13:26:28.194570 4723 generic.go:334] "Generic (PLEG): container finished" podID="449e6144-ad49-44a8-ad79-809de89fa5c6" containerID="2a43ed24c53c90ff73e9513c57760e48ad17450200a520872d6c47cb4cc380e0" exitCode=0 Mar 09 13:26:28 crc kubenswrapper[4723]: I0309 13:26:28.194382 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"449e6144-ad49-44a8-ad79-809de89fa5c6","Type":"ContainerDied","Data":"78281b4a2b76e933fde883105a715006fbc28f0156e8152af1b38dc6f7ebdd6d"} Mar 09 13:26:28 crc kubenswrapper[4723]: I0309 13:26:28.194613 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"449e6144-ad49-44a8-ad79-809de89fa5c6","Type":"ContainerDied","Data":"2a43ed24c53c90ff73e9513c57760e48ad17450200a520872d6c47cb4cc380e0"} Mar 09 13:26:29 crc kubenswrapper[4723]: I0309 13:26:29.208846 4723 generic.go:334] "Generic (PLEG): container finished" podID="c97cec8e-5cb2-455b-8b57-8179ced146c3" containerID="312079012e8257ade1d0f49ec8ce9610ee7d52c3793feccc389eeea9a72e122e" exitCode=0 Mar 09 13:26:29 crc kubenswrapper[4723]: I0309 13:26:29.209032 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fcdn6" event={"ID":"c97cec8e-5cb2-455b-8b57-8179ced146c3","Type":"ContainerDied","Data":"312079012e8257ade1d0f49ec8ce9610ee7d52c3793feccc389eeea9a72e122e"} Mar 09 13:26:29 crc kubenswrapper[4723]: I0309 13:26:29.307577 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-1" podUID="daa528e2-bcd7-43a8-bfea-a0911b3020c5" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.135:5671: connect: connection refused" Mar 09 13:26:29 crc kubenswrapper[4723]: I0309 13:26:29.886224 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.016851 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/daa528e2-bcd7-43a8-bfea-a0911b3020c5-config-data\") pod \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\" (UID: \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\") " Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.016944 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/daa528e2-bcd7-43a8-bfea-a0911b3020c5-plugins-conf\") pod \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\" (UID: \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\") " Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.017016 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tjrf\" (UniqueName: \"kubernetes.io/projected/daa528e2-bcd7-43a8-bfea-a0911b3020c5-kube-api-access-6tjrf\") pod \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\" (UID: \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\") " Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.017109 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/daa528e2-bcd7-43a8-bfea-a0911b3020c5-server-conf\") pod \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\" (UID: \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\") " Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.017158 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/daa528e2-bcd7-43a8-bfea-a0911b3020c5-rabbitmq-tls\") pod \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\" (UID: \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\") " Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.017223 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/daa528e2-bcd7-43a8-bfea-a0911b3020c5-rabbitmq-erlang-cookie\") pod \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\" (UID: \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\") " Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.017288 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/daa528e2-bcd7-43a8-bfea-a0911b3020c5-rabbitmq-plugins\") pod \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\" (UID: \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\") " Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.018707 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/daa528e2-bcd7-43a8-bfea-a0911b3020c5-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "daa528e2-bcd7-43a8-bfea-a0911b3020c5" (UID: "daa528e2-bcd7-43a8-bfea-a0911b3020c5"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.019390 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1277af1d-fdb9-417d-b33b-f6b7b0543c95\") pod \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\" (UID: \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\") " Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.019468 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/daa528e2-bcd7-43a8-bfea-a0911b3020c5-erlang-cookie-secret\") pod \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\" (UID: \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\") " Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.019511 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/daa528e2-bcd7-43a8-bfea-a0911b3020c5-rabbitmq-confd\") pod \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\" (UID: \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\") " Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.019551 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/daa528e2-bcd7-43a8-bfea-a0911b3020c5-pod-info\") pod \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\" (UID: \"daa528e2-bcd7-43a8-bfea-a0911b3020c5\") " Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.019695 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/daa528e2-bcd7-43a8-bfea-a0911b3020c5-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "daa528e2-bcd7-43a8-bfea-a0911b3020c5" (UID: "daa528e2-bcd7-43a8-bfea-a0911b3020c5"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.020331 4723 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/daa528e2-bcd7-43a8-bfea-a0911b3020c5-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.020353 4723 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/daa528e2-bcd7-43a8-bfea-a0911b3020c5-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.020994 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/daa528e2-bcd7-43a8-bfea-a0911b3020c5-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "daa528e2-bcd7-43a8-bfea-a0911b3020c5" (UID: "daa528e2-bcd7-43a8-bfea-a0911b3020c5"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.026402 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/daa528e2-bcd7-43a8-bfea-a0911b3020c5-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "daa528e2-bcd7-43a8-bfea-a0911b3020c5" (UID: "daa528e2-bcd7-43a8-bfea-a0911b3020c5"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.043528 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/daa528e2-bcd7-43a8-bfea-a0911b3020c5-pod-info" (OuterVolumeSpecName: "pod-info") pod "daa528e2-bcd7-43a8-bfea-a0911b3020c5" (UID: "daa528e2-bcd7-43a8-bfea-a0911b3020c5"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.044085 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/daa528e2-bcd7-43a8-bfea-a0911b3020c5-kube-api-access-6tjrf" (OuterVolumeSpecName: "kube-api-access-6tjrf") pod "daa528e2-bcd7-43a8-bfea-a0911b3020c5" (UID: "daa528e2-bcd7-43a8-bfea-a0911b3020c5"). InnerVolumeSpecName "kube-api-access-6tjrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.053212 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.057168 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/daa528e2-bcd7-43a8-bfea-a0911b3020c5-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "daa528e2-bcd7-43a8-bfea-a0911b3020c5" (UID: "daa528e2-bcd7-43a8-bfea-a0911b3020c5"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.060694 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1277af1d-fdb9-417d-b33b-f6b7b0543c95" (OuterVolumeSpecName: "persistence") pod "daa528e2-bcd7-43a8-bfea-a0911b3020c5" (UID: "daa528e2-bcd7-43a8-bfea-a0911b3020c5"). InnerVolumeSpecName "pvc-1277af1d-fdb9-417d-b33b-f6b7b0543c95". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.086470 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/daa528e2-bcd7-43a8-bfea-a0911b3020c5-config-data" (OuterVolumeSpecName: "config-data") pod "daa528e2-bcd7-43a8-bfea-a0911b3020c5" (UID: "daa528e2-bcd7-43a8-bfea-a0911b3020c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.137507 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lp4j\" (UniqueName: \"kubernetes.io/projected/449e6144-ad49-44a8-ad79-809de89fa5c6-kube-api-access-7lp4j\") pod \"449e6144-ad49-44a8-ad79-809de89fa5c6\" (UID: \"449e6144-ad49-44a8-ad79-809de89fa5c6\") " Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.137595 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/449e6144-ad49-44a8-ad79-809de89fa5c6-public-tls-certs\") pod \"449e6144-ad49-44a8-ad79-809de89fa5c6\" (UID: \"449e6144-ad49-44a8-ad79-809de89fa5c6\") " Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.137619 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/449e6144-ad49-44a8-ad79-809de89fa5c6-scripts\") pod \"449e6144-ad49-44a8-ad79-809de89fa5c6\" (UID: \"449e6144-ad49-44a8-ad79-809de89fa5c6\") " Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.137640 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/449e6144-ad49-44a8-ad79-809de89fa5c6-combined-ca-bundle\") pod \"449e6144-ad49-44a8-ad79-809de89fa5c6\" (UID: \"449e6144-ad49-44a8-ad79-809de89fa5c6\") " Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.137655 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/449e6144-ad49-44a8-ad79-809de89fa5c6-config-data\") pod \"449e6144-ad49-44a8-ad79-809de89fa5c6\" (UID: \"449e6144-ad49-44a8-ad79-809de89fa5c6\") " Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.138048 4723 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/daa528e2-bcd7-43a8-bfea-a0911b3020c5-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.138077 4723 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-1277af1d-fdb9-417d-b33b-f6b7b0543c95\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1277af1d-fdb9-417d-b33b-f6b7b0543c95\") on node \"crc\" " Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.138090 4723 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/daa528e2-bcd7-43a8-bfea-a0911b3020c5-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.138103 4723 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/daa528e2-bcd7-43a8-bfea-a0911b3020c5-pod-info\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.138114 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/daa528e2-bcd7-43a8-bfea-a0911b3020c5-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.138123 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tjrf\" (UniqueName: \"kubernetes.io/projected/daa528e2-bcd7-43a8-bfea-a0911b3020c5-kube-api-access-6tjrf\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.138134 4723 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/daa528e2-bcd7-43a8-bfea-a0911b3020c5-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.138660 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/daa528e2-bcd7-43a8-bfea-a0911b3020c5-server-conf" (OuterVolumeSpecName: "server-conf") pod "daa528e2-bcd7-43a8-bfea-a0911b3020c5" (UID: "daa528e2-bcd7-43a8-bfea-a0911b3020c5"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.146195 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/449e6144-ad49-44a8-ad79-809de89fa5c6-scripts" (OuterVolumeSpecName: "scripts") pod "449e6144-ad49-44a8-ad79-809de89fa5c6" (UID: "449e6144-ad49-44a8-ad79-809de89fa5c6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.146280 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/449e6144-ad49-44a8-ad79-809de89fa5c6-kube-api-access-7lp4j" (OuterVolumeSpecName: "kube-api-access-7lp4j") pod "449e6144-ad49-44a8-ad79-809de89fa5c6" (UID: "449e6144-ad49-44a8-ad79-809de89fa5c6"). InnerVolumeSpecName "kube-api-access-7lp4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.191501 4723 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.191820 4723 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-1277af1d-fdb9-417d-b33b-f6b7b0543c95" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1277af1d-fdb9-417d-b33b-f6b7b0543c95") on node "crc" Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.239972 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/449e6144-ad49-44a8-ad79-809de89fa5c6-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "449e6144-ad49-44a8-ad79-809de89fa5c6" (UID: "449e6144-ad49-44a8-ad79-809de89fa5c6"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.243251 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/449e6144-ad49-44a8-ad79-809de89fa5c6-internal-tls-certs\") pod \"449e6144-ad49-44a8-ad79-809de89fa5c6\" (UID: \"449e6144-ad49-44a8-ad79-809de89fa5c6\") " Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.243480 4723 generic.go:334] "Generic (PLEG): container finished" podID="daa528e2-bcd7-43a8-bfea-a0911b3020c5" containerID="f12e964565b682de4c859455f62e9203db365e9420b552bf6ee54ba492d5bdee" exitCode=0 Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.243588 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"daa528e2-bcd7-43a8-bfea-a0911b3020c5","Type":"ContainerDied","Data":"f12e964565b682de4c859455f62e9203db365e9420b552bf6ee54ba492d5bdee"} Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.243624 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"daa528e2-bcd7-43a8-bfea-a0911b3020c5","Type":"ContainerDied","Data":"8abe0dab6a7f482a198575018a6b66071882a893761c80b9a06949d7ef573b27"} Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.243692 4723 scope.go:117] "RemoveContainer" containerID="f12e964565b682de4c859455f62e9203db365e9420b552bf6ee54ba492d5bdee" Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.243968 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.257020 4723 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/daa528e2-bcd7-43a8-bfea-a0911b3020c5-server-conf\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.259234 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lp4j\" (UniqueName: \"kubernetes.io/projected/449e6144-ad49-44a8-ad79-809de89fa5c6-kube-api-access-7lp4j\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.261141 4723 reconciler_common.go:293] "Volume detached for volume \"pvc-1277af1d-fdb9-417d-b33b-f6b7b0543c95\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1277af1d-fdb9-417d-b33b-f6b7b0543c95\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.261279 4723 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/449e6144-ad49-44a8-ad79-809de89fa5c6-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.261449 4723 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/449e6144-ad49-44a8-ad79-809de89fa5c6-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.260603 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.260465 4723 generic.go:334] "Generic (PLEG): container finished" podID="449e6144-ad49-44a8-ad79-809de89fa5c6" containerID="614f6f3f3d2a3090f8d68732d63bcf45362a34004d6e5414bf4083a904fe172c" exitCode=0 Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.262694 4723 generic.go:334] "Generic (PLEG): container finished" podID="449e6144-ad49-44a8-ad79-809de89fa5c6" containerID="1e523dfd4062df573b2177b8e2288353fdf30bdfcdddeb4ec818d336c2c949ee" exitCode=0 Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.260491 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"449e6144-ad49-44a8-ad79-809de89fa5c6","Type":"ContainerDied","Data":"614f6f3f3d2a3090f8d68732d63bcf45362a34004d6e5414bf4083a904fe172c"} Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.263722 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"449e6144-ad49-44a8-ad79-809de89fa5c6","Type":"ContainerDied","Data":"1e523dfd4062df573b2177b8e2288353fdf30bdfcdddeb4ec818d336c2c949ee"} Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.263762 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"449e6144-ad49-44a8-ad79-809de89fa5c6","Type":"ContainerDied","Data":"cdd137e6d9c30f707f9511233d98b502ae4971ba7f018e76318e524c8bf7e3f0"} Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.328547 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/449e6144-ad49-44a8-ad79-809de89fa5c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "449e6144-ad49-44a8-ad79-809de89fa5c6" (UID: "449e6144-ad49-44a8-ad79-809de89fa5c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.356928 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/449e6144-ad49-44a8-ad79-809de89fa5c6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "449e6144-ad49-44a8-ad79-809de89fa5c6" (UID: "449e6144-ad49-44a8-ad79-809de89fa5c6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.363737 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/449e6144-ad49-44a8-ad79-809de89fa5c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:30 crc kubenswrapper[4723]: I0309 13:26:30.363786 4723 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/449e6144-ad49-44a8-ad79-809de89fa5c6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.051104 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/daa528e2-bcd7-43a8-bfea-a0911b3020c5-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "daa528e2-bcd7-43a8-bfea-a0911b3020c5" (UID: "daa528e2-bcd7-43a8-bfea-a0911b3020c5"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.096374 4723 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/daa528e2-bcd7-43a8-bfea-a0911b3020c5-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.109376 4723 scope.go:117] "RemoveContainer" containerID="8ef261746c675013eb62f34eb677fb0207d09a06464e1471d9a77690a6583b55" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.193176 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/449e6144-ad49-44a8-ad79-809de89fa5c6-config-data" (OuterVolumeSpecName: "config-data") pod "449e6144-ad49-44a8-ad79-809de89fa5c6" (UID: "449e6144-ad49-44a8-ad79-809de89fa5c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.199249 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/449e6144-ad49-44a8-ad79-809de89fa5c6-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.324316 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.334059 4723 scope.go:117] "RemoveContainer" containerID="f12e964565b682de4c859455f62e9203db365e9420b552bf6ee54ba492d5bdee" Mar 09 13:26:31 crc kubenswrapper[4723]: E0309 13:26:31.337299 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f12e964565b682de4c859455f62e9203db365e9420b552bf6ee54ba492d5bdee\": container with ID starting with f12e964565b682de4c859455f62e9203db365e9420b552bf6ee54ba492d5bdee not found: ID does not exist" containerID="f12e964565b682de4c859455f62e9203db365e9420b552bf6ee54ba492d5bdee" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.337339 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f12e964565b682de4c859455f62e9203db365e9420b552bf6ee54ba492d5bdee"} err="failed to get container status \"f12e964565b682de4c859455f62e9203db365e9420b552bf6ee54ba492d5bdee\": rpc error: code = NotFound desc = could not find container \"f12e964565b682de4c859455f62e9203db365e9420b552bf6ee54ba492d5bdee\": container with ID starting with f12e964565b682de4c859455f62e9203db365e9420b552bf6ee54ba492d5bdee not found: ID does not exist" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.337364 4723 scope.go:117] "RemoveContainer" containerID="8ef261746c675013eb62f34eb677fb0207d09a06464e1471d9a77690a6583b55" Mar 09 13:26:31 crc kubenswrapper[4723]: E0309 13:26:31.337435 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ef261746c675013eb62f34eb677fb0207d09a06464e1471d9a77690a6583b55\": container with ID starting with 8ef261746c675013eb62f34eb677fb0207d09a06464e1471d9a77690a6583b55 not found: ID does not exist" containerID="8ef261746c675013eb62f34eb677fb0207d09a06464e1471d9a77690a6583b55" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.371031 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ef261746c675013eb62f34eb677fb0207d09a06464e1471d9a77690a6583b55"} err="failed to get container status \"8ef261746c675013eb62f34eb677fb0207d09a06464e1471d9a77690a6583b55\": rpc error: code = NotFound desc = could not find container \"8ef261746c675013eb62f34eb677fb0207d09a06464e1471d9a77690a6583b55\": container with ID starting with 8ef261746c675013eb62f34eb677fb0207d09a06464e1471d9a77690a6583b55 not found: ID does not exist" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.371063 4723 scope.go:117] "RemoveContainer" containerID="614f6f3f3d2a3090f8d68732d63bcf45362a34004d6e5414bf4083a904fe172c" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.395121 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.421778 4723 scope.go:117] "RemoveContainer" containerID="1e523dfd4062df573b2177b8e2288353fdf30bdfcdddeb4ec818d336c2c949ee" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.423262 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fcdn6" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.442928 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-1"] Mar 09 13:26:31 crc kubenswrapper[4723]: E0309 13:26:31.443453 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daa528e2-bcd7-43a8-bfea-a0911b3020c5" containerName="rabbitmq" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.443473 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="daa528e2-bcd7-43a8-bfea-a0911b3020c5" containerName="rabbitmq" Mar 09 13:26:31 crc kubenswrapper[4723]: E0309 13:26:31.443491 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daa528e2-bcd7-43a8-bfea-a0911b3020c5" containerName="setup-container" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.443501 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="daa528e2-bcd7-43a8-bfea-a0911b3020c5" containerName="setup-container" Mar 09 13:26:31 crc kubenswrapper[4723]: E0309 13:26:31.443516 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="449e6144-ad49-44a8-ad79-809de89fa5c6" containerName="aodh-listener" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.443524 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="449e6144-ad49-44a8-ad79-809de89fa5c6" containerName="aodh-listener" Mar 09 13:26:31 crc kubenswrapper[4723]: E0309 13:26:31.443542 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="449e6144-ad49-44a8-ad79-809de89fa5c6" containerName="aodh-evaluator" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.443549 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="449e6144-ad49-44a8-ad79-809de89fa5c6" containerName="aodh-evaluator" Mar 09 13:26:31 crc kubenswrapper[4723]: E0309 13:26:31.443562 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d" containerName="aodh-db-sync" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.443569 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d" containerName="aodh-db-sync" Mar 09 13:26:31 crc kubenswrapper[4723]: E0309 13:26:31.443588 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="449e6144-ad49-44a8-ad79-809de89fa5c6" containerName="aodh-notifier" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.443595 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="449e6144-ad49-44a8-ad79-809de89fa5c6" containerName="aodh-notifier" Mar 09 13:26:31 crc kubenswrapper[4723]: E0309 13:26:31.443617 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="449e6144-ad49-44a8-ad79-809de89fa5c6" containerName="aodh-api" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.443624 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="449e6144-ad49-44a8-ad79-809de89fa5c6" containerName="aodh-api" Mar 09 13:26:31 crc kubenswrapper[4723]: E0309 13:26:31.443644 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c97cec8e-5cb2-455b-8b57-8179ced146c3" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.443653 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="c97cec8e-5cb2-455b-8b57-8179ced146c3" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.443882 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="c97cec8e-5cb2-455b-8b57-8179ced146c3" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.443904 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="449e6144-ad49-44a8-ad79-809de89fa5c6" containerName="aodh-listener" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.443918 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="449e6144-ad49-44a8-ad79-809de89fa5c6" containerName="aodh-notifier" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.443933 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d" containerName="aodh-db-sync" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.443942 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="daa528e2-bcd7-43a8-bfea-a0911b3020c5" containerName="rabbitmq" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.443947 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="449e6144-ad49-44a8-ad79-809de89fa5c6" containerName="aodh-api" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.443955 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="449e6144-ad49-44a8-ad79-809de89fa5c6" containerName="aodh-evaluator" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.445138 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.489980 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.504677 4723 scope.go:117] "RemoveContainer" containerID="78281b4a2b76e933fde883105a715006fbc28f0156e8152af1b38dc6f7ebdd6d" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.613298 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhv4t\" (UniqueName: \"kubernetes.io/projected/c97cec8e-5cb2-455b-8b57-8179ced146c3-kube-api-access-mhv4t\") pod \"c97cec8e-5cb2-455b-8b57-8179ced146c3\" (UID: \"c97cec8e-5cb2-455b-8b57-8179ced146c3\") " Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.613785 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c97cec8e-5cb2-455b-8b57-8179ced146c3-inventory\") pod \"c97cec8e-5cb2-455b-8b57-8179ced146c3\" (UID: \"c97cec8e-5cb2-455b-8b57-8179ced146c3\") " Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.613905 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c97cec8e-5cb2-455b-8b57-8179ced146c3-ssh-key-openstack-edpm-ipam\") pod \"c97cec8e-5cb2-455b-8b57-8179ced146c3\" (UID: \"c97cec8e-5cb2-455b-8b57-8179ced146c3\") " Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.615394 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/447c3a5d-33c3-40ad-a5e4-dce2a0533b8f-config-data\") pod \"rabbitmq-server-1\" (UID: \"447c3a5d-33c3-40ad-a5e4-dce2a0533b8f\") " pod="openstack/rabbitmq-server-1" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.615457 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/447c3a5d-33c3-40ad-a5e4-dce2a0533b8f-server-conf\") pod \"rabbitmq-server-1\" (UID: \"447c3a5d-33c3-40ad-a5e4-dce2a0533b8f\") " pod="openstack/rabbitmq-server-1" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.615555 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/447c3a5d-33c3-40ad-a5e4-dce2a0533b8f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"447c3a5d-33c3-40ad-a5e4-dce2a0533b8f\") " pod="openstack/rabbitmq-server-1" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.615626 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/447c3a5d-33c3-40ad-a5e4-dce2a0533b8f-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"447c3a5d-33c3-40ad-a5e4-dce2a0533b8f\") " pod="openstack/rabbitmq-server-1" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.615649 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/447c3a5d-33c3-40ad-a5e4-dce2a0533b8f-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"447c3a5d-33c3-40ad-a5e4-dce2a0533b8f\") " pod="openstack/rabbitmq-server-1" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.615735 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/447c3a5d-33c3-40ad-a5e4-dce2a0533b8f-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"447c3a5d-33c3-40ad-a5e4-dce2a0533b8f\") " pod="openstack/rabbitmq-server-1" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.615932 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/447c3a5d-33c3-40ad-a5e4-dce2a0533b8f-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"447c3a5d-33c3-40ad-a5e4-dce2a0533b8f\") " pod="openstack/rabbitmq-server-1" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.616024 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jdw6\" (UniqueName: \"kubernetes.io/projected/447c3a5d-33c3-40ad-a5e4-dce2a0533b8f-kube-api-access-2jdw6\") pod \"rabbitmq-server-1\" (UID: \"447c3a5d-33c3-40ad-a5e4-dce2a0533b8f\") " pod="openstack/rabbitmq-server-1" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.616096 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/447c3a5d-33c3-40ad-a5e4-dce2a0533b8f-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"447c3a5d-33c3-40ad-a5e4-dce2a0533b8f\") " pod="openstack/rabbitmq-server-1" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.616120 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/447c3a5d-33c3-40ad-a5e4-dce2a0533b8f-pod-info\") pod \"rabbitmq-server-1\" (UID: \"447c3a5d-33c3-40ad-a5e4-dce2a0533b8f\") " pod="openstack/rabbitmq-server-1" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.616216 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1277af1d-fdb9-417d-b33b-f6b7b0543c95\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1277af1d-fdb9-417d-b33b-f6b7b0543c95\") pod \"rabbitmq-server-1\" (UID: \"447c3a5d-33c3-40ad-a5e4-dce2a0533b8f\") " pod="openstack/rabbitmq-server-1" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.619091 4723 scope.go:117] "RemoveContainer" containerID="2a43ed24c53c90ff73e9513c57760e48ad17450200a520872d6c47cb4cc380e0" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.641373 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.642134 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c97cec8e-5cb2-455b-8b57-8179ced146c3-kube-api-access-mhv4t" (OuterVolumeSpecName: "kube-api-access-mhv4t") pod "c97cec8e-5cb2-455b-8b57-8179ced146c3" (UID: "c97cec8e-5cb2-455b-8b57-8179ced146c3"). InnerVolumeSpecName "kube-api-access-mhv4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.675889 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.710880 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c97cec8e-5cb2-455b-8b57-8179ced146c3-inventory" (OuterVolumeSpecName: "inventory") pod "c97cec8e-5cb2-455b-8b57-8179ced146c3" (UID: "c97cec8e-5cb2-455b-8b57-8179ced146c3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.718434 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/447c3a5d-33c3-40ad-a5e4-dce2a0533b8f-server-conf\") pod \"rabbitmq-server-1\" (UID: \"447c3a5d-33c3-40ad-a5e4-dce2a0533b8f\") " pod="openstack/rabbitmq-server-1" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.718504 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/447c3a5d-33c3-40ad-a5e4-dce2a0533b8f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"447c3a5d-33c3-40ad-a5e4-dce2a0533b8f\") " pod="openstack/rabbitmq-server-1" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.718540 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/447c3a5d-33c3-40ad-a5e4-dce2a0533b8f-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"447c3a5d-33c3-40ad-a5e4-dce2a0533b8f\") " pod="openstack/rabbitmq-server-1" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.718563 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/447c3a5d-33c3-40ad-a5e4-dce2a0533b8f-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"447c3a5d-33c3-40ad-a5e4-dce2a0533b8f\") " pod="openstack/rabbitmq-server-1" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.718598 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/447c3a5d-33c3-40ad-a5e4-dce2a0533b8f-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"447c3a5d-33c3-40ad-a5e4-dce2a0533b8f\") " pod="openstack/rabbitmq-server-1" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.718661 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/447c3a5d-33c3-40ad-a5e4-dce2a0533b8f-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"447c3a5d-33c3-40ad-a5e4-dce2a0533b8f\") " pod="openstack/rabbitmq-server-1" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.718704 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jdw6\" (UniqueName: \"kubernetes.io/projected/447c3a5d-33c3-40ad-a5e4-dce2a0533b8f-kube-api-access-2jdw6\") pod \"rabbitmq-server-1\" (UID: \"447c3a5d-33c3-40ad-a5e4-dce2a0533b8f\") " pod="openstack/rabbitmq-server-1" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.718738 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/447c3a5d-33c3-40ad-a5e4-dce2a0533b8f-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"447c3a5d-33c3-40ad-a5e4-dce2a0533b8f\") " pod="openstack/rabbitmq-server-1" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.718755 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/447c3a5d-33c3-40ad-a5e4-dce2a0533b8f-pod-info\") pod \"rabbitmq-server-1\" (UID: \"447c3a5d-33c3-40ad-a5e4-dce2a0533b8f\") " pod="openstack/rabbitmq-server-1" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.718798 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1277af1d-fdb9-417d-b33b-f6b7b0543c95\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1277af1d-fdb9-417d-b33b-f6b7b0543c95\") pod \"rabbitmq-server-1\" (UID: \"447c3a5d-33c3-40ad-a5e4-dce2a0533b8f\") " pod="openstack/rabbitmq-server-1" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.718832 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/447c3a5d-33c3-40ad-a5e4-dce2a0533b8f-config-data\") pod \"rabbitmq-server-1\" (UID: \"447c3a5d-33c3-40ad-a5e4-dce2a0533b8f\") " pod="openstack/rabbitmq-server-1" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.718916 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhv4t\" (UniqueName: \"kubernetes.io/projected/c97cec8e-5cb2-455b-8b57-8179ced146c3-kube-api-access-mhv4t\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.718929 4723 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c97cec8e-5cb2-455b-8b57-8179ced146c3-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.719697 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/447c3a5d-33c3-40ad-a5e4-dce2a0533b8f-config-data\") pod \"rabbitmq-server-1\" (UID: \"447c3a5d-33c3-40ad-a5e4-dce2a0533b8f\") " pod="openstack/rabbitmq-server-1" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.720622 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/447c3a5d-33c3-40ad-a5e4-dce2a0533b8f-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"447c3a5d-33c3-40ad-a5e4-dce2a0533b8f\") " pod="openstack/rabbitmq-server-1" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.720879 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/447c3a5d-33c3-40ad-a5e4-dce2a0533b8f-server-conf\") pod \"rabbitmq-server-1\" (UID: \"447c3a5d-33c3-40ad-a5e4-dce2a0533b8f\") " pod="openstack/rabbitmq-server-1" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.720996 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/447c3a5d-33c3-40ad-a5e4-dce2a0533b8f-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"447c3a5d-33c3-40ad-a5e4-dce2a0533b8f\") " pod="openstack/rabbitmq-server-1" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.721238 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/447c3a5d-33c3-40ad-a5e4-dce2a0533b8f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"447c3a5d-33c3-40ad-a5e4-dce2a0533b8f\") " pod="openstack/rabbitmq-server-1" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.723288 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.724384 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/447c3a5d-33c3-40ad-a5e4-dce2a0533b8f-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"447c3a5d-33c3-40ad-a5e4-dce2a0533b8f\") " pod="openstack/rabbitmq-server-1" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.725128 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/447c3a5d-33c3-40ad-a5e4-dce2a0533b8f-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"447c3a5d-33c3-40ad-a5e4-dce2a0533b8f\") " pod="openstack/rabbitmq-server-1" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.730067 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.732584 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.732775 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.732910 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.733990 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.734301 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/447c3a5d-33c3-40ad-a5e4-dce2a0533b8f-pod-info\") pod \"rabbitmq-server-1\" (UID: \"447c3a5d-33c3-40ad-a5e4-dce2a0533b8f\") " pod="openstack/rabbitmq-server-1" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.734730 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-97c45" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.735387 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/447c3a5d-33c3-40ad-a5e4-dce2a0533b8f-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"447c3a5d-33c3-40ad-a5e4-dce2a0533b8f\") " pod="openstack/rabbitmq-server-1" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.736025 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.737019 4723 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.737058 4723 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1277af1d-fdb9-417d-b33b-f6b7b0543c95\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1277af1d-fdb9-417d-b33b-f6b7b0543c95\") pod \"rabbitmq-server-1\" (UID: \"447c3a5d-33c3-40ad-a5e4-dce2a0533b8f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b95d2f7d3a9f884abc2cc747032dbde014948f5683a3dc24108da2959777a268/globalmount\"" pod="openstack/rabbitmq-server-1" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.738780 4723 scope.go:117] "RemoveContainer" containerID="614f6f3f3d2a3090f8d68732d63bcf45362a34004d6e5414bf4083a904fe172c" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.750675 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jdw6\" (UniqueName: \"kubernetes.io/projected/447c3a5d-33c3-40ad-a5e4-dce2a0533b8f-kube-api-access-2jdw6\") pod \"rabbitmq-server-1\" (UID: \"447c3a5d-33c3-40ad-a5e4-dce2a0533b8f\") " pod="openstack/rabbitmq-server-1" Mar 09 13:26:31 crc kubenswrapper[4723]: E0309 13:26:31.750782 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"614f6f3f3d2a3090f8d68732d63bcf45362a34004d6e5414bf4083a904fe172c\": container with ID starting with 614f6f3f3d2a3090f8d68732d63bcf45362a34004d6e5414bf4083a904fe172c not found: ID does not exist" containerID="614f6f3f3d2a3090f8d68732d63bcf45362a34004d6e5414bf4083a904fe172c" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.750820 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"614f6f3f3d2a3090f8d68732d63bcf45362a34004d6e5414bf4083a904fe172c"} err="failed to get container status \"614f6f3f3d2a3090f8d68732d63bcf45362a34004d6e5414bf4083a904fe172c\": rpc error: code = NotFound desc = could not find container \"614f6f3f3d2a3090f8d68732d63bcf45362a34004d6e5414bf4083a904fe172c\": container with ID starting with 614f6f3f3d2a3090f8d68732d63bcf45362a34004d6e5414bf4083a904fe172c not found: ID does not exist" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.750884 4723 scope.go:117] "RemoveContainer" containerID="1e523dfd4062df573b2177b8e2288353fdf30bdfcdddeb4ec818d336c2c949ee" Mar 09 13:26:31 crc kubenswrapper[4723]: E0309 13:26:31.751409 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e523dfd4062df573b2177b8e2288353fdf30bdfcdddeb4ec818d336c2c949ee\": container with ID starting with 1e523dfd4062df573b2177b8e2288353fdf30bdfcdddeb4ec818d336c2c949ee not found: ID does not exist" containerID="1e523dfd4062df573b2177b8e2288353fdf30bdfcdddeb4ec818d336c2c949ee" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.751455 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e523dfd4062df573b2177b8e2288353fdf30bdfcdddeb4ec818d336c2c949ee"} err="failed to get container status \"1e523dfd4062df573b2177b8e2288353fdf30bdfcdddeb4ec818d336c2c949ee\": rpc error: code = NotFound desc = could not find container \"1e523dfd4062df573b2177b8e2288353fdf30bdfcdddeb4ec818d336c2c949ee\": container with ID starting with 1e523dfd4062df573b2177b8e2288353fdf30bdfcdddeb4ec818d336c2c949ee not found: ID does not exist" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.751481 4723 scope.go:117] "RemoveContainer" containerID="78281b4a2b76e933fde883105a715006fbc28f0156e8152af1b38dc6f7ebdd6d" Mar 09 13:26:31 crc kubenswrapper[4723]: E0309 13:26:31.752170 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78281b4a2b76e933fde883105a715006fbc28f0156e8152af1b38dc6f7ebdd6d\": container with ID starting with 78281b4a2b76e933fde883105a715006fbc28f0156e8152af1b38dc6f7ebdd6d not found: ID does not exist" containerID="78281b4a2b76e933fde883105a715006fbc28f0156e8152af1b38dc6f7ebdd6d" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.752198 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78281b4a2b76e933fde883105a715006fbc28f0156e8152af1b38dc6f7ebdd6d"} err="failed to get container status \"78281b4a2b76e933fde883105a715006fbc28f0156e8152af1b38dc6f7ebdd6d\": rpc error: code = NotFound desc = could not find container \"78281b4a2b76e933fde883105a715006fbc28f0156e8152af1b38dc6f7ebdd6d\": container with ID starting with 78281b4a2b76e933fde883105a715006fbc28f0156e8152af1b38dc6f7ebdd6d not found: ID does not exist" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.752213 4723 scope.go:117] "RemoveContainer" containerID="2a43ed24c53c90ff73e9513c57760e48ad17450200a520872d6c47cb4cc380e0" Mar 09 13:26:31 crc kubenswrapper[4723]: E0309 13:26:31.752444 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a43ed24c53c90ff73e9513c57760e48ad17450200a520872d6c47cb4cc380e0\": container with ID starting with 2a43ed24c53c90ff73e9513c57760e48ad17450200a520872d6c47cb4cc380e0 not found: ID does not exist" containerID="2a43ed24c53c90ff73e9513c57760e48ad17450200a520872d6c47cb4cc380e0" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.752468 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a43ed24c53c90ff73e9513c57760e48ad17450200a520872d6c47cb4cc380e0"} err="failed to get container status \"2a43ed24c53c90ff73e9513c57760e48ad17450200a520872d6c47cb4cc380e0\": rpc error: code = NotFound desc = could not find container \"2a43ed24c53c90ff73e9513c57760e48ad17450200a520872d6c47cb4cc380e0\": container with ID starting with 2a43ed24c53c90ff73e9513c57760e48ad17450200a520872d6c47cb4cc380e0 not found: ID does not exist" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.752483 4723 scope.go:117] "RemoveContainer" containerID="614f6f3f3d2a3090f8d68732d63bcf45362a34004d6e5414bf4083a904fe172c" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.753637 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"614f6f3f3d2a3090f8d68732d63bcf45362a34004d6e5414bf4083a904fe172c"} err="failed to get container status \"614f6f3f3d2a3090f8d68732d63bcf45362a34004d6e5414bf4083a904fe172c\": rpc error: code = NotFound desc = could not find container \"614f6f3f3d2a3090f8d68732d63bcf45362a34004d6e5414bf4083a904fe172c\": container with ID starting with 614f6f3f3d2a3090f8d68732d63bcf45362a34004d6e5414bf4083a904fe172c not found: ID does not exist" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.753658 4723 scope.go:117] "RemoveContainer" containerID="1e523dfd4062df573b2177b8e2288353fdf30bdfcdddeb4ec818d336c2c949ee" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.753871 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e523dfd4062df573b2177b8e2288353fdf30bdfcdddeb4ec818d336c2c949ee"} err="failed to get container status \"1e523dfd4062df573b2177b8e2288353fdf30bdfcdddeb4ec818d336c2c949ee\": rpc error: code = NotFound desc = could not find container \"1e523dfd4062df573b2177b8e2288353fdf30bdfcdddeb4ec818d336c2c949ee\": container with ID starting with 1e523dfd4062df573b2177b8e2288353fdf30bdfcdddeb4ec818d336c2c949ee not found: ID does not exist" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.753892 4723 scope.go:117] "RemoveContainer" containerID="78281b4a2b76e933fde883105a715006fbc28f0156e8152af1b38dc6f7ebdd6d" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.754102 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78281b4a2b76e933fde883105a715006fbc28f0156e8152af1b38dc6f7ebdd6d"} err="failed to get container status \"78281b4a2b76e933fde883105a715006fbc28f0156e8152af1b38dc6f7ebdd6d\": rpc error: code = NotFound desc = could not find container \"78281b4a2b76e933fde883105a715006fbc28f0156e8152af1b38dc6f7ebdd6d\": container with ID starting with 78281b4a2b76e933fde883105a715006fbc28f0156e8152af1b38dc6f7ebdd6d not found: ID does not exist" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.754123 4723 scope.go:117] "RemoveContainer" containerID="2a43ed24c53c90ff73e9513c57760e48ad17450200a520872d6c47cb4cc380e0" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.754288 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a43ed24c53c90ff73e9513c57760e48ad17450200a520872d6c47cb4cc380e0"} err="failed to get container status \"2a43ed24c53c90ff73e9513c57760e48ad17450200a520872d6c47cb4cc380e0\": rpc error: code = NotFound desc = could not find container \"2a43ed24c53c90ff73e9513c57760e48ad17450200a520872d6c47cb4cc380e0\": container with ID starting with 2a43ed24c53c90ff73e9513c57760e48ad17450200a520872d6c47cb4cc380e0 not found: ID does not exist" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.759663 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c97cec8e-5cb2-455b-8b57-8179ced146c3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c97cec8e-5cb2-455b-8b57-8179ced146c3" (UID: "c97cec8e-5cb2-455b-8b57-8179ced146c3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.820459 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abe6e5e7-570d-40e4-995c-2921d3bd93ca-combined-ca-bundle\") pod \"aodh-0\" (UID: \"abe6e5e7-570d-40e4-995c-2921d3bd93ca\") " pod="openstack/aodh-0" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.820528 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/abe6e5e7-570d-40e4-995c-2921d3bd93ca-public-tls-certs\") pod \"aodh-0\" (UID: \"abe6e5e7-570d-40e4-995c-2921d3bd93ca\") " pod="openstack/aodh-0" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.820662 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz7ws\" (UniqueName: \"kubernetes.io/projected/abe6e5e7-570d-40e4-995c-2921d3bd93ca-kube-api-access-nz7ws\") pod \"aodh-0\" (UID: \"abe6e5e7-570d-40e4-995c-2921d3bd93ca\") " pod="openstack/aodh-0" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.820822 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/abe6e5e7-570d-40e4-995c-2921d3bd93ca-internal-tls-certs\") pod \"aodh-0\" (UID: \"abe6e5e7-570d-40e4-995c-2921d3bd93ca\") " pod="openstack/aodh-0" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.820893 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abe6e5e7-570d-40e4-995c-2921d3bd93ca-config-data\") pod \"aodh-0\" (UID: \"abe6e5e7-570d-40e4-995c-2921d3bd93ca\") " pod="openstack/aodh-0" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.820959 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abe6e5e7-570d-40e4-995c-2921d3bd93ca-scripts\") pod \"aodh-0\" (UID: \"abe6e5e7-570d-40e4-995c-2921d3bd93ca\") " pod="openstack/aodh-0" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.821061 4723 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c97cec8e-5cb2-455b-8b57-8179ced146c3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.859199 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1277af1d-fdb9-417d-b33b-f6b7b0543c95\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1277af1d-fdb9-417d-b33b-f6b7b0543c95\") pod \"rabbitmq-server-1\" (UID: \"447c3a5d-33c3-40ad-a5e4-dce2a0533b8f\") " pod="openstack/rabbitmq-server-1" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.907510 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.922538 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abe6e5e7-570d-40e4-995c-2921d3bd93ca-combined-ca-bundle\") pod \"aodh-0\" (UID: \"abe6e5e7-570d-40e4-995c-2921d3bd93ca\") " pod="openstack/aodh-0" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.922574 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/abe6e5e7-570d-40e4-995c-2921d3bd93ca-public-tls-certs\") pod \"aodh-0\" (UID: \"abe6e5e7-570d-40e4-995c-2921d3bd93ca\") " pod="openstack/aodh-0" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.923200 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nz7ws\" (UniqueName: \"kubernetes.io/projected/abe6e5e7-570d-40e4-995c-2921d3bd93ca-kube-api-access-nz7ws\") pod \"aodh-0\" (UID: \"abe6e5e7-570d-40e4-995c-2921d3bd93ca\") " pod="openstack/aodh-0" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.923333 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/abe6e5e7-570d-40e4-995c-2921d3bd93ca-internal-tls-certs\") pod \"aodh-0\" (UID: \"abe6e5e7-570d-40e4-995c-2921d3bd93ca\") " pod="openstack/aodh-0" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.923365 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abe6e5e7-570d-40e4-995c-2921d3bd93ca-config-data\") pod \"aodh-0\" (UID: \"abe6e5e7-570d-40e4-995c-2921d3bd93ca\") " pod="openstack/aodh-0" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.923422 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abe6e5e7-570d-40e4-995c-2921d3bd93ca-scripts\") pod \"aodh-0\" (UID: \"abe6e5e7-570d-40e4-995c-2921d3bd93ca\") " pod="openstack/aodh-0" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.926174 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abe6e5e7-570d-40e4-995c-2921d3bd93ca-combined-ca-bundle\") pod \"aodh-0\" (UID: \"abe6e5e7-570d-40e4-995c-2921d3bd93ca\") " pod="openstack/aodh-0" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.927127 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abe6e5e7-570d-40e4-995c-2921d3bd93ca-scripts\") pod \"aodh-0\" (UID: \"abe6e5e7-570d-40e4-995c-2921d3bd93ca\") " pod="openstack/aodh-0" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.929657 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/abe6e5e7-570d-40e4-995c-2921d3bd93ca-public-tls-certs\") pod \"aodh-0\" (UID: \"abe6e5e7-570d-40e4-995c-2921d3bd93ca\") " pod="openstack/aodh-0" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.930009 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/abe6e5e7-570d-40e4-995c-2921d3bd93ca-internal-tls-certs\") pod \"aodh-0\" (UID: \"abe6e5e7-570d-40e4-995c-2921d3bd93ca\") " pod="openstack/aodh-0" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.931145 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abe6e5e7-570d-40e4-995c-2921d3bd93ca-config-data\") pod \"aodh-0\" (UID: \"abe6e5e7-570d-40e4-995c-2921d3bd93ca\") " pod="openstack/aodh-0" Mar 09 13:26:31 crc kubenswrapper[4723]: I0309 13:26:31.939489 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz7ws\" (UniqueName: \"kubernetes.io/projected/abe6e5e7-570d-40e4-995c-2921d3bd93ca-kube-api-access-nz7ws\") pod \"aodh-0\" (UID: \"abe6e5e7-570d-40e4-995c-2921d3bd93ca\") " pod="openstack/aodh-0" Mar 09 13:26:32 crc kubenswrapper[4723]: I0309 13:26:32.193484 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 09 13:26:32 crc kubenswrapper[4723]: I0309 13:26:32.369268 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fcdn6" Mar 09 13:26:32 crc kubenswrapper[4723]: I0309 13:26:32.371762 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fcdn6" event={"ID":"c97cec8e-5cb2-455b-8b57-8179ced146c3","Type":"ContainerDied","Data":"b3283be0294e864fcee647380b74d0ee0d9b66d25e08ea7060832a6f36dbf3a4"} Mar 09 13:26:32 crc kubenswrapper[4723]: I0309 13:26:32.373080 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3283be0294e864fcee647380b74d0ee0d9b66d25e08ea7060832a6f36dbf3a4" Mar 09 13:26:32 crc kubenswrapper[4723]: I0309 13:26:32.445438 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 09 13:26:32 crc kubenswrapper[4723]: I0309 13:26:32.541700 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxhhr"] Mar 09 13:26:32 crc kubenswrapper[4723]: I0309 13:26:32.543339 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxhhr" Mar 09 13:26:32 crc kubenswrapper[4723]: I0309 13:26:32.551758 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:26:32 crc kubenswrapper[4723]: I0309 13:26:32.552051 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gw7vt" Mar 09 13:26:32 crc kubenswrapper[4723]: I0309 13:26:32.552101 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:26:32 crc kubenswrapper[4723]: I0309 13:26:32.552333 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:26:32 crc kubenswrapper[4723]: I0309 13:26:32.559903 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxhhr"] Mar 09 13:26:32 crc kubenswrapper[4723]: I0309 13:26:32.647153 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/421e85c4-9862-4601-ab03-d2b602ba68f4-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jxhhr\" (UID: \"421e85c4-9862-4601-ab03-d2b602ba68f4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxhhr" Mar 09 13:26:32 crc kubenswrapper[4723]: I0309 13:26:32.647244 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/421e85c4-9862-4601-ab03-d2b602ba68f4-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jxhhr\" (UID: \"421e85c4-9862-4601-ab03-d2b602ba68f4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxhhr" Mar 09 13:26:32 crc kubenswrapper[4723]: I0309 13:26:32.647288 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421e85c4-9862-4601-ab03-d2b602ba68f4-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jxhhr\" (UID: \"421e85c4-9862-4601-ab03-d2b602ba68f4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxhhr" Mar 09 13:26:32 crc kubenswrapper[4723]: I0309 13:26:32.647329 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sdfd\" (UniqueName: \"kubernetes.io/projected/421e85c4-9862-4601-ab03-d2b602ba68f4-kube-api-access-4sdfd\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jxhhr\" (UID: \"421e85c4-9862-4601-ab03-d2b602ba68f4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxhhr" Mar 09 13:26:32 crc kubenswrapper[4723]: W0309 13:26:32.736643 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podabe6e5e7_570d_40e4_995c_2921d3bd93ca.slice/crio-134d17e0b1ae8214eb148a0192b7c3289316c8064d57a36a46d59828c3f804dc WatchSource:0}: Error finding container 134d17e0b1ae8214eb148a0192b7c3289316c8064d57a36a46d59828c3f804dc: Status 404 returned error can't find the container with id 134d17e0b1ae8214eb148a0192b7c3289316c8064d57a36a46d59828c3f804dc Mar 09 13:26:32 crc kubenswrapper[4723]: I0309 13:26:32.739297 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 09 13:26:32 crc kubenswrapper[4723]: I0309 13:26:32.749694 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sdfd\" (UniqueName: \"kubernetes.io/projected/421e85c4-9862-4601-ab03-d2b602ba68f4-kube-api-access-4sdfd\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jxhhr\" (UID: \"421e85c4-9862-4601-ab03-d2b602ba68f4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxhhr" Mar 09 13:26:32 crc kubenswrapper[4723]: I0309 13:26:32.749902 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/421e85c4-9862-4601-ab03-d2b602ba68f4-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jxhhr\" (UID: \"421e85c4-9862-4601-ab03-d2b602ba68f4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxhhr" Mar 09 13:26:32 crc kubenswrapper[4723]: I0309 13:26:32.749966 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/421e85c4-9862-4601-ab03-d2b602ba68f4-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jxhhr\" (UID: \"421e85c4-9862-4601-ab03-d2b602ba68f4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxhhr" Mar 09 13:26:32 crc kubenswrapper[4723]: I0309 13:26:32.750007 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421e85c4-9862-4601-ab03-d2b602ba68f4-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jxhhr\" (UID: \"421e85c4-9862-4601-ab03-d2b602ba68f4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxhhr" Mar 09 13:26:32 crc kubenswrapper[4723]: I0309 13:26:32.755987 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/421e85c4-9862-4601-ab03-d2b602ba68f4-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jxhhr\" (UID: \"421e85c4-9862-4601-ab03-d2b602ba68f4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxhhr" Mar 09 13:26:32 crc kubenswrapper[4723]: I0309 13:26:32.756236 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/421e85c4-9862-4601-ab03-d2b602ba68f4-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jxhhr\" (UID: \"421e85c4-9862-4601-ab03-d2b602ba68f4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxhhr" Mar 09 13:26:32 crc kubenswrapper[4723]: I0309 13:26:32.756307 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421e85c4-9862-4601-ab03-d2b602ba68f4-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jxhhr\" (UID: \"421e85c4-9862-4601-ab03-d2b602ba68f4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxhhr" Mar 09 13:26:32 crc kubenswrapper[4723]: I0309 13:26:32.767009 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sdfd\" (UniqueName: \"kubernetes.io/projected/421e85c4-9862-4601-ab03-d2b602ba68f4-kube-api-access-4sdfd\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jxhhr\" (UID: \"421e85c4-9862-4601-ab03-d2b602ba68f4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxhhr" Mar 09 13:26:32 crc kubenswrapper[4723]: I0309 13:26:32.900718 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="449e6144-ad49-44a8-ad79-809de89fa5c6" path="/var/lib/kubelet/pods/449e6144-ad49-44a8-ad79-809de89fa5c6/volumes" Mar 09 13:26:32 crc kubenswrapper[4723]: I0309 13:26:32.902391 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="daa528e2-bcd7-43a8-bfea-a0911b3020c5" path="/var/lib/kubelet/pods/daa528e2-bcd7-43a8-bfea-a0911b3020c5/volumes" Mar 09 13:26:32 crc kubenswrapper[4723]: I0309 13:26:32.947397 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxhhr" Mar 09 13:26:33 crc kubenswrapper[4723]: I0309 13:26:33.395608 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"447c3a5d-33c3-40ad-a5e4-dce2a0533b8f","Type":"ContainerStarted","Data":"f747b814d63c52ddb403c464987a9fba63792a7110aed76fa4de63d78bdca03c"} Mar 09 13:26:33 crc kubenswrapper[4723]: I0309 13:26:33.397491 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"abe6e5e7-570d-40e4-995c-2921d3bd93ca","Type":"ContainerStarted","Data":"1dc6e9d1758685cab8bbc5057acd7710ad348c41176e81c24c27feec4cad3a5a"} Mar 09 13:26:33 crc kubenswrapper[4723]: I0309 13:26:33.397524 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"abe6e5e7-570d-40e4-995c-2921d3bd93ca","Type":"ContainerStarted","Data":"134d17e0b1ae8214eb148a0192b7c3289316c8064d57a36a46d59828c3f804dc"} Mar 09 13:26:33 crc kubenswrapper[4723]: I0309 13:26:33.564415 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxhhr"] Mar 09 13:26:33 crc kubenswrapper[4723]: W0309 13:26:33.574107 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod421e85c4_9862_4601_ab03_d2b602ba68f4.slice/crio-fa96277f688f30aff297866f72a6757acd791fa17f260bfa7840f1e21cefecd0 WatchSource:0}: Error finding container fa96277f688f30aff297866f72a6757acd791fa17f260bfa7840f1e21cefecd0: Status 404 returned error can't find the container with id fa96277f688f30aff297866f72a6757acd791fa17f260bfa7840f1e21cefecd0 Mar 09 13:26:34 crc kubenswrapper[4723]: I0309 13:26:34.410568 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxhhr" event={"ID":"421e85c4-9862-4601-ab03-d2b602ba68f4","Type":"ContainerStarted","Data":"fa96277f688f30aff297866f72a6757acd791fa17f260bfa7840f1e21cefecd0"} Mar 09 13:26:34 crc kubenswrapper[4723]: I0309 13:26:34.413157 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"447c3a5d-33c3-40ad-a5e4-dce2a0533b8f","Type":"ContainerStarted","Data":"1a380dcb06c45f7ee18635292468ac4581b6d8066cc874900b178baa413bf172"} Mar 09 13:26:35 crc kubenswrapper[4723]: I0309 13:26:35.434519 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"abe6e5e7-570d-40e4-995c-2921d3bd93ca","Type":"ContainerStarted","Data":"9d8c8e72411fbc520bbc3bb03a2d6a00bc5fdb33c8d2843aa17cb425ddb1a403"} Mar 09 13:26:35 crc kubenswrapper[4723]: I0309 13:26:35.437882 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxhhr" event={"ID":"421e85c4-9862-4601-ab03-d2b602ba68f4","Type":"ContainerStarted","Data":"043323c0bddce9f57a0e1bd21bf0367cded08eb0734be1e8a3d843a780273089"} Mar 09 13:26:35 crc kubenswrapper[4723]: I0309 13:26:35.466448 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxhhr" podStartSLOduration=2.889058887 podStartE2EDuration="3.466427268s" podCreationTimestamp="2026-03-09 13:26:32 +0000 UTC" firstStartedPulling="2026-03-09 13:26:33.577066218 +0000 UTC m=+1667.591533758" lastFinishedPulling="2026-03-09 13:26:34.154434599 +0000 UTC m=+1668.168902139" observedRunningTime="2026-03-09 13:26:35.45968789 +0000 UTC m=+1669.474155430" watchObservedRunningTime="2026-03-09 13:26:35.466427268 +0000 UTC m=+1669.480894808" Mar 09 13:26:36 crc kubenswrapper[4723]: I0309 13:26:36.478026 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"abe6e5e7-570d-40e4-995c-2921d3bd93ca","Type":"ContainerStarted","Data":"f278401e3e9e55631d06ae264994d1070f66a53426361427e65624d813bdf15c"} Mar 09 13:26:37 crc kubenswrapper[4723]: I0309 13:26:37.491849 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"abe6e5e7-570d-40e4-995c-2921d3bd93ca","Type":"ContainerStarted","Data":"346942b89e2d47a834c2038322c5a1047e7d585a585b76238de979e05155ff67"} Mar 09 13:26:37 crc kubenswrapper[4723]: I0309 13:26:37.520407 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.675671886 podStartE2EDuration="6.520382337s" podCreationTimestamp="2026-03-09 13:26:31 +0000 UTC" firstStartedPulling="2026-03-09 13:26:32.740692012 +0000 UTC m=+1666.755159552" lastFinishedPulling="2026-03-09 13:26:36.585402463 +0000 UTC m=+1670.599870003" observedRunningTime="2026-03-09 13:26:37.514917602 +0000 UTC m=+1671.529385142" watchObservedRunningTime="2026-03-09 13:26:37.520382337 +0000 UTC m=+1671.534849877" Mar 09 13:27:06 crc kubenswrapper[4723]: I0309 13:27:06.896646 4723 generic.go:334] "Generic (PLEG): container finished" podID="447c3a5d-33c3-40ad-a5e4-dce2a0533b8f" containerID="1a380dcb06c45f7ee18635292468ac4581b6d8066cc874900b178baa413bf172" exitCode=0 Mar 09 13:27:06 crc kubenswrapper[4723]: I0309 13:27:06.904151 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"447c3a5d-33c3-40ad-a5e4-dce2a0533b8f","Type":"ContainerDied","Data":"1a380dcb06c45f7ee18635292468ac4581b6d8066cc874900b178baa413bf172"} Mar 09 13:27:07 crc kubenswrapper[4723]: I0309 13:27:07.910730 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"447c3a5d-33c3-40ad-a5e4-dce2a0533b8f","Type":"ContainerStarted","Data":"e45201d8aa1373be50328ca4a9d38369d6a07a109227e95747b07d7317784aea"} Mar 09 13:27:07 crc kubenswrapper[4723]: I0309 13:27:07.915703 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-1" Mar 09 13:27:07 crc kubenswrapper[4723]: I0309 13:27:07.942313 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-1" podStartSLOduration=36.942288889 podStartE2EDuration="36.942288889s" podCreationTimestamp="2026-03-09 13:26:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:27:07.936490086 +0000 UTC m=+1701.950957656" watchObservedRunningTime="2026-03-09 13:27:07.942288889 +0000 UTC m=+1701.956756429" Mar 09 13:27:20 crc kubenswrapper[4723]: I0309 13:27:20.967384 4723 scope.go:117] "RemoveContainer" containerID="85c1f4f649f83a640a90cf1b1ded147aa010a539481fa93ee6bbe2b20b1b9914" Mar 09 13:27:20 crc kubenswrapper[4723]: I0309 13:27:20.998729 4723 scope.go:117] "RemoveContainer" containerID="0bc52fc1eaa1ade792d3433bea77da9ac5efc6b56be9f74e7195589f92e642fa" Mar 09 13:27:21 crc kubenswrapper[4723]: I0309 13:27:21.912032 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-1" Mar 09 13:27:21 crc kubenswrapper[4723]: I0309 13:27:21.977613 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 13:27:26 crc kubenswrapper[4723]: I0309 13:27:26.157161 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="19f91c12-b482-46ab-a6e1-20164abe2ee4" containerName="rabbitmq" containerID="cri-o://74f1b39e23aa539fa8c875c3e083fe7c0fa27ba57b3d4335d1fcab549b59bfe5" gracePeriod=604796 Mar 09 13:27:29 crc kubenswrapper[4723]: I0309 13:27:29.437040 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="19f91c12-b482-46ab-a6e1-20164abe2ee4" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.133:5671: connect: connection refused" Mar 09 13:27:32 crc kubenswrapper[4723]: I0309 13:27:32.893744 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.022235 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/19f91c12-b482-46ab-a6e1-20164abe2ee4-erlang-cookie-secret\") pod \"19f91c12-b482-46ab-a6e1-20164abe2ee4\" (UID: \"19f91c12-b482-46ab-a6e1-20164abe2ee4\") " Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.022314 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/19f91c12-b482-46ab-a6e1-20164abe2ee4-rabbitmq-erlang-cookie\") pod \"19f91c12-b482-46ab-a6e1-20164abe2ee4\" (UID: \"19f91c12-b482-46ab-a6e1-20164abe2ee4\") " Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.022377 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/19f91c12-b482-46ab-a6e1-20164abe2ee4-pod-info\") pod \"19f91c12-b482-46ab-a6e1-20164abe2ee4\" (UID: \"19f91c12-b482-46ab-a6e1-20164abe2ee4\") " Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.022592 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/19f91c12-b482-46ab-a6e1-20164abe2ee4-rabbitmq-confd\") pod \"19f91c12-b482-46ab-a6e1-20164abe2ee4\" (UID: \"19f91c12-b482-46ab-a6e1-20164abe2ee4\") " Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.022646 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19f91c12-b482-46ab-a6e1-20164abe2ee4-config-data\") pod \"19f91c12-b482-46ab-a6e1-20164abe2ee4\" (UID: \"19f91c12-b482-46ab-a6e1-20164abe2ee4\") " Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.024713 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19f91c12-b482-46ab-a6e1-20164abe2ee4-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "19f91c12-b482-46ab-a6e1-20164abe2ee4" (UID: "19f91c12-b482-46ab-a6e1-20164abe2ee4"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.030539 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19f91c12-b482-46ab-a6e1-20164abe2ee4-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "19f91c12-b482-46ab-a6e1-20164abe2ee4" (UID: "19f91c12-b482-46ab-a6e1-20164abe2ee4"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.031067 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/19f91c12-b482-46ab-a6e1-20164abe2ee4-pod-info" (OuterVolumeSpecName: "pod-info") pod "19f91c12-b482-46ab-a6e1-20164abe2ee4" (UID: "19f91c12-b482-46ab-a6e1-20164abe2ee4"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.031247 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-400c2cb3-6a4d-4fb5-bae7-35d1dc27ea33\") pod \"19f91c12-b482-46ab-a6e1-20164abe2ee4\" (UID: \"19f91c12-b482-46ab-a6e1-20164abe2ee4\") " Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.031342 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/19f91c12-b482-46ab-a6e1-20164abe2ee4-server-conf\") pod \"19f91c12-b482-46ab-a6e1-20164abe2ee4\" (UID: \"19f91c12-b482-46ab-a6e1-20164abe2ee4\") " Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.032059 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/19f91c12-b482-46ab-a6e1-20164abe2ee4-rabbitmq-tls\") pod \"19f91c12-b482-46ab-a6e1-20164abe2ee4\" (UID: \"19f91c12-b482-46ab-a6e1-20164abe2ee4\") " Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.032123 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/19f91c12-b482-46ab-a6e1-20164abe2ee4-plugins-conf\") pod \"19f91c12-b482-46ab-a6e1-20164abe2ee4\" (UID: \"19f91c12-b482-46ab-a6e1-20164abe2ee4\") " Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.032226 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4n48\" (UniqueName: \"kubernetes.io/projected/19f91c12-b482-46ab-a6e1-20164abe2ee4-kube-api-access-r4n48\") pod \"19f91c12-b482-46ab-a6e1-20164abe2ee4\" (UID: \"19f91c12-b482-46ab-a6e1-20164abe2ee4\") " Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.032306 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/19f91c12-b482-46ab-a6e1-20164abe2ee4-rabbitmq-plugins\") pod \"19f91c12-b482-46ab-a6e1-20164abe2ee4\" (UID: \"19f91c12-b482-46ab-a6e1-20164abe2ee4\") " Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.033979 4723 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/19f91c12-b482-46ab-a6e1-20164abe2ee4-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.034003 4723 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/19f91c12-b482-46ab-a6e1-20164abe2ee4-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.034015 4723 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/19f91c12-b482-46ab-a6e1-20164abe2ee4-pod-info\") on node \"crc\" DevicePath \"\"" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.035029 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19f91c12-b482-46ab-a6e1-20164abe2ee4-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "19f91c12-b482-46ab-a6e1-20164abe2ee4" (UID: "19f91c12-b482-46ab-a6e1-20164abe2ee4"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.035580 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19f91c12-b482-46ab-a6e1-20164abe2ee4-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "19f91c12-b482-46ab-a6e1-20164abe2ee4" (UID: "19f91c12-b482-46ab-a6e1-20164abe2ee4"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.044206 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19f91c12-b482-46ab-a6e1-20164abe2ee4-kube-api-access-r4n48" (OuterVolumeSpecName: "kube-api-access-r4n48") pod "19f91c12-b482-46ab-a6e1-20164abe2ee4" (UID: "19f91c12-b482-46ab-a6e1-20164abe2ee4"). InnerVolumeSpecName "kube-api-access-r4n48". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.057909 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19f91c12-b482-46ab-a6e1-20164abe2ee4-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "19f91c12-b482-46ab-a6e1-20164abe2ee4" (UID: "19f91c12-b482-46ab-a6e1-20164abe2ee4"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.069007 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-400c2cb3-6a4d-4fb5-bae7-35d1dc27ea33" (OuterVolumeSpecName: "persistence") pod "19f91c12-b482-46ab-a6e1-20164abe2ee4" (UID: "19f91c12-b482-46ab-a6e1-20164abe2ee4"). InnerVolumeSpecName "pvc-400c2cb3-6a4d-4fb5-bae7-35d1dc27ea33". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.136302 4723 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-400c2cb3-6a4d-4fb5-bae7-35d1dc27ea33\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-400c2cb3-6a4d-4fb5-bae7-35d1dc27ea33\") on node \"crc\" " Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.136339 4723 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/19f91c12-b482-46ab-a6e1-20164abe2ee4-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.136349 4723 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/19f91c12-b482-46ab-a6e1-20164abe2ee4-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.136359 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4n48\" (UniqueName: \"kubernetes.io/projected/19f91c12-b482-46ab-a6e1-20164abe2ee4-kube-api-access-r4n48\") on node \"crc\" DevicePath \"\"" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.136371 4723 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/19f91c12-b482-46ab-a6e1-20164abe2ee4-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.143524 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19f91c12-b482-46ab-a6e1-20164abe2ee4-config-data" (OuterVolumeSpecName: "config-data") pod "19f91c12-b482-46ab-a6e1-20164abe2ee4" (UID: "19f91c12-b482-46ab-a6e1-20164abe2ee4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.164089 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19f91c12-b482-46ab-a6e1-20164abe2ee4-server-conf" (OuterVolumeSpecName: "server-conf") pod "19f91c12-b482-46ab-a6e1-20164abe2ee4" (UID: "19f91c12-b482-46ab-a6e1-20164abe2ee4"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.184548 4723 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.184692 4723 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-400c2cb3-6a4d-4fb5-bae7-35d1dc27ea33" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-400c2cb3-6a4d-4fb5-bae7-35d1dc27ea33") on node "crc" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.204754 4723 generic.go:334] "Generic (PLEG): container finished" podID="19f91c12-b482-46ab-a6e1-20164abe2ee4" containerID="74f1b39e23aa539fa8c875c3e083fe7c0fa27ba57b3d4335d1fcab549b59bfe5" exitCode=0 Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.204810 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"19f91c12-b482-46ab-a6e1-20164abe2ee4","Type":"ContainerDied","Data":"74f1b39e23aa539fa8c875c3e083fe7c0fa27ba57b3d4335d1fcab549b59bfe5"} Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.204846 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"19f91c12-b482-46ab-a6e1-20164abe2ee4","Type":"ContainerDied","Data":"a582220915ebe3312e61478c345cd6f7b9adf7abe770e979886329554ca913c7"} Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.204881 4723 scope.go:117] "RemoveContainer" containerID="74f1b39e23aa539fa8c875c3e083fe7c0fa27ba57b3d4335d1fcab549b59bfe5" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.205082 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.211815 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19f91c12-b482-46ab-a6e1-20164abe2ee4-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "19f91c12-b482-46ab-a6e1-20164abe2ee4" (UID: "19f91c12-b482-46ab-a6e1-20164abe2ee4"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.236259 4723 scope.go:117] "RemoveContainer" containerID="1619049fc971a5f761213226bb2e8e6badaa3b8e5bc14d02cccc57f4fa2faf21" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.239164 4723 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/19f91c12-b482-46ab-a6e1-20164abe2ee4-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.239810 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19f91c12-b482-46ab-a6e1-20164abe2ee4-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.239830 4723 reconciler_common.go:293] "Volume detached for volume \"pvc-400c2cb3-6a4d-4fb5-bae7-35d1dc27ea33\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-400c2cb3-6a4d-4fb5-bae7-35d1dc27ea33\") on node \"crc\" DevicePath \"\"" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.239876 4723 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/19f91c12-b482-46ab-a6e1-20164abe2ee4-server-conf\") on node \"crc\" DevicePath \"\"" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.268540 4723 scope.go:117] "RemoveContainer" containerID="74f1b39e23aa539fa8c875c3e083fe7c0fa27ba57b3d4335d1fcab549b59bfe5" Mar 09 13:27:33 crc kubenswrapper[4723]: E0309 13:27:33.269379 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74f1b39e23aa539fa8c875c3e083fe7c0fa27ba57b3d4335d1fcab549b59bfe5\": container with ID starting with 74f1b39e23aa539fa8c875c3e083fe7c0fa27ba57b3d4335d1fcab549b59bfe5 not found: ID does not exist" containerID="74f1b39e23aa539fa8c875c3e083fe7c0fa27ba57b3d4335d1fcab549b59bfe5" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.269440 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74f1b39e23aa539fa8c875c3e083fe7c0fa27ba57b3d4335d1fcab549b59bfe5"} err="failed to get container status \"74f1b39e23aa539fa8c875c3e083fe7c0fa27ba57b3d4335d1fcab549b59bfe5\": rpc error: code = NotFound desc = could not find container \"74f1b39e23aa539fa8c875c3e083fe7c0fa27ba57b3d4335d1fcab549b59bfe5\": container with ID starting with 74f1b39e23aa539fa8c875c3e083fe7c0fa27ba57b3d4335d1fcab549b59bfe5 not found: ID does not exist" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.269463 4723 scope.go:117] "RemoveContainer" containerID="1619049fc971a5f761213226bb2e8e6badaa3b8e5bc14d02cccc57f4fa2faf21" Mar 09 13:27:33 crc kubenswrapper[4723]: E0309 13:27:33.270000 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1619049fc971a5f761213226bb2e8e6badaa3b8e5bc14d02cccc57f4fa2faf21\": container with ID starting with 1619049fc971a5f761213226bb2e8e6badaa3b8e5bc14d02cccc57f4fa2faf21 not found: ID does not exist" containerID="1619049fc971a5f761213226bb2e8e6badaa3b8e5bc14d02cccc57f4fa2faf21" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.270044 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1619049fc971a5f761213226bb2e8e6badaa3b8e5bc14d02cccc57f4fa2faf21"} err="failed to get container status \"1619049fc971a5f761213226bb2e8e6badaa3b8e5bc14d02cccc57f4fa2faf21\": rpc error: code = NotFound desc = could not find container \"1619049fc971a5f761213226bb2e8e6badaa3b8e5bc14d02cccc57f4fa2faf21\": container with ID starting with 1619049fc971a5f761213226bb2e8e6badaa3b8e5bc14d02cccc57f4fa2faf21 not found: ID does not exist" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.560803 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.594850 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.609673 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 13:27:33 crc kubenswrapper[4723]: E0309 13:27:33.611161 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19f91c12-b482-46ab-a6e1-20164abe2ee4" containerName="setup-container" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.611304 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="19f91c12-b482-46ab-a6e1-20164abe2ee4" containerName="setup-container" Mar 09 13:27:33 crc kubenswrapper[4723]: E0309 13:27:33.611366 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19f91c12-b482-46ab-a6e1-20164abe2ee4" containerName="rabbitmq" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.611426 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="19f91c12-b482-46ab-a6e1-20164abe2ee4" containerName="rabbitmq" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.611806 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="19f91c12-b482-46ab-a6e1-20164abe2ee4" containerName="rabbitmq" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.613711 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.648476 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swcqk\" (UniqueName: \"kubernetes.io/projected/b3c6ecc2-5b7f-43c3-adfc-d741cb3be077-kube-api-access-swcqk\") pod \"rabbitmq-server-0\" (UID: \"b3c6ecc2-5b7f-43c3-adfc-d741cb3be077\") " pod="openstack/rabbitmq-server-0" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.649075 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-400c2cb3-6a4d-4fb5-bae7-35d1dc27ea33\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-400c2cb3-6a4d-4fb5-bae7-35d1dc27ea33\") pod \"rabbitmq-server-0\" (UID: \"b3c6ecc2-5b7f-43c3-adfc-d741cb3be077\") " pod="openstack/rabbitmq-server-0" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.649272 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b3c6ecc2-5b7f-43c3-adfc-d741cb3be077-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b3c6ecc2-5b7f-43c3-adfc-d741cb3be077\") " pod="openstack/rabbitmq-server-0" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.649450 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b3c6ecc2-5b7f-43c3-adfc-d741cb3be077-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b3c6ecc2-5b7f-43c3-adfc-d741cb3be077\") " pod="openstack/rabbitmq-server-0" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.649627 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b3c6ecc2-5b7f-43c3-adfc-d741cb3be077-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b3c6ecc2-5b7f-43c3-adfc-d741cb3be077\") " pod="openstack/rabbitmq-server-0" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.649720 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b3c6ecc2-5b7f-43c3-adfc-d741cb3be077-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b3c6ecc2-5b7f-43c3-adfc-d741cb3be077\") " pod="openstack/rabbitmq-server-0" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.649824 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b3c6ecc2-5b7f-43c3-adfc-d741cb3be077-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b3c6ecc2-5b7f-43c3-adfc-d741cb3be077\") " pod="openstack/rabbitmq-server-0" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.649935 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b3c6ecc2-5b7f-43c3-adfc-d741cb3be077-config-data\") pod \"rabbitmq-server-0\" (UID: \"b3c6ecc2-5b7f-43c3-adfc-d741cb3be077\") " pod="openstack/rabbitmq-server-0" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.650030 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b3c6ecc2-5b7f-43c3-adfc-d741cb3be077-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b3c6ecc2-5b7f-43c3-adfc-d741cb3be077\") " pod="openstack/rabbitmq-server-0" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.650141 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b3c6ecc2-5b7f-43c3-adfc-d741cb3be077-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b3c6ecc2-5b7f-43c3-adfc-d741cb3be077\") " pod="openstack/rabbitmq-server-0" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.650221 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b3c6ecc2-5b7f-43c3-adfc-d741cb3be077-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b3c6ecc2-5b7f-43c3-adfc-d741cb3be077\") " pod="openstack/rabbitmq-server-0" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.651730 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.754791 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b3c6ecc2-5b7f-43c3-adfc-d741cb3be077-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b3c6ecc2-5b7f-43c3-adfc-d741cb3be077\") " pod="openstack/rabbitmq-server-0" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.754853 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b3c6ecc2-5b7f-43c3-adfc-d741cb3be077-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b3c6ecc2-5b7f-43c3-adfc-d741cb3be077\") " pod="openstack/rabbitmq-server-0" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.754927 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b3c6ecc2-5b7f-43c3-adfc-d741cb3be077-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b3c6ecc2-5b7f-43c3-adfc-d741cb3be077\") " pod="openstack/rabbitmq-server-0" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.755920 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b3c6ecc2-5b7f-43c3-adfc-d741cb3be077-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b3c6ecc2-5b7f-43c3-adfc-d741cb3be077\") " pod="openstack/rabbitmq-server-0" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.755962 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b3c6ecc2-5b7f-43c3-adfc-d741cb3be077-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b3c6ecc2-5b7f-43c3-adfc-d741cb3be077\") " pod="openstack/rabbitmq-server-0" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.755998 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b3c6ecc2-5b7f-43c3-adfc-d741cb3be077-config-data\") pod \"rabbitmq-server-0\" (UID: \"b3c6ecc2-5b7f-43c3-adfc-d741cb3be077\") " pod="openstack/rabbitmq-server-0" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.756032 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b3c6ecc2-5b7f-43c3-adfc-d741cb3be077-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b3c6ecc2-5b7f-43c3-adfc-d741cb3be077\") " pod="openstack/rabbitmq-server-0" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.756360 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b3c6ecc2-5b7f-43c3-adfc-d741cb3be077-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b3c6ecc2-5b7f-43c3-adfc-d741cb3be077\") " pod="openstack/rabbitmq-server-0" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.756410 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b3c6ecc2-5b7f-43c3-adfc-d741cb3be077-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b3c6ecc2-5b7f-43c3-adfc-d741cb3be077\") " pod="openstack/rabbitmq-server-0" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.756430 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b3c6ecc2-5b7f-43c3-adfc-d741cb3be077-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b3c6ecc2-5b7f-43c3-adfc-d741cb3be077\") " pod="openstack/rabbitmq-server-0" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.756830 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b3c6ecc2-5b7f-43c3-adfc-d741cb3be077-config-data\") pod \"rabbitmq-server-0\" (UID: \"b3c6ecc2-5b7f-43c3-adfc-d741cb3be077\") " pod="openstack/rabbitmq-server-0" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.756891 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swcqk\" (UniqueName: \"kubernetes.io/projected/b3c6ecc2-5b7f-43c3-adfc-d741cb3be077-kube-api-access-swcqk\") pod \"rabbitmq-server-0\" (UID: \"b3c6ecc2-5b7f-43c3-adfc-d741cb3be077\") " pod="openstack/rabbitmq-server-0" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.756985 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-400c2cb3-6a4d-4fb5-bae7-35d1dc27ea33\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-400c2cb3-6a4d-4fb5-bae7-35d1dc27ea33\") pod \"rabbitmq-server-0\" (UID: \"b3c6ecc2-5b7f-43c3-adfc-d741cb3be077\") " pod="openstack/rabbitmq-server-0" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.757060 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b3c6ecc2-5b7f-43c3-adfc-d741cb3be077-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b3c6ecc2-5b7f-43c3-adfc-d741cb3be077\") " pod="openstack/rabbitmq-server-0" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.757152 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b3c6ecc2-5b7f-43c3-adfc-d741cb3be077-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b3c6ecc2-5b7f-43c3-adfc-d741cb3be077\") " pod="openstack/rabbitmq-server-0" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.757529 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b3c6ecc2-5b7f-43c3-adfc-d741cb3be077-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b3c6ecc2-5b7f-43c3-adfc-d741cb3be077\") " pod="openstack/rabbitmq-server-0" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.759884 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b3c6ecc2-5b7f-43c3-adfc-d741cb3be077-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b3c6ecc2-5b7f-43c3-adfc-d741cb3be077\") " pod="openstack/rabbitmq-server-0" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.760694 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b3c6ecc2-5b7f-43c3-adfc-d741cb3be077-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b3c6ecc2-5b7f-43c3-adfc-d741cb3be077\") " pod="openstack/rabbitmq-server-0" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.761666 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b3c6ecc2-5b7f-43c3-adfc-d741cb3be077-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b3c6ecc2-5b7f-43c3-adfc-d741cb3be077\") " pod="openstack/rabbitmq-server-0" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.761994 4723 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.762021 4723 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-400c2cb3-6a4d-4fb5-bae7-35d1dc27ea33\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-400c2cb3-6a4d-4fb5-bae7-35d1dc27ea33\") pod \"rabbitmq-server-0\" (UID: \"b3c6ecc2-5b7f-43c3-adfc-d741cb3be077\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1c8821f4c993ac83904f8093fe48c4871280d085aeb9e957927750fc4aaf55ee/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.762212 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b3c6ecc2-5b7f-43c3-adfc-d741cb3be077-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b3c6ecc2-5b7f-43c3-adfc-d741cb3be077\") " pod="openstack/rabbitmq-server-0" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.783320 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swcqk\" (UniqueName: \"kubernetes.io/projected/b3c6ecc2-5b7f-43c3-adfc-d741cb3be077-kube-api-access-swcqk\") pod \"rabbitmq-server-0\" (UID: \"b3c6ecc2-5b7f-43c3-adfc-d741cb3be077\") " pod="openstack/rabbitmq-server-0" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.878594 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-400c2cb3-6a4d-4fb5-bae7-35d1dc27ea33\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-400c2cb3-6a4d-4fb5-bae7-35d1dc27ea33\") pod \"rabbitmq-server-0\" (UID: \"b3c6ecc2-5b7f-43c3-adfc-d741cb3be077\") " pod="openstack/rabbitmq-server-0" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.946787 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.947814 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:27:33 crc kubenswrapper[4723]: I0309 13:27:33.955123 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 09 13:27:34 crc kubenswrapper[4723]: I0309 13:27:34.440114 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 13:27:34 crc kubenswrapper[4723]: I0309 13:27:34.896596 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19f91c12-b482-46ab-a6e1-20164abe2ee4" path="/var/lib/kubelet/pods/19f91c12-b482-46ab-a6e1-20164abe2ee4/volumes" Mar 09 13:27:35 crc kubenswrapper[4723]: I0309 13:27:35.231390 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b3c6ecc2-5b7f-43c3-adfc-d741cb3be077","Type":"ContainerStarted","Data":"be70d8a870df7aaf981066b429b9bb9cc0cee4bf783311ebaebfe11abb4a5234"} Mar 09 13:27:36 crc kubenswrapper[4723]: I0309 13:27:36.247833 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b3c6ecc2-5b7f-43c3-adfc-d741cb3be077","Type":"ContainerStarted","Data":"0690a434abcab711a223bd38410890141599f1f141e6f2cbe658dc0cfc9f82a7"} Mar 09 13:28:00 crc kubenswrapper[4723]: I0309 13:28:00.147472 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551048-g64gj"] Mar 09 13:28:00 crc kubenswrapper[4723]: I0309 13:28:00.149901 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551048-g64gj" Mar 09 13:28:00 crc kubenswrapper[4723]: I0309 13:28:00.152596 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jrp4x" Mar 09 13:28:00 crc kubenswrapper[4723]: I0309 13:28:00.154499 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:28:00 crc kubenswrapper[4723]: I0309 13:28:00.154992 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:28:00 crc kubenswrapper[4723]: I0309 13:28:00.163940 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551048-g64gj"] Mar 09 13:28:00 crc kubenswrapper[4723]: I0309 13:28:00.251810 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9tsx\" (UniqueName: \"kubernetes.io/projected/9702a066-3717-4a9a-8777-372960604154-kube-api-access-l9tsx\") pod \"auto-csr-approver-29551048-g64gj\" (UID: \"9702a066-3717-4a9a-8777-372960604154\") " pod="openshift-infra/auto-csr-approver-29551048-g64gj" Mar 09 13:28:00 crc kubenswrapper[4723]: I0309 13:28:00.354103 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9tsx\" (UniqueName: \"kubernetes.io/projected/9702a066-3717-4a9a-8777-372960604154-kube-api-access-l9tsx\") pod \"auto-csr-approver-29551048-g64gj\" (UID: \"9702a066-3717-4a9a-8777-372960604154\") " pod="openshift-infra/auto-csr-approver-29551048-g64gj" Mar 09 13:28:00 crc kubenswrapper[4723]: I0309 13:28:00.383674 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9tsx\" (UniqueName: \"kubernetes.io/projected/9702a066-3717-4a9a-8777-372960604154-kube-api-access-l9tsx\") pod \"auto-csr-approver-29551048-g64gj\" (UID: \"9702a066-3717-4a9a-8777-372960604154\") " pod="openshift-infra/auto-csr-approver-29551048-g64gj" Mar 09 13:28:00 crc kubenswrapper[4723]: I0309 13:28:00.476550 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551048-g64gj" Mar 09 13:28:00 crc kubenswrapper[4723]: I0309 13:28:00.993877 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551048-g64gj"] Mar 09 13:28:01 crc kubenswrapper[4723]: I0309 13:28:01.567639 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551048-g64gj" event={"ID":"9702a066-3717-4a9a-8777-372960604154","Type":"ContainerStarted","Data":"8ff187e54d4b6a3b756af1f37399429f9c0ed205e72d3be5d5413f841acc2a3d"} Mar 09 13:28:02 crc kubenswrapper[4723]: I0309 13:28:02.594444 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551048-g64gj" event={"ID":"9702a066-3717-4a9a-8777-372960604154","Type":"ContainerStarted","Data":"929cf1f651d0585d8304f624021f967091871f4fe09a0b52b653fdded40e9086"} Mar 09 13:28:02 crc kubenswrapper[4723]: I0309 13:28:02.619998 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551048-g64gj" podStartSLOduration=1.564160022 podStartE2EDuration="2.619965609s" podCreationTimestamp="2026-03-09 13:28:00 +0000 UTC" firstStartedPulling="2026-03-09 13:28:00.99540586 +0000 UTC m=+1755.009873410" lastFinishedPulling="2026-03-09 13:28:02.051211457 +0000 UTC m=+1756.065678997" observedRunningTime="2026-03-09 13:28:02.611813174 +0000 UTC m=+1756.626280754" watchObservedRunningTime="2026-03-09 13:28:02.619965609 +0000 UTC m=+1756.634433180" Mar 09 13:28:03 crc kubenswrapper[4723]: I0309 13:28:03.607588 4723 generic.go:334] "Generic (PLEG): container finished" podID="9702a066-3717-4a9a-8777-372960604154" containerID="929cf1f651d0585d8304f624021f967091871f4fe09a0b52b653fdded40e9086" exitCode=0 Mar 09 13:28:03 crc kubenswrapper[4723]: I0309 13:28:03.607645 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551048-g64gj" event={"ID":"9702a066-3717-4a9a-8777-372960604154","Type":"ContainerDied","Data":"929cf1f651d0585d8304f624021f967091871f4fe09a0b52b653fdded40e9086"} Mar 09 13:28:03 crc kubenswrapper[4723]: I0309 13:28:03.946744 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:28:03 crc kubenswrapper[4723]: I0309 13:28:03.946812 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:28:05 crc kubenswrapper[4723]: I0309 13:28:05.070196 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551048-g64gj" Mar 09 13:28:05 crc kubenswrapper[4723]: I0309 13:28:05.149745 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9tsx\" (UniqueName: \"kubernetes.io/projected/9702a066-3717-4a9a-8777-372960604154-kube-api-access-l9tsx\") pod \"9702a066-3717-4a9a-8777-372960604154\" (UID: \"9702a066-3717-4a9a-8777-372960604154\") " Mar 09 13:28:05 crc kubenswrapper[4723]: I0309 13:28:05.155534 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9702a066-3717-4a9a-8777-372960604154-kube-api-access-l9tsx" (OuterVolumeSpecName: "kube-api-access-l9tsx") pod "9702a066-3717-4a9a-8777-372960604154" (UID: "9702a066-3717-4a9a-8777-372960604154"). InnerVolumeSpecName "kube-api-access-l9tsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:28:05 crc kubenswrapper[4723]: I0309 13:28:05.252477 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9tsx\" (UniqueName: \"kubernetes.io/projected/9702a066-3717-4a9a-8777-372960604154-kube-api-access-l9tsx\") on node \"crc\" DevicePath \"\"" Mar 09 13:28:05 crc kubenswrapper[4723]: I0309 13:28:05.636165 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551048-g64gj" event={"ID":"9702a066-3717-4a9a-8777-372960604154","Type":"ContainerDied","Data":"8ff187e54d4b6a3b756af1f37399429f9c0ed205e72d3be5d5413f841acc2a3d"} Mar 09 13:28:05 crc kubenswrapper[4723]: I0309 13:28:05.636212 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ff187e54d4b6a3b756af1f37399429f9c0ed205e72d3be5d5413f841acc2a3d" Mar 09 13:28:05 crc kubenswrapper[4723]: I0309 13:28:05.636272 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551048-g64gj" Mar 09 13:28:05 crc kubenswrapper[4723]: I0309 13:28:05.699458 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551042-bkmb7"] Mar 09 13:28:05 crc kubenswrapper[4723]: I0309 13:28:05.710581 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551042-bkmb7"] Mar 09 13:28:06 crc kubenswrapper[4723]: I0309 13:28:06.898695 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54bb3205-95cb-4772-977a-3f33fcfe1ab3" path="/var/lib/kubelet/pods/54bb3205-95cb-4772-977a-3f33fcfe1ab3/volumes" Mar 09 13:28:08 crc kubenswrapper[4723]: I0309 13:28:08.672720 4723 generic.go:334] "Generic (PLEG): container finished" podID="b3c6ecc2-5b7f-43c3-adfc-d741cb3be077" containerID="0690a434abcab711a223bd38410890141599f1f141e6f2cbe658dc0cfc9f82a7" exitCode=0 Mar 09 13:28:08 crc kubenswrapper[4723]: I0309 13:28:08.672799 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b3c6ecc2-5b7f-43c3-adfc-d741cb3be077","Type":"ContainerDied","Data":"0690a434abcab711a223bd38410890141599f1f141e6f2cbe658dc0cfc9f82a7"} Mar 09 13:28:09 crc kubenswrapper[4723]: I0309 13:28:09.686276 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b3c6ecc2-5b7f-43c3-adfc-d741cb3be077","Type":"ContainerStarted","Data":"fa13d9f479a2eb57210206c037ea1e844c6bc440b428b999bda763ce2fa7e740"} Mar 09 13:28:09 crc kubenswrapper[4723]: I0309 13:28:09.686764 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 09 13:28:09 crc kubenswrapper[4723]: I0309 13:28:09.720718 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.720696284 podStartE2EDuration="36.720696284s" podCreationTimestamp="2026-03-09 13:27:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:28:09.706208781 +0000 UTC m=+1763.720676341" watchObservedRunningTime="2026-03-09 13:28:09.720696284 +0000 UTC m=+1763.735163824" Mar 09 13:28:21 crc kubenswrapper[4723]: I0309 13:28:21.232251 4723 scope.go:117] "RemoveContainer" containerID="0809588e36ace37a3be426468c6babbc9559b368029ab389c1fa467da8beb254" Mar 09 13:28:21 crc kubenswrapper[4723]: I0309 13:28:21.293204 4723 scope.go:117] "RemoveContainer" containerID="6fa51296fe7f6684547eae4f35c35bbf74bb1c2b6a716b61798033760fc08532" Mar 09 13:28:21 crc kubenswrapper[4723]: I0309 13:28:21.347782 4723 scope.go:117] "RemoveContainer" containerID="f2ca1676e90d1c871dcdaff4231d3ec0d4c9c081386001474bb6c5a2c8ba0916" Mar 09 13:28:21 crc kubenswrapper[4723]: I0309 13:28:21.386107 4723 scope.go:117] "RemoveContainer" containerID="536b540a1c2fa2c8933a710b773d68d8f5d423f462382be9cc4aba588555274d" Mar 09 13:28:23 crc kubenswrapper[4723]: I0309 13:28:23.958029 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 09 13:28:33 crc kubenswrapper[4723]: I0309 13:28:33.946591 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:28:33 crc kubenswrapper[4723]: I0309 13:28:33.947128 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:28:33 crc kubenswrapper[4723]: I0309 13:28:33.947168 4723 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" Mar 09 13:28:33 crc kubenswrapper[4723]: I0309 13:28:33.947623 4723 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a02e51157ce32a3f4545e91341ee80bb0f1a0c3eaea82ea1beb5091e2f5cef4b"} pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 13:28:33 crc kubenswrapper[4723]: I0309 13:28:33.947671 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" containerID="cri-o://a02e51157ce32a3f4545e91341ee80bb0f1a0c3eaea82ea1beb5091e2f5cef4b" gracePeriod=600 Mar 09 13:28:34 crc kubenswrapper[4723]: E0309 13:28:34.088668 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:28:35 crc kubenswrapper[4723]: I0309 13:28:35.001889 4723 generic.go:334] "Generic (PLEG): container finished" podID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerID="a02e51157ce32a3f4545e91341ee80bb0f1a0c3eaea82ea1beb5091e2f5cef4b" exitCode=0 Mar 09 13:28:35 crc kubenswrapper[4723]: I0309 13:28:35.001910 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" event={"ID":"983d5ed4-cfc7-402a-b226-29dc071c6e4e","Type":"ContainerDied","Data":"a02e51157ce32a3f4545e91341ee80bb0f1a0c3eaea82ea1beb5091e2f5cef4b"} Mar 09 13:28:35 crc kubenswrapper[4723]: I0309 13:28:35.003541 4723 scope.go:117] "RemoveContainer" containerID="6ac6d2c984403d03e4d4370dd6ca12328beaf68b063a60d758d836e9ab8d0176" Mar 09 13:28:35 crc kubenswrapper[4723]: I0309 13:28:35.004447 4723 scope.go:117] "RemoveContainer" containerID="a02e51157ce32a3f4545e91341ee80bb0f1a0c3eaea82ea1beb5091e2f5cef4b" Mar 09 13:28:35 crc kubenswrapper[4723]: E0309 13:28:35.004834 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:28:47 crc kubenswrapper[4723]: I0309 13:28:47.881379 4723 scope.go:117] "RemoveContainer" containerID="a02e51157ce32a3f4545e91341ee80bb0f1a0c3eaea82ea1beb5091e2f5cef4b" Mar 09 13:28:47 crc kubenswrapper[4723]: E0309 13:28:47.882632 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:28:58 crc kubenswrapper[4723]: I0309 13:28:58.881924 4723 scope.go:117] "RemoveContainer" containerID="a02e51157ce32a3f4545e91341ee80bb0f1a0c3eaea82ea1beb5091e2f5cef4b" Mar 09 13:28:58 crc kubenswrapper[4723]: E0309 13:28:58.882635 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:29:09 crc kubenswrapper[4723]: I0309 13:29:09.881615 4723 scope.go:117] "RemoveContainer" containerID="a02e51157ce32a3f4545e91341ee80bb0f1a0c3eaea82ea1beb5091e2f5cef4b" Mar 09 13:29:09 crc kubenswrapper[4723]: E0309 13:29:09.882903 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:29:21 crc kubenswrapper[4723]: I0309 13:29:21.500738 4723 scope.go:117] "RemoveContainer" containerID="3d76385ca8602290fdacc0d7567b698ad2e3dc17ea6a31eb0361bc3868c00851" Mar 09 13:29:21 crc kubenswrapper[4723]: I0309 13:29:21.534943 4723 scope.go:117] "RemoveContainer" containerID="ddf7255f8e0a77da4cfc8f37c1cb56df0784c289a8deb6e73189ad719a7cd4b7" Mar 09 13:29:21 crc kubenswrapper[4723]: I0309 13:29:21.596234 4723 scope.go:117] "RemoveContainer" containerID="7942d56e2ba234419f069386784848fa3fa7a8bb97e4e04d5dbb94e9322d0277" Mar 09 13:29:22 crc kubenswrapper[4723]: I0309 13:29:22.882286 4723 scope.go:117] "RemoveContainer" containerID="a02e51157ce32a3f4545e91341ee80bb0f1a0c3eaea82ea1beb5091e2f5cef4b" Mar 09 13:29:22 crc kubenswrapper[4723]: E0309 13:29:22.883251 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:29:37 crc kubenswrapper[4723]: I0309 13:29:37.882178 4723 scope.go:117] "RemoveContainer" containerID="a02e51157ce32a3f4545e91341ee80bb0f1a0c3eaea82ea1beb5091e2f5cef4b" Mar 09 13:29:37 crc kubenswrapper[4723]: E0309 13:29:37.882992 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:29:47 crc kubenswrapper[4723]: I0309 13:29:47.843727 4723 generic.go:334] "Generic (PLEG): container finished" podID="421e85c4-9862-4601-ab03-d2b602ba68f4" containerID="043323c0bddce9f57a0e1bd21bf0367cded08eb0734be1e8a3d843a780273089" exitCode=0 Mar 09 13:29:47 crc kubenswrapper[4723]: I0309 13:29:47.843809 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxhhr" event={"ID":"421e85c4-9862-4601-ab03-d2b602ba68f4","Type":"ContainerDied","Data":"043323c0bddce9f57a0e1bd21bf0367cded08eb0734be1e8a3d843a780273089"} Mar 09 13:29:49 crc kubenswrapper[4723]: I0309 13:29:49.072562 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-fa4f-account-create-update-5mxnk"] Mar 09 13:29:49 crc kubenswrapper[4723]: I0309 13:29:49.090639 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-d2a6-account-create-update-gnb49"] Mar 09 13:29:49 crc kubenswrapper[4723]: I0309 13:29:49.101168 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-g7xtr"] Mar 09 13:29:49 crc kubenswrapper[4723]: I0309 13:29:49.111490 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-fa4f-account-create-update-5mxnk"] Mar 09 13:29:49 crc kubenswrapper[4723]: I0309 13:29:49.121319 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-g7xtr"] Mar 09 13:29:49 crc kubenswrapper[4723]: I0309 13:29:49.139346 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-d2a6-account-create-update-gnb49"] Mar 09 13:29:49 crc kubenswrapper[4723]: I0309 13:29:49.429471 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxhhr" Mar 09 13:29:49 crc kubenswrapper[4723]: I0309 13:29:49.617439 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4sdfd\" (UniqueName: \"kubernetes.io/projected/421e85c4-9862-4601-ab03-d2b602ba68f4-kube-api-access-4sdfd\") pod \"421e85c4-9862-4601-ab03-d2b602ba68f4\" (UID: \"421e85c4-9862-4601-ab03-d2b602ba68f4\") " Mar 09 13:29:49 crc kubenswrapper[4723]: I0309 13:29:49.617740 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421e85c4-9862-4601-ab03-d2b602ba68f4-bootstrap-combined-ca-bundle\") pod \"421e85c4-9862-4601-ab03-d2b602ba68f4\" (UID: \"421e85c4-9862-4601-ab03-d2b602ba68f4\") " Mar 09 13:29:49 crc kubenswrapper[4723]: I0309 13:29:49.618015 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/421e85c4-9862-4601-ab03-d2b602ba68f4-ssh-key-openstack-edpm-ipam\") pod \"421e85c4-9862-4601-ab03-d2b602ba68f4\" (UID: \"421e85c4-9862-4601-ab03-d2b602ba68f4\") " Mar 09 13:29:49 crc kubenswrapper[4723]: I0309 13:29:49.618166 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/421e85c4-9862-4601-ab03-d2b602ba68f4-inventory\") pod \"421e85c4-9862-4601-ab03-d2b602ba68f4\" (UID: \"421e85c4-9862-4601-ab03-d2b602ba68f4\") " Mar 09 13:29:49 crc kubenswrapper[4723]: I0309 13:29:49.626483 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/421e85c4-9862-4601-ab03-d2b602ba68f4-kube-api-access-4sdfd" (OuterVolumeSpecName: "kube-api-access-4sdfd") pod "421e85c4-9862-4601-ab03-d2b602ba68f4" (UID: "421e85c4-9862-4601-ab03-d2b602ba68f4"). InnerVolumeSpecName "kube-api-access-4sdfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:29:49 crc kubenswrapper[4723]: I0309 13:29:49.644011 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/421e85c4-9862-4601-ab03-d2b602ba68f4-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "421e85c4-9862-4601-ab03-d2b602ba68f4" (UID: "421e85c4-9862-4601-ab03-d2b602ba68f4"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:29:49 crc kubenswrapper[4723]: I0309 13:29:49.701312 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/421e85c4-9862-4601-ab03-d2b602ba68f4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "421e85c4-9862-4601-ab03-d2b602ba68f4" (UID: "421e85c4-9862-4601-ab03-d2b602ba68f4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:29:49 crc kubenswrapper[4723]: I0309 13:29:49.701452 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/421e85c4-9862-4601-ab03-d2b602ba68f4-inventory" (OuterVolumeSpecName: "inventory") pod "421e85c4-9862-4601-ab03-d2b602ba68f4" (UID: "421e85c4-9862-4601-ab03-d2b602ba68f4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:29:49 crc kubenswrapper[4723]: I0309 13:29:49.728704 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4sdfd\" (UniqueName: \"kubernetes.io/projected/421e85c4-9862-4601-ab03-d2b602ba68f4-kube-api-access-4sdfd\") on node \"crc\" DevicePath \"\"" Mar 09 13:29:49 crc kubenswrapper[4723]: I0309 13:29:49.728745 4723 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421e85c4-9862-4601-ab03-d2b602ba68f4-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:29:49 crc kubenswrapper[4723]: I0309 13:29:49.728757 4723 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/421e85c4-9862-4601-ab03-d2b602ba68f4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:29:49 crc kubenswrapper[4723]: I0309 13:29:49.728767 4723 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/421e85c4-9862-4601-ab03-d2b602ba68f4-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:29:49 crc kubenswrapper[4723]: I0309 13:29:49.869510 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxhhr" event={"ID":"421e85c4-9862-4601-ab03-d2b602ba68f4","Type":"ContainerDied","Data":"fa96277f688f30aff297866f72a6757acd791fa17f260bfa7840f1e21cefecd0"} Mar 09 13:29:49 crc kubenswrapper[4723]: I0309 13:29:49.869552 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa96277f688f30aff297866f72a6757acd791fa17f260bfa7840f1e21cefecd0" Mar 09 13:29:49 crc kubenswrapper[4723]: I0309 13:29:49.869596 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxhhr" Mar 09 13:29:49 crc kubenswrapper[4723]: I0309 13:29:49.958029 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vrckm"] Mar 09 13:29:49 crc kubenswrapper[4723]: E0309 13:29:49.958662 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9702a066-3717-4a9a-8777-372960604154" containerName="oc" Mar 09 13:29:49 crc kubenswrapper[4723]: I0309 13:29:49.958687 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="9702a066-3717-4a9a-8777-372960604154" containerName="oc" Mar 09 13:29:49 crc kubenswrapper[4723]: E0309 13:29:49.958705 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="421e85c4-9862-4601-ab03-d2b602ba68f4" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 09 13:29:49 crc kubenswrapper[4723]: I0309 13:29:49.958714 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="421e85c4-9862-4601-ab03-d2b602ba68f4" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 09 13:29:49 crc kubenswrapper[4723]: I0309 13:29:49.959042 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="9702a066-3717-4a9a-8777-372960604154" containerName="oc" Mar 09 13:29:49 crc kubenswrapper[4723]: I0309 13:29:49.959090 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="421e85c4-9862-4601-ab03-d2b602ba68f4" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 09 13:29:49 crc kubenswrapper[4723]: I0309 13:29:49.960041 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vrckm" Mar 09 13:29:49 crc kubenswrapper[4723]: I0309 13:29:49.962336 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gw7vt" Mar 09 13:29:49 crc kubenswrapper[4723]: I0309 13:29:49.962716 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:29:49 crc kubenswrapper[4723]: I0309 13:29:49.962722 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:29:49 crc kubenswrapper[4723]: I0309 13:29:49.962769 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:29:49 crc kubenswrapper[4723]: I0309 13:29:49.969604 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vrckm"] Mar 09 13:29:50 crc kubenswrapper[4723]: I0309 13:29:50.136453 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csbmr\" (UniqueName: \"kubernetes.io/projected/e9c9b511-2ead-4e25-a076-846d1723510b-kube-api-access-csbmr\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vrckm\" (UID: \"e9c9b511-2ead-4e25-a076-846d1723510b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vrckm" Mar 09 13:29:50 crc kubenswrapper[4723]: I0309 13:29:50.136530 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9c9b511-2ead-4e25-a076-846d1723510b-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vrckm\" (UID: \"e9c9b511-2ead-4e25-a076-846d1723510b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vrckm" Mar 09 13:29:50 crc kubenswrapper[4723]: I0309 13:29:50.136586 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9c9b511-2ead-4e25-a076-846d1723510b-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vrckm\" (UID: \"e9c9b511-2ead-4e25-a076-846d1723510b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vrckm" Mar 09 13:29:50 crc kubenswrapper[4723]: I0309 13:29:50.238538 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csbmr\" (UniqueName: \"kubernetes.io/projected/e9c9b511-2ead-4e25-a076-846d1723510b-kube-api-access-csbmr\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vrckm\" (UID: \"e9c9b511-2ead-4e25-a076-846d1723510b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vrckm" Mar 09 13:29:50 crc kubenswrapper[4723]: I0309 13:29:50.238623 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9c9b511-2ead-4e25-a076-846d1723510b-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vrckm\" (UID: \"e9c9b511-2ead-4e25-a076-846d1723510b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vrckm" Mar 09 13:29:50 crc kubenswrapper[4723]: I0309 13:29:50.238658 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9c9b511-2ead-4e25-a076-846d1723510b-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vrckm\" (UID: \"e9c9b511-2ead-4e25-a076-846d1723510b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vrckm" Mar 09 13:29:50 crc kubenswrapper[4723]: I0309 13:29:50.242116 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9c9b511-2ead-4e25-a076-846d1723510b-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vrckm\" (UID: \"e9c9b511-2ead-4e25-a076-846d1723510b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vrckm" Mar 09 13:29:50 crc kubenswrapper[4723]: I0309 13:29:50.245501 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9c9b511-2ead-4e25-a076-846d1723510b-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vrckm\" (UID: \"e9c9b511-2ead-4e25-a076-846d1723510b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vrckm" Mar 09 13:29:50 crc kubenswrapper[4723]: I0309 13:29:50.254051 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csbmr\" (UniqueName: \"kubernetes.io/projected/e9c9b511-2ead-4e25-a076-846d1723510b-kube-api-access-csbmr\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vrckm\" (UID: \"e9c9b511-2ead-4e25-a076-846d1723510b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vrckm" Mar 09 13:29:50 crc kubenswrapper[4723]: I0309 13:29:50.284464 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vrckm" Mar 09 13:29:50 crc kubenswrapper[4723]: I0309 13:29:50.861785 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vrckm"] Mar 09 13:29:50 crc kubenswrapper[4723]: I0309 13:29:50.861915 4723 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 13:29:50 crc kubenswrapper[4723]: I0309 13:29:50.893712 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08304d40-0d1d-40e2-8188-8e6fd44434c2" path="/var/lib/kubelet/pods/08304d40-0d1d-40e2-8188-8e6fd44434c2/volumes" Mar 09 13:29:50 crc kubenswrapper[4723]: I0309 13:29:50.895166 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0958c374-1159-4924-bff6-d627956944c9" path="/var/lib/kubelet/pods/0958c374-1159-4924-bff6-d627956944c9/volumes" Mar 09 13:29:50 crc kubenswrapper[4723]: I0309 13:29:50.896298 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd854222-9c09-403c-95c5-37763f40aac3" path="/var/lib/kubelet/pods/cd854222-9c09-403c-95c5-37763f40aac3/volumes" Mar 09 13:29:50 crc kubenswrapper[4723]: I0309 13:29:50.897016 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vrckm" event={"ID":"e9c9b511-2ead-4e25-a076-846d1723510b","Type":"ContainerStarted","Data":"bfabf7f6c383e4438f119c952cf5d7fc5bad4c4f64f4803f48c22a4e65b250d2"} Mar 09 13:29:51 crc kubenswrapper[4723]: I0309 13:29:51.903461 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vrckm" event={"ID":"e9c9b511-2ead-4e25-a076-846d1723510b","Type":"ContainerStarted","Data":"d4dd4a7817e2891da63a363280ac5e315650a965ca4c7bbf1306013aafae9658"} Mar 09 13:29:51 crc kubenswrapper[4723]: I0309 13:29:51.930880 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vrckm" podStartSLOduration=2.502954221 podStartE2EDuration="2.930828592s" podCreationTimestamp="2026-03-09 13:29:49 +0000 UTC" firstStartedPulling="2026-03-09 13:29:50.861581915 +0000 UTC m=+1864.876049455" lastFinishedPulling="2026-03-09 13:29:51.289456286 +0000 UTC m=+1865.303923826" observedRunningTime="2026-03-09 13:29:51.924534594 +0000 UTC m=+1865.939002134" watchObservedRunningTime="2026-03-09 13:29:51.930828592 +0000 UTC m=+1865.945296132" Mar 09 13:29:52 crc kubenswrapper[4723]: I0309 13:29:52.882599 4723 scope.go:117] "RemoveContainer" containerID="a02e51157ce32a3f4545e91341ee80bb0f1a0c3eaea82ea1beb5091e2f5cef4b" Mar 09 13:29:52 crc kubenswrapper[4723]: E0309 13:29:52.883192 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:29:57 crc kubenswrapper[4723]: I0309 13:29:57.039946 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-vzp24"] Mar 09 13:29:57 crc kubenswrapper[4723]: I0309 13:29:57.051726 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-e639-account-create-update-n5ktm"] Mar 09 13:29:57 crc kubenswrapper[4723]: I0309 13:29:57.065184 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-e639-account-create-update-n5ktm"] Mar 09 13:29:57 crc kubenswrapper[4723]: I0309 13:29:57.083153 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-vzp24"] Mar 09 13:29:58 crc kubenswrapper[4723]: I0309 13:29:58.892901 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dcf2e69-d684-4721-a6ad-bedf186a51a4" path="/var/lib/kubelet/pods/0dcf2e69-d684-4721-a6ad-bedf186a51a4/volumes" Mar 09 13:29:58 crc kubenswrapper[4723]: I0309 13:29:58.893754 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="842ad0c6-1dfa-4ea1-9d26-84c84b8d3f92" path="/var/lib/kubelet/pods/842ad0c6-1dfa-4ea1-9d26-84c84b8d3f92/volumes" Mar 09 13:30:00 crc kubenswrapper[4723]: I0309 13:30:00.140037 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551050-mcz8l"] Mar 09 13:30:00 crc kubenswrapper[4723]: I0309 13:30:00.142302 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-mcz8l" Mar 09 13:30:00 crc kubenswrapper[4723]: I0309 13:30:00.144371 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 13:30:00 crc kubenswrapper[4723]: I0309 13:30:00.144567 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 13:30:00 crc kubenswrapper[4723]: I0309 13:30:00.152938 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551050-dct54"] Mar 09 13:30:00 crc kubenswrapper[4723]: I0309 13:30:00.156688 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551050-dct54" Mar 09 13:30:00 crc kubenswrapper[4723]: I0309 13:30:00.175490 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:30:00 crc kubenswrapper[4723]: I0309 13:30:00.179267 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:30:00 crc kubenswrapper[4723]: I0309 13:30:00.207348 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jrp4x" Mar 09 13:30:00 crc kubenswrapper[4723]: I0309 13:30:00.208512 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551050-dct54"] Mar 09 13:30:00 crc kubenswrapper[4723]: I0309 13:30:00.236265 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551050-mcz8l"] Mar 09 13:30:00 crc kubenswrapper[4723]: I0309 13:30:00.295137 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv4kv\" (UniqueName: \"kubernetes.io/projected/058d5dd9-a30c-4de2-a61a-4fd2a359799e-kube-api-access-lv4kv\") pod \"collect-profiles-29551050-mcz8l\" (UID: \"058d5dd9-a30c-4de2-a61a-4fd2a359799e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-mcz8l" Mar 09 13:30:00 crc kubenswrapper[4723]: I0309 13:30:00.295180 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/058d5dd9-a30c-4de2-a61a-4fd2a359799e-config-volume\") pod \"collect-profiles-29551050-mcz8l\" (UID: \"058d5dd9-a30c-4de2-a61a-4fd2a359799e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-mcz8l" Mar 09 13:30:00 crc kubenswrapper[4723]: I0309 13:30:00.295218 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/058d5dd9-a30c-4de2-a61a-4fd2a359799e-secret-volume\") pod \"collect-profiles-29551050-mcz8l\" (UID: \"058d5dd9-a30c-4de2-a61a-4fd2a359799e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-mcz8l" Mar 09 13:30:00 crc kubenswrapper[4723]: I0309 13:30:00.295297 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmvk4\" (UniqueName: \"kubernetes.io/projected/aed8e659-d57f-41bb-a6a5-ce888991208a-kube-api-access-rmvk4\") pod \"auto-csr-approver-29551050-dct54\" (UID: \"aed8e659-d57f-41bb-a6a5-ce888991208a\") " pod="openshift-infra/auto-csr-approver-29551050-dct54" Mar 09 13:30:00 crc kubenswrapper[4723]: I0309 13:30:00.397306 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lv4kv\" (UniqueName: \"kubernetes.io/projected/058d5dd9-a30c-4de2-a61a-4fd2a359799e-kube-api-access-lv4kv\") pod \"collect-profiles-29551050-mcz8l\" (UID: \"058d5dd9-a30c-4de2-a61a-4fd2a359799e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-mcz8l" Mar 09 13:30:00 crc kubenswrapper[4723]: I0309 13:30:00.397358 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/058d5dd9-a30c-4de2-a61a-4fd2a359799e-config-volume\") pod \"collect-profiles-29551050-mcz8l\" (UID: \"058d5dd9-a30c-4de2-a61a-4fd2a359799e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-mcz8l" Mar 09 13:30:00 crc kubenswrapper[4723]: I0309 13:30:00.397405 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/058d5dd9-a30c-4de2-a61a-4fd2a359799e-secret-volume\") pod \"collect-profiles-29551050-mcz8l\" (UID: \"058d5dd9-a30c-4de2-a61a-4fd2a359799e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-mcz8l" Mar 09 13:30:00 crc kubenswrapper[4723]: I0309 13:30:00.397564 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmvk4\" (UniqueName: \"kubernetes.io/projected/aed8e659-d57f-41bb-a6a5-ce888991208a-kube-api-access-rmvk4\") pod \"auto-csr-approver-29551050-dct54\" (UID: \"aed8e659-d57f-41bb-a6a5-ce888991208a\") " pod="openshift-infra/auto-csr-approver-29551050-dct54" Mar 09 13:30:00 crc kubenswrapper[4723]: I0309 13:30:00.398190 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/058d5dd9-a30c-4de2-a61a-4fd2a359799e-config-volume\") pod \"collect-profiles-29551050-mcz8l\" (UID: \"058d5dd9-a30c-4de2-a61a-4fd2a359799e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-mcz8l" Mar 09 13:30:00 crc kubenswrapper[4723]: I0309 13:30:00.415311 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/058d5dd9-a30c-4de2-a61a-4fd2a359799e-secret-volume\") pod \"collect-profiles-29551050-mcz8l\" (UID: \"058d5dd9-a30c-4de2-a61a-4fd2a359799e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-mcz8l" Mar 09 13:30:00 crc kubenswrapper[4723]: I0309 13:30:00.419758 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmvk4\" (UniqueName: \"kubernetes.io/projected/aed8e659-d57f-41bb-a6a5-ce888991208a-kube-api-access-rmvk4\") pod \"auto-csr-approver-29551050-dct54\" (UID: \"aed8e659-d57f-41bb-a6a5-ce888991208a\") " pod="openshift-infra/auto-csr-approver-29551050-dct54" Mar 09 13:30:00 crc kubenswrapper[4723]: I0309 13:30:00.420260 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv4kv\" (UniqueName: \"kubernetes.io/projected/058d5dd9-a30c-4de2-a61a-4fd2a359799e-kube-api-access-lv4kv\") pod \"collect-profiles-29551050-mcz8l\" (UID: \"058d5dd9-a30c-4de2-a61a-4fd2a359799e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-mcz8l" Mar 09 13:30:00 crc kubenswrapper[4723]: I0309 13:30:00.481288 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-mcz8l" Mar 09 13:30:00 crc kubenswrapper[4723]: I0309 13:30:00.496174 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551050-dct54" Mar 09 13:30:01 crc kubenswrapper[4723]: I0309 13:30:01.038915 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551050-mcz8l"] Mar 09 13:30:01 crc kubenswrapper[4723]: I0309 13:30:01.053521 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-ghfww"] Mar 09 13:30:01 crc kubenswrapper[4723]: I0309 13:30:01.078312 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-1379-account-create-update-tw2zb"] Mar 09 13:30:01 crc kubenswrapper[4723]: I0309 13:30:01.095591 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-ghfww"] Mar 09 13:30:01 crc kubenswrapper[4723]: I0309 13:30:01.117486 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-1379-account-create-update-tw2zb"] Mar 09 13:30:01 crc kubenswrapper[4723]: W0309 13:30:01.140330 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaed8e659_d57f_41bb_a6a5_ce888991208a.slice/crio-9db5442985a93e215cf23ca389554186f06df38482da795b823e069c39bef963 WatchSource:0}: Error finding container 9db5442985a93e215cf23ca389554186f06df38482da795b823e069c39bef963: Status 404 returned error can't find the container with id 9db5442985a93e215cf23ca389554186f06df38482da795b823e069c39bef963 Mar 09 13:30:01 crc kubenswrapper[4723]: I0309 13:30:01.142040 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551050-dct54"] Mar 09 13:30:02 crc kubenswrapper[4723]: I0309 13:30:02.013819 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-mcz8l" event={"ID":"058d5dd9-a30c-4de2-a61a-4fd2a359799e","Type":"ContainerStarted","Data":"e11451d0ce7dca7b8212f8043bb36bdb36f297425e5d815f7cea4b03e679d4a9"} Mar 09 13:30:02 crc kubenswrapper[4723]: I0309 13:30:02.014079 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-mcz8l" event={"ID":"058d5dd9-a30c-4de2-a61a-4fd2a359799e","Type":"ContainerStarted","Data":"453f8289f4fd0c040d2e056ed0b86abdfd9bb626223c19cbbdf84b3ffcd1c802"} Mar 09 13:30:02 crc kubenswrapper[4723]: I0309 13:30:02.014957 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551050-dct54" event={"ID":"aed8e659-d57f-41bb-a6a5-ce888991208a","Type":"ContainerStarted","Data":"9db5442985a93e215cf23ca389554186f06df38482da795b823e069c39bef963"} Mar 09 13:30:02 crc kubenswrapper[4723]: I0309 13:30:02.031824 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-mcz8l" podStartSLOduration=2.03180739 podStartE2EDuration="2.03180739s" podCreationTimestamp="2026-03-09 13:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:30:02.030334871 +0000 UTC m=+1876.044802411" watchObservedRunningTime="2026-03-09 13:30:02.03180739 +0000 UTC m=+1876.046274930" Mar 09 13:30:02 crc kubenswrapper[4723]: I0309 13:30:02.056410 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-s8dhp"] Mar 09 13:30:02 crc kubenswrapper[4723]: I0309 13:30:02.073836 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-gnr4x"] Mar 09 13:30:02 crc kubenswrapper[4723]: I0309 13:30:02.087270 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-0679-account-create-update-xwhf9"] Mar 09 13:30:02 crc kubenswrapper[4723]: I0309 13:30:02.101869 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-s8dhp"] Mar 09 13:30:02 crc kubenswrapper[4723]: I0309 13:30:02.117192 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-0679-account-create-update-xwhf9"] Mar 09 13:30:02 crc kubenswrapper[4723]: I0309 13:30:02.128650 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-gnr4x"] Mar 09 13:30:02 crc kubenswrapper[4723]: I0309 13:30:02.914703 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12ca1cc4-a262-42c5-ac51-eee86f7c9793" path="/var/lib/kubelet/pods/12ca1cc4-a262-42c5-ac51-eee86f7c9793/volumes" Mar 09 13:30:02 crc kubenswrapper[4723]: I0309 13:30:02.918968 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36dc8fce-575c-4fe1-b4df-2ae47014bce7" path="/var/lib/kubelet/pods/36dc8fce-575c-4fe1-b4df-2ae47014bce7/volumes" Mar 09 13:30:02 crc kubenswrapper[4723]: I0309 13:30:02.919919 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="556c51ac-2052-4f8c-9c5b-830aacc68de0" path="/var/lib/kubelet/pods/556c51ac-2052-4f8c-9c5b-830aacc68de0/volumes" Mar 09 13:30:02 crc kubenswrapper[4723]: I0309 13:30:02.921114 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b743aece-4b06-44e2-9afa-fd075c0730d3" path="/var/lib/kubelet/pods/b743aece-4b06-44e2-9afa-fd075c0730d3/volumes" Mar 09 13:30:02 crc kubenswrapper[4723]: I0309 13:30:02.922597 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed454b22-e190-4bf0-8581-e71f2ce51324" path="/var/lib/kubelet/pods/ed454b22-e190-4bf0-8581-e71f2ce51324/volumes" Mar 09 13:30:03 crc kubenswrapper[4723]: I0309 13:30:03.032709 4723 generic.go:334] "Generic (PLEG): container finished" podID="058d5dd9-a30c-4de2-a61a-4fd2a359799e" containerID="e11451d0ce7dca7b8212f8043bb36bdb36f297425e5d815f7cea4b03e679d4a9" exitCode=0 Mar 09 13:30:03 crc kubenswrapper[4723]: I0309 13:30:03.032750 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-mcz8l" event={"ID":"058d5dd9-a30c-4de2-a61a-4fd2a359799e","Type":"ContainerDied","Data":"e11451d0ce7dca7b8212f8043bb36bdb36f297425e5d815f7cea4b03e679d4a9"} Mar 09 13:30:04 crc kubenswrapper[4723]: I0309 13:30:04.048480 4723 generic.go:334] "Generic (PLEG): container finished" podID="aed8e659-d57f-41bb-a6a5-ce888991208a" containerID="507b0498e46ad902f17716e4c68a407d92e63614717839d6b40e8fc0a1df26a9" exitCode=0 Mar 09 13:30:04 crc kubenswrapper[4723]: I0309 13:30:04.048978 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551050-dct54" event={"ID":"aed8e659-d57f-41bb-a6a5-ce888991208a","Type":"ContainerDied","Data":"507b0498e46ad902f17716e4c68a407d92e63614717839d6b40e8fc0a1df26a9"} Mar 09 13:30:04 crc kubenswrapper[4723]: I0309 13:30:04.485904 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-mcz8l" Mar 09 13:30:04 crc kubenswrapper[4723]: I0309 13:30:04.606022 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/058d5dd9-a30c-4de2-a61a-4fd2a359799e-config-volume\") pod \"058d5dd9-a30c-4de2-a61a-4fd2a359799e\" (UID: \"058d5dd9-a30c-4de2-a61a-4fd2a359799e\") " Mar 09 13:30:04 crc kubenswrapper[4723]: I0309 13:30:04.606283 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/058d5dd9-a30c-4de2-a61a-4fd2a359799e-secret-volume\") pod \"058d5dd9-a30c-4de2-a61a-4fd2a359799e\" (UID: \"058d5dd9-a30c-4de2-a61a-4fd2a359799e\") " Mar 09 13:30:04 crc kubenswrapper[4723]: I0309 13:30:04.606339 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lv4kv\" (UniqueName: \"kubernetes.io/projected/058d5dd9-a30c-4de2-a61a-4fd2a359799e-kube-api-access-lv4kv\") pod \"058d5dd9-a30c-4de2-a61a-4fd2a359799e\" (UID: \"058d5dd9-a30c-4de2-a61a-4fd2a359799e\") " Mar 09 13:30:04 crc kubenswrapper[4723]: I0309 13:30:04.606794 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/058d5dd9-a30c-4de2-a61a-4fd2a359799e-config-volume" (OuterVolumeSpecName: "config-volume") pod "058d5dd9-a30c-4de2-a61a-4fd2a359799e" (UID: "058d5dd9-a30c-4de2-a61a-4fd2a359799e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:30:04 crc kubenswrapper[4723]: I0309 13:30:04.607814 4723 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/058d5dd9-a30c-4de2-a61a-4fd2a359799e-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 13:30:04 crc kubenswrapper[4723]: I0309 13:30:04.612691 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/058d5dd9-a30c-4de2-a61a-4fd2a359799e-kube-api-access-lv4kv" (OuterVolumeSpecName: "kube-api-access-lv4kv") pod "058d5dd9-a30c-4de2-a61a-4fd2a359799e" (UID: "058d5dd9-a30c-4de2-a61a-4fd2a359799e"). InnerVolumeSpecName "kube-api-access-lv4kv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:30:04 crc kubenswrapper[4723]: I0309 13:30:04.613550 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/058d5dd9-a30c-4de2-a61a-4fd2a359799e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "058d5dd9-a30c-4de2-a61a-4fd2a359799e" (UID: "058d5dd9-a30c-4de2-a61a-4fd2a359799e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:30:04 crc kubenswrapper[4723]: I0309 13:30:04.712822 4723 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/058d5dd9-a30c-4de2-a61a-4fd2a359799e-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 13:30:04 crc kubenswrapper[4723]: I0309 13:30:04.712922 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lv4kv\" (UniqueName: \"kubernetes.io/projected/058d5dd9-a30c-4de2-a61a-4fd2a359799e-kube-api-access-lv4kv\") on node \"crc\" DevicePath \"\"" Mar 09 13:30:05 crc kubenswrapper[4723]: I0309 13:30:05.061117 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-mcz8l" event={"ID":"058d5dd9-a30c-4de2-a61a-4fd2a359799e","Type":"ContainerDied","Data":"453f8289f4fd0c040d2e056ed0b86abdfd9bb626223c19cbbdf84b3ffcd1c802"} Mar 09 13:30:05 crc kubenswrapper[4723]: I0309 13:30:05.061137 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-mcz8l" Mar 09 13:30:05 crc kubenswrapper[4723]: I0309 13:30:05.061161 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="453f8289f4fd0c040d2e056ed0b86abdfd9bb626223c19cbbdf84b3ffcd1c802" Mar 09 13:30:05 crc kubenswrapper[4723]: I0309 13:30:05.536426 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551050-dct54" Mar 09 13:30:05 crc kubenswrapper[4723]: I0309 13:30:05.554507 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmvk4\" (UniqueName: \"kubernetes.io/projected/aed8e659-d57f-41bb-a6a5-ce888991208a-kube-api-access-rmvk4\") pod \"aed8e659-d57f-41bb-a6a5-ce888991208a\" (UID: \"aed8e659-d57f-41bb-a6a5-ce888991208a\") " Mar 09 13:30:05 crc kubenswrapper[4723]: I0309 13:30:05.566448 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aed8e659-d57f-41bb-a6a5-ce888991208a-kube-api-access-rmvk4" (OuterVolumeSpecName: "kube-api-access-rmvk4") pod "aed8e659-d57f-41bb-a6a5-ce888991208a" (UID: "aed8e659-d57f-41bb-a6a5-ce888991208a"). InnerVolumeSpecName "kube-api-access-rmvk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:30:05 crc kubenswrapper[4723]: I0309 13:30:05.658080 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmvk4\" (UniqueName: \"kubernetes.io/projected/aed8e659-d57f-41bb-a6a5-ce888991208a-kube-api-access-rmvk4\") on node \"crc\" DevicePath \"\"" Mar 09 13:30:06 crc kubenswrapper[4723]: I0309 13:30:06.074033 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551050-dct54" event={"ID":"aed8e659-d57f-41bb-a6a5-ce888991208a","Type":"ContainerDied","Data":"9db5442985a93e215cf23ca389554186f06df38482da795b823e069c39bef963"} Mar 09 13:30:06 crc kubenswrapper[4723]: I0309 13:30:06.074083 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9db5442985a93e215cf23ca389554186f06df38482da795b823e069c39bef963" Mar 09 13:30:06 crc kubenswrapper[4723]: I0309 13:30:06.074146 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551050-dct54" Mar 09 13:30:06 crc kubenswrapper[4723]: I0309 13:30:06.626887 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551044-kr92d"] Mar 09 13:30:06 crc kubenswrapper[4723]: I0309 13:30:06.651572 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551044-kr92d"] Mar 09 13:30:06 crc kubenswrapper[4723]: I0309 13:30:06.890117 4723 scope.go:117] "RemoveContainer" containerID="a02e51157ce32a3f4545e91341ee80bb0f1a0c3eaea82ea1beb5091e2f5cef4b" Mar 09 13:30:06 crc kubenswrapper[4723]: E0309 13:30:06.892389 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:30:06 crc kubenswrapper[4723]: I0309 13:30:06.894989 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66dc3748-8aa4-4a0d-8162-42a120d6233d" path="/var/lib/kubelet/pods/66dc3748-8aa4-4a0d-8162-42a120d6233d/volumes" Mar 09 13:30:09 crc kubenswrapper[4723]: I0309 13:30:09.029044 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-fgmft"] Mar 09 13:30:09 crc kubenswrapper[4723]: I0309 13:30:09.042387 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-fgmft"] Mar 09 13:30:10 crc kubenswrapper[4723]: I0309 13:30:10.893115 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f6aa2e5-8424-44e1-a9e0-247a8ff42676" path="/var/lib/kubelet/pods/7f6aa2e5-8424-44e1-a9e0-247a8ff42676/volumes" Mar 09 13:30:19 crc kubenswrapper[4723]: I0309 13:30:19.881416 4723 scope.go:117] "RemoveContainer" containerID="a02e51157ce32a3f4545e91341ee80bb0f1a0c3eaea82ea1beb5091e2f5cef4b" Mar 09 13:30:19 crc kubenswrapper[4723]: E0309 13:30:19.882327 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:30:21 crc kubenswrapper[4723]: I0309 13:30:21.745574 4723 scope.go:117] "RemoveContainer" containerID="30bc40218aec01158c7e6a84024929bc585de0e02283d0f997f94e7b0c46879b" Mar 09 13:30:21 crc kubenswrapper[4723]: I0309 13:30:21.793443 4723 scope.go:117] "RemoveContainer" containerID="a1d7da4a08cf9b2a26a73630ce1508694ccbe3a4f003f21658b56c18137ab4ef" Mar 09 13:30:21 crc kubenswrapper[4723]: I0309 13:30:21.850446 4723 scope.go:117] "RemoveContainer" containerID="9eee0a564c22de11770ba58a1d205cc2a1923922a221b3600eb9363e0c573ab4" Mar 09 13:30:21 crc kubenswrapper[4723]: I0309 13:30:21.901365 4723 scope.go:117] "RemoveContainer" containerID="5afa4fed2a1908b51a8f6e6c3656194ebad26dc4b01dc4b4321f66c1be88249f" Mar 09 13:30:21 crc kubenswrapper[4723]: I0309 13:30:21.951074 4723 scope.go:117] "RemoveContainer" containerID="25cbf1b28d497c5a5e2adaae63fafcf501646f6ae727f3ba0a578919f989820a" Mar 09 13:30:22 crc kubenswrapper[4723]: I0309 13:30:22.026632 4723 scope.go:117] "RemoveContainer" containerID="e0f5a6fb678696ddc6aee2ea2830a0afafb5c962f33dfbdf8b179e7e41d0277a" Mar 09 13:30:22 crc kubenswrapper[4723]: I0309 13:30:22.066225 4723 scope.go:117] "RemoveContainer" containerID="7ef8c316d63f49cc25ce811f7a5e1c1a41fbd8c2bccf44124a352a41c6e50dd5" Mar 09 13:30:22 crc kubenswrapper[4723]: I0309 13:30:22.087399 4723 scope.go:117] "RemoveContainer" containerID="110391c5dd41b539dcca3e4c33253b3a26882afeeb29854053cbd10c6ca6da5f" Mar 09 13:30:22 crc kubenswrapper[4723]: I0309 13:30:22.114249 4723 scope.go:117] "RemoveContainer" containerID="ba88cce2dc24987461693475db35a4db20d16ace8abea540da2a702f0d29436a" Mar 09 13:30:22 crc kubenswrapper[4723]: I0309 13:30:22.135500 4723 scope.go:117] "RemoveContainer" containerID="ad0138a0acdda25ff6c624c810b9988f27e44004cadc70a91a20c7ca3d8ecdf2" Mar 09 13:30:22 crc kubenswrapper[4723]: I0309 13:30:22.166679 4723 scope.go:117] "RemoveContainer" containerID="f7876a0eb57218f6edf385df427799637d84618943937fa84e8d085606c95f9f" Mar 09 13:30:22 crc kubenswrapper[4723]: I0309 13:30:22.189605 4723 scope.go:117] "RemoveContainer" containerID="7a5d6970703d77b0e5e36ab6fd8890a569908407f0a4b312edfdca0e38b0ad32" Mar 09 13:30:22 crc kubenswrapper[4723]: I0309 13:30:22.213889 4723 scope.go:117] "RemoveContainer" containerID="954911e9e6881836f5fbfee400272cdadd39d73843c3fd7f68b4e696a43db9a5" Mar 09 13:30:29 crc kubenswrapper[4723]: I0309 13:30:29.041971 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-d2b1-account-create-update-6966g"] Mar 09 13:30:29 crc kubenswrapper[4723]: I0309 13:30:29.056253 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-d2b1-account-create-update-6966g"] Mar 09 13:30:30 crc kubenswrapper[4723]: I0309 13:30:30.881313 4723 scope.go:117] "RemoveContainer" containerID="a02e51157ce32a3f4545e91341ee80bb0f1a0c3eaea82ea1beb5091e2f5cef4b" Mar 09 13:30:30 crc kubenswrapper[4723]: E0309 13:30:30.882699 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:30:30 crc kubenswrapper[4723]: I0309 13:30:30.904255 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c72c36c7-750a-4bc7-ac34-c9d42896cdd6" path="/var/lib/kubelet/pods/c72c36c7-750a-4bc7-ac34-c9d42896cdd6/volumes" Mar 09 13:30:34 crc kubenswrapper[4723]: I0309 13:30:34.064151 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-88pwc"] Mar 09 13:30:34 crc kubenswrapper[4723]: I0309 13:30:34.081256 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-6f11-account-create-update-gprhw"] Mar 09 13:30:34 crc kubenswrapper[4723]: I0309 13:30:34.097145 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-b554g"] Mar 09 13:30:34 crc kubenswrapper[4723]: I0309 13:30:34.112719 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-6f11-account-create-update-gprhw"] Mar 09 13:30:34 crc kubenswrapper[4723]: I0309 13:30:34.121689 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-b554g"] Mar 09 13:30:34 crc kubenswrapper[4723]: I0309 13:30:34.132173 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-88pwc"] Mar 09 13:30:34 crc kubenswrapper[4723]: I0309 13:30:34.143419 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-sbt7t"] Mar 09 13:30:34 crc kubenswrapper[4723]: I0309 13:30:34.154163 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-cwx46"] Mar 09 13:30:34 crc kubenswrapper[4723]: I0309 13:30:34.166348 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-sbt7t"] Mar 09 13:30:34 crc kubenswrapper[4723]: I0309 13:30:34.176693 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-cwx46"] Mar 09 13:30:34 crc kubenswrapper[4723]: I0309 13:30:34.186975 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-af49-account-create-update-bwsv7"] Mar 09 13:30:34 crc kubenswrapper[4723]: I0309 13:30:34.197208 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-af49-account-create-update-bwsv7"] Mar 09 13:30:34 crc kubenswrapper[4723]: I0309 13:30:34.207699 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-32ad-account-create-update-8p5t9"] Mar 09 13:30:34 crc kubenswrapper[4723]: I0309 13:30:34.215708 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-32ad-account-create-update-8p5t9"] Mar 09 13:30:34 crc kubenswrapper[4723]: I0309 13:30:34.907411 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c9c338c-01c8-428b-89cb-4c4a59505595" path="/var/lib/kubelet/pods/0c9c338c-01c8-428b-89cb-4c4a59505595/volumes" Mar 09 13:30:34 crc kubenswrapper[4723]: I0309 13:30:34.916377 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b441fb5-f89b-4ec1-8399-b3f56fdf139c" path="/var/lib/kubelet/pods/1b441fb5-f89b-4ec1-8399-b3f56fdf139c/volumes" Mar 09 13:30:34 crc kubenswrapper[4723]: I0309 13:30:34.923943 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83082054-f3da-4455-b4af-5232e439042c" path="/var/lib/kubelet/pods/83082054-f3da-4455-b4af-5232e439042c/volumes" Mar 09 13:30:34 crc kubenswrapper[4723]: I0309 13:30:34.931351 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f9ae762-5d7d-4d41-9477-b4cc72689803" path="/var/lib/kubelet/pods/8f9ae762-5d7d-4d41-9477-b4cc72689803/volumes" Mar 09 13:30:34 crc kubenswrapper[4723]: I0309 13:30:34.937492 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9feddb28-c165-4784-94a8-4d63209fda46" path="/var/lib/kubelet/pods/9feddb28-c165-4784-94a8-4d63209fda46/volumes" Mar 09 13:30:34 crc kubenswrapper[4723]: I0309 13:30:34.943284 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1df2ad2-feac-487c-ab26-e885457d7979" path="/var/lib/kubelet/pods/c1df2ad2-feac-487c-ab26-e885457d7979/volumes" Mar 09 13:30:34 crc kubenswrapper[4723]: I0309 13:30:34.948120 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edb56749-bb8b-4620-a1a9-8e1f2a70f1b2" path="/var/lib/kubelet/pods/edb56749-bb8b-4620-a1a9-8e1f2a70f1b2/volumes" Mar 09 13:30:38 crc kubenswrapper[4723]: I0309 13:30:38.046467 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-x7vz5"] Mar 09 13:30:38 crc kubenswrapper[4723]: I0309 13:30:38.067717 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-x7vz5"] Mar 09 13:30:38 crc kubenswrapper[4723]: I0309 13:30:38.895484 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15541e12-c0a2-4c26-b912-d33be48eea77" path="/var/lib/kubelet/pods/15541e12-c0a2-4c26-b912-d33be48eea77/volumes" Mar 09 13:30:44 crc kubenswrapper[4723]: I0309 13:30:44.881351 4723 scope.go:117] "RemoveContainer" containerID="a02e51157ce32a3f4545e91341ee80bb0f1a0c3eaea82ea1beb5091e2f5cef4b" Mar 09 13:30:44 crc kubenswrapper[4723]: E0309 13:30:44.882037 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:30:59 crc kubenswrapper[4723]: I0309 13:30:59.882037 4723 scope.go:117] "RemoveContainer" containerID="a02e51157ce32a3f4545e91341ee80bb0f1a0c3eaea82ea1beb5091e2f5cef4b" Mar 09 13:30:59 crc kubenswrapper[4723]: E0309 13:30:59.883367 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:31:00 crc kubenswrapper[4723]: I0309 13:31:00.052983 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-b5gc4"] Mar 09 13:31:00 crc kubenswrapper[4723]: I0309 13:31:00.070946 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-b5gc4"] Mar 09 13:31:00 crc kubenswrapper[4723]: I0309 13:31:00.899268 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72dc18cb-be01-4378-b62c-609a2c237731" path="/var/lib/kubelet/pods/72dc18cb-be01-4378-b62c-609a2c237731/volumes" Mar 09 13:31:10 crc kubenswrapper[4723]: I0309 13:31:10.048107 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-pf8wv"] Mar 09 13:31:10 crc kubenswrapper[4723]: I0309 13:31:10.060156 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-pf8wv"] Mar 09 13:31:10 crc kubenswrapper[4723]: I0309 13:31:10.905674 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8846c8f3-62f3-4053-8b48-177d011dd0c9" path="/var/lib/kubelet/pods/8846c8f3-62f3-4053-8b48-177d011dd0c9/volumes" Mar 09 13:31:13 crc kubenswrapper[4723]: I0309 13:31:13.881598 4723 scope.go:117] "RemoveContainer" containerID="a02e51157ce32a3f4545e91341ee80bb0f1a0c3eaea82ea1beb5091e2f5cef4b" Mar 09 13:31:13 crc kubenswrapper[4723]: E0309 13:31:13.882298 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:31:17 crc kubenswrapper[4723]: I0309 13:31:17.039124 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-wvbmm"] Mar 09 13:31:17 crc kubenswrapper[4723]: I0309 13:31:17.053442 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-wvbmm"] Mar 09 13:31:18 crc kubenswrapper[4723]: I0309 13:31:18.895398 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="052977d5-adda-4cc2-a8bc-7b4ea4e32df7" path="/var/lib/kubelet/pods/052977d5-adda-4cc2-a8bc-7b4ea4e32df7/volumes" Mar 09 13:31:19 crc kubenswrapper[4723]: I0309 13:31:19.042567 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-cp5rk"] Mar 09 13:31:19 crc kubenswrapper[4723]: I0309 13:31:19.054492 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-cp5rk"] Mar 09 13:31:20 crc kubenswrapper[4723]: I0309 13:31:20.904726 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6832d621-3d7d-4e4a-824b-f219746aaa89" path="/var/lib/kubelet/pods/6832d621-3d7d-4e4a-824b-f219746aaa89/volumes" Mar 09 13:31:22 crc kubenswrapper[4723]: I0309 13:31:22.488362 4723 scope.go:117] "RemoveContainer" containerID="01625adfdb5a75e4cff0e083107fb62501001c9a5214e951c45cff2815bc8cd4" Mar 09 13:31:22 crc kubenswrapper[4723]: I0309 13:31:22.516961 4723 scope.go:117] "RemoveContainer" containerID="99ed3f717a597eff2118d8c9217456fe28f4259766a3a003f976b0720a42d3e3" Mar 09 13:31:22 crc kubenswrapper[4723]: I0309 13:31:22.586823 4723 scope.go:117] "RemoveContainer" containerID="be50d43caf62d2a431c11901c4e5ec9dccfaaaf18ef468a0f27c9c5a42afd31d" Mar 09 13:31:22 crc kubenswrapper[4723]: I0309 13:31:22.628808 4723 scope.go:117] "RemoveContainer" containerID="05df4ab3aeef5227dd01ef0959d21e5fc3b094a1cce9906accd8d7502f74db94" Mar 09 13:31:22 crc kubenswrapper[4723]: I0309 13:31:22.705988 4723 scope.go:117] "RemoveContainer" containerID="3139003d5f6dffdd22dc5e6128e2bb67c3122b2c35b79ec1cea78fac44c55ddf" Mar 09 13:31:22 crc kubenswrapper[4723]: I0309 13:31:22.765218 4723 scope.go:117] "RemoveContainer" containerID="29b115c1f8743153a2d56a56172aa3cc7d2029bb69439e6f911ef022e708ed89" Mar 09 13:31:22 crc kubenswrapper[4723]: I0309 13:31:22.814169 4723 scope.go:117] "RemoveContainer" containerID="ee4ad67d87342969a5456a44427d04f8893dceeb20b494e50679710f34a6e58d" Mar 09 13:31:22 crc kubenswrapper[4723]: I0309 13:31:22.844319 4723 scope.go:117] "RemoveContainer" containerID="a0c8702b95ad8121cc345974734a1faa478de4334e93db13ffef34c0783aefa8" Mar 09 13:31:22 crc kubenswrapper[4723]: I0309 13:31:22.871905 4723 scope.go:117] "RemoveContainer" containerID="fa271cd98fb9d718deefeb8613c719476e73d21728be167d26cb92a97b551a67" Mar 09 13:31:22 crc kubenswrapper[4723]: I0309 13:31:22.903504 4723 scope.go:117] "RemoveContainer" containerID="a12f01e18860b06c7461525078132a6cb03eb0b25e0f6d6ecc433d5701fb5e7c" Mar 09 13:31:22 crc kubenswrapper[4723]: I0309 13:31:22.930609 4723 scope.go:117] "RemoveContainer" containerID="ef3be5a1a9d1774e23441c783b55cf716c31d9fd5560f1064109f14e419f2256" Mar 09 13:31:22 crc kubenswrapper[4723]: I0309 13:31:22.961612 4723 scope.go:117] "RemoveContainer" containerID="e4545be82e46e0052920e5b344bacbaf2bbe64b271359f6b8e82bb95b36fad61" Mar 09 13:31:22 crc kubenswrapper[4723]: I0309 13:31:22.990459 4723 scope.go:117] "RemoveContainer" containerID="b03fbb274ecab3bd38f7ab5714bcc4aa8bff5b700df1fa192cc40bf66926d6df" Mar 09 13:31:24 crc kubenswrapper[4723]: I0309 13:31:24.040421 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-ch4lf"] Mar 09 13:31:24 crc kubenswrapper[4723]: I0309 13:31:24.052785 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-ch4lf"] Mar 09 13:31:24 crc kubenswrapper[4723]: I0309 13:31:24.881638 4723 scope.go:117] "RemoveContainer" containerID="a02e51157ce32a3f4545e91341ee80bb0f1a0c3eaea82ea1beb5091e2f5cef4b" Mar 09 13:31:24 crc kubenswrapper[4723]: E0309 13:31:24.882460 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:31:24 crc kubenswrapper[4723]: I0309 13:31:24.900612 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d74c75e-9665-4723-8dca-9019bd324ccb" path="/var/lib/kubelet/pods/3d74c75e-9665-4723-8dca-9019bd324ccb/volumes" Mar 09 13:31:38 crc kubenswrapper[4723]: I0309 13:31:38.882144 4723 scope.go:117] "RemoveContainer" containerID="a02e51157ce32a3f4545e91341ee80bb0f1a0c3eaea82ea1beb5091e2f5cef4b" Mar 09 13:31:38 crc kubenswrapper[4723]: E0309 13:31:38.882851 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:31:39 crc kubenswrapper[4723]: I0309 13:31:39.054347 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-bm79v"] Mar 09 13:31:39 crc kubenswrapper[4723]: I0309 13:31:39.065080 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-bm79v"] Mar 09 13:31:40 crc kubenswrapper[4723]: I0309 13:31:40.905909 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="191baa15-4ac5-4e55-9f87-751eddffb83e" path="/var/lib/kubelet/pods/191baa15-4ac5-4e55-9f87-751eddffb83e/volumes" Mar 09 13:31:47 crc kubenswrapper[4723]: I0309 13:31:47.331050 4723 generic.go:334] "Generic (PLEG): container finished" podID="e9c9b511-2ead-4e25-a076-846d1723510b" containerID="d4dd4a7817e2891da63a363280ac5e315650a965ca4c7bbf1306013aafae9658" exitCode=0 Mar 09 13:31:47 crc kubenswrapper[4723]: I0309 13:31:47.331137 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vrckm" event={"ID":"e9c9b511-2ead-4e25-a076-846d1723510b","Type":"ContainerDied","Data":"d4dd4a7817e2891da63a363280ac5e315650a965ca4c7bbf1306013aafae9658"} Mar 09 13:31:48 crc kubenswrapper[4723]: I0309 13:31:48.776962 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vrckm" Mar 09 13:31:48 crc kubenswrapper[4723]: I0309 13:31:48.864212 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9c9b511-2ead-4e25-a076-846d1723510b-inventory\") pod \"e9c9b511-2ead-4e25-a076-846d1723510b\" (UID: \"e9c9b511-2ead-4e25-a076-846d1723510b\") " Mar 09 13:31:48 crc kubenswrapper[4723]: I0309 13:31:48.864403 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9c9b511-2ead-4e25-a076-846d1723510b-ssh-key-openstack-edpm-ipam\") pod \"e9c9b511-2ead-4e25-a076-846d1723510b\" (UID: \"e9c9b511-2ead-4e25-a076-846d1723510b\") " Mar 09 13:31:48 crc kubenswrapper[4723]: I0309 13:31:48.864475 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csbmr\" (UniqueName: \"kubernetes.io/projected/e9c9b511-2ead-4e25-a076-846d1723510b-kube-api-access-csbmr\") pod \"e9c9b511-2ead-4e25-a076-846d1723510b\" (UID: \"e9c9b511-2ead-4e25-a076-846d1723510b\") " Mar 09 13:31:48 crc kubenswrapper[4723]: I0309 13:31:48.869966 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9c9b511-2ead-4e25-a076-846d1723510b-kube-api-access-csbmr" (OuterVolumeSpecName: "kube-api-access-csbmr") pod "e9c9b511-2ead-4e25-a076-846d1723510b" (UID: "e9c9b511-2ead-4e25-a076-846d1723510b"). InnerVolumeSpecName "kube-api-access-csbmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:31:48 crc kubenswrapper[4723]: I0309 13:31:48.903163 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9c9b511-2ead-4e25-a076-846d1723510b-inventory" (OuterVolumeSpecName: "inventory") pod "e9c9b511-2ead-4e25-a076-846d1723510b" (UID: "e9c9b511-2ead-4e25-a076-846d1723510b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:31:48 crc kubenswrapper[4723]: I0309 13:31:48.906321 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9c9b511-2ead-4e25-a076-846d1723510b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e9c9b511-2ead-4e25-a076-846d1723510b" (UID: "e9c9b511-2ead-4e25-a076-846d1723510b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:31:48 crc kubenswrapper[4723]: I0309 13:31:48.967500 4723 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9c9b511-2ead-4e25-a076-846d1723510b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:31:48 crc kubenswrapper[4723]: I0309 13:31:48.967541 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csbmr\" (UniqueName: \"kubernetes.io/projected/e9c9b511-2ead-4e25-a076-846d1723510b-kube-api-access-csbmr\") on node \"crc\" DevicePath \"\"" Mar 09 13:31:48 crc kubenswrapper[4723]: I0309 13:31:48.967551 4723 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9c9b511-2ead-4e25-a076-846d1723510b-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:31:49 crc kubenswrapper[4723]: I0309 13:31:49.356392 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vrckm" event={"ID":"e9c9b511-2ead-4e25-a076-846d1723510b","Type":"ContainerDied","Data":"bfabf7f6c383e4438f119c952cf5d7fc5bad4c4f64f4803f48c22a4e65b250d2"} Mar 09 13:31:49 crc kubenswrapper[4723]: I0309 13:31:49.356765 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfabf7f6c383e4438f119c952cf5d7fc5bad4c4f64f4803f48c22a4e65b250d2" Mar 09 13:31:49 crc kubenswrapper[4723]: I0309 13:31:49.356424 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vrckm" Mar 09 13:31:49 crc kubenswrapper[4723]: I0309 13:31:49.470143 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p96wk"] Mar 09 13:31:49 crc kubenswrapper[4723]: E0309 13:31:49.470693 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9c9b511-2ead-4e25-a076-846d1723510b" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 09 13:31:49 crc kubenswrapper[4723]: I0309 13:31:49.470719 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9c9b511-2ead-4e25-a076-846d1723510b" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 09 13:31:49 crc kubenswrapper[4723]: E0309 13:31:49.470750 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aed8e659-d57f-41bb-a6a5-ce888991208a" containerName="oc" Mar 09 13:31:49 crc kubenswrapper[4723]: I0309 13:31:49.470759 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="aed8e659-d57f-41bb-a6a5-ce888991208a" containerName="oc" Mar 09 13:31:49 crc kubenswrapper[4723]: E0309 13:31:49.470804 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="058d5dd9-a30c-4de2-a61a-4fd2a359799e" containerName="collect-profiles" Mar 09 13:31:49 crc kubenswrapper[4723]: I0309 13:31:49.470813 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="058d5dd9-a30c-4de2-a61a-4fd2a359799e" containerName="collect-profiles" Mar 09 13:31:49 crc kubenswrapper[4723]: I0309 13:31:49.471083 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="058d5dd9-a30c-4de2-a61a-4fd2a359799e" containerName="collect-profiles" Mar 09 13:31:49 crc kubenswrapper[4723]: I0309 13:31:49.471126 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9c9b511-2ead-4e25-a076-846d1723510b" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 09 13:31:49 crc kubenswrapper[4723]: I0309 13:31:49.471144 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="aed8e659-d57f-41bb-a6a5-ce888991208a" containerName="oc" Mar 09 13:31:49 crc kubenswrapper[4723]: I0309 13:31:49.472088 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p96wk" Mar 09 13:31:49 crc kubenswrapper[4723]: I0309 13:31:49.475684 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:31:49 crc kubenswrapper[4723]: I0309 13:31:49.475711 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gw7vt" Mar 09 13:31:49 crc kubenswrapper[4723]: I0309 13:31:49.475804 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:31:49 crc kubenswrapper[4723]: I0309 13:31:49.475843 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:31:49 crc kubenswrapper[4723]: I0309 13:31:49.480374 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p96wk"] Mar 09 13:31:49 crc kubenswrapper[4723]: I0309 13:31:49.581895 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c63771a-b480-44bf-98f4-68ac62c7189a-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p96wk\" (UID: \"5c63771a-b480-44bf-98f4-68ac62c7189a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p96wk" Mar 09 13:31:49 crc kubenswrapper[4723]: I0309 13:31:49.582023 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5c63771a-b480-44bf-98f4-68ac62c7189a-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p96wk\" (UID: \"5c63771a-b480-44bf-98f4-68ac62c7189a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p96wk" Mar 09 13:31:49 crc kubenswrapper[4723]: I0309 13:31:49.582080 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bj6s\" (UniqueName: \"kubernetes.io/projected/5c63771a-b480-44bf-98f4-68ac62c7189a-kube-api-access-5bj6s\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p96wk\" (UID: \"5c63771a-b480-44bf-98f4-68ac62c7189a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p96wk" Mar 09 13:31:49 crc kubenswrapper[4723]: I0309 13:31:49.684449 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5c63771a-b480-44bf-98f4-68ac62c7189a-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p96wk\" (UID: \"5c63771a-b480-44bf-98f4-68ac62c7189a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p96wk" Mar 09 13:31:49 crc kubenswrapper[4723]: I0309 13:31:49.684985 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bj6s\" (UniqueName: \"kubernetes.io/projected/5c63771a-b480-44bf-98f4-68ac62c7189a-kube-api-access-5bj6s\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p96wk\" (UID: \"5c63771a-b480-44bf-98f4-68ac62c7189a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p96wk" Mar 09 13:31:49 crc kubenswrapper[4723]: I0309 13:31:49.685222 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c63771a-b480-44bf-98f4-68ac62c7189a-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p96wk\" (UID: \"5c63771a-b480-44bf-98f4-68ac62c7189a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p96wk" Mar 09 13:31:49 crc kubenswrapper[4723]: I0309 13:31:49.692074 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5c63771a-b480-44bf-98f4-68ac62c7189a-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p96wk\" (UID: \"5c63771a-b480-44bf-98f4-68ac62c7189a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p96wk" Mar 09 13:31:49 crc kubenswrapper[4723]: I0309 13:31:49.694387 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c63771a-b480-44bf-98f4-68ac62c7189a-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p96wk\" (UID: \"5c63771a-b480-44bf-98f4-68ac62c7189a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p96wk" Mar 09 13:31:49 crc kubenswrapper[4723]: I0309 13:31:49.706986 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bj6s\" (UniqueName: \"kubernetes.io/projected/5c63771a-b480-44bf-98f4-68ac62c7189a-kube-api-access-5bj6s\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p96wk\" (UID: \"5c63771a-b480-44bf-98f4-68ac62c7189a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p96wk" Mar 09 13:31:49 crc kubenswrapper[4723]: I0309 13:31:49.845030 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p96wk" Mar 09 13:31:50 crc kubenswrapper[4723]: I0309 13:31:50.408339 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p96wk"] Mar 09 13:31:51 crc kubenswrapper[4723]: I0309 13:31:51.378432 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p96wk" event={"ID":"5c63771a-b480-44bf-98f4-68ac62c7189a","Type":"ContainerStarted","Data":"921a4129e0d0fcf157f913e103b3946b2751da08fd5e6777d5262fae62f0c4b3"} Mar 09 13:31:52 crc kubenswrapper[4723]: I0309 13:31:52.395719 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p96wk" event={"ID":"5c63771a-b480-44bf-98f4-68ac62c7189a","Type":"ContainerStarted","Data":"1349603b426a0e43bc77b3f0bc65326dff7d06f92712cc252ec7e979d3de4a4b"} Mar 09 13:31:52 crc kubenswrapper[4723]: I0309 13:31:52.424271 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p96wk" podStartSLOduration=1.8506970059999999 podStartE2EDuration="3.424247978s" podCreationTimestamp="2026-03-09 13:31:49 +0000 UTC" firstStartedPulling="2026-03-09 13:31:50.411131744 +0000 UTC m=+1984.425599284" lastFinishedPulling="2026-03-09 13:31:51.984682716 +0000 UTC m=+1985.999150256" observedRunningTime="2026-03-09 13:31:52.416511723 +0000 UTC m=+1986.430979283" watchObservedRunningTime="2026-03-09 13:31:52.424247978 +0000 UTC m=+1986.438715528" Mar 09 13:31:53 crc kubenswrapper[4723]: I0309 13:31:53.881562 4723 scope.go:117] "RemoveContainer" containerID="a02e51157ce32a3f4545e91341ee80bb0f1a0c3eaea82ea1beb5091e2f5cef4b" Mar 09 13:31:53 crc kubenswrapper[4723]: E0309 13:31:53.884210 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:32:00 crc kubenswrapper[4723]: I0309 13:32:00.143453 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551052-h547x"] Mar 09 13:32:00 crc kubenswrapper[4723]: I0309 13:32:00.145440 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551052-h547x" Mar 09 13:32:00 crc kubenswrapper[4723]: I0309 13:32:00.148890 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:32:00 crc kubenswrapper[4723]: I0309 13:32:00.149015 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:32:00 crc kubenswrapper[4723]: I0309 13:32:00.149060 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jrp4x" Mar 09 13:32:00 crc kubenswrapper[4723]: I0309 13:32:00.163529 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551052-h547x"] Mar 09 13:32:00 crc kubenswrapper[4723]: I0309 13:32:00.249078 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl88v\" (UniqueName: \"kubernetes.io/projected/8d1805eb-e0eb-46e7-936f-5d6cc76753ae-kube-api-access-cl88v\") pod \"auto-csr-approver-29551052-h547x\" (UID: \"8d1805eb-e0eb-46e7-936f-5d6cc76753ae\") " pod="openshift-infra/auto-csr-approver-29551052-h547x" Mar 09 13:32:00 crc kubenswrapper[4723]: I0309 13:32:00.352079 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl88v\" (UniqueName: \"kubernetes.io/projected/8d1805eb-e0eb-46e7-936f-5d6cc76753ae-kube-api-access-cl88v\") pod \"auto-csr-approver-29551052-h547x\" (UID: \"8d1805eb-e0eb-46e7-936f-5d6cc76753ae\") " pod="openshift-infra/auto-csr-approver-29551052-h547x" Mar 09 13:32:00 crc kubenswrapper[4723]: I0309 13:32:00.371646 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl88v\" (UniqueName: \"kubernetes.io/projected/8d1805eb-e0eb-46e7-936f-5d6cc76753ae-kube-api-access-cl88v\") pod \"auto-csr-approver-29551052-h547x\" (UID: \"8d1805eb-e0eb-46e7-936f-5d6cc76753ae\") " pod="openshift-infra/auto-csr-approver-29551052-h547x" Mar 09 13:32:00 crc kubenswrapper[4723]: I0309 13:32:00.467059 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551052-h547x" Mar 09 13:32:00 crc kubenswrapper[4723]: I0309 13:32:00.941626 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551052-h547x"] Mar 09 13:32:01 crc kubenswrapper[4723]: I0309 13:32:01.487492 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551052-h547x" event={"ID":"8d1805eb-e0eb-46e7-936f-5d6cc76753ae","Type":"ContainerStarted","Data":"4c839bd0dfb853ec94efdbef5cf6b7199b6e76da1c718e0d43c05590f1ce9a4b"} Mar 09 13:32:03 crc kubenswrapper[4723]: I0309 13:32:03.512162 4723 generic.go:334] "Generic (PLEG): container finished" podID="8d1805eb-e0eb-46e7-936f-5d6cc76753ae" containerID="5dc3959b3c7ef0edd50a086a00ddd8e4fbd763b4bf13dcdd107bce49c2a4f70d" exitCode=0 Mar 09 13:32:03 crc kubenswrapper[4723]: I0309 13:32:03.512247 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551052-h547x" event={"ID":"8d1805eb-e0eb-46e7-936f-5d6cc76753ae","Type":"ContainerDied","Data":"5dc3959b3c7ef0edd50a086a00ddd8e4fbd763b4bf13dcdd107bce49c2a4f70d"} Mar 09 13:32:04 crc kubenswrapper[4723]: I0309 13:32:04.882449 4723 scope.go:117] "RemoveContainer" containerID="a02e51157ce32a3f4545e91341ee80bb0f1a0c3eaea82ea1beb5091e2f5cef4b" Mar 09 13:32:04 crc kubenswrapper[4723]: E0309 13:32:04.883485 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:32:04 crc kubenswrapper[4723]: I0309 13:32:04.932167 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551052-h547x" Mar 09 13:32:05 crc kubenswrapper[4723]: I0309 13:32:05.109487 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cl88v\" (UniqueName: \"kubernetes.io/projected/8d1805eb-e0eb-46e7-936f-5d6cc76753ae-kube-api-access-cl88v\") pod \"8d1805eb-e0eb-46e7-936f-5d6cc76753ae\" (UID: \"8d1805eb-e0eb-46e7-936f-5d6cc76753ae\") " Mar 09 13:32:05 crc kubenswrapper[4723]: I0309 13:32:05.121641 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d1805eb-e0eb-46e7-936f-5d6cc76753ae-kube-api-access-cl88v" (OuterVolumeSpecName: "kube-api-access-cl88v") pod "8d1805eb-e0eb-46e7-936f-5d6cc76753ae" (UID: "8d1805eb-e0eb-46e7-936f-5d6cc76753ae"). InnerVolumeSpecName "kube-api-access-cl88v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:32:05 crc kubenswrapper[4723]: I0309 13:32:05.213565 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cl88v\" (UniqueName: \"kubernetes.io/projected/8d1805eb-e0eb-46e7-936f-5d6cc76753ae-kube-api-access-cl88v\") on node \"crc\" DevicePath \"\"" Mar 09 13:32:05 crc kubenswrapper[4723]: I0309 13:32:05.545852 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551052-h547x" event={"ID":"8d1805eb-e0eb-46e7-936f-5d6cc76753ae","Type":"ContainerDied","Data":"4c839bd0dfb853ec94efdbef5cf6b7199b6e76da1c718e0d43c05590f1ce9a4b"} Mar 09 13:32:05 crc kubenswrapper[4723]: I0309 13:32:05.546339 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c839bd0dfb853ec94efdbef5cf6b7199b6e76da1c718e0d43c05590f1ce9a4b" Mar 09 13:32:05 crc kubenswrapper[4723]: I0309 13:32:05.545988 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551052-h547x" Mar 09 13:32:06 crc kubenswrapper[4723]: I0309 13:32:06.000238 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551046-t2lf4"] Mar 09 13:32:06 crc kubenswrapper[4723]: I0309 13:32:06.010017 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551046-t2lf4"] Mar 09 13:32:06 crc kubenswrapper[4723]: I0309 13:32:06.909982 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1680395-afcc-4923-a2e8-dcdc08604cda" path="/var/lib/kubelet/pods/f1680395-afcc-4923-a2e8-dcdc08604cda/volumes" Mar 09 13:32:14 crc kubenswrapper[4723]: I0309 13:32:14.044689 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-88895"] Mar 09 13:32:14 crc kubenswrapper[4723]: I0309 13:32:14.062395 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-64d0-account-create-update-92s6n"] Mar 09 13:32:14 crc kubenswrapper[4723]: I0309 13:32:14.076291 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-64d0-account-create-update-92s6n"] Mar 09 13:32:14 crc kubenswrapper[4723]: I0309 13:32:14.085911 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-88895"] Mar 09 13:32:14 crc kubenswrapper[4723]: I0309 13:32:14.896710 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="573e22bf-ce84-4ce1-bd2d-45f52b8cd30a" path="/var/lib/kubelet/pods/573e22bf-ce84-4ce1-bd2d-45f52b8cd30a/volumes" Mar 09 13:32:14 crc kubenswrapper[4723]: I0309 13:32:14.897692 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c035e2e5-1d3a-4254-8950-b6893fc60ff3" path="/var/lib/kubelet/pods/c035e2e5-1d3a-4254-8950-b6893fc60ff3/volumes" Mar 09 13:32:15 crc kubenswrapper[4723]: I0309 13:32:15.882088 4723 scope.go:117] "RemoveContainer" containerID="a02e51157ce32a3f4545e91341ee80bb0f1a0c3eaea82ea1beb5091e2f5cef4b" Mar 09 13:32:15 crc kubenswrapper[4723]: E0309 13:32:15.882715 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:32:16 crc kubenswrapper[4723]: I0309 13:32:16.034403 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-948f-account-create-update-2t9dn"] Mar 09 13:32:16 crc kubenswrapper[4723]: I0309 13:32:16.049074 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-4cgtr"] Mar 09 13:32:16 crc kubenswrapper[4723]: I0309 13:32:16.061677 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-948f-account-create-update-2t9dn"] Mar 09 13:32:16 crc kubenswrapper[4723]: I0309 13:32:16.070853 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-4cgtr"] Mar 09 13:32:16 crc kubenswrapper[4723]: I0309 13:32:16.897047 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36f5570d-569f-4871-9be2-bc1650c32fb8" path="/var/lib/kubelet/pods/36f5570d-569f-4871-9be2-bc1650c32fb8/volumes" Mar 09 13:32:16 crc kubenswrapper[4723]: I0309 13:32:16.898788 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1248e57-4fe2-41f6-9fd9-d3980e6b3c7f" path="/var/lib/kubelet/pods/f1248e57-4fe2-41f6-9fd9-d3980e6b3c7f/volumes" Mar 09 13:32:17 crc kubenswrapper[4723]: I0309 13:32:17.040166 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-8b2e-account-create-update-jp7ld"] Mar 09 13:32:17 crc kubenswrapper[4723]: I0309 13:32:17.052461 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-nlv65"] Mar 09 13:32:17 crc kubenswrapper[4723]: I0309 13:32:17.062724 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-nlv65"] Mar 09 13:32:17 crc kubenswrapper[4723]: I0309 13:32:17.069700 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-8b2e-account-create-update-jp7ld"] Mar 09 13:32:18 crc kubenswrapper[4723]: I0309 13:32:18.914894 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a08da534-32c5-4d52-ba3f-2bc7a8f491c4" path="/var/lib/kubelet/pods/a08da534-32c5-4d52-ba3f-2bc7a8f491c4/volumes" Mar 09 13:32:18 crc kubenswrapper[4723]: I0309 13:32:18.918264 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f82b3a10-19c5-4071-9ab5-5356f38bf35e" path="/var/lib/kubelet/pods/f82b3a10-19c5-4071-9ab5-5356f38bf35e/volumes" Mar 09 13:32:23 crc kubenswrapper[4723]: I0309 13:32:23.280525 4723 scope.go:117] "RemoveContainer" containerID="ac9edbf387a821529550a9f1e53cb67936be28fcc2d665ab4009384ed4c0bb14" Mar 09 13:32:23 crc kubenswrapper[4723]: I0309 13:32:23.320313 4723 scope.go:117] "RemoveContainer" containerID="efccf5749aadb52cbc6bcfcf7ebba24a2aca927f5ea5da8a38aa95fafb98d8b8" Mar 09 13:32:23 crc kubenswrapper[4723]: I0309 13:32:23.379795 4723 scope.go:117] "RemoveContainer" containerID="fe3139d90fc2debdc13fe9cfc9bf20e0412a7bff134036ac39658fac91cbee73" Mar 09 13:32:23 crc kubenswrapper[4723]: I0309 13:32:23.428572 4723 scope.go:117] "RemoveContainer" containerID="dad61ce22585c2f465388c8d8e66e33317bad4e22ce62cb653d79435cc799155" Mar 09 13:32:23 crc kubenswrapper[4723]: I0309 13:32:23.477222 4723 scope.go:117] "RemoveContainer" containerID="e727cb285e0827dde49c8e97f0f4423d19579378350dc70ca73915fd30d39bdc" Mar 09 13:32:23 crc kubenswrapper[4723]: I0309 13:32:23.536349 4723 scope.go:117] "RemoveContainer" containerID="5e62c40bdb2d3eed33c692b0cbc3d05d43ae4c06d524032082bf044dd17a6229" Mar 09 13:32:23 crc kubenswrapper[4723]: I0309 13:32:23.588156 4723 scope.go:117] "RemoveContainer" containerID="de82fa43626083c3dc28fdea40eaa87275fab33290dd422d7772b03e696b0e08" Mar 09 13:32:23 crc kubenswrapper[4723]: I0309 13:32:23.608348 4723 scope.go:117] "RemoveContainer" containerID="3089133ad3ac67c8f2b108d7f5e469a3c6333fb68492b8c0cee405c625b8fe8b" Mar 09 13:32:23 crc kubenswrapper[4723]: I0309 13:32:23.631430 4723 scope.go:117] "RemoveContainer" containerID="de912b3b49280290a9e08e997c841df747a2b308cc9462fab569c29911e39df9" Mar 09 13:32:28 crc kubenswrapper[4723]: I0309 13:32:28.882015 4723 scope.go:117] "RemoveContainer" containerID="a02e51157ce32a3f4545e91341ee80bb0f1a0c3eaea82ea1beb5091e2f5cef4b" Mar 09 13:32:28 crc kubenswrapper[4723]: E0309 13:32:28.883068 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:32:42 crc kubenswrapper[4723]: I0309 13:32:42.881433 4723 scope.go:117] "RemoveContainer" containerID="a02e51157ce32a3f4545e91341ee80bb0f1a0c3eaea82ea1beb5091e2f5cef4b" Mar 09 13:32:42 crc kubenswrapper[4723]: E0309 13:32:42.882552 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:32:57 crc kubenswrapper[4723]: I0309 13:32:57.047934 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-pppx5"] Mar 09 13:32:57 crc kubenswrapper[4723]: I0309 13:32:57.063666 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-pppx5"] Mar 09 13:32:57 crc kubenswrapper[4723]: I0309 13:32:57.881374 4723 scope.go:117] "RemoveContainer" containerID="a02e51157ce32a3f4545e91341ee80bb0f1a0c3eaea82ea1beb5091e2f5cef4b" Mar 09 13:32:57 crc kubenswrapper[4723]: E0309 13:32:57.881942 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:32:58 crc kubenswrapper[4723]: I0309 13:32:58.912077 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e8ed559-11fc-4511-9258-1681da84b5cd" path="/var/lib/kubelet/pods/7e8ed559-11fc-4511-9258-1681da84b5cd/volumes" Mar 09 13:33:03 crc kubenswrapper[4723]: I0309 13:33:03.199143 4723 generic.go:334] "Generic (PLEG): container finished" podID="5c63771a-b480-44bf-98f4-68ac62c7189a" containerID="1349603b426a0e43bc77b3f0bc65326dff7d06f92712cc252ec7e979d3de4a4b" exitCode=0 Mar 09 13:33:03 crc kubenswrapper[4723]: I0309 13:33:03.199204 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p96wk" event={"ID":"5c63771a-b480-44bf-98f4-68ac62c7189a","Type":"ContainerDied","Data":"1349603b426a0e43bc77b3f0bc65326dff7d06f92712cc252ec7e979d3de4a4b"} Mar 09 13:33:04 crc kubenswrapper[4723]: I0309 13:33:04.628267 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p96wk" Mar 09 13:33:04 crc kubenswrapper[4723]: I0309 13:33:04.733749 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bj6s\" (UniqueName: \"kubernetes.io/projected/5c63771a-b480-44bf-98f4-68ac62c7189a-kube-api-access-5bj6s\") pod \"5c63771a-b480-44bf-98f4-68ac62c7189a\" (UID: \"5c63771a-b480-44bf-98f4-68ac62c7189a\") " Mar 09 13:33:04 crc kubenswrapper[4723]: I0309 13:33:04.734370 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5c63771a-b480-44bf-98f4-68ac62c7189a-ssh-key-openstack-edpm-ipam\") pod \"5c63771a-b480-44bf-98f4-68ac62c7189a\" (UID: \"5c63771a-b480-44bf-98f4-68ac62c7189a\") " Mar 09 13:33:04 crc kubenswrapper[4723]: I0309 13:33:04.734553 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c63771a-b480-44bf-98f4-68ac62c7189a-inventory\") pod \"5c63771a-b480-44bf-98f4-68ac62c7189a\" (UID: \"5c63771a-b480-44bf-98f4-68ac62c7189a\") " Mar 09 13:33:04 crc kubenswrapper[4723]: I0309 13:33:04.739463 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c63771a-b480-44bf-98f4-68ac62c7189a-kube-api-access-5bj6s" (OuterVolumeSpecName: "kube-api-access-5bj6s") pod "5c63771a-b480-44bf-98f4-68ac62c7189a" (UID: "5c63771a-b480-44bf-98f4-68ac62c7189a"). InnerVolumeSpecName "kube-api-access-5bj6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:33:04 crc kubenswrapper[4723]: I0309 13:33:04.773211 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c63771a-b480-44bf-98f4-68ac62c7189a-inventory" (OuterVolumeSpecName: "inventory") pod "5c63771a-b480-44bf-98f4-68ac62c7189a" (UID: "5c63771a-b480-44bf-98f4-68ac62c7189a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:33:04 crc kubenswrapper[4723]: I0309 13:33:04.782103 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c63771a-b480-44bf-98f4-68ac62c7189a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5c63771a-b480-44bf-98f4-68ac62c7189a" (UID: "5c63771a-b480-44bf-98f4-68ac62c7189a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:33:04 crc kubenswrapper[4723]: I0309 13:33:04.837581 4723 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c63771a-b480-44bf-98f4-68ac62c7189a-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:04 crc kubenswrapper[4723]: I0309 13:33:04.837616 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bj6s\" (UniqueName: \"kubernetes.io/projected/5c63771a-b480-44bf-98f4-68ac62c7189a-kube-api-access-5bj6s\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:04 crc kubenswrapper[4723]: I0309 13:33:04.837630 4723 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5c63771a-b480-44bf-98f4-68ac62c7189a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:05 crc kubenswrapper[4723]: I0309 13:33:05.222502 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p96wk" event={"ID":"5c63771a-b480-44bf-98f4-68ac62c7189a","Type":"ContainerDied","Data":"921a4129e0d0fcf157f913e103b3946b2751da08fd5e6777d5262fae62f0c4b3"} Mar 09 13:33:05 crc kubenswrapper[4723]: I0309 13:33:05.222554 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="921a4129e0d0fcf157f913e103b3946b2751da08fd5e6777d5262fae62f0c4b3" Mar 09 13:33:05 crc kubenswrapper[4723]: I0309 13:33:05.222619 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p96wk" Mar 09 13:33:05 crc kubenswrapper[4723]: I0309 13:33:05.306348 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7mq48"] Mar 09 13:33:05 crc kubenswrapper[4723]: E0309 13:33:05.307011 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d1805eb-e0eb-46e7-936f-5d6cc76753ae" containerName="oc" Mar 09 13:33:05 crc kubenswrapper[4723]: I0309 13:33:05.307037 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d1805eb-e0eb-46e7-936f-5d6cc76753ae" containerName="oc" Mar 09 13:33:05 crc kubenswrapper[4723]: E0309 13:33:05.307099 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c63771a-b480-44bf-98f4-68ac62c7189a" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 09 13:33:05 crc kubenswrapper[4723]: I0309 13:33:05.307113 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c63771a-b480-44bf-98f4-68ac62c7189a" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 09 13:33:05 crc kubenswrapper[4723]: I0309 13:33:05.307445 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d1805eb-e0eb-46e7-936f-5d6cc76753ae" containerName="oc" Mar 09 13:33:05 crc kubenswrapper[4723]: I0309 13:33:05.307490 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c63771a-b480-44bf-98f4-68ac62c7189a" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 09 13:33:05 crc kubenswrapper[4723]: I0309 13:33:05.308733 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7mq48" Mar 09 13:33:05 crc kubenswrapper[4723]: I0309 13:33:05.311723 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:33:05 crc kubenswrapper[4723]: I0309 13:33:05.312388 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gw7vt" Mar 09 13:33:05 crc kubenswrapper[4723]: I0309 13:33:05.312892 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:33:05 crc kubenswrapper[4723]: I0309 13:33:05.318944 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7mq48"] Mar 09 13:33:05 crc kubenswrapper[4723]: I0309 13:33:05.325062 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:33:05 crc kubenswrapper[4723]: I0309 13:33:05.476475 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/107e9055-2e58-49af-98bd-478922559642-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7mq48\" (UID: \"107e9055-2e58-49af-98bd-478922559642\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7mq48" Mar 09 13:33:05 crc kubenswrapper[4723]: I0309 13:33:05.476906 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5ch2\" (UniqueName: \"kubernetes.io/projected/107e9055-2e58-49af-98bd-478922559642-kube-api-access-b5ch2\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7mq48\" (UID: \"107e9055-2e58-49af-98bd-478922559642\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7mq48" Mar 09 13:33:05 crc kubenswrapper[4723]: I0309 13:33:05.477012 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/107e9055-2e58-49af-98bd-478922559642-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7mq48\" (UID: \"107e9055-2e58-49af-98bd-478922559642\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7mq48" Mar 09 13:33:05 crc kubenswrapper[4723]: I0309 13:33:05.579107 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5ch2\" (UniqueName: \"kubernetes.io/projected/107e9055-2e58-49af-98bd-478922559642-kube-api-access-b5ch2\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7mq48\" (UID: \"107e9055-2e58-49af-98bd-478922559642\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7mq48" Mar 09 13:33:05 crc kubenswrapper[4723]: I0309 13:33:05.579170 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/107e9055-2e58-49af-98bd-478922559642-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7mq48\" (UID: \"107e9055-2e58-49af-98bd-478922559642\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7mq48" Mar 09 13:33:05 crc kubenswrapper[4723]: I0309 13:33:05.579308 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/107e9055-2e58-49af-98bd-478922559642-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7mq48\" (UID: \"107e9055-2e58-49af-98bd-478922559642\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7mq48" Mar 09 13:33:05 crc kubenswrapper[4723]: I0309 13:33:05.585568 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/107e9055-2e58-49af-98bd-478922559642-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7mq48\" (UID: \"107e9055-2e58-49af-98bd-478922559642\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7mq48" Mar 09 13:33:05 crc kubenswrapper[4723]: I0309 13:33:05.585732 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/107e9055-2e58-49af-98bd-478922559642-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7mq48\" (UID: \"107e9055-2e58-49af-98bd-478922559642\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7mq48" Mar 09 13:33:05 crc kubenswrapper[4723]: I0309 13:33:05.597933 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5ch2\" (UniqueName: \"kubernetes.io/projected/107e9055-2e58-49af-98bd-478922559642-kube-api-access-b5ch2\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7mq48\" (UID: \"107e9055-2e58-49af-98bd-478922559642\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7mq48" Mar 09 13:33:05 crc kubenswrapper[4723]: I0309 13:33:05.629279 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7mq48" Mar 09 13:33:06 crc kubenswrapper[4723]: I0309 13:33:06.260436 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7mq48"] Mar 09 13:33:07 crc kubenswrapper[4723]: I0309 13:33:07.242394 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7mq48" event={"ID":"107e9055-2e58-49af-98bd-478922559642","Type":"ContainerStarted","Data":"e333e8b40458fba7971d7f5c988b5b4632b90c40d19c82cc609102ce8dc77e37"} Mar 09 13:33:07 crc kubenswrapper[4723]: I0309 13:33:07.242945 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7mq48" event={"ID":"107e9055-2e58-49af-98bd-478922559642","Type":"ContainerStarted","Data":"9abdc05fbb80778f0229c62972fbf8c9ac9c53a160dd82956a619568ec7b4273"} Mar 09 13:33:07 crc kubenswrapper[4723]: I0309 13:33:07.262510 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7mq48" podStartSLOduration=1.8453758279999999 podStartE2EDuration="2.262483542s" podCreationTimestamp="2026-03-09 13:33:05 +0000 UTC" firstStartedPulling="2026-03-09 13:33:06.274417677 +0000 UTC m=+2060.288885217" lastFinishedPulling="2026-03-09 13:33:06.691525391 +0000 UTC m=+2060.705992931" observedRunningTime="2026-03-09 13:33:07.256698928 +0000 UTC m=+2061.271166478" watchObservedRunningTime="2026-03-09 13:33:07.262483542 +0000 UTC m=+2061.276951092" Mar 09 13:33:12 crc kubenswrapper[4723]: I0309 13:33:12.317896 4723 generic.go:334] "Generic (PLEG): container finished" podID="107e9055-2e58-49af-98bd-478922559642" containerID="e333e8b40458fba7971d7f5c988b5b4632b90c40d19c82cc609102ce8dc77e37" exitCode=0 Mar 09 13:33:12 crc kubenswrapper[4723]: I0309 13:33:12.317968 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7mq48" event={"ID":"107e9055-2e58-49af-98bd-478922559642","Type":"ContainerDied","Data":"e333e8b40458fba7971d7f5c988b5b4632b90c40d19c82cc609102ce8dc77e37"} Mar 09 13:33:12 crc kubenswrapper[4723]: I0309 13:33:12.880997 4723 scope.go:117] "RemoveContainer" containerID="a02e51157ce32a3f4545e91341ee80bb0f1a0c3eaea82ea1beb5091e2f5cef4b" Mar 09 13:33:12 crc kubenswrapper[4723]: E0309 13:33:12.881712 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:33:13 crc kubenswrapper[4723]: I0309 13:33:13.794595 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7mq48" Mar 09 13:33:13 crc kubenswrapper[4723]: I0309 13:33:13.896668 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/107e9055-2e58-49af-98bd-478922559642-ssh-key-openstack-edpm-ipam\") pod \"107e9055-2e58-49af-98bd-478922559642\" (UID: \"107e9055-2e58-49af-98bd-478922559642\") " Mar 09 13:33:13 crc kubenswrapper[4723]: I0309 13:33:13.897261 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/107e9055-2e58-49af-98bd-478922559642-inventory\") pod \"107e9055-2e58-49af-98bd-478922559642\" (UID: \"107e9055-2e58-49af-98bd-478922559642\") " Mar 09 13:33:13 crc kubenswrapper[4723]: I0309 13:33:13.897338 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5ch2\" (UniqueName: \"kubernetes.io/projected/107e9055-2e58-49af-98bd-478922559642-kube-api-access-b5ch2\") pod \"107e9055-2e58-49af-98bd-478922559642\" (UID: \"107e9055-2e58-49af-98bd-478922559642\") " Mar 09 13:33:13 crc kubenswrapper[4723]: I0309 13:33:13.916174 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/107e9055-2e58-49af-98bd-478922559642-kube-api-access-b5ch2" (OuterVolumeSpecName: "kube-api-access-b5ch2") pod "107e9055-2e58-49af-98bd-478922559642" (UID: "107e9055-2e58-49af-98bd-478922559642"). InnerVolumeSpecName "kube-api-access-b5ch2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:33:13 crc kubenswrapper[4723]: I0309 13:33:13.990085 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/107e9055-2e58-49af-98bd-478922559642-inventory" (OuterVolumeSpecName: "inventory") pod "107e9055-2e58-49af-98bd-478922559642" (UID: "107e9055-2e58-49af-98bd-478922559642"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:33:14 crc kubenswrapper[4723]: I0309 13:33:14.000021 4723 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/107e9055-2e58-49af-98bd-478922559642-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:14 crc kubenswrapper[4723]: I0309 13:33:14.000052 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5ch2\" (UniqueName: \"kubernetes.io/projected/107e9055-2e58-49af-98bd-478922559642-kube-api-access-b5ch2\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:14 crc kubenswrapper[4723]: I0309 13:33:14.023932 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/107e9055-2e58-49af-98bd-478922559642-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "107e9055-2e58-49af-98bd-478922559642" (UID: "107e9055-2e58-49af-98bd-478922559642"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:33:14 crc kubenswrapper[4723]: I0309 13:33:14.103302 4723 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/107e9055-2e58-49af-98bd-478922559642-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:14 crc kubenswrapper[4723]: I0309 13:33:14.342828 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7mq48" event={"ID":"107e9055-2e58-49af-98bd-478922559642","Type":"ContainerDied","Data":"9abdc05fbb80778f0229c62972fbf8c9ac9c53a160dd82956a619568ec7b4273"} Mar 09 13:33:14 crc kubenswrapper[4723]: I0309 13:33:14.342909 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7mq48" Mar 09 13:33:14 crc kubenswrapper[4723]: I0309 13:33:14.342931 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9abdc05fbb80778f0229c62972fbf8c9ac9c53a160dd82956a619568ec7b4273" Mar 09 13:33:14 crc kubenswrapper[4723]: I0309 13:33:14.448895 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-pwndd"] Mar 09 13:33:14 crc kubenswrapper[4723]: E0309 13:33:14.449643 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="107e9055-2e58-49af-98bd-478922559642" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 09 13:33:14 crc kubenswrapper[4723]: I0309 13:33:14.449661 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="107e9055-2e58-49af-98bd-478922559642" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 09 13:33:14 crc kubenswrapper[4723]: I0309 13:33:14.449910 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="107e9055-2e58-49af-98bd-478922559642" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 09 13:33:14 crc kubenswrapper[4723]: I0309 13:33:14.450757 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pwndd" Mar 09 13:33:14 crc kubenswrapper[4723]: I0309 13:33:14.453010 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:33:14 crc kubenswrapper[4723]: I0309 13:33:14.453465 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:33:14 crc kubenswrapper[4723]: I0309 13:33:14.453786 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:33:14 crc kubenswrapper[4723]: I0309 13:33:14.458364 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gw7vt" Mar 09 13:33:14 crc kubenswrapper[4723]: I0309 13:33:14.468792 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-pwndd"] Mar 09 13:33:14 crc kubenswrapper[4723]: I0309 13:33:14.515544 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qptmf\" (UniqueName: \"kubernetes.io/projected/8459660b-c673-46c5-81de-8081c3545a15-kube-api-access-qptmf\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pwndd\" (UID: \"8459660b-c673-46c5-81de-8081c3545a15\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pwndd" Mar 09 13:33:14 crc kubenswrapper[4723]: I0309 13:33:14.515810 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8459660b-c673-46c5-81de-8081c3545a15-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pwndd\" (UID: \"8459660b-c673-46c5-81de-8081c3545a15\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pwndd" Mar 09 13:33:14 crc kubenswrapper[4723]: I0309 13:33:14.516235 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8459660b-c673-46c5-81de-8081c3545a15-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pwndd\" (UID: \"8459660b-c673-46c5-81de-8081c3545a15\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pwndd" Mar 09 13:33:14 crc kubenswrapper[4723]: I0309 13:33:14.619017 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qptmf\" (UniqueName: \"kubernetes.io/projected/8459660b-c673-46c5-81de-8081c3545a15-kube-api-access-qptmf\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pwndd\" (UID: \"8459660b-c673-46c5-81de-8081c3545a15\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pwndd" Mar 09 13:33:14 crc kubenswrapper[4723]: I0309 13:33:14.619233 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8459660b-c673-46c5-81de-8081c3545a15-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pwndd\" (UID: \"8459660b-c673-46c5-81de-8081c3545a15\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pwndd" Mar 09 13:33:14 crc kubenswrapper[4723]: I0309 13:33:14.619392 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8459660b-c673-46c5-81de-8081c3545a15-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pwndd\" (UID: \"8459660b-c673-46c5-81de-8081c3545a15\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pwndd" Mar 09 13:33:14 crc kubenswrapper[4723]: I0309 13:33:14.623987 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8459660b-c673-46c5-81de-8081c3545a15-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pwndd\" (UID: \"8459660b-c673-46c5-81de-8081c3545a15\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pwndd" Mar 09 13:33:14 crc kubenswrapper[4723]: I0309 13:33:14.624007 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8459660b-c673-46c5-81de-8081c3545a15-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pwndd\" (UID: \"8459660b-c673-46c5-81de-8081c3545a15\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pwndd" Mar 09 13:33:14 crc kubenswrapper[4723]: I0309 13:33:14.639100 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qptmf\" (UniqueName: \"kubernetes.io/projected/8459660b-c673-46c5-81de-8081c3545a15-kube-api-access-qptmf\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pwndd\" (UID: \"8459660b-c673-46c5-81de-8081c3545a15\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pwndd" Mar 09 13:33:14 crc kubenswrapper[4723]: I0309 13:33:14.776044 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pwndd" Mar 09 13:33:15 crc kubenswrapper[4723]: I0309 13:33:15.340124 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-pwndd"] Mar 09 13:33:15 crc kubenswrapper[4723]: I0309 13:33:15.356003 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pwndd" event={"ID":"8459660b-c673-46c5-81de-8081c3545a15","Type":"ContainerStarted","Data":"7e835bb256a5100a31b786bee3e354ecdba54e76ec1e9577b5477da02f871854"} Mar 09 13:33:16 crc kubenswrapper[4723]: I0309 13:33:16.369421 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pwndd" event={"ID":"8459660b-c673-46c5-81de-8081c3545a15","Type":"ContainerStarted","Data":"dbfca45b1381b0ff8267dce9f32c7a86f5ce3de5230f2383189a2d22d8aca3a8"} Mar 09 13:33:16 crc kubenswrapper[4723]: I0309 13:33:16.400807 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pwndd" podStartSLOduration=1.929957347 podStartE2EDuration="2.400788602s" podCreationTimestamp="2026-03-09 13:33:14 +0000 UTC" firstStartedPulling="2026-03-09 13:33:15.34224285 +0000 UTC m=+2069.356710390" lastFinishedPulling="2026-03-09 13:33:15.813074105 +0000 UTC m=+2069.827541645" observedRunningTime="2026-03-09 13:33:16.384739944 +0000 UTC m=+2070.399207504" watchObservedRunningTime="2026-03-09 13:33:16.400788602 +0000 UTC m=+2070.415256142" Mar 09 13:33:17 crc kubenswrapper[4723]: I0309 13:33:17.223970 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5tmhd"] Mar 09 13:33:17 crc kubenswrapper[4723]: I0309 13:33:17.227109 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5tmhd" Mar 09 13:33:17 crc kubenswrapper[4723]: I0309 13:33:17.242593 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5tmhd"] Mar 09 13:33:17 crc kubenswrapper[4723]: I0309 13:33:17.288946 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6lnj\" (UniqueName: \"kubernetes.io/projected/03153304-2923-4629-90c0-0f7f4f7cdac2-kube-api-access-k6lnj\") pod \"redhat-operators-5tmhd\" (UID: \"03153304-2923-4629-90c0-0f7f4f7cdac2\") " pod="openshift-marketplace/redhat-operators-5tmhd" Mar 09 13:33:17 crc kubenswrapper[4723]: I0309 13:33:17.288990 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03153304-2923-4629-90c0-0f7f4f7cdac2-utilities\") pod \"redhat-operators-5tmhd\" (UID: \"03153304-2923-4629-90c0-0f7f4f7cdac2\") " pod="openshift-marketplace/redhat-operators-5tmhd" Mar 09 13:33:17 crc kubenswrapper[4723]: I0309 13:33:17.289357 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03153304-2923-4629-90c0-0f7f4f7cdac2-catalog-content\") pod \"redhat-operators-5tmhd\" (UID: \"03153304-2923-4629-90c0-0f7f4f7cdac2\") " pod="openshift-marketplace/redhat-operators-5tmhd" Mar 09 13:33:17 crc kubenswrapper[4723]: I0309 13:33:17.391774 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6lnj\" (UniqueName: \"kubernetes.io/projected/03153304-2923-4629-90c0-0f7f4f7cdac2-kube-api-access-k6lnj\") pod \"redhat-operators-5tmhd\" (UID: \"03153304-2923-4629-90c0-0f7f4f7cdac2\") " pod="openshift-marketplace/redhat-operators-5tmhd" Mar 09 13:33:17 crc kubenswrapper[4723]: I0309 13:33:17.391830 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03153304-2923-4629-90c0-0f7f4f7cdac2-utilities\") pod \"redhat-operators-5tmhd\" (UID: \"03153304-2923-4629-90c0-0f7f4f7cdac2\") " pod="openshift-marketplace/redhat-operators-5tmhd" Mar 09 13:33:17 crc kubenswrapper[4723]: I0309 13:33:17.392048 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03153304-2923-4629-90c0-0f7f4f7cdac2-catalog-content\") pod \"redhat-operators-5tmhd\" (UID: \"03153304-2923-4629-90c0-0f7f4f7cdac2\") " pod="openshift-marketplace/redhat-operators-5tmhd" Mar 09 13:33:17 crc kubenswrapper[4723]: I0309 13:33:17.392544 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03153304-2923-4629-90c0-0f7f4f7cdac2-utilities\") pod \"redhat-operators-5tmhd\" (UID: \"03153304-2923-4629-90c0-0f7f4f7cdac2\") " pod="openshift-marketplace/redhat-operators-5tmhd" Mar 09 13:33:17 crc kubenswrapper[4723]: I0309 13:33:17.392690 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03153304-2923-4629-90c0-0f7f4f7cdac2-catalog-content\") pod \"redhat-operators-5tmhd\" (UID: \"03153304-2923-4629-90c0-0f7f4f7cdac2\") " pod="openshift-marketplace/redhat-operators-5tmhd" Mar 09 13:33:17 crc kubenswrapper[4723]: I0309 13:33:17.414580 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6lnj\" (UniqueName: \"kubernetes.io/projected/03153304-2923-4629-90c0-0f7f4f7cdac2-kube-api-access-k6lnj\") pod \"redhat-operators-5tmhd\" (UID: \"03153304-2923-4629-90c0-0f7f4f7cdac2\") " pod="openshift-marketplace/redhat-operators-5tmhd" Mar 09 13:33:17 crc kubenswrapper[4723]: I0309 13:33:17.436112 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dvt9z"] Mar 09 13:33:17 crc kubenswrapper[4723]: I0309 13:33:17.439424 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dvt9z" Mar 09 13:33:17 crc kubenswrapper[4723]: I0309 13:33:17.458421 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dvt9z"] Mar 09 13:33:17 crc kubenswrapper[4723]: I0309 13:33:17.496733 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2hzp\" (UniqueName: \"kubernetes.io/projected/50764dcb-5f95-4085-a631-da55fc65e1f1-kube-api-access-c2hzp\") pod \"certified-operators-dvt9z\" (UID: \"50764dcb-5f95-4085-a631-da55fc65e1f1\") " pod="openshift-marketplace/certified-operators-dvt9z" Mar 09 13:33:17 crc kubenswrapper[4723]: I0309 13:33:17.497433 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50764dcb-5f95-4085-a631-da55fc65e1f1-catalog-content\") pod \"certified-operators-dvt9z\" (UID: \"50764dcb-5f95-4085-a631-da55fc65e1f1\") " pod="openshift-marketplace/certified-operators-dvt9z" Mar 09 13:33:17 crc kubenswrapper[4723]: I0309 13:33:17.497494 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50764dcb-5f95-4085-a631-da55fc65e1f1-utilities\") pod \"certified-operators-dvt9z\" (UID: \"50764dcb-5f95-4085-a631-da55fc65e1f1\") " pod="openshift-marketplace/certified-operators-dvt9z" Mar 09 13:33:17 crc kubenswrapper[4723]: I0309 13:33:17.553721 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5tmhd" Mar 09 13:33:17 crc kubenswrapper[4723]: I0309 13:33:17.600088 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50764dcb-5f95-4085-a631-da55fc65e1f1-catalog-content\") pod \"certified-operators-dvt9z\" (UID: \"50764dcb-5f95-4085-a631-da55fc65e1f1\") " pod="openshift-marketplace/certified-operators-dvt9z" Mar 09 13:33:17 crc kubenswrapper[4723]: I0309 13:33:17.600175 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50764dcb-5f95-4085-a631-da55fc65e1f1-utilities\") pod \"certified-operators-dvt9z\" (UID: \"50764dcb-5f95-4085-a631-da55fc65e1f1\") " pod="openshift-marketplace/certified-operators-dvt9z" Mar 09 13:33:17 crc kubenswrapper[4723]: I0309 13:33:17.600448 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2hzp\" (UniqueName: \"kubernetes.io/projected/50764dcb-5f95-4085-a631-da55fc65e1f1-kube-api-access-c2hzp\") pod \"certified-operators-dvt9z\" (UID: \"50764dcb-5f95-4085-a631-da55fc65e1f1\") " pod="openshift-marketplace/certified-operators-dvt9z" Mar 09 13:33:17 crc kubenswrapper[4723]: I0309 13:33:17.602673 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50764dcb-5f95-4085-a631-da55fc65e1f1-catalog-content\") pod \"certified-operators-dvt9z\" (UID: \"50764dcb-5f95-4085-a631-da55fc65e1f1\") " pod="openshift-marketplace/certified-operators-dvt9z" Mar 09 13:33:17 crc kubenswrapper[4723]: I0309 13:33:17.605397 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50764dcb-5f95-4085-a631-da55fc65e1f1-utilities\") pod \"certified-operators-dvt9z\" (UID: \"50764dcb-5f95-4085-a631-da55fc65e1f1\") " pod="openshift-marketplace/certified-operators-dvt9z" Mar 09 13:33:17 crc kubenswrapper[4723]: I0309 13:33:17.623153 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2hzp\" (UniqueName: \"kubernetes.io/projected/50764dcb-5f95-4085-a631-da55fc65e1f1-kube-api-access-c2hzp\") pod \"certified-operators-dvt9z\" (UID: \"50764dcb-5f95-4085-a631-da55fc65e1f1\") " pod="openshift-marketplace/certified-operators-dvt9z" Mar 09 13:33:17 crc kubenswrapper[4723]: I0309 13:33:17.810878 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dvt9z" Mar 09 13:33:17 crc kubenswrapper[4723]: I0309 13:33:17.996632 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5tmhd"] Mar 09 13:33:18 crc kubenswrapper[4723]: W0309 13:33:18.309851 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50764dcb_5f95_4085_a631_da55fc65e1f1.slice/crio-8123617b3159bc42d9d37f1722216e11f23059269ee052a1310a40e5930e87c8 WatchSource:0}: Error finding container 8123617b3159bc42d9d37f1722216e11f23059269ee052a1310a40e5930e87c8: Status 404 returned error can't find the container with id 8123617b3159bc42d9d37f1722216e11f23059269ee052a1310a40e5930e87c8 Mar 09 13:33:18 crc kubenswrapper[4723]: I0309 13:33:18.313805 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dvt9z"] Mar 09 13:33:18 crc kubenswrapper[4723]: I0309 13:33:18.413012 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tmhd" event={"ID":"03153304-2923-4629-90c0-0f7f4f7cdac2","Type":"ContainerStarted","Data":"614fd4844aa30cb81181f0cc95c96afc93f627e8b24869e17090abc0e8b4c720"} Mar 09 13:33:18 crc kubenswrapper[4723]: I0309 13:33:18.423902 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dvt9z" event={"ID":"50764dcb-5f95-4085-a631-da55fc65e1f1","Type":"ContainerStarted","Data":"8123617b3159bc42d9d37f1722216e11f23059269ee052a1310a40e5930e87c8"} Mar 09 13:33:19 crc kubenswrapper[4723]: I0309 13:33:19.436850 4723 generic.go:334] "Generic (PLEG): container finished" podID="03153304-2923-4629-90c0-0f7f4f7cdac2" containerID="d6b508d6a9c3cd02cfe5742505aee3248e027ae8247450ca218f53412fc135ab" exitCode=0 Mar 09 13:33:19 crc kubenswrapper[4723]: I0309 13:33:19.437183 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tmhd" event={"ID":"03153304-2923-4629-90c0-0f7f4f7cdac2","Type":"ContainerDied","Data":"d6b508d6a9c3cd02cfe5742505aee3248e027ae8247450ca218f53412fc135ab"} Mar 09 13:33:19 crc kubenswrapper[4723]: I0309 13:33:19.442461 4723 generic.go:334] "Generic (PLEG): container finished" podID="50764dcb-5f95-4085-a631-da55fc65e1f1" containerID="41e7222d104925fd8402c191f894f93910ed1a203af6e171846ef9bae670c6fe" exitCode=0 Mar 09 13:33:19 crc kubenswrapper[4723]: I0309 13:33:19.442506 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dvt9z" event={"ID":"50764dcb-5f95-4085-a631-da55fc65e1f1","Type":"ContainerDied","Data":"41e7222d104925fd8402c191f894f93910ed1a203af6e171846ef9bae670c6fe"} Mar 09 13:33:20 crc kubenswrapper[4723]: I0309 13:33:20.038253 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-8808-account-create-update-jg69c"] Mar 09 13:33:20 crc kubenswrapper[4723]: I0309 13:33:20.056069 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-5jgl8"] Mar 09 13:33:20 crc kubenswrapper[4723]: I0309 13:33:20.069043 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-5jgl8"] Mar 09 13:33:20 crc kubenswrapper[4723]: I0309 13:33:20.079140 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-8808-account-create-update-jg69c"] Mar 09 13:33:20 crc kubenswrapper[4723]: I0309 13:33:20.898162 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c81a50-9366-425a-b957-330022266b2d" path="/var/lib/kubelet/pods/49c81a50-9366-425a-b957-330022266b2d/volumes" Mar 09 13:33:20 crc kubenswrapper[4723]: I0309 13:33:20.899155 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eaf73308-8e25-4470-a1d4-a2f9d59b1cd6" path="/var/lib/kubelet/pods/eaf73308-8e25-4470-a1d4-a2f9d59b1cd6/volumes" Mar 09 13:33:21 crc kubenswrapper[4723]: I0309 13:33:21.464024 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dvt9z" event={"ID":"50764dcb-5f95-4085-a631-da55fc65e1f1","Type":"ContainerStarted","Data":"125fcc896aede561bbba36abf58f82cdddc91e34bf7547357748f31432501d5f"} Mar 09 13:33:23 crc kubenswrapper[4723]: I0309 13:33:23.485643 4723 generic.go:334] "Generic (PLEG): container finished" podID="50764dcb-5f95-4085-a631-da55fc65e1f1" containerID="125fcc896aede561bbba36abf58f82cdddc91e34bf7547357748f31432501d5f" exitCode=0 Mar 09 13:33:23 crc kubenswrapper[4723]: I0309 13:33:23.485719 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dvt9z" event={"ID":"50764dcb-5f95-4085-a631-da55fc65e1f1","Type":"ContainerDied","Data":"125fcc896aede561bbba36abf58f82cdddc91e34bf7547357748f31432501d5f"} Mar 09 13:33:23 crc kubenswrapper[4723]: I0309 13:33:23.808541 4723 scope.go:117] "RemoveContainer" containerID="b684cb0dc2c174e2348d5059f387bd692515f3164a959818841846b2a4b8a46e" Mar 09 13:33:23 crc kubenswrapper[4723]: I0309 13:33:23.837596 4723 scope.go:117] "RemoveContainer" containerID="1da2ab881be2e547108284e873355eae3dc6a7acd1feaa32a1e59f89b44f94c5" Mar 09 13:33:23 crc kubenswrapper[4723]: I0309 13:33:23.918138 4723 scope.go:117] "RemoveContainer" containerID="364ddbc084e24139ed96474b977fea3b6f71ac47d2a5348d7ea9b387eda7ee73" Mar 09 13:33:24 crc kubenswrapper[4723]: I0309 13:33:24.036093 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-mfbfj"] Mar 09 13:33:24 crc kubenswrapper[4723]: I0309 13:33:24.047182 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-mfbfj"] Mar 09 13:33:24 crc kubenswrapper[4723]: I0309 13:33:24.498288 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dvt9z" event={"ID":"50764dcb-5f95-4085-a631-da55fc65e1f1","Type":"ContainerStarted","Data":"a3ba7d259b490247634a52db130d36a99ef101142cd9615bbfba31bf4f31bd4c"} Mar 09 13:33:24 crc kubenswrapper[4723]: I0309 13:33:24.535808 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dvt9z" podStartSLOduration=2.86756043 podStartE2EDuration="7.535790567s" podCreationTimestamp="2026-03-09 13:33:17 +0000 UTC" firstStartedPulling="2026-03-09 13:33:19.44359356 +0000 UTC m=+2073.458061090" lastFinishedPulling="2026-03-09 13:33:24.111823687 +0000 UTC m=+2078.126291227" observedRunningTime="2026-03-09 13:33:24.519073751 +0000 UTC m=+2078.533541291" watchObservedRunningTime="2026-03-09 13:33:24.535790567 +0000 UTC m=+2078.550258107" Mar 09 13:33:24 crc kubenswrapper[4723]: I0309 13:33:24.902133 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4e81d20-6788-48e4-a38c-2dda5e6cc206" path="/var/lib/kubelet/pods/c4e81d20-6788-48e4-a38c-2dda5e6cc206/volumes" Mar 09 13:33:25 crc kubenswrapper[4723]: I0309 13:33:25.053478 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hx7mw"] Mar 09 13:33:25 crc kubenswrapper[4723]: I0309 13:33:25.067742 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hx7mw"] Mar 09 13:33:25 crc kubenswrapper[4723]: I0309 13:33:25.509972 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tmhd" event={"ID":"03153304-2923-4629-90c0-0f7f4f7cdac2","Type":"ContainerStarted","Data":"f820d834d542904412ad0de9681cf647da7ed2a1311518de18a10a648ba5bb30"} Mar 09 13:33:26 crc kubenswrapper[4723]: I0309 13:33:26.881105 4723 scope.go:117] "RemoveContainer" containerID="a02e51157ce32a3f4545e91341ee80bb0f1a0c3eaea82ea1beb5091e2f5cef4b" Mar 09 13:33:26 crc kubenswrapper[4723]: E0309 13:33:26.881711 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:33:26 crc kubenswrapper[4723]: I0309 13:33:26.897072 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f23bcc5-001d-4f5e-a28f-ed00ab283c01" path="/var/lib/kubelet/pods/1f23bcc5-001d-4f5e-a28f-ed00ab283c01/volumes" Mar 09 13:33:27 crc kubenswrapper[4723]: I0309 13:33:27.812074 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dvt9z" Mar 09 13:33:27 crc kubenswrapper[4723]: I0309 13:33:27.812148 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dvt9z" Mar 09 13:33:28 crc kubenswrapper[4723]: I0309 13:33:28.858981 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-dvt9z" podUID="50764dcb-5f95-4085-a631-da55fc65e1f1" containerName="registry-server" probeResult="failure" output=< Mar 09 13:33:28 crc kubenswrapper[4723]: timeout: failed to connect service ":50051" within 1s Mar 09 13:33:28 crc kubenswrapper[4723]: > Mar 09 13:33:30 crc kubenswrapper[4723]: I0309 13:33:30.604174 4723 generic.go:334] "Generic (PLEG): container finished" podID="03153304-2923-4629-90c0-0f7f4f7cdac2" containerID="f820d834d542904412ad0de9681cf647da7ed2a1311518de18a10a648ba5bb30" exitCode=0 Mar 09 13:33:30 crc kubenswrapper[4723]: I0309 13:33:30.604224 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tmhd" event={"ID":"03153304-2923-4629-90c0-0f7f4f7cdac2","Type":"ContainerDied","Data":"f820d834d542904412ad0de9681cf647da7ed2a1311518de18a10a648ba5bb30"} Mar 09 13:33:31 crc kubenswrapper[4723]: I0309 13:33:31.617528 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tmhd" event={"ID":"03153304-2923-4629-90c0-0f7f4f7cdac2","Type":"ContainerStarted","Data":"5fde97eaf7367ebfdf2c36741d7ae58d79c8c634d070107533a2a7faca6851f5"} Mar 09 13:33:31 crc kubenswrapper[4723]: I0309 13:33:31.648043 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5tmhd" podStartSLOduration=3.096789471 podStartE2EDuration="14.648023706s" podCreationTimestamp="2026-03-09 13:33:17 +0000 UTC" firstStartedPulling="2026-03-09 13:33:19.439518432 +0000 UTC m=+2073.453985962" lastFinishedPulling="2026-03-09 13:33:30.990752657 +0000 UTC m=+2085.005220197" observedRunningTime="2026-03-09 13:33:31.636665224 +0000 UTC m=+2085.651132784" watchObservedRunningTime="2026-03-09 13:33:31.648023706 +0000 UTC m=+2085.662491256" Mar 09 13:33:37 crc kubenswrapper[4723]: I0309 13:33:37.555539 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5tmhd" Mar 09 13:33:37 crc kubenswrapper[4723]: I0309 13:33:37.556099 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5tmhd" Mar 09 13:33:38 crc kubenswrapper[4723]: I0309 13:33:38.606370 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5tmhd" podUID="03153304-2923-4629-90c0-0f7f4f7cdac2" containerName="registry-server" probeResult="failure" output=< Mar 09 13:33:38 crc kubenswrapper[4723]: timeout: failed to connect service ":50051" within 1s Mar 09 13:33:38 crc kubenswrapper[4723]: > Mar 09 13:33:38 crc kubenswrapper[4723]: I0309 13:33:38.858661 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-dvt9z" podUID="50764dcb-5f95-4085-a631-da55fc65e1f1" containerName="registry-server" probeResult="failure" output=< Mar 09 13:33:38 crc kubenswrapper[4723]: timeout: failed to connect service ":50051" within 1s Mar 09 13:33:38 crc kubenswrapper[4723]: > Mar 09 13:33:41 crc kubenswrapper[4723]: I0309 13:33:41.930183 4723 scope.go:117] "RemoveContainer" containerID="a02e51157ce32a3f4545e91341ee80bb0f1a0c3eaea82ea1beb5091e2f5cef4b" Mar 09 13:33:42 crc kubenswrapper[4723]: I0309 13:33:42.730266 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" event={"ID":"983d5ed4-cfc7-402a-b226-29dc071c6e4e","Type":"ContainerStarted","Data":"62e8c3719a2b94e2de1825a84035da872a22177f9c2e679f00aeaa34e1f4ffd0"} Mar 09 13:33:47 crc kubenswrapper[4723]: I0309 13:33:47.865771 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dvt9z" Mar 09 13:33:47 crc kubenswrapper[4723]: I0309 13:33:47.926850 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dvt9z" Mar 09 13:33:48 crc kubenswrapper[4723]: I0309 13:33:48.423800 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dvt9z"] Mar 09 13:33:48 crc kubenswrapper[4723]: I0309 13:33:48.618331 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5tmhd" podUID="03153304-2923-4629-90c0-0f7f4f7cdac2" containerName="registry-server" probeResult="failure" output=< Mar 09 13:33:48 crc kubenswrapper[4723]: timeout: failed to connect service ":50051" within 1s Mar 09 13:33:48 crc kubenswrapper[4723]: > Mar 09 13:33:49 crc kubenswrapper[4723]: I0309 13:33:49.800692 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dvt9z" podUID="50764dcb-5f95-4085-a631-da55fc65e1f1" containerName="registry-server" containerID="cri-o://a3ba7d259b490247634a52db130d36a99ef101142cd9615bbfba31bf4f31bd4c" gracePeriod=2 Mar 09 13:33:50 crc kubenswrapper[4723]: I0309 13:33:50.377100 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dvt9z" Mar 09 13:33:50 crc kubenswrapper[4723]: I0309 13:33:50.542416 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50764dcb-5f95-4085-a631-da55fc65e1f1-utilities\") pod \"50764dcb-5f95-4085-a631-da55fc65e1f1\" (UID: \"50764dcb-5f95-4085-a631-da55fc65e1f1\") " Mar 09 13:33:50 crc kubenswrapper[4723]: I0309 13:33:50.542532 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2hzp\" (UniqueName: \"kubernetes.io/projected/50764dcb-5f95-4085-a631-da55fc65e1f1-kube-api-access-c2hzp\") pod \"50764dcb-5f95-4085-a631-da55fc65e1f1\" (UID: \"50764dcb-5f95-4085-a631-da55fc65e1f1\") " Mar 09 13:33:50 crc kubenswrapper[4723]: I0309 13:33:50.542634 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50764dcb-5f95-4085-a631-da55fc65e1f1-catalog-content\") pod \"50764dcb-5f95-4085-a631-da55fc65e1f1\" (UID: \"50764dcb-5f95-4085-a631-da55fc65e1f1\") " Mar 09 13:33:50 crc kubenswrapper[4723]: I0309 13:33:50.544316 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50764dcb-5f95-4085-a631-da55fc65e1f1-utilities" (OuterVolumeSpecName: "utilities") pod "50764dcb-5f95-4085-a631-da55fc65e1f1" (UID: "50764dcb-5f95-4085-a631-da55fc65e1f1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:33:50 crc kubenswrapper[4723]: I0309 13:33:50.550424 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50764dcb-5f95-4085-a631-da55fc65e1f1-kube-api-access-c2hzp" (OuterVolumeSpecName: "kube-api-access-c2hzp") pod "50764dcb-5f95-4085-a631-da55fc65e1f1" (UID: "50764dcb-5f95-4085-a631-da55fc65e1f1"). InnerVolumeSpecName "kube-api-access-c2hzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:33:50 crc kubenswrapper[4723]: I0309 13:33:50.623502 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50764dcb-5f95-4085-a631-da55fc65e1f1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "50764dcb-5f95-4085-a631-da55fc65e1f1" (UID: "50764dcb-5f95-4085-a631-da55fc65e1f1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:33:50 crc kubenswrapper[4723]: I0309 13:33:50.645891 4723 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50764dcb-5f95-4085-a631-da55fc65e1f1-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:50 crc kubenswrapper[4723]: I0309 13:33:50.647070 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2hzp\" (UniqueName: \"kubernetes.io/projected/50764dcb-5f95-4085-a631-da55fc65e1f1-kube-api-access-c2hzp\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:50 crc kubenswrapper[4723]: I0309 13:33:50.647100 4723 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50764dcb-5f95-4085-a631-da55fc65e1f1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:50 crc kubenswrapper[4723]: I0309 13:33:50.812079 4723 generic.go:334] "Generic (PLEG): container finished" podID="50764dcb-5f95-4085-a631-da55fc65e1f1" containerID="a3ba7d259b490247634a52db130d36a99ef101142cd9615bbfba31bf4f31bd4c" exitCode=0 Mar 09 13:33:50 crc kubenswrapper[4723]: I0309 13:33:50.812126 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dvt9z" event={"ID":"50764dcb-5f95-4085-a631-da55fc65e1f1","Type":"ContainerDied","Data":"a3ba7d259b490247634a52db130d36a99ef101142cd9615bbfba31bf4f31bd4c"} Mar 09 13:33:50 crc kubenswrapper[4723]: I0309 13:33:50.812156 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dvt9z" event={"ID":"50764dcb-5f95-4085-a631-da55fc65e1f1","Type":"ContainerDied","Data":"8123617b3159bc42d9d37f1722216e11f23059269ee052a1310a40e5930e87c8"} Mar 09 13:33:50 crc kubenswrapper[4723]: I0309 13:33:50.812176 4723 scope.go:117] "RemoveContainer" containerID="a3ba7d259b490247634a52db130d36a99ef101142cd9615bbfba31bf4f31bd4c" Mar 09 13:33:50 crc kubenswrapper[4723]: I0309 13:33:50.812180 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dvt9z" Mar 09 13:33:50 crc kubenswrapper[4723]: I0309 13:33:50.845408 4723 scope.go:117] "RemoveContainer" containerID="125fcc896aede561bbba36abf58f82cdddc91e34bf7547357748f31432501d5f" Mar 09 13:33:50 crc kubenswrapper[4723]: I0309 13:33:50.850297 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dvt9z"] Mar 09 13:33:50 crc kubenswrapper[4723]: I0309 13:33:50.867130 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dvt9z"] Mar 09 13:33:50 crc kubenswrapper[4723]: I0309 13:33:50.873765 4723 scope.go:117] "RemoveContainer" containerID="41e7222d104925fd8402c191f894f93910ed1a203af6e171846ef9bae670c6fe" Mar 09 13:33:50 crc kubenswrapper[4723]: I0309 13:33:50.901406 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50764dcb-5f95-4085-a631-da55fc65e1f1" path="/var/lib/kubelet/pods/50764dcb-5f95-4085-a631-da55fc65e1f1/volumes" Mar 09 13:33:50 crc kubenswrapper[4723]: I0309 13:33:50.930745 4723 scope.go:117] "RemoveContainer" containerID="a3ba7d259b490247634a52db130d36a99ef101142cd9615bbfba31bf4f31bd4c" Mar 09 13:33:50 crc kubenswrapper[4723]: E0309 13:33:50.931183 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3ba7d259b490247634a52db130d36a99ef101142cd9615bbfba31bf4f31bd4c\": container with ID starting with a3ba7d259b490247634a52db130d36a99ef101142cd9615bbfba31bf4f31bd4c not found: ID does not exist" containerID="a3ba7d259b490247634a52db130d36a99ef101142cd9615bbfba31bf4f31bd4c" Mar 09 13:33:50 crc kubenswrapper[4723]: I0309 13:33:50.931222 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3ba7d259b490247634a52db130d36a99ef101142cd9615bbfba31bf4f31bd4c"} err="failed to get container status \"a3ba7d259b490247634a52db130d36a99ef101142cd9615bbfba31bf4f31bd4c\": rpc error: code = NotFound desc = could not find container \"a3ba7d259b490247634a52db130d36a99ef101142cd9615bbfba31bf4f31bd4c\": container with ID starting with a3ba7d259b490247634a52db130d36a99ef101142cd9615bbfba31bf4f31bd4c not found: ID does not exist" Mar 09 13:33:50 crc kubenswrapper[4723]: I0309 13:33:50.931245 4723 scope.go:117] "RemoveContainer" containerID="125fcc896aede561bbba36abf58f82cdddc91e34bf7547357748f31432501d5f" Mar 09 13:33:50 crc kubenswrapper[4723]: E0309 13:33:50.931464 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"125fcc896aede561bbba36abf58f82cdddc91e34bf7547357748f31432501d5f\": container with ID starting with 125fcc896aede561bbba36abf58f82cdddc91e34bf7547357748f31432501d5f not found: ID does not exist" containerID="125fcc896aede561bbba36abf58f82cdddc91e34bf7547357748f31432501d5f" Mar 09 13:33:50 crc kubenswrapper[4723]: I0309 13:33:50.931489 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"125fcc896aede561bbba36abf58f82cdddc91e34bf7547357748f31432501d5f"} err="failed to get container status \"125fcc896aede561bbba36abf58f82cdddc91e34bf7547357748f31432501d5f\": rpc error: code = NotFound desc = could not find container \"125fcc896aede561bbba36abf58f82cdddc91e34bf7547357748f31432501d5f\": container with ID starting with 125fcc896aede561bbba36abf58f82cdddc91e34bf7547357748f31432501d5f not found: ID does not exist" Mar 09 13:33:50 crc kubenswrapper[4723]: I0309 13:33:50.931505 4723 scope.go:117] "RemoveContainer" containerID="41e7222d104925fd8402c191f894f93910ed1a203af6e171846ef9bae670c6fe" Mar 09 13:33:50 crc kubenswrapper[4723]: E0309 13:33:50.931746 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41e7222d104925fd8402c191f894f93910ed1a203af6e171846ef9bae670c6fe\": container with ID starting with 41e7222d104925fd8402c191f894f93910ed1a203af6e171846ef9bae670c6fe not found: ID does not exist" containerID="41e7222d104925fd8402c191f894f93910ed1a203af6e171846ef9bae670c6fe" Mar 09 13:33:50 crc kubenswrapper[4723]: I0309 13:33:50.931766 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41e7222d104925fd8402c191f894f93910ed1a203af6e171846ef9bae670c6fe"} err="failed to get container status \"41e7222d104925fd8402c191f894f93910ed1a203af6e171846ef9bae670c6fe\": rpc error: code = NotFound desc = could not find container \"41e7222d104925fd8402c191f894f93910ed1a203af6e171846ef9bae670c6fe\": container with ID starting with 41e7222d104925fd8402c191f894f93910ed1a203af6e171846ef9bae670c6fe not found: ID does not exist" Mar 09 13:33:52 crc kubenswrapper[4723]: I0309 13:33:52.836295 4723 generic.go:334] "Generic (PLEG): container finished" podID="8459660b-c673-46c5-81de-8081c3545a15" containerID="dbfca45b1381b0ff8267dce9f32c7a86f5ce3de5230f2383189a2d22d8aca3a8" exitCode=0 Mar 09 13:33:52 crc kubenswrapper[4723]: I0309 13:33:52.836381 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pwndd" event={"ID":"8459660b-c673-46c5-81de-8081c3545a15","Type":"ContainerDied","Data":"dbfca45b1381b0ff8267dce9f32c7a86f5ce3de5230f2383189a2d22d8aca3a8"} Mar 09 13:33:54 crc kubenswrapper[4723]: I0309 13:33:54.436704 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pwndd" Mar 09 13:33:54 crc kubenswrapper[4723]: I0309 13:33:54.541163 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8459660b-c673-46c5-81de-8081c3545a15-ssh-key-openstack-edpm-ipam\") pod \"8459660b-c673-46c5-81de-8081c3545a15\" (UID: \"8459660b-c673-46c5-81de-8081c3545a15\") " Mar 09 13:33:54 crc kubenswrapper[4723]: I0309 13:33:54.541347 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qptmf\" (UniqueName: \"kubernetes.io/projected/8459660b-c673-46c5-81de-8081c3545a15-kube-api-access-qptmf\") pod \"8459660b-c673-46c5-81de-8081c3545a15\" (UID: \"8459660b-c673-46c5-81de-8081c3545a15\") " Mar 09 13:33:54 crc kubenswrapper[4723]: I0309 13:33:54.541427 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8459660b-c673-46c5-81de-8081c3545a15-inventory\") pod \"8459660b-c673-46c5-81de-8081c3545a15\" (UID: \"8459660b-c673-46c5-81de-8081c3545a15\") " Mar 09 13:33:54 crc kubenswrapper[4723]: I0309 13:33:54.558915 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8459660b-c673-46c5-81de-8081c3545a15-kube-api-access-qptmf" (OuterVolumeSpecName: "kube-api-access-qptmf") pod "8459660b-c673-46c5-81de-8081c3545a15" (UID: "8459660b-c673-46c5-81de-8081c3545a15"). InnerVolumeSpecName "kube-api-access-qptmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:33:54 crc kubenswrapper[4723]: I0309 13:33:54.587790 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8459660b-c673-46c5-81de-8081c3545a15-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8459660b-c673-46c5-81de-8081c3545a15" (UID: "8459660b-c673-46c5-81de-8081c3545a15"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:33:54 crc kubenswrapper[4723]: I0309 13:33:54.600250 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8459660b-c673-46c5-81de-8081c3545a15-inventory" (OuterVolumeSpecName: "inventory") pod "8459660b-c673-46c5-81de-8081c3545a15" (UID: "8459660b-c673-46c5-81de-8081c3545a15"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:33:54 crc kubenswrapper[4723]: I0309 13:33:54.645150 4723 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8459660b-c673-46c5-81de-8081c3545a15-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:54 crc kubenswrapper[4723]: I0309 13:33:54.645187 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qptmf\" (UniqueName: \"kubernetes.io/projected/8459660b-c673-46c5-81de-8081c3545a15-kube-api-access-qptmf\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:54 crc kubenswrapper[4723]: I0309 13:33:54.645203 4723 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8459660b-c673-46c5-81de-8081c3545a15-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:54 crc kubenswrapper[4723]: I0309 13:33:54.857703 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pwndd" event={"ID":"8459660b-c673-46c5-81de-8081c3545a15","Type":"ContainerDied","Data":"7e835bb256a5100a31b786bee3e354ecdba54e76ec1e9577b5477da02f871854"} Mar 09 13:33:54 crc kubenswrapper[4723]: I0309 13:33:54.857753 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e835bb256a5100a31b786bee3e354ecdba54e76ec1e9577b5477da02f871854" Mar 09 13:33:54 crc kubenswrapper[4723]: I0309 13:33:54.857814 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pwndd" Mar 09 13:33:54 crc kubenswrapper[4723]: I0309 13:33:54.962039 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b5dvf"] Mar 09 13:33:54 crc kubenswrapper[4723]: E0309 13:33:54.962823 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50764dcb-5f95-4085-a631-da55fc65e1f1" containerName="extract-utilities" Mar 09 13:33:54 crc kubenswrapper[4723]: I0309 13:33:54.962844 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="50764dcb-5f95-4085-a631-da55fc65e1f1" containerName="extract-utilities" Mar 09 13:33:54 crc kubenswrapper[4723]: E0309 13:33:54.962902 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50764dcb-5f95-4085-a631-da55fc65e1f1" containerName="registry-server" Mar 09 13:33:54 crc kubenswrapper[4723]: I0309 13:33:54.962911 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="50764dcb-5f95-4085-a631-da55fc65e1f1" containerName="registry-server" Mar 09 13:33:54 crc kubenswrapper[4723]: E0309 13:33:54.962934 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8459660b-c673-46c5-81de-8081c3545a15" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:33:54 crc kubenswrapper[4723]: I0309 13:33:54.962945 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="8459660b-c673-46c5-81de-8081c3545a15" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:33:54 crc kubenswrapper[4723]: E0309 13:33:54.962960 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50764dcb-5f95-4085-a631-da55fc65e1f1" containerName="extract-content" Mar 09 13:33:54 crc kubenswrapper[4723]: I0309 13:33:54.962967 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="50764dcb-5f95-4085-a631-da55fc65e1f1" containerName="extract-content" Mar 09 13:33:54 crc kubenswrapper[4723]: I0309 13:33:54.963238 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="50764dcb-5f95-4085-a631-da55fc65e1f1" containerName="registry-server" Mar 09 13:33:54 crc kubenswrapper[4723]: I0309 13:33:54.963293 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="8459660b-c673-46c5-81de-8081c3545a15" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:33:54 crc kubenswrapper[4723]: I0309 13:33:54.964338 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b5dvf" Mar 09 13:33:54 crc kubenswrapper[4723]: I0309 13:33:54.967478 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:33:54 crc kubenswrapper[4723]: I0309 13:33:54.967522 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:33:54 crc kubenswrapper[4723]: I0309 13:33:54.967492 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gw7vt" Mar 09 13:33:54 crc kubenswrapper[4723]: I0309 13:33:54.967796 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:33:54 crc kubenswrapper[4723]: I0309 13:33:54.979763 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b5dvf"] Mar 09 13:33:55 crc kubenswrapper[4723]: I0309 13:33:55.054311 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18f88a77-e904-4382-84ac-64567a2ef585-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b5dvf\" (UID: \"18f88a77-e904-4382-84ac-64567a2ef585\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b5dvf" Mar 09 13:33:55 crc kubenswrapper[4723]: I0309 13:33:55.054537 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5ndz\" (UniqueName: \"kubernetes.io/projected/18f88a77-e904-4382-84ac-64567a2ef585-kube-api-access-m5ndz\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b5dvf\" (UID: \"18f88a77-e904-4382-84ac-64567a2ef585\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b5dvf" Mar 09 13:33:55 crc kubenswrapper[4723]: I0309 13:33:55.054578 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/18f88a77-e904-4382-84ac-64567a2ef585-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b5dvf\" (UID: \"18f88a77-e904-4382-84ac-64567a2ef585\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b5dvf" Mar 09 13:33:55 crc kubenswrapper[4723]: I0309 13:33:55.156308 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18f88a77-e904-4382-84ac-64567a2ef585-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b5dvf\" (UID: \"18f88a77-e904-4382-84ac-64567a2ef585\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b5dvf" Mar 09 13:33:55 crc kubenswrapper[4723]: I0309 13:33:55.156406 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5ndz\" (UniqueName: \"kubernetes.io/projected/18f88a77-e904-4382-84ac-64567a2ef585-kube-api-access-m5ndz\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b5dvf\" (UID: \"18f88a77-e904-4382-84ac-64567a2ef585\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b5dvf" Mar 09 13:33:55 crc kubenswrapper[4723]: I0309 13:33:55.156442 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/18f88a77-e904-4382-84ac-64567a2ef585-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b5dvf\" (UID: \"18f88a77-e904-4382-84ac-64567a2ef585\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b5dvf" Mar 09 13:33:55 crc kubenswrapper[4723]: I0309 13:33:55.161681 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/18f88a77-e904-4382-84ac-64567a2ef585-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b5dvf\" (UID: \"18f88a77-e904-4382-84ac-64567a2ef585\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b5dvf" Mar 09 13:33:55 crc kubenswrapper[4723]: I0309 13:33:55.165737 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18f88a77-e904-4382-84ac-64567a2ef585-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b5dvf\" (UID: \"18f88a77-e904-4382-84ac-64567a2ef585\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b5dvf" Mar 09 13:33:55 crc kubenswrapper[4723]: I0309 13:33:55.172522 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5ndz\" (UniqueName: \"kubernetes.io/projected/18f88a77-e904-4382-84ac-64567a2ef585-kube-api-access-m5ndz\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b5dvf\" (UID: \"18f88a77-e904-4382-84ac-64567a2ef585\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b5dvf" Mar 09 13:33:55 crc kubenswrapper[4723]: I0309 13:33:55.296243 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b5dvf" Mar 09 13:33:55 crc kubenswrapper[4723]: W0309 13:33:55.874323 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18f88a77_e904_4382_84ac_64567a2ef585.slice/crio-4b79cbaa392739e9c63018b669d2c47d51a62104d7221e173d8dcab025199e1a WatchSource:0}: Error finding container 4b79cbaa392739e9c63018b669d2c47d51a62104d7221e173d8dcab025199e1a: Status 404 returned error can't find the container with id 4b79cbaa392739e9c63018b669d2c47d51a62104d7221e173d8dcab025199e1a Mar 09 13:33:55 crc kubenswrapper[4723]: I0309 13:33:55.876315 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b5dvf"] Mar 09 13:33:56 crc kubenswrapper[4723]: I0309 13:33:56.878533 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b5dvf" event={"ID":"18f88a77-e904-4382-84ac-64567a2ef585","Type":"ContainerStarted","Data":"4b79cbaa392739e9c63018b669d2c47d51a62104d7221e173d8dcab025199e1a"} Mar 09 13:33:57 crc kubenswrapper[4723]: I0309 13:33:57.917129 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b5dvf" event={"ID":"18f88a77-e904-4382-84ac-64567a2ef585","Type":"ContainerStarted","Data":"4a4a01aa927bf7a98bd6f524f0541cb970b61af7e4dba69586e65981f9d36242"} Mar 09 13:33:57 crc kubenswrapper[4723]: I0309 13:33:57.937800 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b5dvf" podStartSLOduration=3.207515327 podStartE2EDuration="3.937774271s" podCreationTimestamp="2026-03-09 13:33:54 +0000 UTC" firstStartedPulling="2026-03-09 13:33:55.877812275 +0000 UTC m=+2109.892279815" lastFinishedPulling="2026-03-09 13:33:56.608071149 +0000 UTC m=+2110.622538759" observedRunningTime="2026-03-09 13:33:57.932978973 +0000 UTC m=+2111.947446543" watchObservedRunningTime="2026-03-09 13:33:57.937774271 +0000 UTC m=+2111.952241821" Mar 09 13:33:58 crc kubenswrapper[4723]: I0309 13:33:58.609455 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5tmhd" podUID="03153304-2923-4629-90c0-0f7f4f7cdac2" containerName="registry-server" probeResult="failure" output=< Mar 09 13:33:58 crc kubenswrapper[4723]: timeout: failed to connect service ":50051" within 1s Mar 09 13:33:58 crc kubenswrapper[4723]: > Mar 09 13:34:00 crc kubenswrapper[4723]: I0309 13:34:00.137298 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551054-wwv7q"] Mar 09 13:34:00 crc kubenswrapper[4723]: I0309 13:34:00.139597 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551054-wwv7q" Mar 09 13:34:00 crc kubenswrapper[4723]: I0309 13:34:00.141813 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:34:00 crc kubenswrapper[4723]: I0309 13:34:00.142247 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:34:00 crc kubenswrapper[4723]: I0309 13:34:00.142735 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jrp4x" Mar 09 13:34:00 crc kubenswrapper[4723]: I0309 13:34:00.154811 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551054-wwv7q"] Mar 09 13:34:00 crc kubenswrapper[4723]: I0309 13:34:00.274642 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv422\" (UniqueName: \"kubernetes.io/projected/b695feec-4723-4e51-9092-a1d537d90fee-kube-api-access-rv422\") pod \"auto-csr-approver-29551054-wwv7q\" (UID: \"b695feec-4723-4e51-9092-a1d537d90fee\") " pod="openshift-infra/auto-csr-approver-29551054-wwv7q" Mar 09 13:34:00 crc kubenswrapper[4723]: I0309 13:34:00.376672 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv422\" (UniqueName: \"kubernetes.io/projected/b695feec-4723-4e51-9092-a1d537d90fee-kube-api-access-rv422\") pod \"auto-csr-approver-29551054-wwv7q\" (UID: \"b695feec-4723-4e51-9092-a1d537d90fee\") " pod="openshift-infra/auto-csr-approver-29551054-wwv7q" Mar 09 13:34:00 crc kubenswrapper[4723]: I0309 13:34:00.410634 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv422\" (UniqueName: \"kubernetes.io/projected/b695feec-4723-4e51-9092-a1d537d90fee-kube-api-access-rv422\") pod \"auto-csr-approver-29551054-wwv7q\" (UID: \"b695feec-4723-4e51-9092-a1d537d90fee\") " pod="openshift-infra/auto-csr-approver-29551054-wwv7q" Mar 09 13:34:00 crc kubenswrapper[4723]: I0309 13:34:00.465537 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551054-wwv7q" Mar 09 13:34:00 crc kubenswrapper[4723]: I0309 13:34:00.977530 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551054-wwv7q"] Mar 09 13:34:00 crc kubenswrapper[4723]: W0309 13:34:00.983299 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb695feec_4723_4e51_9092_a1d537d90fee.slice/crio-c1d4608c662031194b7940218b8133f87523dafe5a35357bc9f50d9e2ace24b7 WatchSource:0}: Error finding container c1d4608c662031194b7940218b8133f87523dafe5a35357bc9f50d9e2ace24b7: Status 404 returned error can't find the container with id c1d4608c662031194b7940218b8133f87523dafe5a35357bc9f50d9e2ace24b7 Mar 09 13:34:01 crc kubenswrapper[4723]: I0309 13:34:01.967404 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551054-wwv7q" event={"ID":"b695feec-4723-4e51-9092-a1d537d90fee","Type":"ContainerStarted","Data":"c1d4608c662031194b7940218b8133f87523dafe5a35357bc9f50d9e2ace24b7"} Mar 09 13:34:02 crc kubenswrapper[4723]: I0309 13:34:02.982236 4723 generic.go:334] "Generic (PLEG): container finished" podID="b695feec-4723-4e51-9092-a1d537d90fee" containerID="fcd4dc04f85767ac905cecfb2a4819cd805845946801d1bdbb7ffe035c726385" exitCode=0 Mar 09 13:34:02 crc kubenswrapper[4723]: I0309 13:34:02.982295 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551054-wwv7q" event={"ID":"b695feec-4723-4e51-9092-a1d537d90fee","Type":"ContainerDied","Data":"fcd4dc04f85767ac905cecfb2a4819cd805845946801d1bdbb7ffe035c726385"} Mar 09 13:34:04 crc kubenswrapper[4723]: I0309 13:34:04.442922 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551054-wwv7q" Mar 09 13:34:04 crc kubenswrapper[4723]: I0309 13:34:04.486541 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rv422\" (UniqueName: \"kubernetes.io/projected/b695feec-4723-4e51-9092-a1d537d90fee-kube-api-access-rv422\") pod \"b695feec-4723-4e51-9092-a1d537d90fee\" (UID: \"b695feec-4723-4e51-9092-a1d537d90fee\") " Mar 09 13:34:04 crc kubenswrapper[4723]: I0309 13:34:04.495197 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b695feec-4723-4e51-9092-a1d537d90fee-kube-api-access-rv422" (OuterVolumeSpecName: "kube-api-access-rv422") pod "b695feec-4723-4e51-9092-a1d537d90fee" (UID: "b695feec-4723-4e51-9092-a1d537d90fee"). InnerVolumeSpecName "kube-api-access-rv422". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:34:04 crc kubenswrapper[4723]: I0309 13:34:04.590952 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rv422\" (UniqueName: \"kubernetes.io/projected/b695feec-4723-4e51-9092-a1d537d90fee-kube-api-access-rv422\") on node \"crc\" DevicePath \"\"" Mar 09 13:34:05 crc kubenswrapper[4723]: I0309 13:34:05.008346 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551054-wwv7q" event={"ID":"b695feec-4723-4e51-9092-a1d537d90fee","Type":"ContainerDied","Data":"c1d4608c662031194b7940218b8133f87523dafe5a35357bc9f50d9e2ace24b7"} Mar 09 13:34:05 crc kubenswrapper[4723]: I0309 13:34:05.008411 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1d4608c662031194b7940218b8133f87523dafe5a35357bc9f50d9e2ace24b7" Mar 09 13:34:05 crc kubenswrapper[4723]: I0309 13:34:05.008430 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551054-wwv7q" Mar 09 13:34:05 crc kubenswrapper[4723]: I0309 13:34:05.520480 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551048-g64gj"] Mar 09 13:34:05 crc kubenswrapper[4723]: I0309 13:34:05.530852 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551048-g64gj"] Mar 09 13:34:06 crc kubenswrapper[4723]: I0309 13:34:06.896144 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9702a066-3717-4a9a-8777-372960604154" path="/var/lib/kubelet/pods/9702a066-3717-4a9a-8777-372960604154/volumes" Mar 09 13:34:07 crc kubenswrapper[4723]: I0309 13:34:07.608642 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5tmhd" Mar 09 13:34:07 crc kubenswrapper[4723]: I0309 13:34:07.670210 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5tmhd" Mar 09 13:34:07 crc kubenswrapper[4723]: I0309 13:34:07.857208 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5tmhd"] Mar 09 13:34:09 crc kubenswrapper[4723]: I0309 13:34:09.050503 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5tmhd" podUID="03153304-2923-4629-90c0-0f7f4f7cdac2" containerName="registry-server" containerID="cri-o://5fde97eaf7367ebfdf2c36741d7ae58d79c8c634d070107533a2a7faca6851f5" gracePeriod=2 Mar 09 13:34:09 crc kubenswrapper[4723]: I0309 13:34:09.635733 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5tmhd" Mar 09 13:34:09 crc kubenswrapper[4723]: I0309 13:34:09.717563 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03153304-2923-4629-90c0-0f7f4f7cdac2-utilities\") pod \"03153304-2923-4629-90c0-0f7f4f7cdac2\" (UID: \"03153304-2923-4629-90c0-0f7f4f7cdac2\") " Mar 09 13:34:09 crc kubenswrapper[4723]: I0309 13:34:09.717796 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6lnj\" (UniqueName: \"kubernetes.io/projected/03153304-2923-4629-90c0-0f7f4f7cdac2-kube-api-access-k6lnj\") pod \"03153304-2923-4629-90c0-0f7f4f7cdac2\" (UID: \"03153304-2923-4629-90c0-0f7f4f7cdac2\") " Mar 09 13:34:09 crc kubenswrapper[4723]: I0309 13:34:09.718048 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03153304-2923-4629-90c0-0f7f4f7cdac2-catalog-content\") pod \"03153304-2923-4629-90c0-0f7f4f7cdac2\" (UID: \"03153304-2923-4629-90c0-0f7f4f7cdac2\") " Mar 09 13:34:09 crc kubenswrapper[4723]: I0309 13:34:09.718699 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03153304-2923-4629-90c0-0f7f4f7cdac2-utilities" (OuterVolumeSpecName: "utilities") pod "03153304-2923-4629-90c0-0f7f4f7cdac2" (UID: "03153304-2923-4629-90c0-0f7f4f7cdac2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:34:09 crc kubenswrapper[4723]: I0309 13:34:09.741133 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03153304-2923-4629-90c0-0f7f4f7cdac2-kube-api-access-k6lnj" (OuterVolumeSpecName: "kube-api-access-k6lnj") pod "03153304-2923-4629-90c0-0f7f4f7cdac2" (UID: "03153304-2923-4629-90c0-0f7f4f7cdac2"). InnerVolumeSpecName "kube-api-access-k6lnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:34:09 crc kubenswrapper[4723]: I0309 13:34:09.821333 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6lnj\" (UniqueName: \"kubernetes.io/projected/03153304-2923-4629-90c0-0f7f4f7cdac2-kube-api-access-k6lnj\") on node \"crc\" DevicePath \"\"" Mar 09 13:34:09 crc kubenswrapper[4723]: I0309 13:34:09.821592 4723 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03153304-2923-4629-90c0-0f7f4f7cdac2-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:34:09 crc kubenswrapper[4723]: I0309 13:34:09.842376 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03153304-2923-4629-90c0-0f7f4f7cdac2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "03153304-2923-4629-90c0-0f7f4f7cdac2" (UID: "03153304-2923-4629-90c0-0f7f4f7cdac2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:34:09 crc kubenswrapper[4723]: I0309 13:34:09.925712 4723 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03153304-2923-4629-90c0-0f7f4f7cdac2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:34:10 crc kubenswrapper[4723]: I0309 13:34:10.035778 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-schhb"] Mar 09 13:34:10 crc kubenswrapper[4723]: I0309 13:34:10.048088 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-schhb"] Mar 09 13:34:10 crc kubenswrapper[4723]: I0309 13:34:10.062717 4723 generic.go:334] "Generic (PLEG): container finished" podID="03153304-2923-4629-90c0-0f7f4f7cdac2" containerID="5fde97eaf7367ebfdf2c36741d7ae58d79c8c634d070107533a2a7faca6851f5" exitCode=0 Mar 09 13:34:10 crc kubenswrapper[4723]: I0309 13:34:10.062766 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tmhd" event={"ID":"03153304-2923-4629-90c0-0f7f4f7cdac2","Type":"ContainerDied","Data":"5fde97eaf7367ebfdf2c36741d7ae58d79c8c634d070107533a2a7faca6851f5"} Mar 09 13:34:10 crc kubenswrapper[4723]: I0309 13:34:10.062797 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tmhd" event={"ID":"03153304-2923-4629-90c0-0f7f4f7cdac2","Type":"ContainerDied","Data":"614fd4844aa30cb81181f0cc95c96afc93f627e8b24869e17090abc0e8b4c720"} Mar 09 13:34:10 crc kubenswrapper[4723]: I0309 13:34:10.062820 4723 scope.go:117] "RemoveContainer" containerID="5fde97eaf7367ebfdf2c36741d7ae58d79c8c634d070107533a2a7faca6851f5" Mar 09 13:34:10 crc kubenswrapper[4723]: I0309 13:34:10.062992 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5tmhd" Mar 09 13:34:10 crc kubenswrapper[4723]: I0309 13:34:10.102107 4723 scope.go:117] "RemoveContainer" containerID="f820d834d542904412ad0de9681cf647da7ed2a1311518de18a10a648ba5bb30" Mar 09 13:34:10 crc kubenswrapper[4723]: I0309 13:34:10.106734 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5tmhd"] Mar 09 13:34:10 crc kubenswrapper[4723]: I0309 13:34:10.118518 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5tmhd"] Mar 09 13:34:10 crc kubenswrapper[4723]: I0309 13:34:10.138877 4723 scope.go:117] "RemoveContainer" containerID="d6b508d6a9c3cd02cfe5742505aee3248e027ae8247450ca218f53412fc135ab" Mar 09 13:34:10 crc kubenswrapper[4723]: I0309 13:34:10.179470 4723 scope.go:117] "RemoveContainer" containerID="5fde97eaf7367ebfdf2c36741d7ae58d79c8c634d070107533a2a7faca6851f5" Mar 09 13:34:10 crc kubenswrapper[4723]: E0309 13:34:10.180494 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fde97eaf7367ebfdf2c36741d7ae58d79c8c634d070107533a2a7faca6851f5\": container with ID starting with 5fde97eaf7367ebfdf2c36741d7ae58d79c8c634d070107533a2a7faca6851f5 not found: ID does not exist" containerID="5fde97eaf7367ebfdf2c36741d7ae58d79c8c634d070107533a2a7faca6851f5" Mar 09 13:34:10 crc kubenswrapper[4723]: I0309 13:34:10.180533 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fde97eaf7367ebfdf2c36741d7ae58d79c8c634d070107533a2a7faca6851f5"} err="failed to get container status \"5fde97eaf7367ebfdf2c36741d7ae58d79c8c634d070107533a2a7faca6851f5\": rpc error: code = NotFound desc = could not find container \"5fde97eaf7367ebfdf2c36741d7ae58d79c8c634d070107533a2a7faca6851f5\": container with ID starting with 5fde97eaf7367ebfdf2c36741d7ae58d79c8c634d070107533a2a7faca6851f5 not found: ID does not exist" Mar 09 13:34:10 crc kubenswrapper[4723]: I0309 13:34:10.180592 4723 scope.go:117] "RemoveContainer" containerID="f820d834d542904412ad0de9681cf647da7ed2a1311518de18a10a648ba5bb30" Mar 09 13:34:10 crc kubenswrapper[4723]: E0309 13:34:10.180946 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f820d834d542904412ad0de9681cf647da7ed2a1311518de18a10a648ba5bb30\": container with ID starting with f820d834d542904412ad0de9681cf647da7ed2a1311518de18a10a648ba5bb30 not found: ID does not exist" containerID="f820d834d542904412ad0de9681cf647da7ed2a1311518de18a10a648ba5bb30" Mar 09 13:34:10 crc kubenswrapper[4723]: I0309 13:34:10.181000 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f820d834d542904412ad0de9681cf647da7ed2a1311518de18a10a648ba5bb30"} err="failed to get container status \"f820d834d542904412ad0de9681cf647da7ed2a1311518de18a10a648ba5bb30\": rpc error: code = NotFound desc = could not find container \"f820d834d542904412ad0de9681cf647da7ed2a1311518de18a10a648ba5bb30\": container with ID starting with f820d834d542904412ad0de9681cf647da7ed2a1311518de18a10a648ba5bb30 not found: ID does not exist" Mar 09 13:34:10 crc kubenswrapper[4723]: I0309 13:34:10.181033 4723 scope.go:117] "RemoveContainer" containerID="d6b508d6a9c3cd02cfe5742505aee3248e027ae8247450ca218f53412fc135ab" Mar 09 13:34:10 crc kubenswrapper[4723]: E0309 13:34:10.181296 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6b508d6a9c3cd02cfe5742505aee3248e027ae8247450ca218f53412fc135ab\": container with ID starting with d6b508d6a9c3cd02cfe5742505aee3248e027ae8247450ca218f53412fc135ab not found: ID does not exist" containerID="d6b508d6a9c3cd02cfe5742505aee3248e027ae8247450ca218f53412fc135ab" Mar 09 13:34:10 crc kubenswrapper[4723]: I0309 13:34:10.181321 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6b508d6a9c3cd02cfe5742505aee3248e027ae8247450ca218f53412fc135ab"} err="failed to get container status \"d6b508d6a9c3cd02cfe5742505aee3248e027ae8247450ca218f53412fc135ab\": rpc error: code = NotFound desc = could not find container \"d6b508d6a9c3cd02cfe5742505aee3248e027ae8247450ca218f53412fc135ab\": container with ID starting with d6b508d6a9c3cd02cfe5742505aee3248e027ae8247450ca218f53412fc135ab not found: ID does not exist" Mar 09 13:34:10 crc kubenswrapper[4723]: I0309 13:34:10.894237 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03153304-2923-4629-90c0-0f7f4f7cdac2" path="/var/lib/kubelet/pods/03153304-2923-4629-90c0-0f7f4f7cdac2/volumes" Mar 09 13:34:10 crc kubenswrapper[4723]: I0309 13:34:10.895697 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df4024d7-4807-416a-883e-b36bdc7945b7" path="/var/lib/kubelet/pods/df4024d7-4807-416a-883e-b36bdc7945b7/volumes" Mar 09 13:34:24 crc kubenswrapper[4723]: I0309 13:34:24.045174 4723 scope.go:117] "RemoveContainer" containerID="56eda3a218baef03a2d03f600372743aaa19368443fb06818b6fa4bb26fd9645" Mar 09 13:34:24 crc kubenswrapper[4723]: I0309 13:34:24.082275 4723 scope.go:117] "RemoveContainer" containerID="2f856257f42c32ae4211493d61e0c0341c2a6e862631446bed630983f8fdcfbb" Mar 09 13:34:24 crc kubenswrapper[4723]: I0309 13:34:24.155920 4723 scope.go:117] "RemoveContainer" containerID="929cf1f651d0585d8304f624021f967091871f4fe09a0b52b653fdded40e9086" Mar 09 13:34:24 crc kubenswrapper[4723]: I0309 13:34:24.224970 4723 scope.go:117] "RemoveContainer" containerID="7a117ee67ee75bb4a7bae9848fb8f37e812495e259379c7b120d3f64f1648ac0" Mar 09 13:34:45 crc kubenswrapper[4723]: I0309 13:34:45.466270 4723 generic.go:334] "Generic (PLEG): container finished" podID="18f88a77-e904-4382-84ac-64567a2ef585" containerID="4a4a01aa927bf7a98bd6f524f0541cb970b61af7e4dba69586e65981f9d36242" exitCode=0 Mar 09 13:34:45 crc kubenswrapper[4723]: I0309 13:34:45.466364 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b5dvf" event={"ID":"18f88a77-e904-4382-84ac-64567a2ef585","Type":"ContainerDied","Data":"4a4a01aa927bf7a98bd6f524f0541cb970b61af7e4dba69586e65981f9d36242"} Mar 09 13:34:46 crc kubenswrapper[4723]: I0309 13:34:46.940946 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b5dvf" Mar 09 13:34:47 crc kubenswrapper[4723]: I0309 13:34:47.081084 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18f88a77-e904-4382-84ac-64567a2ef585-inventory\") pod \"18f88a77-e904-4382-84ac-64567a2ef585\" (UID: \"18f88a77-e904-4382-84ac-64567a2ef585\") " Mar 09 13:34:47 crc kubenswrapper[4723]: I0309 13:34:47.081135 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/18f88a77-e904-4382-84ac-64567a2ef585-ssh-key-openstack-edpm-ipam\") pod \"18f88a77-e904-4382-84ac-64567a2ef585\" (UID: \"18f88a77-e904-4382-84ac-64567a2ef585\") " Mar 09 13:34:47 crc kubenswrapper[4723]: I0309 13:34:47.081418 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5ndz\" (UniqueName: \"kubernetes.io/projected/18f88a77-e904-4382-84ac-64567a2ef585-kube-api-access-m5ndz\") pod \"18f88a77-e904-4382-84ac-64567a2ef585\" (UID: \"18f88a77-e904-4382-84ac-64567a2ef585\") " Mar 09 13:34:47 crc kubenswrapper[4723]: I0309 13:34:47.088980 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18f88a77-e904-4382-84ac-64567a2ef585-kube-api-access-m5ndz" (OuterVolumeSpecName: "kube-api-access-m5ndz") pod "18f88a77-e904-4382-84ac-64567a2ef585" (UID: "18f88a77-e904-4382-84ac-64567a2ef585"). InnerVolumeSpecName "kube-api-access-m5ndz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:34:47 crc kubenswrapper[4723]: I0309 13:34:47.123514 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18f88a77-e904-4382-84ac-64567a2ef585-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "18f88a77-e904-4382-84ac-64567a2ef585" (UID: "18f88a77-e904-4382-84ac-64567a2ef585"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:34:47 crc kubenswrapper[4723]: I0309 13:34:47.128283 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18f88a77-e904-4382-84ac-64567a2ef585-inventory" (OuterVolumeSpecName: "inventory") pod "18f88a77-e904-4382-84ac-64567a2ef585" (UID: "18f88a77-e904-4382-84ac-64567a2ef585"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:34:47 crc kubenswrapper[4723]: I0309 13:34:47.184119 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5ndz\" (UniqueName: \"kubernetes.io/projected/18f88a77-e904-4382-84ac-64567a2ef585-kube-api-access-m5ndz\") on node \"crc\" DevicePath \"\"" Mar 09 13:34:47 crc kubenswrapper[4723]: I0309 13:34:47.184161 4723 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18f88a77-e904-4382-84ac-64567a2ef585-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:34:47 crc kubenswrapper[4723]: I0309 13:34:47.184175 4723 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/18f88a77-e904-4382-84ac-64567a2ef585-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:34:47 crc kubenswrapper[4723]: I0309 13:34:47.489019 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b5dvf" event={"ID":"18f88a77-e904-4382-84ac-64567a2ef585","Type":"ContainerDied","Data":"4b79cbaa392739e9c63018b669d2c47d51a62104d7221e173d8dcab025199e1a"} Mar 09 13:34:47 crc kubenswrapper[4723]: I0309 13:34:47.489071 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b79cbaa392739e9c63018b669d2c47d51a62104d7221e173d8dcab025199e1a" Mar 09 13:34:47 crc kubenswrapper[4723]: I0309 13:34:47.489082 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b5dvf" Mar 09 13:34:47 crc kubenswrapper[4723]: I0309 13:34:47.623778 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-r6qpm"] Mar 09 13:34:47 crc kubenswrapper[4723]: E0309 13:34:47.624466 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18f88a77-e904-4382-84ac-64567a2ef585" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:34:47 crc kubenswrapper[4723]: I0309 13:34:47.624540 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="18f88a77-e904-4382-84ac-64567a2ef585" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:34:47 crc kubenswrapper[4723]: E0309 13:34:47.624601 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03153304-2923-4629-90c0-0f7f4f7cdac2" containerName="extract-content" Mar 09 13:34:47 crc kubenswrapper[4723]: I0309 13:34:47.624661 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="03153304-2923-4629-90c0-0f7f4f7cdac2" containerName="extract-content" Mar 09 13:34:47 crc kubenswrapper[4723]: E0309 13:34:47.624712 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03153304-2923-4629-90c0-0f7f4f7cdac2" containerName="registry-server" Mar 09 13:34:47 crc kubenswrapper[4723]: I0309 13:34:47.624759 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="03153304-2923-4629-90c0-0f7f4f7cdac2" containerName="registry-server" Mar 09 13:34:47 crc kubenswrapper[4723]: E0309 13:34:47.624819 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b695feec-4723-4e51-9092-a1d537d90fee" containerName="oc" Mar 09 13:34:47 crc kubenswrapper[4723]: I0309 13:34:47.624889 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="b695feec-4723-4e51-9092-a1d537d90fee" containerName="oc" Mar 09 13:34:47 crc kubenswrapper[4723]: E0309 13:34:47.624953 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03153304-2923-4629-90c0-0f7f4f7cdac2" containerName="extract-utilities" Mar 09 13:34:47 crc kubenswrapper[4723]: I0309 13:34:47.625012 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="03153304-2923-4629-90c0-0f7f4f7cdac2" containerName="extract-utilities" Mar 09 13:34:47 crc kubenswrapper[4723]: I0309 13:34:47.625277 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="b695feec-4723-4e51-9092-a1d537d90fee" containerName="oc" Mar 09 13:34:47 crc kubenswrapper[4723]: I0309 13:34:47.625348 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="18f88a77-e904-4382-84ac-64567a2ef585" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:34:47 crc kubenswrapper[4723]: I0309 13:34:47.625405 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="03153304-2923-4629-90c0-0f7f4f7cdac2" containerName="registry-server" Mar 09 13:34:47 crc kubenswrapper[4723]: I0309 13:34:47.626308 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-r6qpm" Mar 09 13:34:47 crc kubenswrapper[4723]: I0309 13:34:47.628947 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:34:47 crc kubenswrapper[4723]: I0309 13:34:47.629240 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:34:47 crc kubenswrapper[4723]: I0309 13:34:47.629924 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gw7vt" Mar 09 13:34:47 crc kubenswrapper[4723]: I0309 13:34:47.631234 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:34:47 crc kubenswrapper[4723]: I0309 13:34:47.643132 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-r6qpm"] Mar 09 13:34:47 crc kubenswrapper[4723]: I0309 13:34:47.696416 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac5a7be9-9a6b-4595-9cd9-ddec8641c533-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-r6qpm\" (UID: \"ac5a7be9-9a6b-4595-9cd9-ddec8641c533\") " pod="openstack/ssh-known-hosts-edpm-deployment-r6qpm" Mar 09 13:34:47 crc kubenswrapper[4723]: I0309 13:34:47.696680 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ac5a7be9-9a6b-4595-9cd9-ddec8641c533-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-r6qpm\" (UID: \"ac5a7be9-9a6b-4595-9cd9-ddec8641c533\") " pod="openstack/ssh-known-hosts-edpm-deployment-r6qpm" Mar 09 13:34:47 crc kubenswrapper[4723]: I0309 13:34:47.696763 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n66nj\" (UniqueName: \"kubernetes.io/projected/ac5a7be9-9a6b-4595-9cd9-ddec8641c533-kube-api-access-n66nj\") pod \"ssh-known-hosts-edpm-deployment-r6qpm\" (UID: \"ac5a7be9-9a6b-4595-9cd9-ddec8641c533\") " pod="openstack/ssh-known-hosts-edpm-deployment-r6qpm" Mar 09 13:34:47 crc kubenswrapper[4723]: I0309 13:34:47.799362 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac5a7be9-9a6b-4595-9cd9-ddec8641c533-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-r6qpm\" (UID: \"ac5a7be9-9a6b-4595-9cd9-ddec8641c533\") " pod="openstack/ssh-known-hosts-edpm-deployment-r6qpm" Mar 09 13:34:47 crc kubenswrapper[4723]: I0309 13:34:47.799519 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ac5a7be9-9a6b-4595-9cd9-ddec8641c533-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-r6qpm\" (UID: \"ac5a7be9-9a6b-4595-9cd9-ddec8641c533\") " pod="openstack/ssh-known-hosts-edpm-deployment-r6qpm" Mar 09 13:34:47 crc kubenswrapper[4723]: I0309 13:34:47.799565 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n66nj\" (UniqueName: \"kubernetes.io/projected/ac5a7be9-9a6b-4595-9cd9-ddec8641c533-kube-api-access-n66nj\") pod \"ssh-known-hosts-edpm-deployment-r6qpm\" (UID: \"ac5a7be9-9a6b-4595-9cd9-ddec8641c533\") " pod="openstack/ssh-known-hosts-edpm-deployment-r6qpm" Mar 09 13:34:47 crc kubenswrapper[4723]: I0309 13:34:47.806615 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac5a7be9-9a6b-4595-9cd9-ddec8641c533-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-r6qpm\" (UID: \"ac5a7be9-9a6b-4595-9cd9-ddec8641c533\") " pod="openstack/ssh-known-hosts-edpm-deployment-r6qpm" Mar 09 13:34:47 crc kubenswrapper[4723]: I0309 13:34:47.806676 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ac5a7be9-9a6b-4595-9cd9-ddec8641c533-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-r6qpm\" (UID: \"ac5a7be9-9a6b-4595-9cd9-ddec8641c533\") " pod="openstack/ssh-known-hosts-edpm-deployment-r6qpm" Mar 09 13:34:47 crc kubenswrapper[4723]: I0309 13:34:47.820785 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n66nj\" (UniqueName: \"kubernetes.io/projected/ac5a7be9-9a6b-4595-9cd9-ddec8641c533-kube-api-access-n66nj\") pod \"ssh-known-hosts-edpm-deployment-r6qpm\" (UID: \"ac5a7be9-9a6b-4595-9cd9-ddec8641c533\") " pod="openstack/ssh-known-hosts-edpm-deployment-r6qpm" Mar 09 13:34:48 crc kubenswrapper[4723]: I0309 13:34:48.000342 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-r6qpm" Mar 09 13:34:48 crc kubenswrapper[4723]: I0309 13:34:48.593211 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-r6qpm"] Mar 09 13:34:49 crc kubenswrapper[4723]: I0309 13:34:49.520208 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-r6qpm" event={"ID":"ac5a7be9-9a6b-4595-9cd9-ddec8641c533","Type":"ContainerStarted","Data":"e22c25fb25b40a2916f6f148751ba0133e23b1adcefc001510228c5b055b8a61"} Mar 09 13:34:50 crc kubenswrapper[4723]: I0309 13:34:50.532945 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-r6qpm" event={"ID":"ac5a7be9-9a6b-4595-9cd9-ddec8641c533","Type":"ContainerStarted","Data":"03384deab6f63072c1d78855b0d714c5659143d3a7edffec3972db0824c100a0"} Mar 09 13:34:56 crc kubenswrapper[4723]: I0309 13:34:56.590440 4723 generic.go:334] "Generic (PLEG): container finished" podID="ac5a7be9-9a6b-4595-9cd9-ddec8641c533" containerID="03384deab6f63072c1d78855b0d714c5659143d3a7edffec3972db0824c100a0" exitCode=0 Mar 09 13:34:56 crc kubenswrapper[4723]: I0309 13:34:56.590534 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-r6qpm" event={"ID":"ac5a7be9-9a6b-4595-9cd9-ddec8641c533","Type":"ContainerDied","Data":"03384deab6f63072c1d78855b0d714c5659143d3a7edffec3972db0824c100a0"} Mar 09 13:34:58 crc kubenswrapper[4723]: I0309 13:34:58.169822 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-r6qpm" Mar 09 13:34:58 crc kubenswrapper[4723]: I0309 13:34:58.258733 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n66nj\" (UniqueName: \"kubernetes.io/projected/ac5a7be9-9a6b-4595-9cd9-ddec8641c533-kube-api-access-n66nj\") pod \"ac5a7be9-9a6b-4595-9cd9-ddec8641c533\" (UID: \"ac5a7be9-9a6b-4595-9cd9-ddec8641c533\") " Mar 09 13:34:58 crc kubenswrapper[4723]: I0309 13:34:58.258874 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ac5a7be9-9a6b-4595-9cd9-ddec8641c533-inventory-0\") pod \"ac5a7be9-9a6b-4595-9cd9-ddec8641c533\" (UID: \"ac5a7be9-9a6b-4595-9cd9-ddec8641c533\") " Mar 09 13:34:58 crc kubenswrapper[4723]: I0309 13:34:58.258959 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac5a7be9-9a6b-4595-9cd9-ddec8641c533-ssh-key-openstack-edpm-ipam\") pod \"ac5a7be9-9a6b-4595-9cd9-ddec8641c533\" (UID: \"ac5a7be9-9a6b-4595-9cd9-ddec8641c533\") " Mar 09 13:34:58 crc kubenswrapper[4723]: I0309 13:34:58.272092 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac5a7be9-9a6b-4595-9cd9-ddec8641c533-kube-api-access-n66nj" (OuterVolumeSpecName: "kube-api-access-n66nj") pod "ac5a7be9-9a6b-4595-9cd9-ddec8641c533" (UID: "ac5a7be9-9a6b-4595-9cd9-ddec8641c533"). InnerVolumeSpecName "kube-api-access-n66nj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:34:58 crc kubenswrapper[4723]: I0309 13:34:58.344424 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac5a7be9-9a6b-4595-9cd9-ddec8641c533-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ac5a7be9-9a6b-4595-9cd9-ddec8641c533" (UID: "ac5a7be9-9a6b-4595-9cd9-ddec8641c533"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:34:58 crc kubenswrapper[4723]: I0309 13:34:58.358086 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac5a7be9-9a6b-4595-9cd9-ddec8641c533-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "ac5a7be9-9a6b-4595-9cd9-ddec8641c533" (UID: "ac5a7be9-9a6b-4595-9cd9-ddec8641c533"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:34:58 crc kubenswrapper[4723]: I0309 13:34:58.366245 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n66nj\" (UniqueName: \"kubernetes.io/projected/ac5a7be9-9a6b-4595-9cd9-ddec8641c533-kube-api-access-n66nj\") on node \"crc\" DevicePath \"\"" Mar 09 13:34:58 crc kubenswrapper[4723]: I0309 13:34:58.366276 4723 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ac5a7be9-9a6b-4595-9cd9-ddec8641c533-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 09 13:34:58 crc kubenswrapper[4723]: I0309 13:34:58.366288 4723 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac5a7be9-9a6b-4595-9cd9-ddec8641c533-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:34:58 crc kubenswrapper[4723]: I0309 13:34:58.627539 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-r6qpm" event={"ID":"ac5a7be9-9a6b-4595-9cd9-ddec8641c533","Type":"ContainerDied","Data":"e22c25fb25b40a2916f6f148751ba0133e23b1adcefc001510228c5b055b8a61"} Mar 09 13:34:58 crc kubenswrapper[4723]: I0309 13:34:58.627581 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e22c25fb25b40a2916f6f148751ba0133e23b1adcefc001510228c5b055b8a61" Mar 09 13:34:58 crc kubenswrapper[4723]: I0309 13:34:58.627639 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-r6qpm" Mar 09 13:34:58 crc kubenswrapper[4723]: I0309 13:34:58.674974 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-tjglg"] Mar 09 13:34:58 crc kubenswrapper[4723]: E0309 13:34:58.675454 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac5a7be9-9a6b-4595-9cd9-ddec8641c533" containerName="ssh-known-hosts-edpm-deployment" Mar 09 13:34:58 crc kubenswrapper[4723]: I0309 13:34:58.675478 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac5a7be9-9a6b-4595-9cd9-ddec8641c533" containerName="ssh-known-hosts-edpm-deployment" Mar 09 13:34:58 crc kubenswrapper[4723]: I0309 13:34:58.675767 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac5a7be9-9a6b-4595-9cd9-ddec8641c533" containerName="ssh-known-hosts-edpm-deployment" Mar 09 13:34:58 crc kubenswrapper[4723]: I0309 13:34:58.676689 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tjglg" Mar 09 13:34:58 crc kubenswrapper[4723]: I0309 13:34:58.679190 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:34:58 crc kubenswrapper[4723]: I0309 13:34:58.679426 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gw7vt" Mar 09 13:34:58 crc kubenswrapper[4723]: I0309 13:34:58.679492 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:34:58 crc kubenswrapper[4723]: I0309 13:34:58.681381 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:34:58 crc kubenswrapper[4723]: I0309 13:34:58.692270 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-tjglg"] Mar 09 13:34:58 crc kubenswrapper[4723]: I0309 13:34:58.776592 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkb75\" (UniqueName: \"kubernetes.io/projected/59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8-kube-api-access-nkb75\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-tjglg\" (UID: \"59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tjglg" Mar 09 13:34:58 crc kubenswrapper[4723]: I0309 13:34:58.776723 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-tjglg\" (UID: \"59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tjglg" Mar 09 13:34:58 crc kubenswrapper[4723]: I0309 13:34:58.776755 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-tjglg\" (UID: \"59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tjglg" Mar 09 13:34:58 crc kubenswrapper[4723]: I0309 13:34:58.879401 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-tjglg\" (UID: \"59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tjglg" Mar 09 13:34:58 crc kubenswrapper[4723]: I0309 13:34:58.879450 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-tjglg\" (UID: \"59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tjglg" Mar 09 13:34:58 crc kubenswrapper[4723]: I0309 13:34:58.879620 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkb75\" (UniqueName: \"kubernetes.io/projected/59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8-kube-api-access-nkb75\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-tjglg\" (UID: \"59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tjglg" Mar 09 13:34:58 crc kubenswrapper[4723]: I0309 13:34:58.883337 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-tjglg\" (UID: \"59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tjglg" Mar 09 13:34:58 crc kubenswrapper[4723]: I0309 13:34:58.884077 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-tjglg\" (UID: \"59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tjglg" Mar 09 13:34:58 crc kubenswrapper[4723]: I0309 13:34:58.898974 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkb75\" (UniqueName: \"kubernetes.io/projected/59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8-kube-api-access-nkb75\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-tjglg\" (UID: \"59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tjglg" Mar 09 13:34:59 crc kubenswrapper[4723]: I0309 13:34:59.016382 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tjglg" Mar 09 13:34:59 crc kubenswrapper[4723]: I0309 13:34:59.625465 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-tjglg"] Mar 09 13:34:59 crc kubenswrapper[4723]: I0309 13:34:59.634351 4723 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 13:34:59 crc kubenswrapper[4723]: I0309 13:34:59.640557 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tjglg" event={"ID":"59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8","Type":"ContainerStarted","Data":"f1b6e25e96a026fac096e674ac23dea4490c468ff89eaaf1ba3532efabc376ca"} Mar 09 13:35:00 crc kubenswrapper[4723]: I0309 13:35:00.653578 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tjglg" event={"ID":"59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8","Type":"ContainerStarted","Data":"8468a4415f8bf06be6bc630e61db7def1ddf42ee0921663264b48f5c7cb18625"} Mar 09 13:35:00 crc kubenswrapper[4723]: I0309 13:35:00.675896 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tjglg" podStartSLOduration=2.145034816 podStartE2EDuration="2.675876735s" podCreationTimestamp="2026-03-09 13:34:58 +0000 UTC" firstStartedPulling="2026-03-09 13:34:59.634086017 +0000 UTC m=+2173.648553557" lastFinishedPulling="2026-03-09 13:35:00.164927936 +0000 UTC m=+2174.179395476" observedRunningTime="2026-03-09 13:35:00.668819637 +0000 UTC m=+2174.683287187" watchObservedRunningTime="2026-03-09 13:35:00.675876735 +0000 UTC m=+2174.690344285" Mar 09 13:35:07 crc kubenswrapper[4723]: I0309 13:35:07.781294 4723 generic.go:334] "Generic (PLEG): container finished" podID="59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8" containerID="8468a4415f8bf06be6bc630e61db7def1ddf42ee0921663264b48f5c7cb18625" exitCode=0 Mar 09 13:35:07 crc kubenswrapper[4723]: I0309 13:35:07.781379 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tjglg" event={"ID":"59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8","Type":"ContainerDied","Data":"8468a4415f8bf06be6bc630e61db7def1ddf42ee0921663264b48f5c7cb18625"} Mar 09 13:35:09 crc kubenswrapper[4723]: I0309 13:35:09.349244 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tjglg" Mar 09 13:35:09 crc kubenswrapper[4723]: I0309 13:35:09.441755 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkb75\" (UniqueName: \"kubernetes.io/projected/59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8-kube-api-access-nkb75\") pod \"59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8\" (UID: \"59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8\") " Mar 09 13:35:09 crc kubenswrapper[4723]: I0309 13:35:09.441970 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8-inventory\") pod \"59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8\" (UID: \"59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8\") " Mar 09 13:35:09 crc kubenswrapper[4723]: I0309 13:35:09.442182 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8-ssh-key-openstack-edpm-ipam\") pod \"59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8\" (UID: \"59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8\") " Mar 09 13:35:09 crc kubenswrapper[4723]: I0309 13:35:09.452020 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8-kube-api-access-nkb75" (OuterVolumeSpecName: "kube-api-access-nkb75") pod "59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8" (UID: "59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8"). InnerVolumeSpecName "kube-api-access-nkb75". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:35:09 crc kubenswrapper[4723]: I0309 13:35:09.494746 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8-inventory" (OuterVolumeSpecName: "inventory") pod "59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8" (UID: "59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:35:09 crc kubenswrapper[4723]: I0309 13:35:09.540067 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8" (UID: "59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:35:09 crc kubenswrapper[4723]: I0309 13:35:09.546264 4723 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:35:09 crc kubenswrapper[4723]: I0309 13:35:09.546558 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkb75\" (UniqueName: \"kubernetes.io/projected/59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8-kube-api-access-nkb75\") on node \"crc\" DevicePath \"\"" Mar 09 13:35:09 crc kubenswrapper[4723]: I0309 13:35:09.546665 4723 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:35:09 crc kubenswrapper[4723]: I0309 13:35:09.806681 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tjglg" event={"ID":"59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8","Type":"ContainerDied","Data":"f1b6e25e96a026fac096e674ac23dea4490c468ff89eaaf1ba3532efabc376ca"} Mar 09 13:35:09 crc kubenswrapper[4723]: I0309 13:35:09.807149 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1b6e25e96a026fac096e674ac23dea4490c468ff89eaaf1ba3532efabc376ca" Mar 09 13:35:09 crc kubenswrapper[4723]: I0309 13:35:09.806760 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tjglg" Mar 09 13:35:09 crc kubenswrapper[4723]: I0309 13:35:09.944447 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-flc4d"] Mar 09 13:35:09 crc kubenswrapper[4723]: E0309 13:35:09.944991 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:35:09 crc kubenswrapper[4723]: I0309 13:35:09.945008 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:35:09 crc kubenswrapper[4723]: I0309 13:35:09.945221 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:35:09 crc kubenswrapper[4723]: I0309 13:35:09.946124 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-flc4d" Mar 09 13:35:09 crc kubenswrapper[4723]: I0309 13:35:09.947993 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gw7vt" Mar 09 13:35:09 crc kubenswrapper[4723]: I0309 13:35:09.948631 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:35:09 crc kubenswrapper[4723]: I0309 13:35:09.948696 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:35:09 crc kubenswrapper[4723]: I0309 13:35:09.948906 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:35:09 crc kubenswrapper[4723]: I0309 13:35:09.971945 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-flc4d"] Mar 09 13:35:10 crc kubenswrapper[4723]: I0309 13:35:10.059061 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2b7aa25e-6eb5-4559-a5e9-ae08b27093ef-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-flc4d\" (UID: \"2b7aa25e-6eb5-4559-a5e9-ae08b27093ef\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-flc4d" Mar 09 13:35:10 crc kubenswrapper[4723]: I0309 13:35:10.059348 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq5pz\" (UniqueName: \"kubernetes.io/projected/2b7aa25e-6eb5-4559-a5e9-ae08b27093ef-kube-api-access-qq5pz\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-flc4d\" (UID: \"2b7aa25e-6eb5-4559-a5e9-ae08b27093ef\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-flc4d" Mar 09 13:35:10 crc kubenswrapper[4723]: I0309 13:35:10.059379 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b7aa25e-6eb5-4559-a5e9-ae08b27093ef-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-flc4d\" (UID: \"2b7aa25e-6eb5-4559-a5e9-ae08b27093ef\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-flc4d" Mar 09 13:35:10 crc kubenswrapper[4723]: I0309 13:35:10.162252 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq5pz\" (UniqueName: \"kubernetes.io/projected/2b7aa25e-6eb5-4559-a5e9-ae08b27093ef-kube-api-access-qq5pz\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-flc4d\" (UID: \"2b7aa25e-6eb5-4559-a5e9-ae08b27093ef\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-flc4d" Mar 09 13:35:10 crc kubenswrapper[4723]: I0309 13:35:10.162328 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b7aa25e-6eb5-4559-a5e9-ae08b27093ef-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-flc4d\" (UID: \"2b7aa25e-6eb5-4559-a5e9-ae08b27093ef\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-flc4d" Mar 09 13:35:10 crc kubenswrapper[4723]: I0309 13:35:10.162399 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2b7aa25e-6eb5-4559-a5e9-ae08b27093ef-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-flc4d\" (UID: \"2b7aa25e-6eb5-4559-a5e9-ae08b27093ef\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-flc4d" Mar 09 13:35:10 crc kubenswrapper[4723]: I0309 13:35:10.166574 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2b7aa25e-6eb5-4559-a5e9-ae08b27093ef-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-flc4d\" (UID: \"2b7aa25e-6eb5-4559-a5e9-ae08b27093ef\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-flc4d" Mar 09 13:35:10 crc kubenswrapper[4723]: I0309 13:35:10.170764 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b7aa25e-6eb5-4559-a5e9-ae08b27093ef-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-flc4d\" (UID: \"2b7aa25e-6eb5-4559-a5e9-ae08b27093ef\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-flc4d" Mar 09 13:35:10 crc kubenswrapper[4723]: I0309 13:35:10.179177 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq5pz\" (UniqueName: \"kubernetes.io/projected/2b7aa25e-6eb5-4559-a5e9-ae08b27093ef-kube-api-access-qq5pz\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-flc4d\" (UID: \"2b7aa25e-6eb5-4559-a5e9-ae08b27093ef\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-flc4d" Mar 09 13:35:10 crc kubenswrapper[4723]: I0309 13:35:10.277185 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-flc4d" Mar 09 13:35:10 crc kubenswrapper[4723]: I0309 13:35:10.822695 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-flc4d"] Mar 09 13:35:11 crc kubenswrapper[4723]: I0309 13:35:11.829835 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-flc4d" event={"ID":"2b7aa25e-6eb5-4559-a5e9-ae08b27093ef","Type":"ContainerStarted","Data":"a3c1642e75c50ce7d46585c940170845d7ca2d3ae8344b8be52bd28a48bd6852"} Mar 09 13:35:11 crc kubenswrapper[4723]: I0309 13:35:11.830208 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-flc4d" event={"ID":"2b7aa25e-6eb5-4559-a5e9-ae08b27093ef","Type":"ContainerStarted","Data":"3ced70e3e5d586d4b3d64a4fc63b2ceadc590a00122359d46c3a91c0fe598070"} Mar 09 13:35:11 crc kubenswrapper[4723]: I0309 13:35:11.849014 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-flc4d" podStartSLOduration=2.364202981 podStartE2EDuration="2.848994962s" podCreationTimestamp="2026-03-09 13:35:09 +0000 UTC" firstStartedPulling="2026-03-09 13:35:10.831696617 +0000 UTC m=+2184.846164157" lastFinishedPulling="2026-03-09 13:35:11.316488588 +0000 UTC m=+2185.330956138" observedRunningTime="2026-03-09 13:35:11.846811784 +0000 UTC m=+2185.861279324" watchObservedRunningTime="2026-03-09 13:35:11.848994962 +0000 UTC m=+2185.863462502" Mar 09 13:35:20 crc kubenswrapper[4723]: I0309 13:35:20.921033 4723 generic.go:334] "Generic (PLEG): container finished" podID="2b7aa25e-6eb5-4559-a5e9-ae08b27093ef" containerID="a3c1642e75c50ce7d46585c940170845d7ca2d3ae8344b8be52bd28a48bd6852" exitCode=0 Mar 09 13:35:20 crc kubenswrapper[4723]: I0309 13:35:20.921098 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-flc4d" event={"ID":"2b7aa25e-6eb5-4559-a5e9-ae08b27093ef","Type":"ContainerDied","Data":"a3c1642e75c50ce7d46585c940170845d7ca2d3ae8344b8be52bd28a48bd6852"} Mar 09 13:35:22 crc kubenswrapper[4723]: I0309 13:35:22.446659 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-flc4d" Mar 09 13:35:22 crc kubenswrapper[4723]: I0309 13:35:22.569541 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qq5pz\" (UniqueName: \"kubernetes.io/projected/2b7aa25e-6eb5-4559-a5e9-ae08b27093ef-kube-api-access-qq5pz\") pod \"2b7aa25e-6eb5-4559-a5e9-ae08b27093ef\" (UID: \"2b7aa25e-6eb5-4559-a5e9-ae08b27093ef\") " Mar 09 13:35:22 crc kubenswrapper[4723]: I0309 13:35:22.569874 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2b7aa25e-6eb5-4559-a5e9-ae08b27093ef-ssh-key-openstack-edpm-ipam\") pod \"2b7aa25e-6eb5-4559-a5e9-ae08b27093ef\" (UID: \"2b7aa25e-6eb5-4559-a5e9-ae08b27093ef\") " Mar 09 13:35:22 crc kubenswrapper[4723]: I0309 13:35:22.569933 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b7aa25e-6eb5-4559-a5e9-ae08b27093ef-inventory\") pod \"2b7aa25e-6eb5-4559-a5e9-ae08b27093ef\" (UID: \"2b7aa25e-6eb5-4559-a5e9-ae08b27093ef\") " Mar 09 13:35:22 crc kubenswrapper[4723]: I0309 13:35:22.574969 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b7aa25e-6eb5-4559-a5e9-ae08b27093ef-kube-api-access-qq5pz" (OuterVolumeSpecName: "kube-api-access-qq5pz") pod "2b7aa25e-6eb5-4559-a5e9-ae08b27093ef" (UID: "2b7aa25e-6eb5-4559-a5e9-ae08b27093ef"). InnerVolumeSpecName "kube-api-access-qq5pz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:35:22 crc kubenswrapper[4723]: I0309 13:35:22.607498 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b7aa25e-6eb5-4559-a5e9-ae08b27093ef-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2b7aa25e-6eb5-4559-a5e9-ae08b27093ef" (UID: "2b7aa25e-6eb5-4559-a5e9-ae08b27093ef"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:35:22 crc kubenswrapper[4723]: I0309 13:35:22.618606 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b7aa25e-6eb5-4559-a5e9-ae08b27093ef-inventory" (OuterVolumeSpecName: "inventory") pod "2b7aa25e-6eb5-4559-a5e9-ae08b27093ef" (UID: "2b7aa25e-6eb5-4559-a5e9-ae08b27093ef"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:35:22 crc kubenswrapper[4723]: I0309 13:35:22.672980 4723 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2b7aa25e-6eb5-4559-a5e9-ae08b27093ef-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:35:22 crc kubenswrapper[4723]: I0309 13:35:22.673027 4723 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b7aa25e-6eb5-4559-a5e9-ae08b27093ef-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:35:22 crc kubenswrapper[4723]: I0309 13:35:22.673040 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qq5pz\" (UniqueName: \"kubernetes.io/projected/2b7aa25e-6eb5-4559-a5e9-ae08b27093ef-kube-api-access-qq5pz\") on node \"crc\" DevicePath \"\"" Mar 09 13:35:22 crc kubenswrapper[4723]: I0309 13:35:22.946378 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-flc4d" event={"ID":"2b7aa25e-6eb5-4559-a5e9-ae08b27093ef","Type":"ContainerDied","Data":"3ced70e3e5d586d4b3d64a4fc63b2ceadc590a00122359d46c3a91c0fe598070"} Mar 09 13:35:22 crc kubenswrapper[4723]: I0309 13:35:22.946688 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ced70e3e5d586d4b3d64a4fc63b2ceadc590a00122359d46c3a91c0fe598070" Mar 09 13:35:22 crc kubenswrapper[4723]: I0309 13:35:22.946450 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-flc4d" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.036098 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh"] Mar 09 13:35:23 crc kubenswrapper[4723]: E0309 13:35:23.036672 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b7aa25e-6eb5-4559-a5e9-ae08b27093ef" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.036699 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b7aa25e-6eb5-4559-a5e9-ae08b27093ef" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.036974 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b7aa25e-6eb5-4559-a5e9-ae08b27093ef" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.037763 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.040966 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.041044 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gw7vt" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.041332 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.041414 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.041769 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.041795 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.041816 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.041915 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.042192 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.066542 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh"] Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.184190 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcq9n\" (UniqueName: \"kubernetes.io/projected/746fd331-118e-4649-bd60-d93fa3cb0f12-kube-api-access-wcq9n\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.184289 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/746fd331-118e-4649-bd60-d93fa3cb0f12-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.184340 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/746fd331-118e-4649-bd60-d93fa3cb0f12-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.184381 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.184423 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.184463 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.184491 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.184522 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/746fd331-118e-4649-bd60-d93fa3cb0f12-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.184577 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.184628 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.184675 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.184709 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.184741 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.184767 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/746fd331-118e-4649-bd60-d93fa3cb0f12-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.184810 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.184895 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/746fd331-118e-4649-bd60-d93fa3cb0f12-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.288180 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.288256 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.288289 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.288326 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.288345 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/746fd331-118e-4649-bd60-d93fa3cb0f12-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.288379 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.288431 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/746fd331-118e-4649-bd60-d93fa3cb0f12-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.293034 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcq9n\" (UniqueName: \"kubernetes.io/projected/746fd331-118e-4649-bd60-d93fa3cb0f12-kube-api-access-wcq9n\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.293241 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/746fd331-118e-4649-bd60-d93fa3cb0f12-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.293326 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/746fd331-118e-4649-bd60-d93fa3cb0f12-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.293422 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.293508 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.293592 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.293642 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.293694 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/746fd331-118e-4649-bd60-d93fa3cb0f12-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.293773 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.299488 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.300664 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.305916 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.308902 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/746fd331-118e-4649-bd60-d93fa3cb0f12-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.309186 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/746fd331-118e-4649-bd60-d93fa3cb0f12-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.311140 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/746fd331-118e-4649-bd60-d93fa3cb0f12-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.312312 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.312486 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcq9n\" (UniqueName: \"kubernetes.io/projected/746fd331-118e-4649-bd60-d93fa3cb0f12-kube-api-access-wcq9n\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.312637 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.312734 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/746fd331-118e-4649-bd60-d93fa3cb0f12-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.313262 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/746fd331-118e-4649-bd60-d93fa3cb0f12-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.313457 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.314450 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.314639 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.315172 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.315396 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.357794 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.925178 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh"] Mar 09 13:35:23 crc kubenswrapper[4723]: W0309 13:35:23.930125 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod746fd331_118e_4649_bd60_d93fa3cb0f12.slice/crio-5a45c69e6195d7ab94926b6bf09a7939913f68a3a1a6faee55d8284234bf1075 WatchSource:0}: Error finding container 5a45c69e6195d7ab94926b6bf09a7939913f68a3a1a6faee55d8284234bf1075: Status 404 returned error can't find the container with id 5a45c69e6195d7ab94926b6bf09a7939913f68a3a1a6faee55d8284234bf1075 Mar 09 13:35:23 crc kubenswrapper[4723]: I0309 13:35:23.957816 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" event={"ID":"746fd331-118e-4649-bd60-d93fa3cb0f12","Type":"ContainerStarted","Data":"5a45c69e6195d7ab94926b6bf09a7939913f68a3a1a6faee55d8284234bf1075"} Mar 09 13:35:24 crc kubenswrapper[4723]: I0309 13:35:24.970146 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" event={"ID":"746fd331-118e-4649-bd60-d93fa3cb0f12","Type":"ContainerStarted","Data":"6ca9c8968d469182101db8cc1fffbace437dd5ae32358376d8a9806a8de0c8d1"} Mar 09 13:35:25 crc kubenswrapper[4723]: I0309 13:35:25.012824 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" podStartSLOduration=1.436886898 podStartE2EDuration="2.012801848s" podCreationTimestamp="2026-03-09 13:35:23 +0000 UTC" firstStartedPulling="2026-03-09 13:35:23.933132841 +0000 UTC m=+2197.947600391" lastFinishedPulling="2026-03-09 13:35:24.509047791 +0000 UTC m=+2198.523515341" observedRunningTime="2026-03-09 13:35:25.004302832 +0000 UTC m=+2199.018770382" watchObservedRunningTime="2026-03-09 13:35:25.012801848 +0000 UTC m=+2199.027269388" Mar 09 13:35:39 crc kubenswrapper[4723]: I0309 13:35:39.058375 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-flx99"] Mar 09 13:35:39 crc kubenswrapper[4723]: I0309 13:35:39.068779 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-flx99"] Mar 09 13:35:40 crc kubenswrapper[4723]: I0309 13:35:40.895228 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09a0d87c-3e58-4839-95fb-7c964152ed7c" path="/var/lib/kubelet/pods/09a0d87c-3e58-4839-95fb-7c964152ed7c/volumes" Mar 09 13:36:00 crc kubenswrapper[4723]: I0309 13:36:00.149647 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551056-rsmtp"] Mar 09 13:36:00 crc kubenswrapper[4723]: I0309 13:36:00.152575 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551056-rsmtp" Mar 09 13:36:00 crc kubenswrapper[4723]: I0309 13:36:00.155129 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jrp4x" Mar 09 13:36:00 crc kubenswrapper[4723]: I0309 13:36:00.155315 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:36:00 crc kubenswrapper[4723]: I0309 13:36:00.155523 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:36:00 crc kubenswrapper[4723]: I0309 13:36:00.174422 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551056-rsmtp"] Mar 09 13:36:00 crc kubenswrapper[4723]: I0309 13:36:00.293290 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82hs8\" (UniqueName: \"kubernetes.io/projected/d23ab36b-5283-431a-a5e3-b909c9ff2918-kube-api-access-82hs8\") pod \"auto-csr-approver-29551056-rsmtp\" (UID: \"d23ab36b-5283-431a-a5e3-b909c9ff2918\") " pod="openshift-infra/auto-csr-approver-29551056-rsmtp" Mar 09 13:36:00 crc kubenswrapper[4723]: I0309 13:36:00.395290 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82hs8\" (UniqueName: \"kubernetes.io/projected/d23ab36b-5283-431a-a5e3-b909c9ff2918-kube-api-access-82hs8\") pod \"auto-csr-approver-29551056-rsmtp\" (UID: \"d23ab36b-5283-431a-a5e3-b909c9ff2918\") " pod="openshift-infra/auto-csr-approver-29551056-rsmtp" Mar 09 13:36:00 crc kubenswrapper[4723]: I0309 13:36:00.415823 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82hs8\" (UniqueName: \"kubernetes.io/projected/d23ab36b-5283-431a-a5e3-b909c9ff2918-kube-api-access-82hs8\") pod \"auto-csr-approver-29551056-rsmtp\" (UID: \"d23ab36b-5283-431a-a5e3-b909c9ff2918\") " pod="openshift-infra/auto-csr-approver-29551056-rsmtp" Mar 09 13:36:00 crc kubenswrapper[4723]: I0309 13:36:00.503024 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551056-rsmtp" Mar 09 13:36:00 crc kubenswrapper[4723]: I0309 13:36:00.975807 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551056-rsmtp"] Mar 09 13:36:01 crc kubenswrapper[4723]: I0309 13:36:01.362158 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551056-rsmtp" event={"ID":"d23ab36b-5283-431a-a5e3-b909c9ff2918","Type":"ContainerStarted","Data":"30264cda905209caa8635a211d66f970894ba44d04d96739ab253ef6f76962cb"} Mar 09 13:36:03 crc kubenswrapper[4723]: I0309 13:36:03.396208 4723 generic.go:334] "Generic (PLEG): container finished" podID="d23ab36b-5283-431a-a5e3-b909c9ff2918" containerID="f253f31fb5be5fdf20bd5874577193a4b52ac11e5afaeb5994228e3aebd0e86a" exitCode=0 Mar 09 13:36:03 crc kubenswrapper[4723]: I0309 13:36:03.396295 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551056-rsmtp" event={"ID":"d23ab36b-5283-431a-a5e3-b909c9ff2918","Type":"ContainerDied","Data":"f253f31fb5be5fdf20bd5874577193a4b52ac11e5afaeb5994228e3aebd0e86a"} Mar 09 13:36:03 crc kubenswrapper[4723]: I0309 13:36:03.947190 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:36:03 crc kubenswrapper[4723]: I0309 13:36:03.947515 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:36:04 crc kubenswrapper[4723]: I0309 13:36:04.908842 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551056-rsmtp" Mar 09 13:36:05 crc kubenswrapper[4723]: I0309 13:36:05.025210 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82hs8\" (UniqueName: \"kubernetes.io/projected/d23ab36b-5283-431a-a5e3-b909c9ff2918-kube-api-access-82hs8\") pod \"d23ab36b-5283-431a-a5e3-b909c9ff2918\" (UID: \"d23ab36b-5283-431a-a5e3-b909c9ff2918\") " Mar 09 13:36:05 crc kubenswrapper[4723]: I0309 13:36:05.032250 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d23ab36b-5283-431a-a5e3-b909c9ff2918-kube-api-access-82hs8" (OuterVolumeSpecName: "kube-api-access-82hs8") pod "d23ab36b-5283-431a-a5e3-b909c9ff2918" (UID: "d23ab36b-5283-431a-a5e3-b909c9ff2918"). InnerVolumeSpecName "kube-api-access-82hs8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:36:05 crc kubenswrapper[4723]: I0309 13:36:05.127940 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82hs8\" (UniqueName: \"kubernetes.io/projected/d23ab36b-5283-431a-a5e3-b909c9ff2918-kube-api-access-82hs8\") on node \"crc\" DevicePath \"\"" Mar 09 13:36:05 crc kubenswrapper[4723]: I0309 13:36:05.421234 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551056-rsmtp" Mar 09 13:36:05 crc kubenswrapper[4723]: I0309 13:36:05.421233 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551056-rsmtp" event={"ID":"d23ab36b-5283-431a-a5e3-b909c9ff2918","Type":"ContainerDied","Data":"30264cda905209caa8635a211d66f970894ba44d04d96739ab253ef6f76962cb"} Mar 09 13:36:05 crc kubenswrapper[4723]: I0309 13:36:05.421368 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30264cda905209caa8635a211d66f970894ba44d04d96739ab253ef6f76962cb" Mar 09 13:36:05 crc kubenswrapper[4723]: I0309 13:36:05.423255 4723 generic.go:334] "Generic (PLEG): container finished" podID="746fd331-118e-4649-bd60-d93fa3cb0f12" containerID="6ca9c8968d469182101db8cc1fffbace437dd5ae32358376d8a9806a8de0c8d1" exitCode=0 Mar 09 13:36:05 crc kubenswrapper[4723]: I0309 13:36:05.423307 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" event={"ID":"746fd331-118e-4649-bd60-d93fa3cb0f12","Type":"ContainerDied","Data":"6ca9c8968d469182101db8cc1fffbace437dd5ae32358376d8a9806a8de0c8d1"} Mar 09 13:36:05 crc kubenswrapper[4723]: I0309 13:36:05.989793 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551050-dct54"] Mar 09 13:36:06 crc kubenswrapper[4723]: I0309 13:36:06.006639 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551050-dct54"] Mar 09 13:36:06 crc kubenswrapper[4723]: I0309 13:36:06.893078 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aed8e659-d57f-41bb-a6a5-ce888991208a" path="/var/lib/kubelet/pods/aed8e659-d57f-41bb-a6a5-ce888991208a/volumes" Mar 09 13:36:06 crc kubenswrapper[4723]: I0309 13:36:06.928473 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.079354 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-ssh-key-openstack-edpm-ipam\") pod \"746fd331-118e-4649-bd60-d93fa3cb0f12\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.079420 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/746fd331-118e-4649-bd60-d93fa3cb0f12-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"746fd331-118e-4649-bd60-d93fa3cb0f12\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.079457 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/746fd331-118e-4649-bd60-d93fa3cb0f12-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"746fd331-118e-4649-bd60-d93fa3cb0f12\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.079484 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-telemetry-power-monitoring-combined-ca-bundle\") pod \"746fd331-118e-4649-bd60-d93fa3cb0f12\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.079508 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-bootstrap-combined-ca-bundle\") pod \"746fd331-118e-4649-bd60-d93fa3cb0f12\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.079532 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/746fd331-118e-4649-bd60-d93fa3cb0f12-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"746fd331-118e-4649-bd60-d93fa3cb0f12\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.079599 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-nova-combined-ca-bundle\") pod \"746fd331-118e-4649-bd60-d93fa3cb0f12\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.079662 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-telemetry-combined-ca-bundle\") pod \"746fd331-118e-4649-bd60-d93fa3cb0f12\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.080695 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-ovn-combined-ca-bundle\") pod \"746fd331-118e-4649-bd60-d93fa3cb0f12\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.080747 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/746fd331-118e-4649-bd60-d93fa3cb0f12-openstack-edpm-ipam-ovn-default-certs-0\") pod \"746fd331-118e-4649-bd60-d93fa3cb0f12\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.080790 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-libvirt-combined-ca-bundle\") pod \"746fd331-118e-4649-bd60-d93fa3cb0f12\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.080818 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-inventory\") pod \"746fd331-118e-4649-bd60-d93fa3cb0f12\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.080908 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/746fd331-118e-4649-bd60-d93fa3cb0f12-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"746fd331-118e-4649-bd60-d93fa3cb0f12\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.081071 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcq9n\" (UniqueName: \"kubernetes.io/projected/746fd331-118e-4649-bd60-d93fa3cb0f12-kube-api-access-wcq9n\") pod \"746fd331-118e-4649-bd60-d93fa3cb0f12\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.081098 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-neutron-metadata-combined-ca-bundle\") pod \"746fd331-118e-4649-bd60-d93fa3cb0f12\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.081123 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-repo-setup-combined-ca-bundle\") pod \"746fd331-118e-4649-bd60-d93fa3cb0f12\" (UID: \"746fd331-118e-4649-bd60-d93fa3cb0f12\") " Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.087620 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/746fd331-118e-4649-bd60-d93fa3cb0f12-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "746fd331-118e-4649-bd60-d93fa3cb0f12" (UID: "746fd331-118e-4649-bd60-d93fa3cb0f12"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.087706 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "746fd331-118e-4649-bd60-d93fa3cb0f12" (UID: "746fd331-118e-4649-bd60-d93fa3cb0f12"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.088247 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "746fd331-118e-4649-bd60-d93fa3cb0f12" (UID: "746fd331-118e-4649-bd60-d93fa3cb0f12"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.088298 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/746fd331-118e-4649-bd60-d93fa3cb0f12-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "746fd331-118e-4649-bd60-d93fa3cb0f12" (UID: "746fd331-118e-4649-bd60-d93fa3cb0f12"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.088458 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "746fd331-118e-4649-bd60-d93fa3cb0f12" (UID: "746fd331-118e-4649-bd60-d93fa3cb0f12"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.088531 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "746fd331-118e-4649-bd60-d93fa3cb0f12" (UID: "746fd331-118e-4649-bd60-d93fa3cb0f12"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.090670 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "746fd331-118e-4649-bd60-d93fa3cb0f12" (UID: "746fd331-118e-4649-bd60-d93fa3cb0f12"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.092773 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "746fd331-118e-4649-bd60-d93fa3cb0f12" (UID: "746fd331-118e-4649-bd60-d93fa3cb0f12"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.094351 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/746fd331-118e-4649-bd60-d93fa3cb0f12-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "746fd331-118e-4649-bd60-d93fa3cb0f12" (UID: "746fd331-118e-4649-bd60-d93fa3cb0f12"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.095087 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/746fd331-118e-4649-bd60-d93fa3cb0f12-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0") pod "746fd331-118e-4649-bd60-d93fa3cb0f12" (UID: "746fd331-118e-4649-bd60-d93fa3cb0f12"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.095688 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "746fd331-118e-4649-bd60-d93fa3cb0f12" (UID: "746fd331-118e-4649-bd60-d93fa3cb0f12"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.096377 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/746fd331-118e-4649-bd60-d93fa3cb0f12-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "746fd331-118e-4649-bd60-d93fa3cb0f12" (UID: "746fd331-118e-4649-bd60-d93fa3cb0f12"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.098111 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/746fd331-118e-4649-bd60-d93fa3cb0f12-kube-api-access-wcq9n" (OuterVolumeSpecName: "kube-api-access-wcq9n") pod "746fd331-118e-4649-bd60-d93fa3cb0f12" (UID: "746fd331-118e-4649-bd60-d93fa3cb0f12"). InnerVolumeSpecName "kube-api-access-wcq9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.099082 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "746fd331-118e-4649-bd60-d93fa3cb0f12" (UID: "746fd331-118e-4649-bd60-d93fa3cb0f12"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.122440 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "746fd331-118e-4649-bd60-d93fa3cb0f12" (UID: "746fd331-118e-4649-bd60-d93fa3cb0f12"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.131259 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-inventory" (OuterVolumeSpecName: "inventory") pod "746fd331-118e-4649-bd60-d93fa3cb0f12" (UID: "746fd331-118e-4649-bd60-d93fa3cb0f12"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.185936 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcq9n\" (UniqueName: \"kubernetes.io/projected/746fd331-118e-4649-bd60-d93fa3cb0f12-kube-api-access-wcq9n\") on node \"crc\" DevicePath \"\"" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.185973 4723 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.185988 4723 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.185998 4723 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.186009 4723 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/746fd331-118e-4649-bd60-d93fa3cb0f12-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.186020 4723 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/746fd331-118e-4649-bd60-d93fa3cb0f12-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.186029 4723 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.186038 4723 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.186048 4723 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/746fd331-118e-4649-bd60-d93fa3cb0f12-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.186058 4723 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.186067 4723 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.186084 4723 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.186092 4723 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/746fd331-118e-4649-bd60-d93fa3cb0f12-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.186102 4723 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.186113 4723 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/746fd331-118e-4649-bd60-d93fa3cb0f12-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.186123 4723 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/746fd331-118e-4649-bd60-d93fa3cb0f12-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.450540 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" event={"ID":"746fd331-118e-4649-bd60-d93fa3cb0f12","Type":"ContainerDied","Data":"5a45c69e6195d7ab94926b6bf09a7939913f68a3a1a6faee55d8284234bf1075"} Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.450589 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a45c69e6195d7ab94926b6bf09a7939913f68a3a1a6faee55d8284234bf1075" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.450596 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.572214 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-b8zjw"] Mar 09 13:36:07 crc kubenswrapper[4723]: E0309 13:36:07.572882 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="746fd331-118e-4649-bd60-d93fa3cb0f12" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.572905 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="746fd331-118e-4649-bd60-d93fa3cb0f12" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 09 13:36:07 crc kubenswrapper[4723]: E0309 13:36:07.572935 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d23ab36b-5283-431a-a5e3-b909c9ff2918" containerName="oc" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.572946 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="d23ab36b-5283-431a-a5e3-b909c9ff2918" containerName="oc" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.573217 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="746fd331-118e-4649-bd60-d93fa3cb0f12" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.573245 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="d23ab36b-5283-431a-a5e3-b909c9ff2918" containerName="oc" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.574189 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b8zjw" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.576366 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.576547 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.576712 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.576828 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.577240 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gw7vt" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.583612 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-b8zjw"] Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.618616 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/940da565-ce26-42f3-a35d-dbaa3efd8521-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-b8zjw\" (UID: \"940da565-ce26-42f3-a35d-dbaa3efd8521\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b8zjw" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.618881 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/940da565-ce26-42f3-a35d-dbaa3efd8521-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-b8zjw\" (UID: \"940da565-ce26-42f3-a35d-dbaa3efd8521\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b8zjw" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.619019 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/940da565-ce26-42f3-a35d-dbaa3efd8521-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-b8zjw\" (UID: \"940da565-ce26-42f3-a35d-dbaa3efd8521\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b8zjw" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.619099 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntgkq\" (UniqueName: \"kubernetes.io/projected/940da565-ce26-42f3-a35d-dbaa3efd8521-kube-api-access-ntgkq\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-b8zjw\" (UID: \"940da565-ce26-42f3-a35d-dbaa3efd8521\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b8zjw" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.619180 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/940da565-ce26-42f3-a35d-dbaa3efd8521-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-b8zjw\" (UID: \"940da565-ce26-42f3-a35d-dbaa3efd8521\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b8zjw" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.721471 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/940da565-ce26-42f3-a35d-dbaa3efd8521-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-b8zjw\" (UID: \"940da565-ce26-42f3-a35d-dbaa3efd8521\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b8zjw" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.721966 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/940da565-ce26-42f3-a35d-dbaa3efd8521-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-b8zjw\" (UID: \"940da565-ce26-42f3-a35d-dbaa3efd8521\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b8zjw" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.722052 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/940da565-ce26-42f3-a35d-dbaa3efd8521-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-b8zjw\" (UID: \"940da565-ce26-42f3-a35d-dbaa3efd8521\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b8zjw" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.722084 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntgkq\" (UniqueName: \"kubernetes.io/projected/940da565-ce26-42f3-a35d-dbaa3efd8521-kube-api-access-ntgkq\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-b8zjw\" (UID: \"940da565-ce26-42f3-a35d-dbaa3efd8521\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b8zjw" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.722118 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/940da565-ce26-42f3-a35d-dbaa3efd8521-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-b8zjw\" (UID: \"940da565-ce26-42f3-a35d-dbaa3efd8521\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b8zjw" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.722844 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/940da565-ce26-42f3-a35d-dbaa3efd8521-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-b8zjw\" (UID: \"940da565-ce26-42f3-a35d-dbaa3efd8521\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b8zjw" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.726106 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/940da565-ce26-42f3-a35d-dbaa3efd8521-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-b8zjw\" (UID: \"940da565-ce26-42f3-a35d-dbaa3efd8521\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b8zjw" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.726141 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/940da565-ce26-42f3-a35d-dbaa3efd8521-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-b8zjw\" (UID: \"940da565-ce26-42f3-a35d-dbaa3efd8521\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b8zjw" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.727147 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/940da565-ce26-42f3-a35d-dbaa3efd8521-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-b8zjw\" (UID: \"940da565-ce26-42f3-a35d-dbaa3efd8521\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b8zjw" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.737828 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntgkq\" (UniqueName: \"kubernetes.io/projected/940da565-ce26-42f3-a35d-dbaa3efd8521-kube-api-access-ntgkq\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-b8zjw\" (UID: \"940da565-ce26-42f3-a35d-dbaa3efd8521\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b8zjw" Mar 09 13:36:07 crc kubenswrapper[4723]: I0309 13:36:07.942369 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b8zjw" Mar 09 13:36:08 crc kubenswrapper[4723]: I0309 13:36:08.498620 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-b8zjw"] Mar 09 13:36:09 crc kubenswrapper[4723]: I0309 13:36:09.476889 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b8zjw" event={"ID":"940da565-ce26-42f3-a35d-dbaa3efd8521","Type":"ContainerStarted","Data":"8fca6b986936c22493e8d65ad53f7aa9c0c53cb6ad98a6433c32bf028e65638d"} Mar 09 13:36:09 crc kubenswrapper[4723]: I0309 13:36:09.477313 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b8zjw" event={"ID":"940da565-ce26-42f3-a35d-dbaa3efd8521","Type":"ContainerStarted","Data":"b3fafbf5967618a6f7cfd3437abf0147433161f76b5e3d2a59a76f3a59fbc9de"} Mar 09 13:36:09 crc kubenswrapper[4723]: I0309 13:36:09.502621 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b8zjw" podStartSLOduration=1.989996621 podStartE2EDuration="2.502602434s" podCreationTimestamp="2026-03-09 13:36:07 +0000 UTC" firstStartedPulling="2026-03-09 13:36:08.501138872 +0000 UTC m=+2242.515606412" lastFinishedPulling="2026-03-09 13:36:09.013744675 +0000 UTC m=+2243.028212225" observedRunningTime="2026-03-09 13:36:09.496182513 +0000 UTC m=+2243.510650063" watchObservedRunningTime="2026-03-09 13:36:09.502602434 +0000 UTC m=+2243.517069964" Mar 09 13:36:24 crc kubenswrapper[4723]: I0309 13:36:24.412287 4723 scope.go:117] "RemoveContainer" containerID="ddf76b174c171ce02f933683ef40712510e279024507bd1418f8e581a6808299" Mar 09 13:36:24 crc kubenswrapper[4723]: I0309 13:36:24.445500 4723 scope.go:117] "RemoveContainer" containerID="507b0498e46ad902f17716e4c68a407d92e63614717839d6b40e8fc0a1df26a9" Mar 09 13:36:25 crc kubenswrapper[4723]: I0309 13:36:25.212064 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b2qw6"] Mar 09 13:36:25 crc kubenswrapper[4723]: I0309 13:36:25.214501 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b2qw6" Mar 09 13:36:25 crc kubenswrapper[4723]: I0309 13:36:25.230543 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b2qw6"] Mar 09 13:36:25 crc kubenswrapper[4723]: I0309 13:36:25.367449 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/291dce87-7d45-474d-b2a9-c7c21eea25aa-catalog-content\") pod \"community-operators-b2qw6\" (UID: \"291dce87-7d45-474d-b2a9-c7c21eea25aa\") " pod="openshift-marketplace/community-operators-b2qw6" Mar 09 13:36:25 crc kubenswrapper[4723]: I0309 13:36:25.367689 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvn27\" (UniqueName: \"kubernetes.io/projected/291dce87-7d45-474d-b2a9-c7c21eea25aa-kube-api-access-vvn27\") pod \"community-operators-b2qw6\" (UID: \"291dce87-7d45-474d-b2a9-c7c21eea25aa\") " pod="openshift-marketplace/community-operators-b2qw6" Mar 09 13:36:25 crc kubenswrapper[4723]: I0309 13:36:25.367756 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/291dce87-7d45-474d-b2a9-c7c21eea25aa-utilities\") pod \"community-operators-b2qw6\" (UID: \"291dce87-7d45-474d-b2a9-c7c21eea25aa\") " pod="openshift-marketplace/community-operators-b2qw6" Mar 09 13:36:25 crc kubenswrapper[4723]: I0309 13:36:25.470594 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/291dce87-7d45-474d-b2a9-c7c21eea25aa-utilities\") pod \"community-operators-b2qw6\" (UID: \"291dce87-7d45-474d-b2a9-c7c21eea25aa\") " pod="openshift-marketplace/community-operators-b2qw6" Mar 09 13:36:25 crc kubenswrapper[4723]: I0309 13:36:25.471417 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/291dce87-7d45-474d-b2a9-c7c21eea25aa-catalog-content\") pod \"community-operators-b2qw6\" (UID: \"291dce87-7d45-474d-b2a9-c7c21eea25aa\") " pod="openshift-marketplace/community-operators-b2qw6" Mar 09 13:36:25 crc kubenswrapper[4723]: I0309 13:36:25.471191 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/291dce87-7d45-474d-b2a9-c7c21eea25aa-utilities\") pod \"community-operators-b2qw6\" (UID: \"291dce87-7d45-474d-b2a9-c7c21eea25aa\") " pod="openshift-marketplace/community-operators-b2qw6" Mar 09 13:36:25 crc kubenswrapper[4723]: I0309 13:36:25.471635 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/291dce87-7d45-474d-b2a9-c7c21eea25aa-catalog-content\") pod \"community-operators-b2qw6\" (UID: \"291dce87-7d45-474d-b2a9-c7c21eea25aa\") " pod="openshift-marketplace/community-operators-b2qw6" Mar 09 13:36:25 crc kubenswrapper[4723]: I0309 13:36:25.471778 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvn27\" (UniqueName: \"kubernetes.io/projected/291dce87-7d45-474d-b2a9-c7c21eea25aa-kube-api-access-vvn27\") pod \"community-operators-b2qw6\" (UID: \"291dce87-7d45-474d-b2a9-c7c21eea25aa\") " pod="openshift-marketplace/community-operators-b2qw6" Mar 09 13:36:25 crc kubenswrapper[4723]: I0309 13:36:25.489404 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvn27\" (UniqueName: \"kubernetes.io/projected/291dce87-7d45-474d-b2a9-c7c21eea25aa-kube-api-access-vvn27\") pod \"community-operators-b2qw6\" (UID: \"291dce87-7d45-474d-b2a9-c7c21eea25aa\") " pod="openshift-marketplace/community-operators-b2qw6" Mar 09 13:36:25 crc kubenswrapper[4723]: I0309 13:36:25.538379 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b2qw6" Mar 09 13:36:26 crc kubenswrapper[4723]: I0309 13:36:26.040235 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-js2tm"] Mar 09 13:36:26 crc kubenswrapper[4723]: I0309 13:36:26.053506 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-js2tm"] Mar 09 13:36:26 crc kubenswrapper[4723]: I0309 13:36:26.082786 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b2qw6"] Mar 09 13:36:26 crc kubenswrapper[4723]: I0309 13:36:26.662169 4723 generic.go:334] "Generic (PLEG): container finished" podID="291dce87-7d45-474d-b2a9-c7c21eea25aa" containerID="a48b5469ffef23f11398ea822218aafa4f8b60a025a54d55730130233196c4c1" exitCode=0 Mar 09 13:36:26 crc kubenswrapper[4723]: I0309 13:36:26.662222 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b2qw6" event={"ID":"291dce87-7d45-474d-b2a9-c7c21eea25aa","Type":"ContainerDied","Data":"a48b5469ffef23f11398ea822218aafa4f8b60a025a54d55730130233196c4c1"} Mar 09 13:36:26 crc kubenswrapper[4723]: I0309 13:36:26.662516 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b2qw6" event={"ID":"291dce87-7d45-474d-b2a9-c7c21eea25aa","Type":"ContainerStarted","Data":"a7008e0a849906c56894ce5d001d98729fccb4971bee6698b97ba62a6881f266"} Mar 09 13:36:26 crc kubenswrapper[4723]: I0309 13:36:26.894220 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d" path="/var/lib/kubelet/pods/ebd5cf0f-83bf-448a-9fe6-1ddfb4195a1d/volumes" Mar 09 13:36:28 crc kubenswrapper[4723]: I0309 13:36:28.685132 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b2qw6" event={"ID":"291dce87-7d45-474d-b2a9-c7c21eea25aa","Type":"ContainerStarted","Data":"050a6b21d9d723fcf408b85eeaf6322c4c9e3ff0a7a35114da634634a9acc752"} Mar 09 13:36:30 crc kubenswrapper[4723]: I0309 13:36:30.719600 4723 generic.go:334] "Generic (PLEG): container finished" podID="291dce87-7d45-474d-b2a9-c7c21eea25aa" containerID="050a6b21d9d723fcf408b85eeaf6322c4c9e3ff0a7a35114da634634a9acc752" exitCode=0 Mar 09 13:36:30 crc kubenswrapper[4723]: I0309 13:36:30.720000 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b2qw6" event={"ID":"291dce87-7d45-474d-b2a9-c7c21eea25aa","Type":"ContainerDied","Data":"050a6b21d9d723fcf408b85eeaf6322c4c9e3ff0a7a35114da634634a9acc752"} Mar 09 13:36:31 crc kubenswrapper[4723]: I0309 13:36:31.732393 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b2qw6" event={"ID":"291dce87-7d45-474d-b2a9-c7c21eea25aa","Type":"ContainerStarted","Data":"d74141cd0c956ee569c582e1d8af396ce8b0871ea518f91e73718a7dc31c91f0"} Mar 09 13:36:31 crc kubenswrapper[4723]: I0309 13:36:31.757791 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b2qw6" podStartSLOduration=2.240999182 podStartE2EDuration="6.75776726s" podCreationTimestamp="2026-03-09 13:36:25 +0000 UTC" firstStartedPulling="2026-03-09 13:36:26.663981762 +0000 UTC m=+2260.678449302" lastFinishedPulling="2026-03-09 13:36:31.18074983 +0000 UTC m=+2265.195217380" observedRunningTime="2026-03-09 13:36:31.747408974 +0000 UTC m=+2265.761876524" watchObservedRunningTime="2026-03-09 13:36:31.75776726 +0000 UTC m=+2265.772234800" Mar 09 13:36:33 crc kubenswrapper[4723]: I0309 13:36:33.946979 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:36:33 crc kubenswrapper[4723]: I0309 13:36:33.947512 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:36:35 crc kubenswrapper[4723]: I0309 13:36:35.539531 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b2qw6" Mar 09 13:36:35 crc kubenswrapper[4723]: I0309 13:36:35.540651 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b2qw6" Mar 09 13:36:36 crc kubenswrapper[4723]: I0309 13:36:36.590018 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-b2qw6" podUID="291dce87-7d45-474d-b2a9-c7c21eea25aa" containerName="registry-server" probeResult="failure" output=< Mar 09 13:36:36 crc kubenswrapper[4723]: timeout: failed to connect service ":50051" within 1s Mar 09 13:36:36 crc kubenswrapper[4723]: > Mar 09 13:36:37 crc kubenswrapper[4723]: I0309 13:36:37.246151 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-p2bmm"] Mar 09 13:36:37 crc kubenswrapper[4723]: I0309 13:36:37.248751 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p2bmm" Mar 09 13:36:37 crc kubenswrapper[4723]: I0309 13:36:37.265077 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p2bmm"] Mar 09 13:36:37 crc kubenswrapper[4723]: I0309 13:36:37.426553 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4158b731-ab92-4158-8755-85f1984d0939-utilities\") pod \"redhat-marketplace-p2bmm\" (UID: \"4158b731-ab92-4158-8755-85f1984d0939\") " pod="openshift-marketplace/redhat-marketplace-p2bmm" Mar 09 13:36:37 crc kubenswrapper[4723]: I0309 13:36:37.426592 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4158b731-ab92-4158-8755-85f1984d0939-catalog-content\") pod \"redhat-marketplace-p2bmm\" (UID: \"4158b731-ab92-4158-8755-85f1984d0939\") " pod="openshift-marketplace/redhat-marketplace-p2bmm" Mar 09 13:36:37 crc kubenswrapper[4723]: I0309 13:36:37.426734 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gjlk\" (UniqueName: \"kubernetes.io/projected/4158b731-ab92-4158-8755-85f1984d0939-kube-api-access-4gjlk\") pod \"redhat-marketplace-p2bmm\" (UID: \"4158b731-ab92-4158-8755-85f1984d0939\") " pod="openshift-marketplace/redhat-marketplace-p2bmm" Mar 09 13:36:37 crc kubenswrapper[4723]: I0309 13:36:37.529795 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4158b731-ab92-4158-8755-85f1984d0939-utilities\") pod \"redhat-marketplace-p2bmm\" (UID: \"4158b731-ab92-4158-8755-85f1984d0939\") " pod="openshift-marketplace/redhat-marketplace-p2bmm" Mar 09 13:36:37 crc kubenswrapper[4723]: I0309 13:36:37.530208 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4158b731-ab92-4158-8755-85f1984d0939-catalog-content\") pod \"redhat-marketplace-p2bmm\" (UID: \"4158b731-ab92-4158-8755-85f1984d0939\") " pod="openshift-marketplace/redhat-marketplace-p2bmm" Mar 09 13:36:37 crc kubenswrapper[4723]: I0309 13:36:37.530338 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gjlk\" (UniqueName: \"kubernetes.io/projected/4158b731-ab92-4158-8755-85f1984d0939-kube-api-access-4gjlk\") pod \"redhat-marketplace-p2bmm\" (UID: \"4158b731-ab92-4158-8755-85f1984d0939\") " pod="openshift-marketplace/redhat-marketplace-p2bmm" Mar 09 13:36:37 crc kubenswrapper[4723]: I0309 13:36:37.530429 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4158b731-ab92-4158-8755-85f1984d0939-utilities\") pod \"redhat-marketplace-p2bmm\" (UID: \"4158b731-ab92-4158-8755-85f1984d0939\") " pod="openshift-marketplace/redhat-marketplace-p2bmm" Mar 09 13:36:37 crc kubenswrapper[4723]: I0309 13:36:37.530559 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4158b731-ab92-4158-8755-85f1984d0939-catalog-content\") pod \"redhat-marketplace-p2bmm\" (UID: \"4158b731-ab92-4158-8755-85f1984d0939\") " pod="openshift-marketplace/redhat-marketplace-p2bmm" Mar 09 13:36:37 crc kubenswrapper[4723]: I0309 13:36:37.553204 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gjlk\" (UniqueName: \"kubernetes.io/projected/4158b731-ab92-4158-8755-85f1984d0939-kube-api-access-4gjlk\") pod \"redhat-marketplace-p2bmm\" (UID: \"4158b731-ab92-4158-8755-85f1984d0939\") " pod="openshift-marketplace/redhat-marketplace-p2bmm" Mar 09 13:36:37 crc kubenswrapper[4723]: I0309 13:36:37.568574 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p2bmm" Mar 09 13:36:38 crc kubenswrapper[4723]: I0309 13:36:38.139845 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p2bmm"] Mar 09 13:36:38 crc kubenswrapper[4723]: I0309 13:36:38.806284 4723 generic.go:334] "Generic (PLEG): container finished" podID="4158b731-ab92-4158-8755-85f1984d0939" containerID="6d4ab3b9ccabab7e56c6402e7ac773eb990af505f0f75e61faa84dc317318ceb" exitCode=0 Mar 09 13:36:38 crc kubenswrapper[4723]: I0309 13:36:38.806550 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p2bmm" event={"ID":"4158b731-ab92-4158-8755-85f1984d0939","Type":"ContainerDied","Data":"6d4ab3b9ccabab7e56c6402e7ac773eb990af505f0f75e61faa84dc317318ceb"} Mar 09 13:36:38 crc kubenswrapper[4723]: I0309 13:36:38.806880 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p2bmm" event={"ID":"4158b731-ab92-4158-8755-85f1984d0939","Type":"ContainerStarted","Data":"fa53e210fdfabf536d863642d3bf8d504dc7b7b5a3167741b155f48ccee3efb0"} Mar 09 13:36:39 crc kubenswrapper[4723]: I0309 13:36:39.817912 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p2bmm" event={"ID":"4158b731-ab92-4158-8755-85f1984d0939","Type":"ContainerStarted","Data":"0ace0a16326a8b934669b60d15291829593af6c87b4478384ce59ed4bdcb7428"} Mar 09 13:36:41 crc kubenswrapper[4723]: I0309 13:36:41.841454 4723 generic.go:334] "Generic (PLEG): container finished" podID="4158b731-ab92-4158-8755-85f1984d0939" containerID="0ace0a16326a8b934669b60d15291829593af6c87b4478384ce59ed4bdcb7428" exitCode=0 Mar 09 13:36:41 crc kubenswrapper[4723]: I0309 13:36:41.841528 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p2bmm" event={"ID":"4158b731-ab92-4158-8755-85f1984d0939","Type":"ContainerDied","Data":"0ace0a16326a8b934669b60d15291829593af6c87b4478384ce59ed4bdcb7428"} Mar 09 13:36:42 crc kubenswrapper[4723]: I0309 13:36:42.858744 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p2bmm" event={"ID":"4158b731-ab92-4158-8755-85f1984d0939","Type":"ContainerStarted","Data":"90a06777a640113d2f1828b21257a8810a9fac0643688293784fa95c33df12b0"} Mar 09 13:36:42 crc kubenswrapper[4723]: I0309 13:36:42.900727 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-p2bmm" podStartSLOduration=2.3490519770000002 podStartE2EDuration="5.900670452s" podCreationTimestamp="2026-03-09 13:36:37 +0000 UTC" firstStartedPulling="2026-03-09 13:36:38.810052621 +0000 UTC m=+2272.824520161" lastFinishedPulling="2026-03-09 13:36:42.361671086 +0000 UTC m=+2276.376138636" observedRunningTime="2026-03-09 13:36:42.885737304 +0000 UTC m=+2276.900204874" watchObservedRunningTime="2026-03-09 13:36:42.900670452 +0000 UTC m=+2276.915137992" Mar 09 13:36:45 crc kubenswrapper[4723]: I0309 13:36:45.590230 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b2qw6" Mar 09 13:36:45 crc kubenswrapper[4723]: I0309 13:36:45.645604 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b2qw6" Mar 09 13:36:45 crc kubenswrapper[4723]: I0309 13:36:45.835940 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b2qw6"] Mar 09 13:36:46 crc kubenswrapper[4723]: I0309 13:36:46.918198 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-b2qw6" podUID="291dce87-7d45-474d-b2a9-c7c21eea25aa" containerName="registry-server" containerID="cri-o://d74141cd0c956ee569c582e1d8af396ce8b0871ea518f91e73718a7dc31c91f0" gracePeriod=2 Mar 09 13:36:47 crc kubenswrapper[4723]: I0309 13:36:47.467048 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b2qw6" Mar 09 13:36:47 crc kubenswrapper[4723]: I0309 13:36:47.569635 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-p2bmm" Mar 09 13:36:47 crc kubenswrapper[4723]: I0309 13:36:47.570906 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-p2bmm" Mar 09 13:36:47 crc kubenswrapper[4723]: I0309 13:36:47.572614 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/291dce87-7d45-474d-b2a9-c7c21eea25aa-catalog-content\") pod \"291dce87-7d45-474d-b2a9-c7c21eea25aa\" (UID: \"291dce87-7d45-474d-b2a9-c7c21eea25aa\") " Mar 09 13:36:47 crc kubenswrapper[4723]: I0309 13:36:47.572748 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/291dce87-7d45-474d-b2a9-c7c21eea25aa-utilities\") pod \"291dce87-7d45-474d-b2a9-c7c21eea25aa\" (UID: \"291dce87-7d45-474d-b2a9-c7c21eea25aa\") " Mar 09 13:36:47 crc kubenswrapper[4723]: I0309 13:36:47.573493 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/291dce87-7d45-474d-b2a9-c7c21eea25aa-utilities" (OuterVolumeSpecName: "utilities") pod "291dce87-7d45-474d-b2a9-c7c21eea25aa" (UID: "291dce87-7d45-474d-b2a9-c7c21eea25aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:36:47 crc kubenswrapper[4723]: I0309 13:36:47.573592 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvn27\" (UniqueName: \"kubernetes.io/projected/291dce87-7d45-474d-b2a9-c7c21eea25aa-kube-api-access-vvn27\") pod \"291dce87-7d45-474d-b2a9-c7c21eea25aa\" (UID: \"291dce87-7d45-474d-b2a9-c7c21eea25aa\") " Mar 09 13:36:47 crc kubenswrapper[4723]: I0309 13:36:47.574588 4723 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/291dce87-7d45-474d-b2a9-c7c21eea25aa-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:36:47 crc kubenswrapper[4723]: I0309 13:36:47.585159 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/291dce87-7d45-474d-b2a9-c7c21eea25aa-kube-api-access-vvn27" (OuterVolumeSpecName: "kube-api-access-vvn27") pod "291dce87-7d45-474d-b2a9-c7c21eea25aa" (UID: "291dce87-7d45-474d-b2a9-c7c21eea25aa"). InnerVolumeSpecName "kube-api-access-vvn27". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:36:47 crc kubenswrapper[4723]: I0309 13:36:47.617905 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/291dce87-7d45-474d-b2a9-c7c21eea25aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "291dce87-7d45-474d-b2a9-c7c21eea25aa" (UID: "291dce87-7d45-474d-b2a9-c7c21eea25aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:36:47 crc kubenswrapper[4723]: I0309 13:36:47.676182 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvn27\" (UniqueName: \"kubernetes.io/projected/291dce87-7d45-474d-b2a9-c7c21eea25aa-kube-api-access-vvn27\") on node \"crc\" DevicePath \"\"" Mar 09 13:36:47 crc kubenswrapper[4723]: I0309 13:36:47.676219 4723 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/291dce87-7d45-474d-b2a9-c7c21eea25aa-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:36:47 crc kubenswrapper[4723]: I0309 13:36:47.929282 4723 generic.go:334] "Generic (PLEG): container finished" podID="291dce87-7d45-474d-b2a9-c7c21eea25aa" containerID="d74141cd0c956ee569c582e1d8af396ce8b0871ea518f91e73718a7dc31c91f0" exitCode=0 Mar 09 13:36:47 crc kubenswrapper[4723]: I0309 13:36:47.930395 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b2qw6" Mar 09 13:36:47 crc kubenswrapper[4723]: I0309 13:36:47.933037 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b2qw6" event={"ID":"291dce87-7d45-474d-b2a9-c7c21eea25aa","Type":"ContainerDied","Data":"d74141cd0c956ee569c582e1d8af396ce8b0871ea518f91e73718a7dc31c91f0"} Mar 09 13:36:47 crc kubenswrapper[4723]: I0309 13:36:47.933097 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b2qw6" event={"ID":"291dce87-7d45-474d-b2a9-c7c21eea25aa","Type":"ContainerDied","Data":"a7008e0a849906c56894ce5d001d98729fccb4971bee6698b97ba62a6881f266"} Mar 09 13:36:47 crc kubenswrapper[4723]: I0309 13:36:47.933121 4723 scope.go:117] "RemoveContainer" containerID="d74141cd0c956ee569c582e1d8af396ce8b0871ea518f91e73718a7dc31c91f0" Mar 09 13:36:47 crc kubenswrapper[4723]: I0309 13:36:47.969133 4723 scope.go:117] "RemoveContainer" containerID="050a6b21d9d723fcf408b85eeaf6322c4c9e3ff0a7a35114da634634a9acc752" Mar 09 13:36:47 crc kubenswrapper[4723]: I0309 13:36:47.974888 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b2qw6"] Mar 09 13:36:47 crc kubenswrapper[4723]: I0309 13:36:47.994040 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-b2qw6"] Mar 09 13:36:48 crc kubenswrapper[4723]: I0309 13:36:48.004179 4723 scope.go:117] "RemoveContainer" containerID="a48b5469ffef23f11398ea822218aafa4f8b60a025a54d55730130233196c4c1" Mar 09 13:36:48 crc kubenswrapper[4723]: I0309 13:36:48.057427 4723 scope.go:117] "RemoveContainer" containerID="d74141cd0c956ee569c582e1d8af396ce8b0871ea518f91e73718a7dc31c91f0" Mar 09 13:36:48 crc kubenswrapper[4723]: E0309 13:36:48.058080 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d74141cd0c956ee569c582e1d8af396ce8b0871ea518f91e73718a7dc31c91f0\": container with ID starting with d74141cd0c956ee569c582e1d8af396ce8b0871ea518f91e73718a7dc31c91f0 not found: ID does not exist" containerID="d74141cd0c956ee569c582e1d8af396ce8b0871ea518f91e73718a7dc31c91f0" Mar 09 13:36:48 crc kubenswrapper[4723]: I0309 13:36:48.058124 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d74141cd0c956ee569c582e1d8af396ce8b0871ea518f91e73718a7dc31c91f0"} err="failed to get container status \"d74141cd0c956ee569c582e1d8af396ce8b0871ea518f91e73718a7dc31c91f0\": rpc error: code = NotFound desc = could not find container \"d74141cd0c956ee569c582e1d8af396ce8b0871ea518f91e73718a7dc31c91f0\": container with ID starting with d74141cd0c956ee569c582e1d8af396ce8b0871ea518f91e73718a7dc31c91f0 not found: ID does not exist" Mar 09 13:36:48 crc kubenswrapper[4723]: I0309 13:36:48.058155 4723 scope.go:117] "RemoveContainer" containerID="050a6b21d9d723fcf408b85eeaf6322c4c9e3ff0a7a35114da634634a9acc752" Mar 09 13:36:48 crc kubenswrapper[4723]: E0309 13:36:48.058565 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"050a6b21d9d723fcf408b85eeaf6322c4c9e3ff0a7a35114da634634a9acc752\": container with ID starting with 050a6b21d9d723fcf408b85eeaf6322c4c9e3ff0a7a35114da634634a9acc752 not found: ID does not exist" containerID="050a6b21d9d723fcf408b85eeaf6322c4c9e3ff0a7a35114da634634a9acc752" Mar 09 13:36:48 crc kubenswrapper[4723]: I0309 13:36:48.058588 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"050a6b21d9d723fcf408b85eeaf6322c4c9e3ff0a7a35114da634634a9acc752"} err="failed to get container status \"050a6b21d9d723fcf408b85eeaf6322c4c9e3ff0a7a35114da634634a9acc752\": rpc error: code = NotFound desc = could not find container \"050a6b21d9d723fcf408b85eeaf6322c4c9e3ff0a7a35114da634634a9acc752\": container with ID starting with 050a6b21d9d723fcf408b85eeaf6322c4c9e3ff0a7a35114da634634a9acc752 not found: ID does not exist" Mar 09 13:36:48 crc kubenswrapper[4723]: I0309 13:36:48.058602 4723 scope.go:117] "RemoveContainer" containerID="a48b5469ffef23f11398ea822218aafa4f8b60a025a54d55730130233196c4c1" Mar 09 13:36:48 crc kubenswrapper[4723]: E0309 13:36:48.059211 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a48b5469ffef23f11398ea822218aafa4f8b60a025a54d55730130233196c4c1\": container with ID starting with a48b5469ffef23f11398ea822218aafa4f8b60a025a54d55730130233196c4c1 not found: ID does not exist" containerID="a48b5469ffef23f11398ea822218aafa4f8b60a025a54d55730130233196c4c1" Mar 09 13:36:48 crc kubenswrapper[4723]: I0309 13:36:48.059236 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a48b5469ffef23f11398ea822218aafa4f8b60a025a54d55730130233196c4c1"} err="failed to get container status \"a48b5469ffef23f11398ea822218aafa4f8b60a025a54d55730130233196c4c1\": rpc error: code = NotFound desc = could not find container \"a48b5469ffef23f11398ea822218aafa4f8b60a025a54d55730130233196c4c1\": container with ID starting with a48b5469ffef23f11398ea822218aafa4f8b60a025a54d55730130233196c4c1 not found: ID does not exist" Mar 09 13:36:48 crc kubenswrapper[4723]: E0309 13:36:48.105008 4723 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod291dce87_7d45_474d_b2a9_c7c21eea25aa.slice/crio-d74141cd0c956ee569c582e1d8af396ce8b0871ea518f91e73718a7dc31c91f0.scope\": RecentStats: unable to find data in memory cache]" Mar 09 13:36:48 crc kubenswrapper[4723]: E0309 13:36:48.107679 4723 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod291dce87_7d45_474d_b2a9_c7c21eea25aa.slice/crio-d74141cd0c956ee569c582e1d8af396ce8b0871ea518f91e73718a7dc31c91f0.scope\": RecentStats: unable to find data in memory cache]" Mar 09 13:36:48 crc kubenswrapper[4723]: I0309 13:36:48.630909 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-p2bmm" podUID="4158b731-ab92-4158-8755-85f1984d0939" containerName="registry-server" probeResult="failure" output=< Mar 09 13:36:48 crc kubenswrapper[4723]: timeout: failed to connect service ":50051" within 1s Mar 09 13:36:48 crc kubenswrapper[4723]: > Mar 09 13:36:48 crc kubenswrapper[4723]: I0309 13:36:48.897682 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="291dce87-7d45-474d-b2a9-c7c21eea25aa" path="/var/lib/kubelet/pods/291dce87-7d45-474d-b2a9-c7c21eea25aa/volumes" Mar 09 13:36:51 crc kubenswrapper[4723]: E0309 13:36:51.818185 4723 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod291dce87_7d45_474d_b2a9_c7c21eea25aa.slice/crio-d74141cd0c956ee569c582e1d8af396ce8b0871ea518f91e73718a7dc31c91f0.scope\": RecentStats: unable to find data in memory cache]" Mar 09 13:36:54 crc kubenswrapper[4723]: E0309 13:36:54.993891 4723 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod291dce87_7d45_474d_b2a9_c7c21eea25aa.slice/crio-d74141cd0c956ee569c582e1d8af396ce8b0871ea518f91e73718a7dc31c91f0.scope\": RecentStats: unable to find data in memory cache]" Mar 09 13:36:57 crc kubenswrapper[4723]: I0309 13:36:57.665469 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-p2bmm" Mar 09 13:36:57 crc kubenswrapper[4723]: I0309 13:36:57.723220 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-p2bmm" Mar 09 13:36:57 crc kubenswrapper[4723]: I0309 13:36:57.908583 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p2bmm"] Mar 09 13:36:59 crc kubenswrapper[4723]: I0309 13:36:59.053145 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-p2bmm" podUID="4158b731-ab92-4158-8755-85f1984d0939" containerName="registry-server" containerID="cri-o://90a06777a640113d2f1828b21257a8810a9fac0643688293784fa95c33df12b0" gracePeriod=2 Mar 09 13:36:59 crc kubenswrapper[4723]: I0309 13:36:59.561748 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p2bmm" Mar 09 13:36:59 crc kubenswrapper[4723]: I0309 13:36:59.600097 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gjlk\" (UniqueName: \"kubernetes.io/projected/4158b731-ab92-4158-8755-85f1984d0939-kube-api-access-4gjlk\") pod \"4158b731-ab92-4158-8755-85f1984d0939\" (UID: \"4158b731-ab92-4158-8755-85f1984d0939\") " Mar 09 13:36:59 crc kubenswrapper[4723]: I0309 13:36:59.600286 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4158b731-ab92-4158-8755-85f1984d0939-utilities\") pod \"4158b731-ab92-4158-8755-85f1984d0939\" (UID: \"4158b731-ab92-4158-8755-85f1984d0939\") " Mar 09 13:36:59 crc kubenswrapper[4723]: I0309 13:36:59.600417 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4158b731-ab92-4158-8755-85f1984d0939-catalog-content\") pod \"4158b731-ab92-4158-8755-85f1984d0939\" (UID: \"4158b731-ab92-4158-8755-85f1984d0939\") " Mar 09 13:36:59 crc kubenswrapper[4723]: I0309 13:36:59.604669 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4158b731-ab92-4158-8755-85f1984d0939-utilities" (OuterVolumeSpecName: "utilities") pod "4158b731-ab92-4158-8755-85f1984d0939" (UID: "4158b731-ab92-4158-8755-85f1984d0939"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:36:59 crc kubenswrapper[4723]: I0309 13:36:59.610686 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4158b731-ab92-4158-8755-85f1984d0939-kube-api-access-4gjlk" (OuterVolumeSpecName: "kube-api-access-4gjlk") pod "4158b731-ab92-4158-8755-85f1984d0939" (UID: "4158b731-ab92-4158-8755-85f1984d0939"). InnerVolumeSpecName "kube-api-access-4gjlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:36:59 crc kubenswrapper[4723]: I0309 13:36:59.635849 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4158b731-ab92-4158-8755-85f1984d0939-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4158b731-ab92-4158-8755-85f1984d0939" (UID: "4158b731-ab92-4158-8755-85f1984d0939"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:36:59 crc kubenswrapper[4723]: I0309 13:36:59.703628 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gjlk\" (UniqueName: \"kubernetes.io/projected/4158b731-ab92-4158-8755-85f1984d0939-kube-api-access-4gjlk\") on node \"crc\" DevicePath \"\"" Mar 09 13:36:59 crc kubenswrapper[4723]: I0309 13:36:59.703660 4723 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4158b731-ab92-4158-8755-85f1984d0939-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:36:59 crc kubenswrapper[4723]: I0309 13:36:59.703669 4723 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4158b731-ab92-4158-8755-85f1984d0939-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:37:00 crc kubenswrapper[4723]: I0309 13:37:00.064350 4723 generic.go:334] "Generic (PLEG): container finished" podID="4158b731-ab92-4158-8755-85f1984d0939" containerID="90a06777a640113d2f1828b21257a8810a9fac0643688293784fa95c33df12b0" exitCode=0 Mar 09 13:37:00 crc kubenswrapper[4723]: I0309 13:37:00.064416 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p2bmm" Mar 09 13:37:00 crc kubenswrapper[4723]: I0309 13:37:00.064421 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p2bmm" event={"ID":"4158b731-ab92-4158-8755-85f1984d0939","Type":"ContainerDied","Data":"90a06777a640113d2f1828b21257a8810a9fac0643688293784fa95c33df12b0"} Mar 09 13:37:00 crc kubenswrapper[4723]: I0309 13:37:00.064792 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p2bmm" event={"ID":"4158b731-ab92-4158-8755-85f1984d0939","Type":"ContainerDied","Data":"fa53e210fdfabf536d863642d3bf8d504dc7b7b5a3167741b155f48ccee3efb0"} Mar 09 13:37:00 crc kubenswrapper[4723]: I0309 13:37:00.064816 4723 scope.go:117] "RemoveContainer" containerID="90a06777a640113d2f1828b21257a8810a9fac0643688293784fa95c33df12b0" Mar 09 13:37:00 crc kubenswrapper[4723]: I0309 13:37:00.088829 4723 scope.go:117] "RemoveContainer" containerID="0ace0a16326a8b934669b60d15291829593af6c87b4478384ce59ed4bdcb7428" Mar 09 13:37:00 crc kubenswrapper[4723]: I0309 13:37:00.104125 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p2bmm"] Mar 09 13:37:00 crc kubenswrapper[4723]: I0309 13:37:00.113308 4723 scope.go:117] "RemoveContainer" containerID="6d4ab3b9ccabab7e56c6402e7ac773eb990af505f0f75e61faa84dc317318ceb" Mar 09 13:37:00 crc kubenswrapper[4723]: I0309 13:37:00.115576 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-p2bmm"] Mar 09 13:37:00 crc kubenswrapper[4723]: I0309 13:37:00.184332 4723 scope.go:117] "RemoveContainer" containerID="90a06777a640113d2f1828b21257a8810a9fac0643688293784fa95c33df12b0" Mar 09 13:37:00 crc kubenswrapper[4723]: E0309 13:37:00.184745 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90a06777a640113d2f1828b21257a8810a9fac0643688293784fa95c33df12b0\": container with ID starting with 90a06777a640113d2f1828b21257a8810a9fac0643688293784fa95c33df12b0 not found: ID does not exist" containerID="90a06777a640113d2f1828b21257a8810a9fac0643688293784fa95c33df12b0" Mar 09 13:37:00 crc kubenswrapper[4723]: I0309 13:37:00.184775 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90a06777a640113d2f1828b21257a8810a9fac0643688293784fa95c33df12b0"} err="failed to get container status \"90a06777a640113d2f1828b21257a8810a9fac0643688293784fa95c33df12b0\": rpc error: code = NotFound desc = could not find container \"90a06777a640113d2f1828b21257a8810a9fac0643688293784fa95c33df12b0\": container with ID starting with 90a06777a640113d2f1828b21257a8810a9fac0643688293784fa95c33df12b0 not found: ID does not exist" Mar 09 13:37:00 crc kubenswrapper[4723]: I0309 13:37:00.184797 4723 scope.go:117] "RemoveContainer" containerID="0ace0a16326a8b934669b60d15291829593af6c87b4478384ce59ed4bdcb7428" Mar 09 13:37:00 crc kubenswrapper[4723]: E0309 13:37:00.185158 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ace0a16326a8b934669b60d15291829593af6c87b4478384ce59ed4bdcb7428\": container with ID starting with 0ace0a16326a8b934669b60d15291829593af6c87b4478384ce59ed4bdcb7428 not found: ID does not exist" containerID="0ace0a16326a8b934669b60d15291829593af6c87b4478384ce59ed4bdcb7428" Mar 09 13:37:00 crc kubenswrapper[4723]: I0309 13:37:00.185182 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ace0a16326a8b934669b60d15291829593af6c87b4478384ce59ed4bdcb7428"} err="failed to get container status \"0ace0a16326a8b934669b60d15291829593af6c87b4478384ce59ed4bdcb7428\": rpc error: code = NotFound desc = could not find container \"0ace0a16326a8b934669b60d15291829593af6c87b4478384ce59ed4bdcb7428\": container with ID starting with 0ace0a16326a8b934669b60d15291829593af6c87b4478384ce59ed4bdcb7428 not found: ID does not exist" Mar 09 13:37:00 crc kubenswrapper[4723]: I0309 13:37:00.185197 4723 scope.go:117] "RemoveContainer" containerID="6d4ab3b9ccabab7e56c6402e7ac773eb990af505f0f75e61faa84dc317318ceb" Mar 09 13:37:00 crc kubenswrapper[4723]: E0309 13:37:00.185460 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d4ab3b9ccabab7e56c6402e7ac773eb990af505f0f75e61faa84dc317318ceb\": container with ID starting with 6d4ab3b9ccabab7e56c6402e7ac773eb990af505f0f75e61faa84dc317318ceb not found: ID does not exist" containerID="6d4ab3b9ccabab7e56c6402e7ac773eb990af505f0f75e61faa84dc317318ceb" Mar 09 13:37:00 crc kubenswrapper[4723]: I0309 13:37:00.185479 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d4ab3b9ccabab7e56c6402e7ac773eb990af505f0f75e61faa84dc317318ceb"} err="failed to get container status \"6d4ab3b9ccabab7e56c6402e7ac773eb990af505f0f75e61faa84dc317318ceb\": rpc error: code = NotFound desc = could not find container \"6d4ab3b9ccabab7e56c6402e7ac773eb990af505f0f75e61faa84dc317318ceb\": container with ID starting with 6d4ab3b9ccabab7e56c6402e7ac773eb990af505f0f75e61faa84dc317318ceb not found: ID does not exist" Mar 09 13:37:00 crc kubenswrapper[4723]: I0309 13:37:00.901653 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4158b731-ab92-4158-8755-85f1984d0939" path="/var/lib/kubelet/pods/4158b731-ab92-4158-8755-85f1984d0939/volumes" Mar 09 13:37:03 crc kubenswrapper[4723]: I0309 13:37:03.946431 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:37:03 crc kubenswrapper[4723]: I0309 13:37:03.946950 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:37:03 crc kubenswrapper[4723]: I0309 13:37:03.947005 4723 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" Mar 09 13:37:03 crc kubenswrapper[4723]: I0309 13:37:03.948018 4723 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"62e8c3719a2b94e2de1825a84035da872a22177f9c2e679f00aeaa34e1f4ffd0"} pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 13:37:03 crc kubenswrapper[4723]: I0309 13:37:03.948074 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" containerID="cri-o://62e8c3719a2b94e2de1825a84035da872a22177f9c2e679f00aeaa34e1f4ffd0" gracePeriod=600 Mar 09 13:37:04 crc kubenswrapper[4723]: I0309 13:37:04.112020 4723 generic.go:334] "Generic (PLEG): container finished" podID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerID="62e8c3719a2b94e2de1825a84035da872a22177f9c2e679f00aeaa34e1f4ffd0" exitCode=0 Mar 09 13:37:04 crc kubenswrapper[4723]: I0309 13:37:04.112074 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" event={"ID":"983d5ed4-cfc7-402a-b226-29dc071c6e4e","Type":"ContainerDied","Data":"62e8c3719a2b94e2de1825a84035da872a22177f9c2e679f00aeaa34e1f4ffd0"} Mar 09 13:37:04 crc kubenswrapper[4723]: I0309 13:37:04.112116 4723 scope.go:117] "RemoveContainer" containerID="a02e51157ce32a3f4545e91341ee80bb0f1a0c3eaea82ea1beb5091e2f5cef4b" Mar 09 13:37:05 crc kubenswrapper[4723]: E0309 13:37:05.036078 4723 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod291dce87_7d45_474d_b2a9_c7c21eea25aa.slice/crio-d74141cd0c956ee569c582e1d8af396ce8b0871ea518f91e73718a7dc31c91f0.scope\": RecentStats: unable to find data in memory cache]" Mar 09 13:37:05 crc kubenswrapper[4723]: I0309 13:37:05.123100 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" event={"ID":"983d5ed4-cfc7-402a-b226-29dc071c6e4e","Type":"ContainerStarted","Data":"ec7109e644ba6b7891bd72977a28590072eb643f158e140ac3846f72eb7be55a"} Mar 09 13:37:06 crc kubenswrapper[4723]: E0309 13:37:06.571222 4723 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod291dce87_7d45_474d_b2a9_c7c21eea25aa.slice/crio-d74141cd0c956ee569c582e1d8af396ce8b0871ea518f91e73718a7dc31c91f0.scope\": RecentStats: unable to find data in memory cache]" Mar 09 13:37:08 crc kubenswrapper[4723]: I0309 13:37:08.156025 4723 generic.go:334] "Generic (PLEG): container finished" podID="940da565-ce26-42f3-a35d-dbaa3efd8521" containerID="8fca6b986936c22493e8d65ad53f7aa9c0c53cb6ad98a6433c32bf028e65638d" exitCode=0 Mar 09 13:37:08 crc kubenswrapper[4723]: I0309 13:37:08.156123 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b8zjw" event={"ID":"940da565-ce26-42f3-a35d-dbaa3efd8521","Type":"ContainerDied","Data":"8fca6b986936c22493e8d65ad53f7aa9c0c53cb6ad98a6433c32bf028e65638d"} Mar 09 13:37:09 crc kubenswrapper[4723]: I0309 13:37:09.642749 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b8zjw" Mar 09 13:37:09 crc kubenswrapper[4723]: I0309 13:37:09.665827 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/940da565-ce26-42f3-a35d-dbaa3efd8521-ovn-combined-ca-bundle\") pod \"940da565-ce26-42f3-a35d-dbaa3efd8521\" (UID: \"940da565-ce26-42f3-a35d-dbaa3efd8521\") " Mar 09 13:37:09 crc kubenswrapper[4723]: I0309 13:37:09.666615 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/940da565-ce26-42f3-a35d-dbaa3efd8521-ovncontroller-config-0\") pod \"940da565-ce26-42f3-a35d-dbaa3efd8521\" (UID: \"940da565-ce26-42f3-a35d-dbaa3efd8521\") " Mar 09 13:37:09 crc kubenswrapper[4723]: I0309 13:37:09.666944 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/940da565-ce26-42f3-a35d-dbaa3efd8521-ssh-key-openstack-edpm-ipam\") pod \"940da565-ce26-42f3-a35d-dbaa3efd8521\" (UID: \"940da565-ce26-42f3-a35d-dbaa3efd8521\") " Mar 09 13:37:09 crc kubenswrapper[4723]: I0309 13:37:09.667088 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/940da565-ce26-42f3-a35d-dbaa3efd8521-inventory\") pod \"940da565-ce26-42f3-a35d-dbaa3efd8521\" (UID: \"940da565-ce26-42f3-a35d-dbaa3efd8521\") " Mar 09 13:37:09 crc kubenswrapper[4723]: I0309 13:37:09.667184 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntgkq\" (UniqueName: \"kubernetes.io/projected/940da565-ce26-42f3-a35d-dbaa3efd8521-kube-api-access-ntgkq\") pod \"940da565-ce26-42f3-a35d-dbaa3efd8521\" (UID: \"940da565-ce26-42f3-a35d-dbaa3efd8521\") " Mar 09 13:37:09 crc kubenswrapper[4723]: I0309 13:37:09.674711 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/940da565-ce26-42f3-a35d-dbaa3efd8521-kube-api-access-ntgkq" (OuterVolumeSpecName: "kube-api-access-ntgkq") pod "940da565-ce26-42f3-a35d-dbaa3efd8521" (UID: "940da565-ce26-42f3-a35d-dbaa3efd8521"). InnerVolumeSpecName "kube-api-access-ntgkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:37:09 crc kubenswrapper[4723]: I0309 13:37:09.686110 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/940da565-ce26-42f3-a35d-dbaa3efd8521-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "940da565-ce26-42f3-a35d-dbaa3efd8521" (UID: "940da565-ce26-42f3-a35d-dbaa3efd8521"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:37:09 crc kubenswrapper[4723]: I0309 13:37:09.707241 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/940da565-ce26-42f3-a35d-dbaa3efd8521-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "940da565-ce26-42f3-a35d-dbaa3efd8521" (UID: "940da565-ce26-42f3-a35d-dbaa3efd8521"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:37:09 crc kubenswrapper[4723]: I0309 13:37:09.713579 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/940da565-ce26-42f3-a35d-dbaa3efd8521-inventory" (OuterVolumeSpecName: "inventory") pod "940da565-ce26-42f3-a35d-dbaa3efd8521" (UID: "940da565-ce26-42f3-a35d-dbaa3efd8521"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:37:09 crc kubenswrapper[4723]: I0309 13:37:09.717573 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/940da565-ce26-42f3-a35d-dbaa3efd8521-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "940da565-ce26-42f3-a35d-dbaa3efd8521" (UID: "940da565-ce26-42f3-a35d-dbaa3efd8521"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:37:09 crc kubenswrapper[4723]: I0309 13:37:09.770853 4723 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/940da565-ce26-42f3-a35d-dbaa3efd8521-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 09 13:37:09 crc kubenswrapper[4723]: I0309 13:37:09.770911 4723 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/940da565-ce26-42f3-a35d-dbaa3efd8521-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:37:09 crc kubenswrapper[4723]: I0309 13:37:09.770928 4723 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/940da565-ce26-42f3-a35d-dbaa3efd8521-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:37:09 crc kubenswrapper[4723]: I0309 13:37:09.770938 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntgkq\" (UniqueName: \"kubernetes.io/projected/940da565-ce26-42f3-a35d-dbaa3efd8521-kube-api-access-ntgkq\") on node \"crc\" DevicePath \"\"" Mar 09 13:37:09 crc kubenswrapper[4723]: I0309 13:37:09.770951 4723 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/940da565-ce26-42f3-a35d-dbaa3efd8521-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:37:10 crc kubenswrapper[4723]: I0309 13:37:10.176008 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b8zjw" event={"ID":"940da565-ce26-42f3-a35d-dbaa3efd8521","Type":"ContainerDied","Data":"b3fafbf5967618a6f7cfd3437abf0147433161f76b5e3d2a59a76f3a59fbc9de"} Mar 09 13:37:10 crc kubenswrapper[4723]: I0309 13:37:10.176355 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3fafbf5967618a6f7cfd3437abf0147433161f76b5e3d2a59a76f3a59fbc9de" Mar 09 13:37:10 crc kubenswrapper[4723]: I0309 13:37:10.176085 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b8zjw" Mar 09 13:37:10 crc kubenswrapper[4723]: I0309 13:37:10.277566 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q"] Mar 09 13:37:10 crc kubenswrapper[4723]: E0309 13:37:10.278173 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="291dce87-7d45-474d-b2a9-c7c21eea25aa" containerName="registry-server" Mar 09 13:37:10 crc kubenswrapper[4723]: I0309 13:37:10.278195 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="291dce87-7d45-474d-b2a9-c7c21eea25aa" containerName="registry-server" Mar 09 13:37:10 crc kubenswrapper[4723]: E0309 13:37:10.278220 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4158b731-ab92-4158-8755-85f1984d0939" containerName="extract-utilities" Mar 09 13:37:10 crc kubenswrapper[4723]: I0309 13:37:10.278228 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="4158b731-ab92-4158-8755-85f1984d0939" containerName="extract-utilities" Mar 09 13:37:10 crc kubenswrapper[4723]: E0309 13:37:10.278242 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4158b731-ab92-4158-8755-85f1984d0939" containerName="registry-server" Mar 09 13:37:10 crc kubenswrapper[4723]: I0309 13:37:10.278248 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="4158b731-ab92-4158-8755-85f1984d0939" containerName="registry-server" Mar 09 13:37:10 crc kubenswrapper[4723]: E0309 13:37:10.278262 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="940da565-ce26-42f3-a35d-dbaa3efd8521" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 09 13:37:10 crc kubenswrapper[4723]: I0309 13:37:10.278268 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="940da565-ce26-42f3-a35d-dbaa3efd8521" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 09 13:37:10 crc kubenswrapper[4723]: E0309 13:37:10.278290 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="291dce87-7d45-474d-b2a9-c7c21eea25aa" containerName="extract-content" Mar 09 13:37:10 crc kubenswrapper[4723]: I0309 13:37:10.278297 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="291dce87-7d45-474d-b2a9-c7c21eea25aa" containerName="extract-content" Mar 09 13:37:10 crc kubenswrapper[4723]: E0309 13:37:10.278313 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="291dce87-7d45-474d-b2a9-c7c21eea25aa" containerName="extract-utilities" Mar 09 13:37:10 crc kubenswrapper[4723]: I0309 13:37:10.278320 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="291dce87-7d45-474d-b2a9-c7c21eea25aa" containerName="extract-utilities" Mar 09 13:37:10 crc kubenswrapper[4723]: E0309 13:37:10.278356 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4158b731-ab92-4158-8755-85f1984d0939" containerName="extract-content" Mar 09 13:37:10 crc kubenswrapper[4723]: I0309 13:37:10.278363 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="4158b731-ab92-4158-8755-85f1984d0939" containerName="extract-content" Mar 09 13:37:10 crc kubenswrapper[4723]: I0309 13:37:10.278607 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="940da565-ce26-42f3-a35d-dbaa3efd8521" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 09 13:37:10 crc kubenswrapper[4723]: I0309 13:37:10.278624 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="291dce87-7d45-474d-b2a9-c7c21eea25aa" containerName="registry-server" Mar 09 13:37:10 crc kubenswrapper[4723]: I0309 13:37:10.278640 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="4158b731-ab92-4158-8755-85f1984d0939" containerName="registry-server" Mar 09 13:37:10 crc kubenswrapper[4723]: I0309 13:37:10.279486 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q" Mar 09 13:37:10 crc kubenswrapper[4723]: I0309 13:37:10.283733 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 09 13:37:10 crc kubenswrapper[4723]: I0309 13:37:10.283890 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:37:10 crc kubenswrapper[4723]: I0309 13:37:10.283948 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 09 13:37:10 crc kubenswrapper[4723]: I0309 13:37:10.284031 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:37:10 crc kubenswrapper[4723]: I0309 13:37:10.284067 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:37:10 crc kubenswrapper[4723]: I0309 13:37:10.284166 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gw7vt" Mar 09 13:37:10 crc kubenswrapper[4723]: I0309 13:37:10.293746 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q"] Mar 09 13:37:10 crc kubenswrapper[4723]: I0309 13:37:10.383500 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k44sf\" (UniqueName: \"kubernetes.io/projected/eccdafec-6101-40db-8d3a-a7141546b0b5-kube-api-access-k44sf\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q\" (UID: \"eccdafec-6101-40db-8d3a-a7141546b0b5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q" Mar 09 13:37:10 crc kubenswrapper[4723]: I0309 13:37:10.383553 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/eccdafec-6101-40db-8d3a-a7141546b0b5-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q\" (UID: \"eccdafec-6101-40db-8d3a-a7141546b0b5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q" Mar 09 13:37:10 crc kubenswrapper[4723]: I0309 13:37:10.383597 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/eccdafec-6101-40db-8d3a-a7141546b0b5-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q\" (UID: \"eccdafec-6101-40db-8d3a-a7141546b0b5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q" Mar 09 13:37:10 crc kubenswrapper[4723]: I0309 13:37:10.383623 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eccdafec-6101-40db-8d3a-a7141546b0b5-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q\" (UID: \"eccdafec-6101-40db-8d3a-a7141546b0b5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q" Mar 09 13:37:10 crc kubenswrapper[4723]: I0309 13:37:10.383661 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eccdafec-6101-40db-8d3a-a7141546b0b5-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q\" (UID: \"eccdafec-6101-40db-8d3a-a7141546b0b5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q" Mar 09 13:37:10 crc kubenswrapper[4723]: I0309 13:37:10.383711 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eccdafec-6101-40db-8d3a-a7141546b0b5-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q\" (UID: \"eccdafec-6101-40db-8d3a-a7141546b0b5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q" Mar 09 13:37:10 crc kubenswrapper[4723]: I0309 13:37:10.486066 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k44sf\" (UniqueName: \"kubernetes.io/projected/eccdafec-6101-40db-8d3a-a7141546b0b5-kube-api-access-k44sf\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q\" (UID: \"eccdafec-6101-40db-8d3a-a7141546b0b5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q" Mar 09 13:37:10 crc kubenswrapper[4723]: I0309 13:37:10.486121 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/eccdafec-6101-40db-8d3a-a7141546b0b5-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q\" (UID: \"eccdafec-6101-40db-8d3a-a7141546b0b5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q" Mar 09 13:37:10 crc kubenswrapper[4723]: I0309 13:37:10.486170 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/eccdafec-6101-40db-8d3a-a7141546b0b5-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q\" (UID: \"eccdafec-6101-40db-8d3a-a7141546b0b5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q" Mar 09 13:37:10 crc kubenswrapper[4723]: I0309 13:37:10.486190 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eccdafec-6101-40db-8d3a-a7141546b0b5-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q\" (UID: \"eccdafec-6101-40db-8d3a-a7141546b0b5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q" Mar 09 13:37:10 crc kubenswrapper[4723]: I0309 13:37:10.486222 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eccdafec-6101-40db-8d3a-a7141546b0b5-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q\" (UID: \"eccdafec-6101-40db-8d3a-a7141546b0b5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q" Mar 09 13:37:10 crc kubenswrapper[4723]: I0309 13:37:10.486275 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eccdafec-6101-40db-8d3a-a7141546b0b5-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q\" (UID: \"eccdafec-6101-40db-8d3a-a7141546b0b5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q" Mar 09 13:37:10 crc kubenswrapper[4723]: I0309 13:37:10.491120 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eccdafec-6101-40db-8d3a-a7141546b0b5-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q\" (UID: \"eccdafec-6101-40db-8d3a-a7141546b0b5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q" Mar 09 13:37:10 crc kubenswrapper[4723]: I0309 13:37:10.492016 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eccdafec-6101-40db-8d3a-a7141546b0b5-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q\" (UID: \"eccdafec-6101-40db-8d3a-a7141546b0b5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q" Mar 09 13:37:10 crc kubenswrapper[4723]: I0309 13:37:10.492154 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/eccdafec-6101-40db-8d3a-a7141546b0b5-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q\" (UID: \"eccdafec-6101-40db-8d3a-a7141546b0b5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q" Mar 09 13:37:10 crc kubenswrapper[4723]: I0309 13:37:10.492626 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/eccdafec-6101-40db-8d3a-a7141546b0b5-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q\" (UID: \"eccdafec-6101-40db-8d3a-a7141546b0b5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q" Mar 09 13:37:10 crc kubenswrapper[4723]: I0309 13:37:10.506689 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eccdafec-6101-40db-8d3a-a7141546b0b5-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q\" (UID: \"eccdafec-6101-40db-8d3a-a7141546b0b5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q" Mar 09 13:37:10 crc kubenswrapper[4723]: I0309 13:37:10.508788 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k44sf\" (UniqueName: \"kubernetes.io/projected/eccdafec-6101-40db-8d3a-a7141546b0b5-kube-api-access-k44sf\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q\" (UID: \"eccdafec-6101-40db-8d3a-a7141546b0b5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q" Mar 09 13:37:10 crc kubenswrapper[4723]: I0309 13:37:10.596347 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q" Mar 09 13:37:11 crc kubenswrapper[4723]: I0309 13:37:11.138677 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q"] Mar 09 13:37:11 crc kubenswrapper[4723]: I0309 13:37:11.193238 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q" event={"ID":"eccdafec-6101-40db-8d3a-a7141546b0b5","Type":"ContainerStarted","Data":"ddfa776e0a1d2325913d155eca2650a2b2a6ecac70571f57b924e315257cf463"} Mar 09 13:37:12 crc kubenswrapper[4723]: I0309 13:37:12.213176 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q" event={"ID":"eccdafec-6101-40db-8d3a-a7141546b0b5","Type":"ContainerStarted","Data":"dc4391f99c4110f0be5bba7c38f26fcd5fe91bf835a304496121261e7fd505d1"} Mar 09 13:37:12 crc kubenswrapper[4723]: I0309 13:37:12.236157 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q" podStartSLOduration=1.6198407289999999 podStartE2EDuration="2.236139896s" podCreationTimestamp="2026-03-09 13:37:10 +0000 UTC" firstStartedPulling="2026-03-09 13:37:11.14300381 +0000 UTC m=+2305.157471350" lastFinishedPulling="2026-03-09 13:37:11.759302967 +0000 UTC m=+2305.773770517" observedRunningTime="2026-03-09 13:37:12.234267016 +0000 UTC m=+2306.248734566" watchObservedRunningTime="2026-03-09 13:37:12.236139896 +0000 UTC m=+2306.250607436" Mar 09 13:37:15 crc kubenswrapper[4723]: E0309 13:37:15.372840 4723 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod291dce87_7d45_474d_b2a9_c7c21eea25aa.slice/crio-d74141cd0c956ee569c582e1d8af396ce8b0871ea518f91e73718a7dc31c91f0.scope\": RecentStats: unable to find data in memory cache]" Mar 09 13:37:21 crc kubenswrapper[4723]: E0309 13:37:21.817388 4723 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod291dce87_7d45_474d_b2a9_c7c21eea25aa.slice/crio-d74141cd0c956ee569c582e1d8af396ce8b0871ea518f91e73718a7dc31c91f0.scope\": RecentStats: unable to find data in memory cache]" Mar 09 13:37:24 crc kubenswrapper[4723]: I0309 13:37:24.579603 4723 scope.go:117] "RemoveContainer" containerID="57130dd725d7cc21c2a89d5bda87ab79dfee89089bd9e3287144c0588f8dd44b" Mar 09 13:37:25 crc kubenswrapper[4723]: E0309 13:37:25.430476 4723 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod291dce87_7d45_474d_b2a9_c7c21eea25aa.slice/crio-d74141cd0c956ee569c582e1d8af396ce8b0871ea518f91e73718a7dc31c91f0.scope\": RecentStats: unable to find data in memory cache]" Mar 09 13:37:35 crc kubenswrapper[4723]: E0309 13:37:35.700340 4723 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod291dce87_7d45_474d_b2a9_c7c21eea25aa.slice/crio-d74141cd0c956ee569c582e1d8af396ce8b0871ea518f91e73718a7dc31c91f0.scope\": RecentStats: unable to find data in memory cache]" Mar 09 13:37:36 crc kubenswrapper[4723]: E0309 13:37:36.567193 4723 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod291dce87_7d45_474d_b2a9_c7c21eea25aa.slice/crio-d74141cd0c956ee569c582e1d8af396ce8b0871ea518f91e73718a7dc31c91f0.scope\": RecentStats: unable to find data in memory cache]" Mar 09 13:37:46 crc kubenswrapper[4723]: E0309 13:37:46.018794 4723 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod291dce87_7d45_474d_b2a9_c7c21eea25aa.slice/crio-d74141cd0c956ee569c582e1d8af396ce8b0871ea518f91e73718a7dc31c91f0.scope\": RecentStats: unable to find data in memory cache]" Mar 09 13:37:55 crc kubenswrapper[4723]: I0309 13:37:55.686766 4723 generic.go:334] "Generic (PLEG): container finished" podID="eccdafec-6101-40db-8d3a-a7141546b0b5" containerID="dc4391f99c4110f0be5bba7c38f26fcd5fe91bf835a304496121261e7fd505d1" exitCode=0 Mar 09 13:37:55 crc kubenswrapper[4723]: I0309 13:37:55.686840 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q" event={"ID":"eccdafec-6101-40db-8d3a-a7141546b0b5","Type":"ContainerDied","Data":"dc4391f99c4110f0be5bba7c38f26fcd5fe91bf835a304496121261e7fd505d1"} Mar 09 13:37:57 crc kubenswrapper[4723]: I0309 13:37:57.199232 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q" Mar 09 13:37:57 crc kubenswrapper[4723]: I0309 13:37:57.336816 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/eccdafec-6101-40db-8d3a-a7141546b0b5-nova-metadata-neutron-config-0\") pod \"eccdafec-6101-40db-8d3a-a7141546b0b5\" (UID: \"eccdafec-6101-40db-8d3a-a7141546b0b5\") " Mar 09 13:37:57 crc kubenswrapper[4723]: I0309 13:37:57.336965 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eccdafec-6101-40db-8d3a-a7141546b0b5-inventory\") pod \"eccdafec-6101-40db-8d3a-a7141546b0b5\" (UID: \"eccdafec-6101-40db-8d3a-a7141546b0b5\") " Mar 09 13:37:57 crc kubenswrapper[4723]: I0309 13:37:57.337094 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/eccdafec-6101-40db-8d3a-a7141546b0b5-neutron-ovn-metadata-agent-neutron-config-0\") pod \"eccdafec-6101-40db-8d3a-a7141546b0b5\" (UID: \"eccdafec-6101-40db-8d3a-a7141546b0b5\") " Mar 09 13:37:57 crc kubenswrapper[4723]: I0309 13:37:57.337183 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eccdafec-6101-40db-8d3a-a7141546b0b5-neutron-metadata-combined-ca-bundle\") pod \"eccdafec-6101-40db-8d3a-a7141546b0b5\" (UID: \"eccdafec-6101-40db-8d3a-a7141546b0b5\") " Mar 09 13:37:57 crc kubenswrapper[4723]: I0309 13:37:57.337214 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eccdafec-6101-40db-8d3a-a7141546b0b5-ssh-key-openstack-edpm-ipam\") pod \"eccdafec-6101-40db-8d3a-a7141546b0b5\" (UID: \"eccdafec-6101-40db-8d3a-a7141546b0b5\") " Mar 09 13:37:57 crc kubenswrapper[4723]: I0309 13:37:57.337245 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k44sf\" (UniqueName: \"kubernetes.io/projected/eccdafec-6101-40db-8d3a-a7141546b0b5-kube-api-access-k44sf\") pod \"eccdafec-6101-40db-8d3a-a7141546b0b5\" (UID: \"eccdafec-6101-40db-8d3a-a7141546b0b5\") " Mar 09 13:37:57 crc kubenswrapper[4723]: I0309 13:37:57.342217 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eccdafec-6101-40db-8d3a-a7141546b0b5-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "eccdafec-6101-40db-8d3a-a7141546b0b5" (UID: "eccdafec-6101-40db-8d3a-a7141546b0b5"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:37:57 crc kubenswrapper[4723]: I0309 13:37:57.344241 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eccdafec-6101-40db-8d3a-a7141546b0b5-kube-api-access-k44sf" (OuterVolumeSpecName: "kube-api-access-k44sf") pod "eccdafec-6101-40db-8d3a-a7141546b0b5" (UID: "eccdafec-6101-40db-8d3a-a7141546b0b5"). InnerVolumeSpecName "kube-api-access-k44sf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:37:57 crc kubenswrapper[4723]: I0309 13:37:57.376932 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eccdafec-6101-40db-8d3a-a7141546b0b5-inventory" (OuterVolumeSpecName: "inventory") pod "eccdafec-6101-40db-8d3a-a7141546b0b5" (UID: "eccdafec-6101-40db-8d3a-a7141546b0b5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:37:57 crc kubenswrapper[4723]: I0309 13:37:57.378875 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eccdafec-6101-40db-8d3a-a7141546b0b5-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "eccdafec-6101-40db-8d3a-a7141546b0b5" (UID: "eccdafec-6101-40db-8d3a-a7141546b0b5"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:37:57 crc kubenswrapper[4723]: I0309 13:37:57.381193 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eccdafec-6101-40db-8d3a-a7141546b0b5-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "eccdafec-6101-40db-8d3a-a7141546b0b5" (UID: "eccdafec-6101-40db-8d3a-a7141546b0b5"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:37:57 crc kubenswrapper[4723]: I0309 13:37:57.392371 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eccdafec-6101-40db-8d3a-a7141546b0b5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "eccdafec-6101-40db-8d3a-a7141546b0b5" (UID: "eccdafec-6101-40db-8d3a-a7141546b0b5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:37:57 crc kubenswrapper[4723]: I0309 13:37:57.439788 4723 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eccdafec-6101-40db-8d3a-a7141546b0b5-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:37:57 crc kubenswrapper[4723]: I0309 13:37:57.440061 4723 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eccdafec-6101-40db-8d3a-a7141546b0b5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:37:57 crc kubenswrapper[4723]: I0309 13:37:57.440074 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k44sf\" (UniqueName: \"kubernetes.io/projected/eccdafec-6101-40db-8d3a-a7141546b0b5-kube-api-access-k44sf\") on node \"crc\" DevicePath \"\"" Mar 09 13:37:57 crc kubenswrapper[4723]: I0309 13:37:57.440083 4723 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/eccdafec-6101-40db-8d3a-a7141546b0b5-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 09 13:37:57 crc kubenswrapper[4723]: I0309 13:37:57.440092 4723 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eccdafec-6101-40db-8d3a-a7141546b0b5-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:37:57 crc kubenswrapper[4723]: I0309 13:37:57.440102 4723 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/eccdafec-6101-40db-8d3a-a7141546b0b5-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 09 13:37:57 crc kubenswrapper[4723]: I0309 13:37:57.708822 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q" event={"ID":"eccdafec-6101-40db-8d3a-a7141546b0b5","Type":"ContainerDied","Data":"ddfa776e0a1d2325913d155eca2650a2b2a6ecac70571f57b924e315257cf463"} Mar 09 13:37:57 crc kubenswrapper[4723]: I0309 13:37:57.708911 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddfa776e0a1d2325913d155eca2650a2b2a6ecac70571f57b924e315257cf463" Mar 09 13:37:57 crc kubenswrapper[4723]: I0309 13:37:57.708905 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q" Mar 09 13:37:57 crc kubenswrapper[4723]: I0309 13:37:57.846351 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlxv7"] Mar 09 13:37:57 crc kubenswrapper[4723]: E0309 13:37:57.846795 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eccdafec-6101-40db-8d3a-a7141546b0b5" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 09 13:37:57 crc kubenswrapper[4723]: I0309 13:37:57.846810 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="eccdafec-6101-40db-8d3a-a7141546b0b5" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 09 13:37:57 crc kubenswrapper[4723]: I0309 13:37:57.847055 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="eccdafec-6101-40db-8d3a-a7141546b0b5" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 09 13:37:57 crc kubenswrapper[4723]: I0309 13:37:57.847786 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlxv7" Mar 09 13:37:57 crc kubenswrapper[4723]: I0309 13:37:57.850322 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:37:57 crc kubenswrapper[4723]: I0309 13:37:57.850581 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 09 13:37:57 crc kubenswrapper[4723]: I0309 13:37:57.850786 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gw7vt" Mar 09 13:37:57 crc kubenswrapper[4723]: I0309 13:37:57.851711 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:37:57 crc kubenswrapper[4723]: I0309 13:37:57.851916 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:37:57 crc kubenswrapper[4723]: I0309 13:37:57.908242 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlxv7"] Mar 09 13:37:57 crc kubenswrapper[4723]: I0309 13:37:57.950222 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7260959-2422-4f47-b153-63bba9a58875-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mlxv7\" (UID: \"e7260959-2422-4f47-b153-63bba9a58875\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlxv7" Mar 09 13:37:57 crc kubenswrapper[4723]: I0309 13:37:57.950293 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7260959-2422-4f47-b153-63bba9a58875-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mlxv7\" (UID: \"e7260959-2422-4f47-b153-63bba9a58875\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlxv7" Mar 09 13:37:57 crc kubenswrapper[4723]: I0309 13:37:57.950812 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e7260959-2422-4f47-b153-63bba9a58875-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mlxv7\" (UID: \"e7260959-2422-4f47-b153-63bba9a58875\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlxv7" Mar 09 13:37:57 crc kubenswrapper[4723]: I0309 13:37:57.950856 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e7260959-2422-4f47-b153-63bba9a58875-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mlxv7\" (UID: \"e7260959-2422-4f47-b153-63bba9a58875\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlxv7" Mar 09 13:37:57 crc kubenswrapper[4723]: I0309 13:37:57.951069 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r28f4\" (UniqueName: \"kubernetes.io/projected/e7260959-2422-4f47-b153-63bba9a58875-kube-api-access-r28f4\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mlxv7\" (UID: \"e7260959-2422-4f47-b153-63bba9a58875\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlxv7" Mar 09 13:37:58 crc kubenswrapper[4723]: I0309 13:37:58.052825 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7260959-2422-4f47-b153-63bba9a58875-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mlxv7\" (UID: \"e7260959-2422-4f47-b153-63bba9a58875\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlxv7" Mar 09 13:37:58 crc kubenswrapper[4723]: I0309 13:37:58.052909 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7260959-2422-4f47-b153-63bba9a58875-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mlxv7\" (UID: \"e7260959-2422-4f47-b153-63bba9a58875\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlxv7" Mar 09 13:37:58 crc kubenswrapper[4723]: I0309 13:37:58.053102 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e7260959-2422-4f47-b153-63bba9a58875-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mlxv7\" (UID: \"e7260959-2422-4f47-b153-63bba9a58875\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlxv7" Mar 09 13:37:58 crc kubenswrapper[4723]: I0309 13:37:58.053132 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e7260959-2422-4f47-b153-63bba9a58875-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mlxv7\" (UID: \"e7260959-2422-4f47-b153-63bba9a58875\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlxv7" Mar 09 13:37:58 crc kubenswrapper[4723]: I0309 13:37:58.053235 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r28f4\" (UniqueName: \"kubernetes.io/projected/e7260959-2422-4f47-b153-63bba9a58875-kube-api-access-r28f4\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mlxv7\" (UID: \"e7260959-2422-4f47-b153-63bba9a58875\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlxv7" Mar 09 13:37:58 crc kubenswrapper[4723]: I0309 13:37:58.057264 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e7260959-2422-4f47-b153-63bba9a58875-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mlxv7\" (UID: \"e7260959-2422-4f47-b153-63bba9a58875\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlxv7" Mar 09 13:37:58 crc kubenswrapper[4723]: I0309 13:37:58.059548 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e7260959-2422-4f47-b153-63bba9a58875-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mlxv7\" (UID: \"e7260959-2422-4f47-b153-63bba9a58875\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlxv7" Mar 09 13:37:58 crc kubenswrapper[4723]: I0309 13:37:58.059552 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7260959-2422-4f47-b153-63bba9a58875-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mlxv7\" (UID: \"e7260959-2422-4f47-b153-63bba9a58875\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlxv7" Mar 09 13:37:58 crc kubenswrapper[4723]: I0309 13:37:58.072243 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7260959-2422-4f47-b153-63bba9a58875-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mlxv7\" (UID: \"e7260959-2422-4f47-b153-63bba9a58875\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlxv7" Mar 09 13:37:58 crc kubenswrapper[4723]: I0309 13:37:58.079564 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r28f4\" (UniqueName: \"kubernetes.io/projected/e7260959-2422-4f47-b153-63bba9a58875-kube-api-access-r28f4\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mlxv7\" (UID: \"e7260959-2422-4f47-b153-63bba9a58875\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlxv7" Mar 09 13:37:58 crc kubenswrapper[4723]: I0309 13:37:58.188480 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlxv7" Mar 09 13:37:58 crc kubenswrapper[4723]: I0309 13:37:58.795176 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlxv7"] Mar 09 13:37:59 crc kubenswrapper[4723]: I0309 13:37:59.729037 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlxv7" event={"ID":"e7260959-2422-4f47-b153-63bba9a58875","Type":"ContainerStarted","Data":"45dea2a39e04b3bf4ad6ac7104aad6384376e5e1ecb74e56d5c9a3b04d9caf3f"} Mar 09 13:38:00 crc kubenswrapper[4723]: I0309 13:38:00.141223 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551058-zclwx"] Mar 09 13:38:00 crc kubenswrapper[4723]: I0309 13:38:00.143078 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551058-zclwx" Mar 09 13:38:00 crc kubenswrapper[4723]: I0309 13:38:00.145420 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:38:00 crc kubenswrapper[4723]: I0309 13:38:00.145735 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jrp4x" Mar 09 13:38:00 crc kubenswrapper[4723]: I0309 13:38:00.145935 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:38:00 crc kubenswrapper[4723]: I0309 13:38:00.151802 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551058-zclwx"] Mar 09 13:38:00 crc kubenswrapper[4723]: I0309 13:38:00.220818 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6c7x\" (UniqueName: \"kubernetes.io/projected/90bf326b-8f21-4736-8ab3-5265edda19ff-kube-api-access-z6c7x\") pod \"auto-csr-approver-29551058-zclwx\" (UID: \"90bf326b-8f21-4736-8ab3-5265edda19ff\") " pod="openshift-infra/auto-csr-approver-29551058-zclwx" Mar 09 13:38:00 crc kubenswrapper[4723]: I0309 13:38:00.325080 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6c7x\" (UniqueName: \"kubernetes.io/projected/90bf326b-8f21-4736-8ab3-5265edda19ff-kube-api-access-z6c7x\") pod \"auto-csr-approver-29551058-zclwx\" (UID: \"90bf326b-8f21-4736-8ab3-5265edda19ff\") " pod="openshift-infra/auto-csr-approver-29551058-zclwx" Mar 09 13:38:00 crc kubenswrapper[4723]: I0309 13:38:00.343547 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6c7x\" (UniqueName: \"kubernetes.io/projected/90bf326b-8f21-4736-8ab3-5265edda19ff-kube-api-access-z6c7x\") pod \"auto-csr-approver-29551058-zclwx\" (UID: \"90bf326b-8f21-4736-8ab3-5265edda19ff\") " pod="openshift-infra/auto-csr-approver-29551058-zclwx" Mar 09 13:38:00 crc kubenswrapper[4723]: I0309 13:38:00.515049 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551058-zclwx" Mar 09 13:38:00 crc kubenswrapper[4723]: I0309 13:38:00.747157 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlxv7" event={"ID":"e7260959-2422-4f47-b153-63bba9a58875","Type":"ContainerStarted","Data":"90c9bc96b10a4afbc9dc30e34f380099663ac53cd51faf931d81685f664169e8"} Mar 09 13:38:00 crc kubenswrapper[4723]: I0309 13:38:00.770704 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlxv7" podStartSLOduration=2.3180169250000002 podStartE2EDuration="3.770685012s" podCreationTimestamp="2026-03-09 13:37:57 +0000 UTC" firstStartedPulling="2026-03-09 13:37:58.807783742 +0000 UTC m=+2352.822251282" lastFinishedPulling="2026-03-09 13:38:00.260451829 +0000 UTC m=+2354.274919369" observedRunningTime="2026-03-09 13:38:00.761421015 +0000 UTC m=+2354.775888555" watchObservedRunningTime="2026-03-09 13:38:00.770685012 +0000 UTC m=+2354.785152552" Mar 09 13:38:01 crc kubenswrapper[4723]: W0309 13:38:01.008605 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90bf326b_8f21_4736_8ab3_5265edda19ff.slice/crio-63345517e3718b12bd9c94d77a06db0052e104be5cfd219d01794018e3a3468e WatchSource:0}: Error finding container 63345517e3718b12bd9c94d77a06db0052e104be5cfd219d01794018e3a3468e: Status 404 returned error can't find the container with id 63345517e3718b12bd9c94d77a06db0052e104be5cfd219d01794018e3a3468e Mar 09 13:38:01 crc kubenswrapper[4723]: I0309 13:38:01.015512 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551058-zclwx"] Mar 09 13:38:01 crc kubenswrapper[4723]: I0309 13:38:01.760513 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551058-zclwx" event={"ID":"90bf326b-8f21-4736-8ab3-5265edda19ff","Type":"ContainerStarted","Data":"63345517e3718b12bd9c94d77a06db0052e104be5cfd219d01794018e3a3468e"} Mar 09 13:38:02 crc kubenswrapper[4723]: I0309 13:38:02.775121 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551058-zclwx" event={"ID":"90bf326b-8f21-4736-8ab3-5265edda19ff","Type":"ContainerStarted","Data":"ca2e8c64945ada93e1c0cb567226db661b96341bc037f6faa04fee026aad8285"} Mar 09 13:38:02 crc kubenswrapper[4723]: I0309 13:38:02.792389 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551058-zclwx" podStartSLOduration=1.534942971 podStartE2EDuration="2.792372988s" podCreationTimestamp="2026-03-09 13:38:00 +0000 UTC" firstStartedPulling="2026-03-09 13:38:01.012134144 +0000 UTC m=+2355.026601684" lastFinishedPulling="2026-03-09 13:38:02.269564151 +0000 UTC m=+2356.284031701" observedRunningTime="2026-03-09 13:38:02.790461187 +0000 UTC m=+2356.804928737" watchObservedRunningTime="2026-03-09 13:38:02.792372988 +0000 UTC m=+2356.806840528" Mar 09 13:38:03 crc kubenswrapper[4723]: I0309 13:38:03.790709 4723 generic.go:334] "Generic (PLEG): container finished" podID="90bf326b-8f21-4736-8ab3-5265edda19ff" containerID="ca2e8c64945ada93e1c0cb567226db661b96341bc037f6faa04fee026aad8285" exitCode=0 Mar 09 13:38:03 crc kubenswrapper[4723]: I0309 13:38:03.790818 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551058-zclwx" event={"ID":"90bf326b-8f21-4736-8ab3-5265edda19ff","Type":"ContainerDied","Data":"ca2e8c64945ada93e1c0cb567226db661b96341bc037f6faa04fee026aad8285"} Mar 09 13:38:05 crc kubenswrapper[4723]: I0309 13:38:05.184998 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551058-zclwx" Mar 09 13:38:05 crc kubenswrapper[4723]: I0309 13:38:05.253720 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6c7x\" (UniqueName: \"kubernetes.io/projected/90bf326b-8f21-4736-8ab3-5265edda19ff-kube-api-access-z6c7x\") pod \"90bf326b-8f21-4736-8ab3-5265edda19ff\" (UID: \"90bf326b-8f21-4736-8ab3-5265edda19ff\") " Mar 09 13:38:05 crc kubenswrapper[4723]: I0309 13:38:05.260132 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90bf326b-8f21-4736-8ab3-5265edda19ff-kube-api-access-z6c7x" (OuterVolumeSpecName: "kube-api-access-z6c7x") pod "90bf326b-8f21-4736-8ab3-5265edda19ff" (UID: "90bf326b-8f21-4736-8ab3-5265edda19ff"). InnerVolumeSpecName "kube-api-access-z6c7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:38:05 crc kubenswrapper[4723]: I0309 13:38:05.357179 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6c7x\" (UniqueName: \"kubernetes.io/projected/90bf326b-8f21-4736-8ab3-5265edda19ff-kube-api-access-z6c7x\") on node \"crc\" DevicePath \"\"" Mar 09 13:38:05 crc kubenswrapper[4723]: I0309 13:38:05.813287 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551058-zclwx" event={"ID":"90bf326b-8f21-4736-8ab3-5265edda19ff","Type":"ContainerDied","Data":"63345517e3718b12bd9c94d77a06db0052e104be5cfd219d01794018e3a3468e"} Mar 09 13:38:05 crc kubenswrapper[4723]: I0309 13:38:05.813323 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63345517e3718b12bd9c94d77a06db0052e104be5cfd219d01794018e3a3468e" Mar 09 13:38:05 crc kubenswrapper[4723]: I0309 13:38:05.813332 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551058-zclwx" Mar 09 13:38:05 crc kubenswrapper[4723]: I0309 13:38:05.908540 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551052-h547x"] Mar 09 13:38:05 crc kubenswrapper[4723]: I0309 13:38:05.919014 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551052-h547x"] Mar 09 13:38:06 crc kubenswrapper[4723]: I0309 13:38:06.894768 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d1805eb-e0eb-46e7-936f-5d6cc76753ae" path="/var/lib/kubelet/pods/8d1805eb-e0eb-46e7-936f-5d6cc76753ae/volumes" Mar 09 13:38:24 crc kubenswrapper[4723]: I0309 13:38:24.694016 4723 scope.go:117] "RemoveContainer" containerID="5dc3959b3c7ef0edd50a086a00ddd8e4fbd763b4bf13dcdd107bce49c2a4f70d" Mar 09 13:39:33 crc kubenswrapper[4723]: I0309 13:39:33.946751 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:39:33 crc kubenswrapper[4723]: I0309 13:39:33.947341 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:40:00 crc kubenswrapper[4723]: I0309 13:40:00.143583 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551060-gppnd"] Mar 09 13:40:00 crc kubenswrapper[4723]: E0309 13:40:00.144767 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90bf326b-8f21-4736-8ab3-5265edda19ff" containerName="oc" Mar 09 13:40:00 crc kubenswrapper[4723]: I0309 13:40:00.144786 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="90bf326b-8f21-4736-8ab3-5265edda19ff" containerName="oc" Mar 09 13:40:00 crc kubenswrapper[4723]: I0309 13:40:00.145331 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="90bf326b-8f21-4736-8ab3-5265edda19ff" containerName="oc" Mar 09 13:40:00 crc kubenswrapper[4723]: I0309 13:40:00.146384 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551060-gppnd" Mar 09 13:40:00 crc kubenswrapper[4723]: I0309 13:40:00.147831 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jrp4x" Mar 09 13:40:00 crc kubenswrapper[4723]: I0309 13:40:00.148242 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:40:00 crc kubenswrapper[4723]: I0309 13:40:00.150697 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:40:00 crc kubenswrapper[4723]: I0309 13:40:00.167392 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551060-gppnd"] Mar 09 13:40:00 crc kubenswrapper[4723]: I0309 13:40:00.339688 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swbbc\" (UniqueName: \"kubernetes.io/projected/6c87d396-1f7f-4388-a238-686ce16cfc80-kube-api-access-swbbc\") pod \"auto-csr-approver-29551060-gppnd\" (UID: \"6c87d396-1f7f-4388-a238-686ce16cfc80\") " pod="openshift-infra/auto-csr-approver-29551060-gppnd" Mar 09 13:40:00 crc kubenswrapper[4723]: I0309 13:40:00.443797 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swbbc\" (UniqueName: \"kubernetes.io/projected/6c87d396-1f7f-4388-a238-686ce16cfc80-kube-api-access-swbbc\") pod \"auto-csr-approver-29551060-gppnd\" (UID: \"6c87d396-1f7f-4388-a238-686ce16cfc80\") " pod="openshift-infra/auto-csr-approver-29551060-gppnd" Mar 09 13:40:00 crc kubenswrapper[4723]: I0309 13:40:00.462187 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swbbc\" (UniqueName: \"kubernetes.io/projected/6c87d396-1f7f-4388-a238-686ce16cfc80-kube-api-access-swbbc\") pod \"auto-csr-approver-29551060-gppnd\" (UID: \"6c87d396-1f7f-4388-a238-686ce16cfc80\") " pod="openshift-infra/auto-csr-approver-29551060-gppnd" Mar 09 13:40:00 crc kubenswrapper[4723]: I0309 13:40:00.470205 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551060-gppnd" Mar 09 13:40:00 crc kubenswrapper[4723]: I0309 13:40:00.955173 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551060-gppnd"] Mar 09 13:40:00 crc kubenswrapper[4723]: I0309 13:40:00.956469 4723 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 13:40:01 crc kubenswrapper[4723]: I0309 13:40:01.025807 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551060-gppnd" event={"ID":"6c87d396-1f7f-4388-a238-686ce16cfc80","Type":"ContainerStarted","Data":"205b478ae3f83350bae0684f7f9bf6fda8574c929b2c0eaadd95ac5b5b421a84"} Mar 09 13:40:03 crc kubenswrapper[4723]: I0309 13:40:03.050007 4723 generic.go:334] "Generic (PLEG): container finished" podID="6c87d396-1f7f-4388-a238-686ce16cfc80" containerID="526c7407dbe1cea2e487ded638c31a27b44b03b9caa4897314f20f1a6d7ab913" exitCode=0 Mar 09 13:40:03 crc kubenswrapper[4723]: I0309 13:40:03.050113 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551060-gppnd" event={"ID":"6c87d396-1f7f-4388-a238-686ce16cfc80","Type":"ContainerDied","Data":"526c7407dbe1cea2e487ded638c31a27b44b03b9caa4897314f20f1a6d7ab913"} Mar 09 13:40:03 crc kubenswrapper[4723]: I0309 13:40:03.947979 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:40:03 crc kubenswrapper[4723]: I0309 13:40:03.948345 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:40:04 crc kubenswrapper[4723]: I0309 13:40:04.442667 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551060-gppnd" Mar 09 13:40:04 crc kubenswrapper[4723]: I0309 13:40:04.467302 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swbbc\" (UniqueName: \"kubernetes.io/projected/6c87d396-1f7f-4388-a238-686ce16cfc80-kube-api-access-swbbc\") pod \"6c87d396-1f7f-4388-a238-686ce16cfc80\" (UID: \"6c87d396-1f7f-4388-a238-686ce16cfc80\") " Mar 09 13:40:04 crc kubenswrapper[4723]: I0309 13:40:04.473633 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c87d396-1f7f-4388-a238-686ce16cfc80-kube-api-access-swbbc" (OuterVolumeSpecName: "kube-api-access-swbbc") pod "6c87d396-1f7f-4388-a238-686ce16cfc80" (UID: "6c87d396-1f7f-4388-a238-686ce16cfc80"). InnerVolumeSpecName "kube-api-access-swbbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:04 crc kubenswrapper[4723]: I0309 13:40:04.570630 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swbbc\" (UniqueName: \"kubernetes.io/projected/6c87d396-1f7f-4388-a238-686ce16cfc80-kube-api-access-swbbc\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:05 crc kubenswrapper[4723]: I0309 13:40:05.070852 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551060-gppnd" event={"ID":"6c87d396-1f7f-4388-a238-686ce16cfc80","Type":"ContainerDied","Data":"205b478ae3f83350bae0684f7f9bf6fda8574c929b2c0eaadd95ac5b5b421a84"} Mar 09 13:40:05 crc kubenswrapper[4723]: I0309 13:40:05.071343 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="205b478ae3f83350bae0684f7f9bf6fda8574c929b2c0eaadd95ac5b5b421a84" Mar 09 13:40:05 crc kubenswrapper[4723]: I0309 13:40:05.071097 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551060-gppnd" Mar 09 13:40:05 crc kubenswrapper[4723]: I0309 13:40:05.520087 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551054-wwv7q"] Mar 09 13:40:05 crc kubenswrapper[4723]: I0309 13:40:05.531294 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551054-wwv7q"] Mar 09 13:40:06 crc kubenswrapper[4723]: I0309 13:40:06.899752 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b695feec-4723-4e51-9092-a1d537d90fee" path="/var/lib/kubelet/pods/b695feec-4723-4e51-9092-a1d537d90fee/volumes" Mar 09 13:40:24 crc kubenswrapper[4723]: I0309 13:40:24.778239 4723 scope.go:117] "RemoveContainer" containerID="fcd4dc04f85767ac905cecfb2a4819cd805845946801d1bdbb7ffe035c726385" Mar 09 13:40:33 crc kubenswrapper[4723]: I0309 13:40:33.947625 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:40:33 crc kubenswrapper[4723]: I0309 13:40:33.948258 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:40:33 crc kubenswrapper[4723]: I0309 13:40:33.948315 4723 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" Mar 09 13:40:33 crc kubenswrapper[4723]: I0309 13:40:33.949349 4723 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ec7109e644ba6b7891bd72977a28590072eb643f158e140ac3846f72eb7be55a"} pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 13:40:33 crc kubenswrapper[4723]: I0309 13:40:33.949415 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" containerID="cri-o://ec7109e644ba6b7891bd72977a28590072eb643f158e140ac3846f72eb7be55a" gracePeriod=600 Mar 09 13:40:35 crc kubenswrapper[4723]: E0309 13:40:35.022660 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:40:35 crc kubenswrapper[4723]: I0309 13:40:35.392749 4723 generic.go:334] "Generic (PLEG): container finished" podID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerID="ec7109e644ba6b7891bd72977a28590072eb643f158e140ac3846f72eb7be55a" exitCode=0 Mar 09 13:40:35 crc kubenswrapper[4723]: I0309 13:40:35.392802 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" event={"ID":"983d5ed4-cfc7-402a-b226-29dc071c6e4e","Type":"ContainerDied","Data":"ec7109e644ba6b7891bd72977a28590072eb643f158e140ac3846f72eb7be55a"} Mar 09 13:40:35 crc kubenswrapper[4723]: I0309 13:40:35.392846 4723 scope.go:117] "RemoveContainer" containerID="62e8c3719a2b94e2de1825a84035da872a22177f9c2e679f00aeaa34e1f4ffd0" Mar 09 13:40:35 crc kubenswrapper[4723]: I0309 13:40:35.393851 4723 scope.go:117] "RemoveContainer" containerID="ec7109e644ba6b7891bd72977a28590072eb643f158e140ac3846f72eb7be55a" Mar 09 13:40:35 crc kubenswrapper[4723]: E0309 13:40:35.394532 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:40:46 crc kubenswrapper[4723]: I0309 13:40:46.883600 4723 scope.go:117] "RemoveContainer" containerID="ec7109e644ba6b7891bd72977a28590072eb643f158e140ac3846f72eb7be55a" Mar 09 13:40:46 crc kubenswrapper[4723]: E0309 13:40:46.885391 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:41:01 crc kubenswrapper[4723]: I0309 13:41:01.881828 4723 scope.go:117] "RemoveContainer" containerID="ec7109e644ba6b7891bd72977a28590072eb643f158e140ac3846f72eb7be55a" Mar 09 13:41:01 crc kubenswrapper[4723]: E0309 13:41:01.883389 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:41:15 crc kubenswrapper[4723]: I0309 13:41:15.881698 4723 scope.go:117] "RemoveContainer" containerID="ec7109e644ba6b7891bd72977a28590072eb643f158e140ac3846f72eb7be55a" Mar 09 13:41:15 crc kubenswrapper[4723]: E0309 13:41:15.882716 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:41:26 crc kubenswrapper[4723]: I0309 13:41:26.889991 4723 scope.go:117] "RemoveContainer" containerID="ec7109e644ba6b7891bd72977a28590072eb643f158e140ac3846f72eb7be55a" Mar 09 13:41:26 crc kubenswrapper[4723]: E0309 13:41:26.890627 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:41:40 crc kubenswrapper[4723]: I0309 13:41:40.882114 4723 scope.go:117] "RemoveContainer" containerID="ec7109e644ba6b7891bd72977a28590072eb643f158e140ac3846f72eb7be55a" Mar 09 13:41:40 crc kubenswrapper[4723]: E0309 13:41:40.883078 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:41:41 crc kubenswrapper[4723]: I0309 13:41:41.129337 4723 generic.go:334] "Generic (PLEG): container finished" podID="e7260959-2422-4f47-b153-63bba9a58875" containerID="90c9bc96b10a4afbc9dc30e34f380099663ac53cd51faf931d81685f664169e8" exitCode=0 Mar 09 13:41:41 crc kubenswrapper[4723]: I0309 13:41:41.129462 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlxv7" event={"ID":"e7260959-2422-4f47-b153-63bba9a58875","Type":"ContainerDied","Data":"90c9bc96b10a4afbc9dc30e34f380099663ac53cd51faf931d81685f664169e8"} Mar 09 13:41:42 crc kubenswrapper[4723]: I0309 13:41:42.618503 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlxv7" Mar 09 13:41:42 crc kubenswrapper[4723]: I0309 13:41:42.809736 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e7260959-2422-4f47-b153-63bba9a58875-ssh-key-openstack-edpm-ipam\") pod \"e7260959-2422-4f47-b153-63bba9a58875\" (UID: \"e7260959-2422-4f47-b153-63bba9a58875\") " Mar 09 13:41:42 crc kubenswrapper[4723]: I0309 13:41:42.810707 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7260959-2422-4f47-b153-63bba9a58875-libvirt-combined-ca-bundle\") pod \"e7260959-2422-4f47-b153-63bba9a58875\" (UID: \"e7260959-2422-4f47-b153-63bba9a58875\") " Mar 09 13:41:42 crc kubenswrapper[4723]: I0309 13:41:42.810777 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r28f4\" (UniqueName: \"kubernetes.io/projected/e7260959-2422-4f47-b153-63bba9a58875-kube-api-access-r28f4\") pod \"e7260959-2422-4f47-b153-63bba9a58875\" (UID: \"e7260959-2422-4f47-b153-63bba9a58875\") " Mar 09 13:41:42 crc kubenswrapper[4723]: I0309 13:41:42.810890 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7260959-2422-4f47-b153-63bba9a58875-inventory\") pod \"e7260959-2422-4f47-b153-63bba9a58875\" (UID: \"e7260959-2422-4f47-b153-63bba9a58875\") " Mar 09 13:41:42 crc kubenswrapper[4723]: I0309 13:41:42.811010 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e7260959-2422-4f47-b153-63bba9a58875-libvirt-secret-0\") pod \"e7260959-2422-4f47-b153-63bba9a58875\" (UID: \"e7260959-2422-4f47-b153-63bba9a58875\") " Mar 09 13:41:42 crc kubenswrapper[4723]: I0309 13:41:42.815232 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7260959-2422-4f47-b153-63bba9a58875-kube-api-access-r28f4" (OuterVolumeSpecName: "kube-api-access-r28f4") pod "e7260959-2422-4f47-b153-63bba9a58875" (UID: "e7260959-2422-4f47-b153-63bba9a58875"). InnerVolumeSpecName "kube-api-access-r28f4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:41:42 crc kubenswrapper[4723]: I0309 13:41:42.815815 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7260959-2422-4f47-b153-63bba9a58875-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "e7260959-2422-4f47-b153-63bba9a58875" (UID: "e7260959-2422-4f47-b153-63bba9a58875"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:42 crc kubenswrapper[4723]: I0309 13:41:42.844508 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7260959-2422-4f47-b153-63bba9a58875-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e7260959-2422-4f47-b153-63bba9a58875" (UID: "e7260959-2422-4f47-b153-63bba9a58875"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:42 crc kubenswrapper[4723]: I0309 13:41:42.845469 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7260959-2422-4f47-b153-63bba9a58875-inventory" (OuterVolumeSpecName: "inventory") pod "e7260959-2422-4f47-b153-63bba9a58875" (UID: "e7260959-2422-4f47-b153-63bba9a58875"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:42 crc kubenswrapper[4723]: I0309 13:41:42.852930 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7260959-2422-4f47-b153-63bba9a58875-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "e7260959-2422-4f47-b153-63bba9a58875" (UID: "e7260959-2422-4f47-b153-63bba9a58875"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:42 crc kubenswrapper[4723]: I0309 13:41:42.914116 4723 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7260959-2422-4f47-b153-63bba9a58875-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:42 crc kubenswrapper[4723]: I0309 13:41:42.914158 4723 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e7260959-2422-4f47-b153-63bba9a58875-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:42 crc kubenswrapper[4723]: I0309 13:41:42.914173 4723 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e7260959-2422-4f47-b153-63bba9a58875-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:42 crc kubenswrapper[4723]: I0309 13:41:42.914184 4723 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7260959-2422-4f47-b153-63bba9a58875-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:42 crc kubenswrapper[4723]: I0309 13:41:42.914196 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r28f4\" (UniqueName: \"kubernetes.io/projected/e7260959-2422-4f47-b153-63bba9a58875-kube-api-access-r28f4\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.151095 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlxv7" event={"ID":"e7260959-2422-4f47-b153-63bba9a58875","Type":"ContainerDied","Data":"45dea2a39e04b3bf4ad6ac7104aad6384376e5e1ecb74e56d5c9a3b04d9caf3f"} Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.151454 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45dea2a39e04b3bf4ad6ac7104aad6384376e5e1ecb74e56d5c9a3b04d9caf3f" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.151147 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlxv7" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.248177 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-sxtxv"] Mar 09 13:41:43 crc kubenswrapper[4723]: E0309 13:41:43.249142 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7260959-2422-4f47-b153-63bba9a58875" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.249167 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7260959-2422-4f47-b153-63bba9a58875" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 09 13:41:43 crc kubenswrapper[4723]: E0309 13:41:43.249219 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c87d396-1f7f-4388-a238-686ce16cfc80" containerName="oc" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.249228 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c87d396-1f7f-4388-a238-686ce16cfc80" containerName="oc" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.251999 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7260959-2422-4f47-b153-63bba9a58875" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.252041 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c87d396-1f7f-4388-a238-686ce16cfc80" containerName="oc" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.253091 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sxtxv" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.255802 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.256249 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.263732 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-sxtxv"] Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.265217 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.265307 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.265456 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gw7vt" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.265871 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.283373 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.425922 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sxtxv\" (UID: \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sxtxv" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.425974 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sxtxv\" (UID: \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sxtxv" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.425998 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sxtxv\" (UID: \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sxtxv" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.426542 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sxtxv\" (UID: \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sxtxv" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.426590 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sxtxv\" (UID: \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sxtxv" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.426610 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sxtxv\" (UID: \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sxtxv" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.426785 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmgrx\" (UniqueName: \"kubernetes.io/projected/bd750bc2-9d58-4d3d-9d73-644c1bce9804-kube-api-access-pmgrx\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sxtxv\" (UID: \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sxtxv" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.426832 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sxtxv\" (UID: \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sxtxv" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.426950 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sxtxv\" (UID: \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sxtxv" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.427063 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sxtxv\" (UID: \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sxtxv" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.427094 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sxtxv\" (UID: \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sxtxv" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.529115 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sxtxv\" (UID: \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sxtxv" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.529171 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sxtxv\" (UID: \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sxtxv" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.529193 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sxtxv\" (UID: \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sxtxv" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.529218 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sxtxv\" (UID: \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sxtxv" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.529244 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sxtxv\" (UID: \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sxtxv" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.529260 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sxtxv\" (UID: \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sxtxv" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.530187 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sxtxv\" (UID: \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sxtxv" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.530392 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmgrx\" (UniqueName: \"kubernetes.io/projected/bd750bc2-9d58-4d3d-9d73-644c1bce9804-kube-api-access-pmgrx\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sxtxv\" (UID: \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sxtxv" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.530425 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sxtxv\" (UID: \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sxtxv" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.531105 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sxtxv\" (UID: \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sxtxv" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.531225 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sxtxv\" (UID: \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sxtxv" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.531252 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sxtxv\" (UID: \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sxtxv" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.533954 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sxtxv\" (UID: \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sxtxv" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.534474 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sxtxv\" (UID: \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sxtxv" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.535344 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sxtxv\" (UID: \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sxtxv" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.535422 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sxtxv\" (UID: \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sxtxv" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.537144 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sxtxv\" (UID: \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sxtxv" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.537588 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sxtxv\" (UID: \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sxtxv" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.539328 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sxtxv\" (UID: \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sxtxv" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.547373 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sxtxv\" (UID: \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sxtxv" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.547495 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sxtxv\" (UID: \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sxtxv" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.558358 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmgrx\" (UniqueName: \"kubernetes.io/projected/bd750bc2-9d58-4d3d-9d73-644c1bce9804-kube-api-access-pmgrx\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sxtxv\" (UID: \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sxtxv" Mar 09 13:41:43 crc kubenswrapper[4723]: I0309 13:41:43.584558 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sxtxv" Mar 09 13:41:44 crc kubenswrapper[4723]: I0309 13:41:44.111618 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-sxtxv"] Mar 09 13:41:44 crc kubenswrapper[4723]: I0309 13:41:44.167218 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sxtxv" event={"ID":"bd750bc2-9d58-4d3d-9d73-644c1bce9804","Type":"ContainerStarted","Data":"7f1c6c7b3406be3f319268ad2732f7238d511c0a1e2bd6bce163b135c93b5d50"} Mar 09 13:41:45 crc kubenswrapper[4723]: I0309 13:41:45.178792 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sxtxv" event={"ID":"bd750bc2-9d58-4d3d-9d73-644c1bce9804","Type":"ContainerStarted","Data":"5296953a4dae0054cada3c501535af6a9340f882f4c76a219690492b66dfa108"} Mar 09 13:41:45 crc kubenswrapper[4723]: I0309 13:41:45.206450 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sxtxv" podStartSLOduration=1.58159735 podStartE2EDuration="2.206427075s" podCreationTimestamp="2026-03-09 13:41:43 +0000 UTC" firstStartedPulling="2026-03-09 13:41:44.105318703 +0000 UTC m=+2578.119786253" lastFinishedPulling="2026-03-09 13:41:44.730148438 +0000 UTC m=+2578.744615978" observedRunningTime="2026-03-09 13:41:45.197685503 +0000 UTC m=+2579.212153043" watchObservedRunningTime="2026-03-09 13:41:45.206427075 +0000 UTC m=+2579.220894625" Mar 09 13:41:55 crc kubenswrapper[4723]: I0309 13:41:55.881518 4723 scope.go:117] "RemoveContainer" containerID="ec7109e644ba6b7891bd72977a28590072eb643f158e140ac3846f72eb7be55a" Mar 09 13:41:55 crc kubenswrapper[4723]: E0309 13:41:55.882276 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:42:00 crc kubenswrapper[4723]: I0309 13:42:00.137807 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551062-6nggw"] Mar 09 13:42:00 crc kubenswrapper[4723]: I0309 13:42:00.141673 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551062-6nggw" Mar 09 13:42:00 crc kubenswrapper[4723]: I0309 13:42:00.146311 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jrp4x" Mar 09 13:42:00 crc kubenswrapper[4723]: I0309 13:42:00.146364 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:42:00 crc kubenswrapper[4723]: I0309 13:42:00.146612 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:42:00 crc kubenswrapper[4723]: I0309 13:42:00.151708 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551062-6nggw"] Mar 09 13:42:00 crc kubenswrapper[4723]: I0309 13:42:00.245204 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwvtz\" (UniqueName: \"kubernetes.io/projected/07a9bd7d-dc03-4b2c-9d9d-673da5110b61-kube-api-access-lwvtz\") pod \"auto-csr-approver-29551062-6nggw\" (UID: \"07a9bd7d-dc03-4b2c-9d9d-673da5110b61\") " pod="openshift-infra/auto-csr-approver-29551062-6nggw" Mar 09 13:42:00 crc kubenswrapper[4723]: I0309 13:42:00.348010 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwvtz\" (UniqueName: \"kubernetes.io/projected/07a9bd7d-dc03-4b2c-9d9d-673da5110b61-kube-api-access-lwvtz\") pod \"auto-csr-approver-29551062-6nggw\" (UID: \"07a9bd7d-dc03-4b2c-9d9d-673da5110b61\") " pod="openshift-infra/auto-csr-approver-29551062-6nggw" Mar 09 13:42:00 crc kubenswrapper[4723]: I0309 13:42:00.368314 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwvtz\" (UniqueName: \"kubernetes.io/projected/07a9bd7d-dc03-4b2c-9d9d-673da5110b61-kube-api-access-lwvtz\") pod \"auto-csr-approver-29551062-6nggw\" (UID: \"07a9bd7d-dc03-4b2c-9d9d-673da5110b61\") " pod="openshift-infra/auto-csr-approver-29551062-6nggw" Mar 09 13:42:00 crc kubenswrapper[4723]: I0309 13:42:00.465770 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551062-6nggw" Mar 09 13:42:00 crc kubenswrapper[4723]: I0309 13:42:00.923278 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551062-6nggw"] Mar 09 13:42:00 crc kubenswrapper[4723]: W0309 13:42:00.926692 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07a9bd7d_dc03_4b2c_9d9d_673da5110b61.slice/crio-e040d0c410fce031faae9a423fdb9c8987f057a43c10949f0b3f739d700859f5 WatchSource:0}: Error finding container e040d0c410fce031faae9a423fdb9c8987f057a43c10949f0b3f739d700859f5: Status 404 returned error can't find the container with id e040d0c410fce031faae9a423fdb9c8987f057a43c10949f0b3f739d700859f5 Mar 09 13:42:01 crc kubenswrapper[4723]: I0309 13:42:01.340758 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551062-6nggw" event={"ID":"07a9bd7d-dc03-4b2c-9d9d-673da5110b61","Type":"ContainerStarted","Data":"e040d0c410fce031faae9a423fdb9c8987f057a43c10949f0b3f739d700859f5"} Mar 09 13:42:02 crc kubenswrapper[4723]: I0309 13:42:02.365136 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551062-6nggw" event={"ID":"07a9bd7d-dc03-4b2c-9d9d-673da5110b61","Type":"ContainerStarted","Data":"c2eef59af587150b8d74827db9f407add65326c4259d10f929c1f0c8e2e7cc01"} Mar 09 13:42:02 crc kubenswrapper[4723]: I0309 13:42:02.383141 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551062-6nggw" podStartSLOduration=1.311091776 podStartE2EDuration="2.383115174s" podCreationTimestamp="2026-03-09 13:42:00 +0000 UTC" firstStartedPulling="2026-03-09 13:42:00.929976073 +0000 UTC m=+2594.944443643" lastFinishedPulling="2026-03-09 13:42:02.001999511 +0000 UTC m=+2596.016467041" observedRunningTime="2026-03-09 13:42:02.377679339 +0000 UTC m=+2596.392146879" watchObservedRunningTime="2026-03-09 13:42:02.383115174 +0000 UTC m=+2596.397582714" Mar 09 13:42:03 crc kubenswrapper[4723]: I0309 13:42:03.375344 4723 generic.go:334] "Generic (PLEG): container finished" podID="07a9bd7d-dc03-4b2c-9d9d-673da5110b61" containerID="c2eef59af587150b8d74827db9f407add65326c4259d10f929c1f0c8e2e7cc01" exitCode=0 Mar 09 13:42:03 crc kubenswrapper[4723]: I0309 13:42:03.375402 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551062-6nggw" event={"ID":"07a9bd7d-dc03-4b2c-9d9d-673da5110b61","Type":"ContainerDied","Data":"c2eef59af587150b8d74827db9f407add65326c4259d10f929c1f0c8e2e7cc01"} Mar 09 13:42:04 crc kubenswrapper[4723]: I0309 13:42:04.790802 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551062-6nggw" Mar 09 13:42:04 crc kubenswrapper[4723]: I0309 13:42:04.858636 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwvtz\" (UniqueName: \"kubernetes.io/projected/07a9bd7d-dc03-4b2c-9d9d-673da5110b61-kube-api-access-lwvtz\") pod \"07a9bd7d-dc03-4b2c-9d9d-673da5110b61\" (UID: \"07a9bd7d-dc03-4b2c-9d9d-673da5110b61\") " Mar 09 13:42:04 crc kubenswrapper[4723]: I0309 13:42:04.865064 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07a9bd7d-dc03-4b2c-9d9d-673da5110b61-kube-api-access-lwvtz" (OuterVolumeSpecName: "kube-api-access-lwvtz") pod "07a9bd7d-dc03-4b2c-9d9d-673da5110b61" (UID: "07a9bd7d-dc03-4b2c-9d9d-673da5110b61"). InnerVolumeSpecName "kube-api-access-lwvtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:42:04 crc kubenswrapper[4723]: I0309 13:42:04.962115 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwvtz\" (UniqueName: \"kubernetes.io/projected/07a9bd7d-dc03-4b2c-9d9d-673da5110b61-kube-api-access-lwvtz\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:05 crc kubenswrapper[4723]: I0309 13:42:05.397445 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551062-6nggw" event={"ID":"07a9bd7d-dc03-4b2c-9d9d-673da5110b61","Type":"ContainerDied","Data":"e040d0c410fce031faae9a423fdb9c8987f057a43c10949f0b3f739d700859f5"} Mar 09 13:42:05 crc kubenswrapper[4723]: I0309 13:42:05.397487 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e040d0c410fce031faae9a423fdb9c8987f057a43c10949f0b3f739d700859f5" Mar 09 13:42:05 crc kubenswrapper[4723]: I0309 13:42:05.397544 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551062-6nggw" Mar 09 13:42:05 crc kubenswrapper[4723]: I0309 13:42:05.446936 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551056-rsmtp"] Mar 09 13:42:05 crc kubenswrapper[4723]: I0309 13:42:05.456961 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551056-rsmtp"] Mar 09 13:42:06 crc kubenswrapper[4723]: I0309 13:42:06.902135 4723 scope.go:117] "RemoveContainer" containerID="ec7109e644ba6b7891bd72977a28590072eb643f158e140ac3846f72eb7be55a" Mar 09 13:42:06 crc kubenswrapper[4723]: I0309 13:42:06.902438 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d23ab36b-5283-431a-a5e3-b909c9ff2918" path="/var/lib/kubelet/pods/d23ab36b-5283-431a-a5e3-b909c9ff2918/volumes" Mar 09 13:42:06 crc kubenswrapper[4723]: E0309 13:42:06.902759 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:42:19 crc kubenswrapper[4723]: I0309 13:42:19.881595 4723 scope.go:117] "RemoveContainer" containerID="ec7109e644ba6b7891bd72977a28590072eb643f158e140ac3846f72eb7be55a" Mar 09 13:42:19 crc kubenswrapper[4723]: E0309 13:42:19.882414 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:42:24 crc kubenswrapper[4723]: I0309 13:42:24.903902 4723 scope.go:117] "RemoveContainer" containerID="f253f31fb5be5fdf20bd5874577193a4b52ac11e5afaeb5994228e3aebd0e86a" Mar 09 13:42:32 crc kubenswrapper[4723]: I0309 13:42:32.882642 4723 scope.go:117] "RemoveContainer" containerID="ec7109e644ba6b7891bd72977a28590072eb643f158e140ac3846f72eb7be55a" Mar 09 13:42:32 crc kubenswrapper[4723]: E0309 13:42:32.883718 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:42:43 crc kubenswrapper[4723]: I0309 13:42:43.881325 4723 scope.go:117] "RemoveContainer" containerID="ec7109e644ba6b7891bd72977a28590072eb643f158e140ac3846f72eb7be55a" Mar 09 13:42:43 crc kubenswrapper[4723]: E0309 13:42:43.882212 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:42:56 crc kubenswrapper[4723]: I0309 13:42:56.891550 4723 scope.go:117] "RemoveContainer" containerID="ec7109e644ba6b7891bd72977a28590072eb643f158e140ac3846f72eb7be55a" Mar 09 13:42:56 crc kubenswrapper[4723]: E0309 13:42:56.892572 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:43:10 crc kubenswrapper[4723]: I0309 13:43:10.882741 4723 scope.go:117] "RemoveContainer" containerID="ec7109e644ba6b7891bd72977a28590072eb643f158e140ac3846f72eb7be55a" Mar 09 13:43:10 crc kubenswrapper[4723]: E0309 13:43:10.886203 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:43:23 crc kubenswrapper[4723]: I0309 13:43:23.881388 4723 scope.go:117] "RemoveContainer" containerID="ec7109e644ba6b7891bd72977a28590072eb643f158e140ac3846f72eb7be55a" Mar 09 13:43:23 crc kubenswrapper[4723]: E0309 13:43:23.882555 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:43:38 crc kubenswrapper[4723]: I0309 13:43:38.881563 4723 scope.go:117] "RemoveContainer" containerID="ec7109e644ba6b7891bd72977a28590072eb643f158e140ac3846f72eb7be55a" Mar 09 13:43:38 crc kubenswrapper[4723]: E0309 13:43:38.882548 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:43:53 crc kubenswrapper[4723]: I0309 13:43:53.881601 4723 scope.go:117] "RemoveContainer" containerID="ec7109e644ba6b7891bd72977a28590072eb643f158e140ac3846f72eb7be55a" Mar 09 13:43:53 crc kubenswrapper[4723]: E0309 13:43:53.882394 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:44:00 crc kubenswrapper[4723]: I0309 13:44:00.207244 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551064-vff59"] Mar 09 13:44:00 crc kubenswrapper[4723]: E0309 13:44:00.208392 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07a9bd7d-dc03-4b2c-9d9d-673da5110b61" containerName="oc" Mar 09 13:44:00 crc kubenswrapper[4723]: I0309 13:44:00.208411 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="07a9bd7d-dc03-4b2c-9d9d-673da5110b61" containerName="oc" Mar 09 13:44:00 crc kubenswrapper[4723]: I0309 13:44:00.208695 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="07a9bd7d-dc03-4b2c-9d9d-673da5110b61" containerName="oc" Mar 09 13:44:00 crc kubenswrapper[4723]: I0309 13:44:00.209698 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551064-vff59" Mar 09 13:44:00 crc kubenswrapper[4723]: I0309 13:44:00.212659 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:44:00 crc kubenswrapper[4723]: I0309 13:44:00.212789 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jrp4x" Mar 09 13:44:00 crc kubenswrapper[4723]: I0309 13:44:00.212984 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:44:00 crc kubenswrapper[4723]: I0309 13:44:00.220734 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551064-vff59"] Mar 09 13:44:00 crc kubenswrapper[4723]: I0309 13:44:00.356299 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snt4j\" (UniqueName: \"kubernetes.io/projected/bc9de4ea-678a-4fc1-ad89-09afe0581584-kube-api-access-snt4j\") pod \"auto-csr-approver-29551064-vff59\" (UID: \"bc9de4ea-678a-4fc1-ad89-09afe0581584\") " pod="openshift-infra/auto-csr-approver-29551064-vff59" Mar 09 13:44:00 crc kubenswrapper[4723]: I0309 13:44:00.459798 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snt4j\" (UniqueName: \"kubernetes.io/projected/bc9de4ea-678a-4fc1-ad89-09afe0581584-kube-api-access-snt4j\") pod \"auto-csr-approver-29551064-vff59\" (UID: \"bc9de4ea-678a-4fc1-ad89-09afe0581584\") " pod="openshift-infra/auto-csr-approver-29551064-vff59" Mar 09 13:44:00 crc kubenswrapper[4723]: I0309 13:44:00.485005 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snt4j\" (UniqueName: \"kubernetes.io/projected/bc9de4ea-678a-4fc1-ad89-09afe0581584-kube-api-access-snt4j\") pod \"auto-csr-approver-29551064-vff59\" (UID: \"bc9de4ea-678a-4fc1-ad89-09afe0581584\") " pod="openshift-infra/auto-csr-approver-29551064-vff59" Mar 09 13:44:00 crc kubenswrapper[4723]: I0309 13:44:00.531373 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551064-vff59" Mar 09 13:44:01 crc kubenswrapper[4723]: I0309 13:44:01.030834 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551064-vff59"] Mar 09 13:44:01 crc kubenswrapper[4723]: I0309 13:44:01.630487 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551064-vff59" event={"ID":"bc9de4ea-678a-4fc1-ad89-09afe0581584","Type":"ContainerStarted","Data":"0c2eaa6d96c26189a26a30dbec45d1390c27e673bcbd629004a4da320b28ff8f"} Mar 09 13:44:02 crc kubenswrapper[4723]: I0309 13:44:02.642338 4723 generic.go:334] "Generic (PLEG): container finished" podID="bc9de4ea-678a-4fc1-ad89-09afe0581584" containerID="ea165a7614d5ccfad94bd5ad5343af2df093a51c50c89bbe998b7a79f284d26b" exitCode=0 Mar 09 13:44:02 crc kubenswrapper[4723]: I0309 13:44:02.642390 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551064-vff59" event={"ID":"bc9de4ea-678a-4fc1-ad89-09afe0581584","Type":"ContainerDied","Data":"ea165a7614d5ccfad94bd5ad5343af2df093a51c50c89bbe998b7a79f284d26b"} Mar 09 13:44:04 crc kubenswrapper[4723]: I0309 13:44:04.058827 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551064-vff59" Mar 09 13:44:04 crc kubenswrapper[4723]: I0309 13:44:04.258002 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snt4j\" (UniqueName: \"kubernetes.io/projected/bc9de4ea-678a-4fc1-ad89-09afe0581584-kube-api-access-snt4j\") pod \"bc9de4ea-678a-4fc1-ad89-09afe0581584\" (UID: \"bc9de4ea-678a-4fc1-ad89-09afe0581584\") " Mar 09 13:44:04 crc kubenswrapper[4723]: I0309 13:44:04.267308 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc9de4ea-678a-4fc1-ad89-09afe0581584-kube-api-access-snt4j" (OuterVolumeSpecName: "kube-api-access-snt4j") pod "bc9de4ea-678a-4fc1-ad89-09afe0581584" (UID: "bc9de4ea-678a-4fc1-ad89-09afe0581584"). InnerVolumeSpecName "kube-api-access-snt4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:44:04 crc kubenswrapper[4723]: I0309 13:44:04.360960 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snt4j\" (UniqueName: \"kubernetes.io/projected/bc9de4ea-678a-4fc1-ad89-09afe0581584-kube-api-access-snt4j\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:04 crc kubenswrapper[4723]: I0309 13:44:04.665188 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551064-vff59" event={"ID":"bc9de4ea-678a-4fc1-ad89-09afe0581584","Type":"ContainerDied","Data":"0c2eaa6d96c26189a26a30dbec45d1390c27e673bcbd629004a4da320b28ff8f"} Mar 09 13:44:04 crc kubenswrapper[4723]: I0309 13:44:04.665235 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c2eaa6d96c26189a26a30dbec45d1390c27e673bcbd629004a4da320b28ff8f" Mar 09 13:44:04 crc kubenswrapper[4723]: I0309 13:44:04.665240 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551064-vff59" Mar 09 13:44:05 crc kubenswrapper[4723]: I0309 13:44:05.127355 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551058-zclwx"] Mar 09 13:44:05 crc kubenswrapper[4723]: I0309 13:44:05.143140 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551058-zclwx"] Mar 09 13:44:06 crc kubenswrapper[4723]: I0309 13:44:06.686099 4723 generic.go:334] "Generic (PLEG): container finished" podID="bd750bc2-9d58-4d3d-9d73-644c1bce9804" containerID="5296953a4dae0054cada3c501535af6a9340f882f4c76a219690492b66dfa108" exitCode=0 Mar 09 13:44:06 crc kubenswrapper[4723]: I0309 13:44:06.686221 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sxtxv" event={"ID":"bd750bc2-9d58-4d3d-9d73-644c1bce9804","Type":"ContainerDied","Data":"5296953a4dae0054cada3c501535af6a9340f882f4c76a219690492b66dfa108"} Mar 09 13:44:06 crc kubenswrapper[4723]: I0309 13:44:06.896364 4723 scope.go:117] "RemoveContainer" containerID="ec7109e644ba6b7891bd72977a28590072eb643f158e140ac3846f72eb7be55a" Mar 09 13:44:06 crc kubenswrapper[4723]: E0309 13:44:06.896937 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:44:06 crc kubenswrapper[4723]: I0309 13:44:06.897915 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90bf326b-8f21-4736-8ab3-5265edda19ff" path="/var/lib/kubelet/pods/90bf326b-8f21-4736-8ab3-5265edda19ff/volumes" Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.194434 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sxtxv" Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.350817 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-combined-ca-bundle\") pod \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\" (UID: \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\") " Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.350981 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmgrx\" (UniqueName: \"kubernetes.io/projected/bd750bc2-9d58-4d3d-9d73-644c1bce9804-kube-api-access-pmgrx\") pod \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\" (UID: \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\") " Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.351047 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-cell1-compute-config-3\") pod \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\" (UID: \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\") " Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.351078 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-cell1-compute-config-2\") pod \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\" (UID: \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\") " Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.351134 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-migration-ssh-key-1\") pod \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\" (UID: \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\") " Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.351267 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-extra-config-0\") pod \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\" (UID: \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\") " Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.351351 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-ssh-key-openstack-edpm-ipam\") pod \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\" (UID: \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\") " Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.351387 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-cell1-compute-config-0\") pod \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\" (UID: \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\") " Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.351414 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-inventory\") pod \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\" (UID: \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\") " Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.351507 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-migration-ssh-key-0\") pod \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\" (UID: \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\") " Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.351582 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-cell1-compute-config-1\") pod \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\" (UID: \"bd750bc2-9d58-4d3d-9d73-644c1bce9804\") " Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.356806 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "bd750bc2-9d58-4d3d-9d73-644c1bce9804" (UID: "bd750bc2-9d58-4d3d-9d73-644c1bce9804"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.357340 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd750bc2-9d58-4d3d-9d73-644c1bce9804-kube-api-access-pmgrx" (OuterVolumeSpecName: "kube-api-access-pmgrx") pod "bd750bc2-9d58-4d3d-9d73-644c1bce9804" (UID: "bd750bc2-9d58-4d3d-9d73-644c1bce9804"). InnerVolumeSpecName "kube-api-access-pmgrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.394615 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "bd750bc2-9d58-4d3d-9d73-644c1bce9804" (UID: "bd750bc2-9d58-4d3d-9d73-644c1bce9804"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.396493 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "bd750bc2-9d58-4d3d-9d73-644c1bce9804" (UID: "bd750bc2-9d58-4d3d-9d73-644c1bce9804"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.396548 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "bd750bc2-9d58-4d3d-9d73-644c1bce9804" (UID: "bd750bc2-9d58-4d3d-9d73-644c1bce9804"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.396615 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "bd750bc2-9d58-4d3d-9d73-644c1bce9804" (UID: "bd750bc2-9d58-4d3d-9d73-644c1bce9804"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.396675 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "bd750bc2-9d58-4d3d-9d73-644c1bce9804" (UID: "bd750bc2-9d58-4d3d-9d73-644c1bce9804"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.397618 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "bd750bc2-9d58-4d3d-9d73-644c1bce9804" (UID: "bd750bc2-9d58-4d3d-9d73-644c1bce9804"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.401064 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "bd750bc2-9d58-4d3d-9d73-644c1bce9804" (UID: "bd750bc2-9d58-4d3d-9d73-644c1bce9804"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.411097 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-inventory" (OuterVolumeSpecName: "inventory") pod "bd750bc2-9d58-4d3d-9d73-644c1bce9804" (UID: "bd750bc2-9d58-4d3d-9d73-644c1bce9804"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.411378 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "bd750bc2-9d58-4d3d-9d73-644c1bce9804" (UID: "bd750bc2-9d58-4d3d-9d73-644c1bce9804"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.455182 4723 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.455220 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmgrx\" (UniqueName: \"kubernetes.io/projected/bd750bc2-9d58-4d3d-9d73-644c1bce9804-kube-api-access-pmgrx\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.455230 4723 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.455241 4723 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.455252 4723 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.455263 4723 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.455272 4723 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.455280 4723 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.455289 4723 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.455299 4723 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.455307 4723 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/bd750bc2-9d58-4d3d-9d73-644c1bce9804-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.721409 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sxtxv" event={"ID":"bd750bc2-9d58-4d3d-9d73-644c1bce9804","Type":"ContainerDied","Data":"7f1c6c7b3406be3f319268ad2732f7238d511c0a1e2bd6bce163b135c93b5d50"} Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.721455 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f1c6c7b3406be3f319268ad2732f7238d511c0a1e2bd6bce163b135c93b5d50" Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.721523 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sxtxv" Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.810233 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb"] Mar 09 13:44:08 crc kubenswrapper[4723]: E0309 13:44:08.810812 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc9de4ea-678a-4fc1-ad89-09afe0581584" containerName="oc" Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.810830 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc9de4ea-678a-4fc1-ad89-09afe0581584" containerName="oc" Mar 09 13:44:08 crc kubenswrapper[4723]: E0309 13:44:08.810853 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd750bc2-9d58-4d3d-9d73-644c1bce9804" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.810876 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd750bc2-9d58-4d3d-9d73-644c1bce9804" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.811174 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc9de4ea-678a-4fc1-ad89-09afe0581584" containerName="oc" Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.811194 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd750bc2-9d58-4d3d-9d73-644c1bce9804" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.812110 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb" Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.814892 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.815073 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.815925 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gw7vt" Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.816441 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.816918 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.824808 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb"] Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.967102 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtg5m\" (UniqueName: \"kubernetes.io/projected/76b26f74-a654-4507-a416-617f8fec3d89-kube-api-access-qtg5m\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb\" (UID: \"76b26f74-a654-4507-a416-617f8fec3d89\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb" Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.967327 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/76b26f74-a654-4507-a416-617f8fec3d89-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb\" (UID: \"76b26f74-a654-4507-a416-617f8fec3d89\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb" Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.967412 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76b26f74-a654-4507-a416-617f8fec3d89-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb\" (UID: \"76b26f74-a654-4507-a416-617f8fec3d89\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb" Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.967543 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/76b26f74-a654-4507-a416-617f8fec3d89-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb\" (UID: \"76b26f74-a654-4507-a416-617f8fec3d89\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb" Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.967982 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/76b26f74-a654-4507-a416-617f8fec3d89-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb\" (UID: \"76b26f74-a654-4507-a416-617f8fec3d89\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb" Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.968119 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/76b26f74-a654-4507-a416-617f8fec3d89-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb\" (UID: \"76b26f74-a654-4507-a416-617f8fec3d89\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb" Mar 09 13:44:08 crc kubenswrapper[4723]: I0309 13:44:08.968221 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/76b26f74-a654-4507-a416-617f8fec3d89-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb\" (UID: \"76b26f74-a654-4507-a416-617f8fec3d89\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb" Mar 09 13:44:09 crc kubenswrapper[4723]: I0309 13:44:09.073107 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/76b26f74-a654-4507-a416-617f8fec3d89-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb\" (UID: \"76b26f74-a654-4507-a416-617f8fec3d89\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb" Mar 09 13:44:09 crc kubenswrapper[4723]: I0309 13:44:09.074571 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/76b26f74-a654-4507-a416-617f8fec3d89-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb\" (UID: \"76b26f74-a654-4507-a416-617f8fec3d89\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb" Mar 09 13:44:09 crc kubenswrapper[4723]: I0309 13:44:09.074797 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/76b26f74-a654-4507-a416-617f8fec3d89-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb\" (UID: \"76b26f74-a654-4507-a416-617f8fec3d89\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb" Mar 09 13:44:09 crc kubenswrapper[4723]: I0309 13:44:09.075014 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/76b26f74-a654-4507-a416-617f8fec3d89-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb\" (UID: \"76b26f74-a654-4507-a416-617f8fec3d89\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb" Mar 09 13:44:09 crc kubenswrapper[4723]: I0309 13:44:09.075194 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtg5m\" (UniqueName: \"kubernetes.io/projected/76b26f74-a654-4507-a416-617f8fec3d89-kube-api-access-qtg5m\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb\" (UID: \"76b26f74-a654-4507-a416-617f8fec3d89\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb" Mar 09 13:44:09 crc kubenswrapper[4723]: I0309 13:44:09.075373 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/76b26f74-a654-4507-a416-617f8fec3d89-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb\" (UID: \"76b26f74-a654-4507-a416-617f8fec3d89\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb" Mar 09 13:44:09 crc kubenswrapper[4723]: I0309 13:44:09.075483 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76b26f74-a654-4507-a416-617f8fec3d89-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb\" (UID: \"76b26f74-a654-4507-a416-617f8fec3d89\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb" Mar 09 13:44:09 crc kubenswrapper[4723]: I0309 13:44:09.088431 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/76b26f74-a654-4507-a416-617f8fec3d89-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb\" (UID: \"76b26f74-a654-4507-a416-617f8fec3d89\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb" Mar 09 13:44:09 crc kubenswrapper[4723]: I0309 13:44:09.091393 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/76b26f74-a654-4507-a416-617f8fec3d89-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb\" (UID: \"76b26f74-a654-4507-a416-617f8fec3d89\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb" Mar 09 13:44:09 crc kubenswrapper[4723]: I0309 13:44:09.092688 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/76b26f74-a654-4507-a416-617f8fec3d89-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb\" (UID: \"76b26f74-a654-4507-a416-617f8fec3d89\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb" Mar 09 13:44:09 crc kubenswrapper[4723]: I0309 13:44:09.094330 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76b26f74-a654-4507-a416-617f8fec3d89-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb\" (UID: \"76b26f74-a654-4507-a416-617f8fec3d89\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb" Mar 09 13:44:09 crc kubenswrapper[4723]: I0309 13:44:09.097370 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/76b26f74-a654-4507-a416-617f8fec3d89-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb\" (UID: \"76b26f74-a654-4507-a416-617f8fec3d89\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb" Mar 09 13:44:09 crc kubenswrapper[4723]: I0309 13:44:09.101154 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/76b26f74-a654-4507-a416-617f8fec3d89-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb\" (UID: \"76b26f74-a654-4507-a416-617f8fec3d89\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb" Mar 09 13:44:09 crc kubenswrapper[4723]: I0309 13:44:09.106357 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtg5m\" (UniqueName: \"kubernetes.io/projected/76b26f74-a654-4507-a416-617f8fec3d89-kube-api-access-qtg5m\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb\" (UID: \"76b26f74-a654-4507-a416-617f8fec3d89\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb" Mar 09 13:44:09 crc kubenswrapper[4723]: I0309 13:44:09.135797 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb" Mar 09 13:44:09 crc kubenswrapper[4723]: I0309 13:44:09.718043 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb"] Mar 09 13:44:09 crc kubenswrapper[4723]: I0309 13:44:09.733012 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb" event={"ID":"76b26f74-a654-4507-a416-617f8fec3d89","Type":"ContainerStarted","Data":"6777ffa62689f686c2fbddfb306682ca0d836988649172d5e95ad07e4ea713c8"} Mar 09 13:44:10 crc kubenswrapper[4723]: I0309 13:44:10.747847 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb" event={"ID":"76b26f74-a654-4507-a416-617f8fec3d89","Type":"ContainerStarted","Data":"c7c4fcf237bfe36dc971d6eb8bfda2d7d5fa71791e08ad06e10dc704a2c57028"} Mar 09 13:44:10 crc kubenswrapper[4723]: I0309 13:44:10.773959 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb" podStartSLOduration=2.225543029 podStartE2EDuration="2.773934428s" podCreationTimestamp="2026-03-09 13:44:08 +0000 UTC" firstStartedPulling="2026-03-09 13:44:09.724764639 +0000 UTC m=+2723.739232179" lastFinishedPulling="2026-03-09 13:44:10.273156038 +0000 UTC m=+2724.287623578" observedRunningTime="2026-03-09 13:44:10.767972609 +0000 UTC m=+2724.782440149" watchObservedRunningTime="2026-03-09 13:44:10.773934428 +0000 UTC m=+2724.788401968" Mar 09 13:44:15 crc kubenswrapper[4723]: I0309 13:44:15.927436 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-j6nk8"] Mar 09 13:44:15 crc kubenswrapper[4723]: I0309 13:44:15.930498 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j6nk8" Mar 09 13:44:15 crc kubenswrapper[4723]: I0309 13:44:15.940369 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j6nk8"] Mar 09 13:44:15 crc kubenswrapper[4723]: I0309 13:44:15.951205 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9a247a2-3e98-473e-ac12-587785fb8f5b-utilities\") pod \"redhat-operators-j6nk8\" (UID: \"f9a247a2-3e98-473e-ac12-587785fb8f5b\") " pod="openshift-marketplace/redhat-operators-j6nk8" Mar 09 13:44:15 crc kubenswrapper[4723]: I0309 13:44:15.951452 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9a247a2-3e98-473e-ac12-587785fb8f5b-catalog-content\") pod \"redhat-operators-j6nk8\" (UID: \"f9a247a2-3e98-473e-ac12-587785fb8f5b\") " pod="openshift-marketplace/redhat-operators-j6nk8" Mar 09 13:44:15 crc kubenswrapper[4723]: I0309 13:44:15.951652 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q976l\" (UniqueName: \"kubernetes.io/projected/f9a247a2-3e98-473e-ac12-587785fb8f5b-kube-api-access-q976l\") pod \"redhat-operators-j6nk8\" (UID: \"f9a247a2-3e98-473e-ac12-587785fb8f5b\") " pod="openshift-marketplace/redhat-operators-j6nk8" Mar 09 13:44:16 crc kubenswrapper[4723]: I0309 13:44:16.053401 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9a247a2-3e98-473e-ac12-587785fb8f5b-utilities\") pod \"redhat-operators-j6nk8\" (UID: \"f9a247a2-3e98-473e-ac12-587785fb8f5b\") " pod="openshift-marketplace/redhat-operators-j6nk8" Mar 09 13:44:16 crc kubenswrapper[4723]: I0309 13:44:16.053534 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9a247a2-3e98-473e-ac12-587785fb8f5b-catalog-content\") pod \"redhat-operators-j6nk8\" (UID: \"f9a247a2-3e98-473e-ac12-587785fb8f5b\") " pod="openshift-marketplace/redhat-operators-j6nk8" Mar 09 13:44:16 crc kubenswrapper[4723]: I0309 13:44:16.053596 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q976l\" (UniqueName: \"kubernetes.io/projected/f9a247a2-3e98-473e-ac12-587785fb8f5b-kube-api-access-q976l\") pod \"redhat-operators-j6nk8\" (UID: \"f9a247a2-3e98-473e-ac12-587785fb8f5b\") " pod="openshift-marketplace/redhat-operators-j6nk8" Mar 09 13:44:16 crc kubenswrapper[4723]: I0309 13:44:16.053929 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9a247a2-3e98-473e-ac12-587785fb8f5b-utilities\") pod \"redhat-operators-j6nk8\" (UID: \"f9a247a2-3e98-473e-ac12-587785fb8f5b\") " pod="openshift-marketplace/redhat-operators-j6nk8" Mar 09 13:44:16 crc kubenswrapper[4723]: I0309 13:44:16.054030 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9a247a2-3e98-473e-ac12-587785fb8f5b-catalog-content\") pod \"redhat-operators-j6nk8\" (UID: \"f9a247a2-3e98-473e-ac12-587785fb8f5b\") " pod="openshift-marketplace/redhat-operators-j6nk8" Mar 09 13:44:16 crc kubenswrapper[4723]: I0309 13:44:16.073595 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q976l\" (UniqueName: \"kubernetes.io/projected/f9a247a2-3e98-473e-ac12-587785fb8f5b-kube-api-access-q976l\") pod \"redhat-operators-j6nk8\" (UID: \"f9a247a2-3e98-473e-ac12-587785fb8f5b\") " pod="openshift-marketplace/redhat-operators-j6nk8" Mar 09 13:44:16 crc kubenswrapper[4723]: I0309 13:44:16.277715 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j6nk8" Mar 09 13:44:16 crc kubenswrapper[4723]: I0309 13:44:16.809780 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j6nk8"] Mar 09 13:44:17 crc kubenswrapper[4723]: I0309 13:44:17.817416 4723 generic.go:334] "Generic (PLEG): container finished" podID="f9a247a2-3e98-473e-ac12-587785fb8f5b" containerID="26b6f229e6f608dcd2b90e164211c8c3e5c5dbdf56ffb5aed7d81672418c06ff" exitCode=0 Mar 09 13:44:17 crc kubenswrapper[4723]: I0309 13:44:17.817724 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6nk8" event={"ID":"f9a247a2-3e98-473e-ac12-587785fb8f5b","Type":"ContainerDied","Data":"26b6f229e6f608dcd2b90e164211c8c3e5c5dbdf56ffb5aed7d81672418c06ff"} Mar 09 13:44:17 crc kubenswrapper[4723]: I0309 13:44:17.817749 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6nk8" event={"ID":"f9a247a2-3e98-473e-ac12-587785fb8f5b","Type":"ContainerStarted","Data":"fb231bd921926e3d6839a94660988e3a8b89d5c057a7b06191573efb71fac090"} Mar 09 13:44:19 crc kubenswrapper[4723]: I0309 13:44:19.853803 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6nk8" event={"ID":"f9a247a2-3e98-473e-ac12-587785fb8f5b","Type":"ContainerStarted","Data":"fd8ef6987505ce59b652d5bdc30cd1fa278a863f81d4f7e450b73d8780162d78"} Mar 09 13:44:21 crc kubenswrapper[4723]: I0309 13:44:21.880660 4723 scope.go:117] "RemoveContainer" containerID="ec7109e644ba6b7891bd72977a28590072eb643f158e140ac3846f72eb7be55a" Mar 09 13:44:21 crc kubenswrapper[4723]: E0309 13:44:21.881161 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:44:22 crc kubenswrapper[4723]: I0309 13:44:22.982118 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xwkz8"] Mar 09 13:44:22 crc kubenswrapper[4723]: I0309 13:44:22.985915 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xwkz8" Mar 09 13:44:22 crc kubenswrapper[4723]: I0309 13:44:22.995080 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xwkz8"] Mar 09 13:44:23 crc kubenswrapper[4723]: I0309 13:44:23.142463 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/860c77c5-ebd9-4c93-8136-6f06ef270d7c-utilities\") pod \"certified-operators-xwkz8\" (UID: \"860c77c5-ebd9-4c93-8136-6f06ef270d7c\") " pod="openshift-marketplace/certified-operators-xwkz8" Mar 09 13:44:23 crc kubenswrapper[4723]: I0309 13:44:23.142512 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qlrn\" (UniqueName: \"kubernetes.io/projected/860c77c5-ebd9-4c93-8136-6f06ef270d7c-kube-api-access-5qlrn\") pod \"certified-operators-xwkz8\" (UID: \"860c77c5-ebd9-4c93-8136-6f06ef270d7c\") " pod="openshift-marketplace/certified-operators-xwkz8" Mar 09 13:44:23 crc kubenswrapper[4723]: I0309 13:44:23.142587 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/860c77c5-ebd9-4c93-8136-6f06ef270d7c-catalog-content\") pod \"certified-operators-xwkz8\" (UID: \"860c77c5-ebd9-4c93-8136-6f06ef270d7c\") " pod="openshift-marketplace/certified-operators-xwkz8" Mar 09 13:44:23 crc kubenswrapper[4723]: I0309 13:44:23.245198 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qlrn\" (UniqueName: \"kubernetes.io/projected/860c77c5-ebd9-4c93-8136-6f06ef270d7c-kube-api-access-5qlrn\") pod \"certified-operators-xwkz8\" (UID: \"860c77c5-ebd9-4c93-8136-6f06ef270d7c\") " pod="openshift-marketplace/certified-operators-xwkz8" Mar 09 13:44:23 crc kubenswrapper[4723]: I0309 13:44:23.245333 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/860c77c5-ebd9-4c93-8136-6f06ef270d7c-catalog-content\") pod \"certified-operators-xwkz8\" (UID: \"860c77c5-ebd9-4c93-8136-6f06ef270d7c\") " pod="openshift-marketplace/certified-operators-xwkz8" Mar 09 13:44:23 crc kubenswrapper[4723]: I0309 13:44:23.245505 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/860c77c5-ebd9-4c93-8136-6f06ef270d7c-utilities\") pod \"certified-operators-xwkz8\" (UID: \"860c77c5-ebd9-4c93-8136-6f06ef270d7c\") " pod="openshift-marketplace/certified-operators-xwkz8" Mar 09 13:44:23 crc kubenswrapper[4723]: I0309 13:44:23.246162 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/860c77c5-ebd9-4c93-8136-6f06ef270d7c-catalog-content\") pod \"certified-operators-xwkz8\" (UID: \"860c77c5-ebd9-4c93-8136-6f06ef270d7c\") " pod="openshift-marketplace/certified-operators-xwkz8" Mar 09 13:44:23 crc kubenswrapper[4723]: I0309 13:44:23.246194 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/860c77c5-ebd9-4c93-8136-6f06ef270d7c-utilities\") pod \"certified-operators-xwkz8\" (UID: \"860c77c5-ebd9-4c93-8136-6f06ef270d7c\") " pod="openshift-marketplace/certified-operators-xwkz8" Mar 09 13:44:23 crc kubenswrapper[4723]: I0309 13:44:23.265779 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qlrn\" (UniqueName: \"kubernetes.io/projected/860c77c5-ebd9-4c93-8136-6f06ef270d7c-kube-api-access-5qlrn\") pod \"certified-operators-xwkz8\" (UID: \"860c77c5-ebd9-4c93-8136-6f06ef270d7c\") " pod="openshift-marketplace/certified-operators-xwkz8" Mar 09 13:44:23 crc kubenswrapper[4723]: I0309 13:44:23.325484 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xwkz8" Mar 09 13:44:23 crc kubenswrapper[4723]: I0309 13:44:23.948001 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xwkz8"] Mar 09 13:44:24 crc kubenswrapper[4723]: I0309 13:44:24.901574 4723 generic.go:334] "Generic (PLEG): container finished" podID="860c77c5-ebd9-4c93-8136-6f06ef270d7c" containerID="2f3768ee02403d6d25575608fdb66325198e64f35f0c32ea775dc8d98c4beda4" exitCode=0 Mar 09 13:44:24 crc kubenswrapper[4723]: I0309 13:44:24.901946 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xwkz8" event={"ID":"860c77c5-ebd9-4c93-8136-6f06ef270d7c","Type":"ContainerDied","Data":"2f3768ee02403d6d25575608fdb66325198e64f35f0c32ea775dc8d98c4beda4"} Mar 09 13:44:24 crc kubenswrapper[4723]: I0309 13:44:24.901981 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xwkz8" event={"ID":"860c77c5-ebd9-4c93-8136-6f06ef270d7c","Type":"ContainerStarted","Data":"9c6933f4ef20598edbaba242813dc1857ca6a82f06bac6a289fb8f0b76989724"} Mar 09 13:44:24 crc kubenswrapper[4723]: I0309 13:44:24.906570 4723 generic.go:334] "Generic (PLEG): container finished" podID="f9a247a2-3e98-473e-ac12-587785fb8f5b" containerID="fd8ef6987505ce59b652d5bdc30cd1fa278a863f81d4f7e450b73d8780162d78" exitCode=0 Mar 09 13:44:24 crc kubenswrapper[4723]: I0309 13:44:24.906602 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6nk8" event={"ID":"f9a247a2-3e98-473e-ac12-587785fb8f5b","Type":"ContainerDied","Data":"fd8ef6987505ce59b652d5bdc30cd1fa278a863f81d4f7e450b73d8780162d78"} Mar 09 13:44:25 crc kubenswrapper[4723]: I0309 13:44:25.006459 4723 scope.go:117] "RemoveContainer" containerID="ca2e8c64945ada93e1c0cb567226db661b96341bc037f6faa04fee026aad8285" Mar 09 13:44:26 crc kubenswrapper[4723]: I0309 13:44:26.931521 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6nk8" event={"ID":"f9a247a2-3e98-473e-ac12-587785fb8f5b","Type":"ContainerStarted","Data":"84543e77eb9e7e606c449c75382a7c37f1dddc999cdaa2f2d95c264c14dff388"} Mar 09 13:44:26 crc kubenswrapper[4723]: I0309 13:44:26.934557 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xwkz8" event={"ID":"860c77c5-ebd9-4c93-8136-6f06ef270d7c","Type":"ContainerStarted","Data":"960bbf2b4e902b332fddd14b22c7c6c93dbe8300fdaee6e900b5539f5d0d9bfc"} Mar 09 13:44:26 crc kubenswrapper[4723]: I0309 13:44:26.963588 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-j6nk8" podStartSLOduration=4.086950218 podStartE2EDuration="11.963568102s" podCreationTimestamp="2026-03-09 13:44:15 +0000 UTC" firstStartedPulling="2026-03-09 13:44:17.819740411 +0000 UTC m=+2731.834207951" lastFinishedPulling="2026-03-09 13:44:25.696358295 +0000 UTC m=+2739.710825835" observedRunningTime="2026-03-09 13:44:26.951167522 +0000 UTC m=+2740.965635062" watchObservedRunningTime="2026-03-09 13:44:26.963568102 +0000 UTC m=+2740.978035642" Mar 09 13:44:28 crc kubenswrapper[4723]: I0309 13:44:28.988613 4723 generic.go:334] "Generic (PLEG): container finished" podID="860c77c5-ebd9-4c93-8136-6f06ef270d7c" containerID="960bbf2b4e902b332fddd14b22c7c6c93dbe8300fdaee6e900b5539f5d0d9bfc" exitCode=0 Mar 09 13:44:28 crc kubenswrapper[4723]: I0309 13:44:28.988685 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xwkz8" event={"ID":"860c77c5-ebd9-4c93-8136-6f06ef270d7c","Type":"ContainerDied","Data":"960bbf2b4e902b332fddd14b22c7c6c93dbe8300fdaee6e900b5539f5d0d9bfc"} Mar 09 13:44:30 crc kubenswrapper[4723]: I0309 13:44:30.001649 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xwkz8" event={"ID":"860c77c5-ebd9-4c93-8136-6f06ef270d7c","Type":"ContainerStarted","Data":"e96402c316e8ece68a76121d806bd1f1e97b6f1240111a3d8a8bc72566904f58"} Mar 09 13:44:30 crc kubenswrapper[4723]: I0309 13:44:30.021868 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xwkz8" podStartSLOduration=3.46567059 podStartE2EDuration="8.021839232s" podCreationTimestamp="2026-03-09 13:44:22 +0000 UTC" firstStartedPulling="2026-03-09 13:44:24.903778972 +0000 UTC m=+2738.918246512" lastFinishedPulling="2026-03-09 13:44:29.459947614 +0000 UTC m=+2743.474415154" observedRunningTime="2026-03-09 13:44:30.01838429 +0000 UTC m=+2744.032851830" watchObservedRunningTime="2026-03-09 13:44:30.021839232 +0000 UTC m=+2744.036306772" Mar 09 13:44:33 crc kubenswrapper[4723]: I0309 13:44:33.326706 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xwkz8" Mar 09 13:44:33 crc kubenswrapper[4723]: I0309 13:44:33.327444 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xwkz8" Mar 09 13:44:33 crc kubenswrapper[4723]: I0309 13:44:33.392413 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xwkz8" Mar 09 13:44:33 crc kubenswrapper[4723]: I0309 13:44:33.881391 4723 scope.go:117] "RemoveContainer" containerID="ec7109e644ba6b7891bd72977a28590072eb643f158e140ac3846f72eb7be55a" Mar 09 13:44:33 crc kubenswrapper[4723]: E0309 13:44:33.881702 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:44:34 crc kubenswrapper[4723]: I0309 13:44:34.113067 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xwkz8" Mar 09 13:44:34 crc kubenswrapper[4723]: I0309 13:44:34.167093 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xwkz8"] Mar 09 13:44:36 crc kubenswrapper[4723]: I0309 13:44:36.075914 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xwkz8" podUID="860c77c5-ebd9-4c93-8136-6f06ef270d7c" containerName="registry-server" containerID="cri-o://e96402c316e8ece68a76121d806bd1f1e97b6f1240111a3d8a8bc72566904f58" gracePeriod=2 Mar 09 13:44:36 crc kubenswrapper[4723]: I0309 13:44:36.279358 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-j6nk8" Mar 09 13:44:36 crc kubenswrapper[4723]: I0309 13:44:36.279671 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-j6nk8" Mar 09 13:44:36 crc kubenswrapper[4723]: I0309 13:44:36.607610 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xwkz8" Mar 09 13:44:36 crc kubenswrapper[4723]: I0309 13:44:36.694707 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/860c77c5-ebd9-4c93-8136-6f06ef270d7c-catalog-content\") pod \"860c77c5-ebd9-4c93-8136-6f06ef270d7c\" (UID: \"860c77c5-ebd9-4c93-8136-6f06ef270d7c\") " Mar 09 13:44:36 crc kubenswrapper[4723]: I0309 13:44:36.694927 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qlrn\" (UniqueName: \"kubernetes.io/projected/860c77c5-ebd9-4c93-8136-6f06ef270d7c-kube-api-access-5qlrn\") pod \"860c77c5-ebd9-4c93-8136-6f06ef270d7c\" (UID: \"860c77c5-ebd9-4c93-8136-6f06ef270d7c\") " Mar 09 13:44:36 crc kubenswrapper[4723]: I0309 13:44:36.696139 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/860c77c5-ebd9-4c93-8136-6f06ef270d7c-utilities\") pod \"860c77c5-ebd9-4c93-8136-6f06ef270d7c\" (UID: \"860c77c5-ebd9-4c93-8136-6f06ef270d7c\") " Mar 09 13:44:36 crc kubenswrapper[4723]: I0309 13:44:36.696948 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/860c77c5-ebd9-4c93-8136-6f06ef270d7c-utilities" (OuterVolumeSpecName: "utilities") pod "860c77c5-ebd9-4c93-8136-6f06ef270d7c" (UID: "860c77c5-ebd9-4c93-8136-6f06ef270d7c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:44:36 crc kubenswrapper[4723]: I0309 13:44:36.697842 4723 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/860c77c5-ebd9-4c93-8136-6f06ef270d7c-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:36 crc kubenswrapper[4723]: I0309 13:44:36.718836 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/860c77c5-ebd9-4c93-8136-6f06ef270d7c-kube-api-access-5qlrn" (OuterVolumeSpecName: "kube-api-access-5qlrn") pod "860c77c5-ebd9-4c93-8136-6f06ef270d7c" (UID: "860c77c5-ebd9-4c93-8136-6f06ef270d7c"). InnerVolumeSpecName "kube-api-access-5qlrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:44:36 crc kubenswrapper[4723]: I0309 13:44:36.765417 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/860c77c5-ebd9-4c93-8136-6f06ef270d7c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "860c77c5-ebd9-4c93-8136-6f06ef270d7c" (UID: "860c77c5-ebd9-4c93-8136-6f06ef270d7c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:44:36 crc kubenswrapper[4723]: I0309 13:44:36.799849 4723 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/860c77c5-ebd9-4c93-8136-6f06ef270d7c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:36 crc kubenswrapper[4723]: I0309 13:44:36.799949 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qlrn\" (UniqueName: \"kubernetes.io/projected/860c77c5-ebd9-4c93-8136-6f06ef270d7c-kube-api-access-5qlrn\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:37 crc kubenswrapper[4723]: I0309 13:44:37.089049 4723 generic.go:334] "Generic (PLEG): container finished" podID="860c77c5-ebd9-4c93-8136-6f06ef270d7c" containerID="e96402c316e8ece68a76121d806bd1f1e97b6f1240111a3d8a8bc72566904f58" exitCode=0 Mar 09 13:44:37 crc kubenswrapper[4723]: I0309 13:44:37.089519 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xwkz8" event={"ID":"860c77c5-ebd9-4c93-8136-6f06ef270d7c","Type":"ContainerDied","Data":"e96402c316e8ece68a76121d806bd1f1e97b6f1240111a3d8a8bc72566904f58"} Mar 09 13:44:37 crc kubenswrapper[4723]: I0309 13:44:37.089613 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xwkz8" Mar 09 13:44:37 crc kubenswrapper[4723]: I0309 13:44:37.089636 4723 scope.go:117] "RemoveContainer" containerID="e96402c316e8ece68a76121d806bd1f1e97b6f1240111a3d8a8bc72566904f58" Mar 09 13:44:37 crc kubenswrapper[4723]: I0309 13:44:37.089616 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xwkz8" event={"ID":"860c77c5-ebd9-4c93-8136-6f06ef270d7c","Type":"ContainerDied","Data":"9c6933f4ef20598edbaba242813dc1857ca6a82f06bac6a289fb8f0b76989724"} Mar 09 13:44:37 crc kubenswrapper[4723]: I0309 13:44:37.124873 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xwkz8"] Mar 09 13:44:37 crc kubenswrapper[4723]: I0309 13:44:37.136722 4723 scope.go:117] "RemoveContainer" containerID="960bbf2b4e902b332fddd14b22c7c6c93dbe8300fdaee6e900b5539f5d0d9bfc" Mar 09 13:44:37 crc kubenswrapper[4723]: I0309 13:44:37.151771 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xwkz8"] Mar 09 13:44:37 crc kubenswrapper[4723]: I0309 13:44:37.184697 4723 scope.go:117] "RemoveContainer" containerID="2f3768ee02403d6d25575608fdb66325198e64f35f0c32ea775dc8d98c4beda4" Mar 09 13:44:37 crc kubenswrapper[4723]: I0309 13:44:37.224167 4723 scope.go:117] "RemoveContainer" containerID="e96402c316e8ece68a76121d806bd1f1e97b6f1240111a3d8a8bc72566904f58" Mar 09 13:44:37 crc kubenswrapper[4723]: E0309 13:44:37.224689 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e96402c316e8ece68a76121d806bd1f1e97b6f1240111a3d8a8bc72566904f58\": container with ID starting with e96402c316e8ece68a76121d806bd1f1e97b6f1240111a3d8a8bc72566904f58 not found: ID does not exist" containerID="e96402c316e8ece68a76121d806bd1f1e97b6f1240111a3d8a8bc72566904f58" Mar 09 13:44:37 crc kubenswrapper[4723]: I0309 13:44:37.224753 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e96402c316e8ece68a76121d806bd1f1e97b6f1240111a3d8a8bc72566904f58"} err="failed to get container status \"e96402c316e8ece68a76121d806bd1f1e97b6f1240111a3d8a8bc72566904f58\": rpc error: code = NotFound desc = could not find container \"e96402c316e8ece68a76121d806bd1f1e97b6f1240111a3d8a8bc72566904f58\": container with ID starting with e96402c316e8ece68a76121d806bd1f1e97b6f1240111a3d8a8bc72566904f58 not found: ID does not exist" Mar 09 13:44:37 crc kubenswrapper[4723]: I0309 13:44:37.224775 4723 scope.go:117] "RemoveContainer" containerID="960bbf2b4e902b332fddd14b22c7c6c93dbe8300fdaee6e900b5539f5d0d9bfc" Mar 09 13:44:37 crc kubenswrapper[4723]: E0309 13:44:37.225340 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"960bbf2b4e902b332fddd14b22c7c6c93dbe8300fdaee6e900b5539f5d0d9bfc\": container with ID starting with 960bbf2b4e902b332fddd14b22c7c6c93dbe8300fdaee6e900b5539f5d0d9bfc not found: ID does not exist" containerID="960bbf2b4e902b332fddd14b22c7c6c93dbe8300fdaee6e900b5539f5d0d9bfc" Mar 09 13:44:37 crc kubenswrapper[4723]: I0309 13:44:37.225366 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"960bbf2b4e902b332fddd14b22c7c6c93dbe8300fdaee6e900b5539f5d0d9bfc"} err="failed to get container status \"960bbf2b4e902b332fddd14b22c7c6c93dbe8300fdaee6e900b5539f5d0d9bfc\": rpc error: code = NotFound desc = could not find container \"960bbf2b4e902b332fddd14b22c7c6c93dbe8300fdaee6e900b5539f5d0d9bfc\": container with ID starting with 960bbf2b4e902b332fddd14b22c7c6c93dbe8300fdaee6e900b5539f5d0d9bfc not found: ID does not exist" Mar 09 13:44:37 crc kubenswrapper[4723]: I0309 13:44:37.225444 4723 scope.go:117] "RemoveContainer" containerID="2f3768ee02403d6d25575608fdb66325198e64f35f0c32ea775dc8d98c4beda4" Mar 09 13:44:37 crc kubenswrapper[4723]: E0309 13:44:37.225713 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f3768ee02403d6d25575608fdb66325198e64f35f0c32ea775dc8d98c4beda4\": container with ID starting with 2f3768ee02403d6d25575608fdb66325198e64f35f0c32ea775dc8d98c4beda4 not found: ID does not exist" containerID="2f3768ee02403d6d25575608fdb66325198e64f35f0c32ea775dc8d98c4beda4" Mar 09 13:44:37 crc kubenswrapper[4723]: I0309 13:44:37.225738 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f3768ee02403d6d25575608fdb66325198e64f35f0c32ea775dc8d98c4beda4"} err="failed to get container status \"2f3768ee02403d6d25575608fdb66325198e64f35f0c32ea775dc8d98c4beda4\": rpc error: code = NotFound desc = could not find container \"2f3768ee02403d6d25575608fdb66325198e64f35f0c32ea775dc8d98c4beda4\": container with ID starting with 2f3768ee02403d6d25575608fdb66325198e64f35f0c32ea775dc8d98c4beda4 not found: ID does not exist" Mar 09 13:44:37 crc kubenswrapper[4723]: I0309 13:44:37.343130 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-j6nk8" podUID="f9a247a2-3e98-473e-ac12-587785fb8f5b" containerName="registry-server" probeResult="failure" output=< Mar 09 13:44:37 crc kubenswrapper[4723]: timeout: failed to connect service ":50051" within 1s Mar 09 13:44:37 crc kubenswrapper[4723]: > Mar 09 13:44:38 crc kubenswrapper[4723]: I0309 13:44:38.894064 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="860c77c5-ebd9-4c93-8136-6f06ef270d7c" path="/var/lib/kubelet/pods/860c77c5-ebd9-4c93-8136-6f06ef270d7c/volumes" Mar 09 13:44:47 crc kubenswrapper[4723]: I0309 13:44:47.330204 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-j6nk8" podUID="f9a247a2-3e98-473e-ac12-587785fb8f5b" containerName="registry-server" probeResult="failure" output=< Mar 09 13:44:47 crc kubenswrapper[4723]: timeout: failed to connect service ":50051" within 1s Mar 09 13:44:47 crc kubenswrapper[4723]: > Mar 09 13:44:48 crc kubenswrapper[4723]: I0309 13:44:48.884880 4723 scope.go:117] "RemoveContainer" containerID="ec7109e644ba6b7891bd72977a28590072eb643f158e140ac3846f72eb7be55a" Mar 09 13:44:48 crc kubenswrapper[4723]: E0309 13:44:48.885506 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:44:56 crc kubenswrapper[4723]: I0309 13:44:56.342705 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-j6nk8" Mar 09 13:44:56 crc kubenswrapper[4723]: I0309 13:44:56.400612 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-j6nk8" Mar 09 13:44:56 crc kubenswrapper[4723]: I0309 13:44:56.580643 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j6nk8"] Mar 09 13:44:58 crc kubenswrapper[4723]: I0309 13:44:58.303032 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-j6nk8" podUID="f9a247a2-3e98-473e-ac12-587785fb8f5b" containerName="registry-server" containerID="cri-o://84543e77eb9e7e606c449c75382a7c37f1dddc999cdaa2f2d95c264c14dff388" gracePeriod=2 Mar 09 13:44:58 crc kubenswrapper[4723]: I0309 13:44:58.857953 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j6nk8" Mar 09 13:44:58 crc kubenswrapper[4723]: I0309 13:44:58.948943 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9a247a2-3e98-473e-ac12-587785fb8f5b-catalog-content\") pod \"f9a247a2-3e98-473e-ac12-587785fb8f5b\" (UID: \"f9a247a2-3e98-473e-ac12-587785fb8f5b\") " Mar 09 13:44:58 crc kubenswrapper[4723]: I0309 13:44:58.949183 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9a247a2-3e98-473e-ac12-587785fb8f5b-utilities\") pod \"f9a247a2-3e98-473e-ac12-587785fb8f5b\" (UID: \"f9a247a2-3e98-473e-ac12-587785fb8f5b\") " Mar 09 13:44:58 crc kubenswrapper[4723]: I0309 13:44:58.949302 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q976l\" (UniqueName: \"kubernetes.io/projected/f9a247a2-3e98-473e-ac12-587785fb8f5b-kube-api-access-q976l\") pod \"f9a247a2-3e98-473e-ac12-587785fb8f5b\" (UID: \"f9a247a2-3e98-473e-ac12-587785fb8f5b\") " Mar 09 13:44:58 crc kubenswrapper[4723]: I0309 13:44:58.950641 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9a247a2-3e98-473e-ac12-587785fb8f5b-utilities" (OuterVolumeSpecName: "utilities") pod "f9a247a2-3e98-473e-ac12-587785fb8f5b" (UID: "f9a247a2-3e98-473e-ac12-587785fb8f5b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:44:58 crc kubenswrapper[4723]: I0309 13:44:58.956094 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9a247a2-3e98-473e-ac12-587785fb8f5b-kube-api-access-q976l" (OuterVolumeSpecName: "kube-api-access-q976l") pod "f9a247a2-3e98-473e-ac12-587785fb8f5b" (UID: "f9a247a2-3e98-473e-ac12-587785fb8f5b"). InnerVolumeSpecName "kube-api-access-q976l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:44:59 crc kubenswrapper[4723]: I0309 13:44:59.053452 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q976l\" (UniqueName: \"kubernetes.io/projected/f9a247a2-3e98-473e-ac12-587785fb8f5b-kube-api-access-q976l\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:59 crc kubenswrapper[4723]: I0309 13:44:59.053486 4723 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9a247a2-3e98-473e-ac12-587785fb8f5b-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:59 crc kubenswrapper[4723]: I0309 13:44:59.087676 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9a247a2-3e98-473e-ac12-587785fb8f5b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f9a247a2-3e98-473e-ac12-587785fb8f5b" (UID: "f9a247a2-3e98-473e-ac12-587785fb8f5b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:44:59 crc kubenswrapper[4723]: I0309 13:44:59.155308 4723 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9a247a2-3e98-473e-ac12-587785fb8f5b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:59 crc kubenswrapper[4723]: I0309 13:44:59.317679 4723 generic.go:334] "Generic (PLEG): container finished" podID="f9a247a2-3e98-473e-ac12-587785fb8f5b" containerID="84543e77eb9e7e606c449c75382a7c37f1dddc999cdaa2f2d95c264c14dff388" exitCode=0 Mar 09 13:44:59 crc kubenswrapper[4723]: I0309 13:44:59.317986 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6nk8" event={"ID":"f9a247a2-3e98-473e-ac12-587785fb8f5b","Type":"ContainerDied","Data":"84543e77eb9e7e606c449c75382a7c37f1dddc999cdaa2f2d95c264c14dff388"} Mar 09 13:44:59 crc kubenswrapper[4723]: I0309 13:44:59.318013 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6nk8" event={"ID":"f9a247a2-3e98-473e-ac12-587785fb8f5b","Type":"ContainerDied","Data":"fb231bd921926e3d6839a94660988e3a8b89d5c057a7b06191573efb71fac090"} Mar 09 13:44:59 crc kubenswrapper[4723]: I0309 13:44:59.318033 4723 scope.go:117] "RemoveContainer" containerID="84543e77eb9e7e606c449c75382a7c37f1dddc999cdaa2f2d95c264c14dff388" Mar 09 13:44:59 crc kubenswrapper[4723]: I0309 13:44:59.318138 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j6nk8" Mar 09 13:44:59 crc kubenswrapper[4723]: I0309 13:44:59.343688 4723 scope.go:117] "RemoveContainer" containerID="fd8ef6987505ce59b652d5bdc30cd1fa278a863f81d4f7e450b73d8780162d78" Mar 09 13:44:59 crc kubenswrapper[4723]: I0309 13:44:59.378959 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j6nk8"] Mar 09 13:44:59 crc kubenswrapper[4723]: I0309 13:44:59.384946 4723 scope.go:117] "RemoveContainer" containerID="26b6f229e6f608dcd2b90e164211c8c3e5c5dbdf56ffb5aed7d81672418c06ff" Mar 09 13:44:59 crc kubenswrapper[4723]: I0309 13:44:59.391972 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-j6nk8"] Mar 09 13:44:59 crc kubenswrapper[4723]: I0309 13:44:59.444262 4723 scope.go:117] "RemoveContainer" containerID="84543e77eb9e7e606c449c75382a7c37f1dddc999cdaa2f2d95c264c14dff388" Mar 09 13:44:59 crc kubenswrapper[4723]: E0309 13:44:59.444714 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84543e77eb9e7e606c449c75382a7c37f1dddc999cdaa2f2d95c264c14dff388\": container with ID starting with 84543e77eb9e7e606c449c75382a7c37f1dddc999cdaa2f2d95c264c14dff388 not found: ID does not exist" containerID="84543e77eb9e7e606c449c75382a7c37f1dddc999cdaa2f2d95c264c14dff388" Mar 09 13:44:59 crc kubenswrapper[4723]: I0309 13:44:59.444743 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84543e77eb9e7e606c449c75382a7c37f1dddc999cdaa2f2d95c264c14dff388"} err="failed to get container status \"84543e77eb9e7e606c449c75382a7c37f1dddc999cdaa2f2d95c264c14dff388\": rpc error: code = NotFound desc = could not find container \"84543e77eb9e7e606c449c75382a7c37f1dddc999cdaa2f2d95c264c14dff388\": container with ID starting with 84543e77eb9e7e606c449c75382a7c37f1dddc999cdaa2f2d95c264c14dff388 not found: ID does not exist" Mar 09 13:44:59 crc kubenswrapper[4723]: I0309 13:44:59.444764 4723 scope.go:117] "RemoveContainer" containerID="fd8ef6987505ce59b652d5bdc30cd1fa278a863f81d4f7e450b73d8780162d78" Mar 09 13:44:59 crc kubenswrapper[4723]: E0309 13:44:59.446048 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd8ef6987505ce59b652d5bdc30cd1fa278a863f81d4f7e450b73d8780162d78\": container with ID starting with fd8ef6987505ce59b652d5bdc30cd1fa278a863f81d4f7e450b73d8780162d78 not found: ID does not exist" containerID="fd8ef6987505ce59b652d5bdc30cd1fa278a863f81d4f7e450b73d8780162d78" Mar 09 13:44:59 crc kubenswrapper[4723]: I0309 13:44:59.446094 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd8ef6987505ce59b652d5bdc30cd1fa278a863f81d4f7e450b73d8780162d78"} err="failed to get container status \"fd8ef6987505ce59b652d5bdc30cd1fa278a863f81d4f7e450b73d8780162d78\": rpc error: code = NotFound desc = could not find container \"fd8ef6987505ce59b652d5bdc30cd1fa278a863f81d4f7e450b73d8780162d78\": container with ID starting with fd8ef6987505ce59b652d5bdc30cd1fa278a863f81d4f7e450b73d8780162d78 not found: ID does not exist" Mar 09 13:44:59 crc kubenswrapper[4723]: I0309 13:44:59.446122 4723 scope.go:117] "RemoveContainer" containerID="26b6f229e6f608dcd2b90e164211c8c3e5c5dbdf56ffb5aed7d81672418c06ff" Mar 09 13:44:59 crc kubenswrapper[4723]: E0309 13:44:59.446464 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26b6f229e6f608dcd2b90e164211c8c3e5c5dbdf56ffb5aed7d81672418c06ff\": container with ID starting with 26b6f229e6f608dcd2b90e164211c8c3e5c5dbdf56ffb5aed7d81672418c06ff not found: ID does not exist" containerID="26b6f229e6f608dcd2b90e164211c8c3e5c5dbdf56ffb5aed7d81672418c06ff" Mar 09 13:44:59 crc kubenswrapper[4723]: I0309 13:44:59.446493 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26b6f229e6f608dcd2b90e164211c8c3e5c5dbdf56ffb5aed7d81672418c06ff"} err="failed to get container status \"26b6f229e6f608dcd2b90e164211c8c3e5c5dbdf56ffb5aed7d81672418c06ff\": rpc error: code = NotFound desc = could not find container \"26b6f229e6f608dcd2b90e164211c8c3e5c5dbdf56ffb5aed7d81672418c06ff\": container with ID starting with 26b6f229e6f608dcd2b90e164211c8c3e5c5dbdf56ffb5aed7d81672418c06ff not found: ID does not exist" Mar 09 13:45:00 crc kubenswrapper[4723]: I0309 13:45:00.155517 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551065-8m792"] Mar 09 13:45:00 crc kubenswrapper[4723]: E0309 13:45:00.156398 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9a247a2-3e98-473e-ac12-587785fb8f5b" containerName="extract-content" Mar 09 13:45:00 crc kubenswrapper[4723]: I0309 13:45:00.156428 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9a247a2-3e98-473e-ac12-587785fb8f5b" containerName="extract-content" Mar 09 13:45:00 crc kubenswrapper[4723]: E0309 13:45:00.156454 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9a247a2-3e98-473e-ac12-587785fb8f5b" containerName="registry-server" Mar 09 13:45:00 crc kubenswrapper[4723]: I0309 13:45:00.156467 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9a247a2-3e98-473e-ac12-587785fb8f5b" containerName="registry-server" Mar 09 13:45:00 crc kubenswrapper[4723]: E0309 13:45:00.156481 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="860c77c5-ebd9-4c93-8136-6f06ef270d7c" containerName="extract-content" Mar 09 13:45:00 crc kubenswrapper[4723]: I0309 13:45:00.156491 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="860c77c5-ebd9-4c93-8136-6f06ef270d7c" containerName="extract-content" Mar 09 13:45:00 crc kubenswrapper[4723]: E0309 13:45:00.156523 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="860c77c5-ebd9-4c93-8136-6f06ef270d7c" containerName="extract-utilities" Mar 09 13:45:00 crc kubenswrapper[4723]: I0309 13:45:00.156534 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="860c77c5-ebd9-4c93-8136-6f06ef270d7c" containerName="extract-utilities" Mar 09 13:45:00 crc kubenswrapper[4723]: E0309 13:45:00.156550 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9a247a2-3e98-473e-ac12-587785fb8f5b" containerName="extract-utilities" Mar 09 13:45:00 crc kubenswrapper[4723]: I0309 13:45:00.156560 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9a247a2-3e98-473e-ac12-587785fb8f5b" containerName="extract-utilities" Mar 09 13:45:00 crc kubenswrapper[4723]: E0309 13:45:00.156576 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="860c77c5-ebd9-4c93-8136-6f06ef270d7c" containerName="registry-server" Mar 09 13:45:00 crc kubenswrapper[4723]: I0309 13:45:00.156585 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="860c77c5-ebd9-4c93-8136-6f06ef270d7c" containerName="registry-server" Mar 09 13:45:00 crc kubenswrapper[4723]: I0309 13:45:00.157013 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="860c77c5-ebd9-4c93-8136-6f06ef270d7c" containerName="registry-server" Mar 09 13:45:00 crc kubenswrapper[4723]: I0309 13:45:00.157049 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9a247a2-3e98-473e-ac12-587785fb8f5b" containerName="registry-server" Mar 09 13:45:00 crc kubenswrapper[4723]: I0309 13:45:00.158414 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-8m792" Mar 09 13:45:00 crc kubenswrapper[4723]: I0309 13:45:00.164547 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 13:45:00 crc kubenswrapper[4723]: I0309 13:45:00.164751 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 13:45:00 crc kubenswrapper[4723]: I0309 13:45:00.167955 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551065-8m792"] Mar 09 13:45:00 crc kubenswrapper[4723]: I0309 13:45:00.295658 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9125e69b-8a83-46bc-9f9b-d23390153693-secret-volume\") pod \"collect-profiles-29551065-8m792\" (UID: \"9125e69b-8a83-46bc-9f9b-d23390153693\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-8m792" Mar 09 13:45:00 crc kubenswrapper[4723]: I0309 13:45:00.296102 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5nm6\" (UniqueName: \"kubernetes.io/projected/9125e69b-8a83-46bc-9f9b-d23390153693-kube-api-access-s5nm6\") pod \"collect-profiles-29551065-8m792\" (UID: \"9125e69b-8a83-46bc-9f9b-d23390153693\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-8m792" Mar 09 13:45:00 crc kubenswrapper[4723]: I0309 13:45:00.296246 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9125e69b-8a83-46bc-9f9b-d23390153693-config-volume\") pod \"collect-profiles-29551065-8m792\" (UID: \"9125e69b-8a83-46bc-9f9b-d23390153693\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-8m792" Mar 09 13:45:00 crc kubenswrapper[4723]: I0309 13:45:00.408504 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5nm6\" (UniqueName: \"kubernetes.io/projected/9125e69b-8a83-46bc-9f9b-d23390153693-kube-api-access-s5nm6\") pod \"collect-profiles-29551065-8m792\" (UID: \"9125e69b-8a83-46bc-9f9b-d23390153693\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-8m792" Mar 09 13:45:00 crc kubenswrapper[4723]: I0309 13:45:00.408700 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9125e69b-8a83-46bc-9f9b-d23390153693-config-volume\") pod \"collect-profiles-29551065-8m792\" (UID: \"9125e69b-8a83-46bc-9f9b-d23390153693\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-8m792" Mar 09 13:45:00 crc kubenswrapper[4723]: I0309 13:45:00.409168 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9125e69b-8a83-46bc-9f9b-d23390153693-secret-volume\") pod \"collect-profiles-29551065-8m792\" (UID: \"9125e69b-8a83-46bc-9f9b-d23390153693\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-8m792" Mar 09 13:45:00 crc kubenswrapper[4723]: I0309 13:45:00.413804 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9125e69b-8a83-46bc-9f9b-d23390153693-config-volume\") pod \"collect-profiles-29551065-8m792\" (UID: \"9125e69b-8a83-46bc-9f9b-d23390153693\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-8m792" Mar 09 13:45:00 crc kubenswrapper[4723]: I0309 13:45:00.417821 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9125e69b-8a83-46bc-9f9b-d23390153693-secret-volume\") pod \"collect-profiles-29551065-8m792\" (UID: \"9125e69b-8a83-46bc-9f9b-d23390153693\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-8m792" Mar 09 13:45:00 crc kubenswrapper[4723]: I0309 13:45:00.432473 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5nm6\" (UniqueName: \"kubernetes.io/projected/9125e69b-8a83-46bc-9f9b-d23390153693-kube-api-access-s5nm6\") pod \"collect-profiles-29551065-8m792\" (UID: \"9125e69b-8a83-46bc-9f9b-d23390153693\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-8m792" Mar 09 13:45:00 crc kubenswrapper[4723]: I0309 13:45:00.490496 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-8m792" Mar 09 13:45:00 crc kubenswrapper[4723]: I0309 13:45:00.894751 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9a247a2-3e98-473e-ac12-587785fb8f5b" path="/var/lib/kubelet/pods/f9a247a2-3e98-473e-ac12-587785fb8f5b/volumes" Mar 09 13:45:00 crc kubenswrapper[4723]: I0309 13:45:00.987433 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551065-8m792"] Mar 09 13:45:01 crc kubenswrapper[4723]: I0309 13:45:01.344734 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-8m792" event={"ID":"9125e69b-8a83-46bc-9f9b-d23390153693","Type":"ContainerStarted","Data":"24537ded5078dc0bf8c927aeb0212378e10cd41dcb3eb2d9b31b1c93eb8fce85"} Mar 09 13:45:01 crc kubenswrapper[4723]: I0309 13:45:01.345057 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-8m792" event={"ID":"9125e69b-8a83-46bc-9f9b-d23390153693","Type":"ContainerStarted","Data":"f02cc44f7d1bf0cb60137e417e59855246ca74ad6ae997634d46bb870a4d6b43"} Mar 09 13:45:01 crc kubenswrapper[4723]: I0309 13:45:01.360103 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-8m792" podStartSLOduration=1.360083399 podStartE2EDuration="1.360083399s" podCreationTimestamp="2026-03-09 13:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:45:01.358854026 +0000 UTC m=+2775.373321566" watchObservedRunningTime="2026-03-09 13:45:01.360083399 +0000 UTC m=+2775.374550929" Mar 09 13:45:01 crc kubenswrapper[4723]: I0309 13:45:01.881918 4723 scope.go:117] "RemoveContainer" containerID="ec7109e644ba6b7891bd72977a28590072eb643f158e140ac3846f72eb7be55a" Mar 09 13:45:01 crc kubenswrapper[4723]: E0309 13:45:01.882187 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:45:02 crc kubenswrapper[4723]: I0309 13:45:02.360672 4723 generic.go:334] "Generic (PLEG): container finished" podID="9125e69b-8a83-46bc-9f9b-d23390153693" containerID="24537ded5078dc0bf8c927aeb0212378e10cd41dcb3eb2d9b31b1c93eb8fce85" exitCode=0 Mar 09 13:45:02 crc kubenswrapper[4723]: I0309 13:45:02.360761 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-8m792" event={"ID":"9125e69b-8a83-46bc-9f9b-d23390153693","Type":"ContainerDied","Data":"24537ded5078dc0bf8c927aeb0212378e10cd41dcb3eb2d9b31b1c93eb8fce85"} Mar 09 13:45:03 crc kubenswrapper[4723]: I0309 13:45:03.861135 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-8m792" Mar 09 13:45:04 crc kubenswrapper[4723]: I0309 13:45:04.010851 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9125e69b-8a83-46bc-9f9b-d23390153693-secret-volume\") pod \"9125e69b-8a83-46bc-9f9b-d23390153693\" (UID: \"9125e69b-8a83-46bc-9f9b-d23390153693\") " Mar 09 13:45:04 crc kubenswrapper[4723]: I0309 13:45:04.011145 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5nm6\" (UniqueName: \"kubernetes.io/projected/9125e69b-8a83-46bc-9f9b-d23390153693-kube-api-access-s5nm6\") pod \"9125e69b-8a83-46bc-9f9b-d23390153693\" (UID: \"9125e69b-8a83-46bc-9f9b-d23390153693\") " Mar 09 13:45:04 crc kubenswrapper[4723]: I0309 13:45:04.011295 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9125e69b-8a83-46bc-9f9b-d23390153693-config-volume\") pod \"9125e69b-8a83-46bc-9f9b-d23390153693\" (UID: \"9125e69b-8a83-46bc-9f9b-d23390153693\") " Mar 09 13:45:04 crc kubenswrapper[4723]: I0309 13:45:04.011794 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9125e69b-8a83-46bc-9f9b-d23390153693-config-volume" (OuterVolumeSpecName: "config-volume") pod "9125e69b-8a83-46bc-9f9b-d23390153693" (UID: "9125e69b-8a83-46bc-9f9b-d23390153693"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:45:04 crc kubenswrapper[4723]: I0309 13:45:04.012343 4723 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9125e69b-8a83-46bc-9f9b-d23390153693-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 13:45:04 crc kubenswrapper[4723]: I0309 13:45:04.017766 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9125e69b-8a83-46bc-9f9b-d23390153693-kube-api-access-s5nm6" (OuterVolumeSpecName: "kube-api-access-s5nm6") pod "9125e69b-8a83-46bc-9f9b-d23390153693" (UID: "9125e69b-8a83-46bc-9f9b-d23390153693"). InnerVolumeSpecName "kube-api-access-s5nm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:45:04 crc kubenswrapper[4723]: I0309 13:45:04.018415 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9125e69b-8a83-46bc-9f9b-d23390153693-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9125e69b-8a83-46bc-9f9b-d23390153693" (UID: "9125e69b-8a83-46bc-9f9b-d23390153693"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:45:04 crc kubenswrapper[4723]: I0309 13:45:04.115203 4723 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9125e69b-8a83-46bc-9f9b-d23390153693-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 13:45:04 crc kubenswrapper[4723]: I0309 13:45:04.115450 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5nm6\" (UniqueName: \"kubernetes.io/projected/9125e69b-8a83-46bc-9f9b-d23390153693-kube-api-access-s5nm6\") on node \"crc\" DevicePath \"\"" Mar 09 13:45:04 crc kubenswrapper[4723]: I0309 13:45:04.381802 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-8m792" event={"ID":"9125e69b-8a83-46bc-9f9b-d23390153693","Type":"ContainerDied","Data":"f02cc44f7d1bf0cb60137e417e59855246ca74ad6ae997634d46bb870a4d6b43"} Mar 09 13:45:04 crc kubenswrapper[4723]: I0309 13:45:04.381909 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f02cc44f7d1bf0cb60137e417e59855246ca74ad6ae997634d46bb870a4d6b43" Mar 09 13:45:04 crc kubenswrapper[4723]: I0309 13:45:04.381848 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-8m792" Mar 09 13:45:04 crc kubenswrapper[4723]: I0309 13:45:04.443876 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551020-rlng8"] Mar 09 13:45:04 crc kubenswrapper[4723]: I0309 13:45:04.454904 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551020-rlng8"] Mar 09 13:45:04 crc kubenswrapper[4723]: I0309 13:45:04.901608 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c15fd52d-a005-4417-aaca-84839023e2b4" path="/var/lib/kubelet/pods/c15fd52d-a005-4417-aaca-84839023e2b4/volumes" Mar 09 13:45:15 crc kubenswrapper[4723]: I0309 13:45:15.881465 4723 scope.go:117] "RemoveContainer" containerID="ec7109e644ba6b7891bd72977a28590072eb643f158e140ac3846f72eb7be55a" Mar 09 13:45:15 crc kubenswrapper[4723]: E0309 13:45:15.882477 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:45:25 crc kubenswrapper[4723]: I0309 13:45:25.084106 4723 scope.go:117] "RemoveContainer" containerID="071de674372c6d212f8340b6e56e0ccc976bb9a08fd325a215bdfba2286169c9" Mar 09 13:45:27 crc kubenswrapper[4723]: I0309 13:45:27.881620 4723 scope.go:117] "RemoveContainer" containerID="ec7109e644ba6b7891bd72977a28590072eb643f158e140ac3846f72eb7be55a" Mar 09 13:45:27 crc kubenswrapper[4723]: E0309 13:45:27.882466 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:45:39 crc kubenswrapper[4723]: I0309 13:45:39.881150 4723 scope.go:117] "RemoveContainer" containerID="ec7109e644ba6b7891bd72977a28590072eb643f158e140ac3846f72eb7be55a" Mar 09 13:45:40 crc kubenswrapper[4723]: I0309 13:45:40.772161 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" event={"ID":"983d5ed4-cfc7-402a-b226-29dc071c6e4e","Type":"ContainerStarted","Data":"93ab1ab8464ceda0c7f74b2147e43d17ddb836385b932c3d4e0c352f241797bc"} Mar 09 13:46:00 crc kubenswrapper[4723]: I0309 13:46:00.146226 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551066-8rghf"] Mar 09 13:46:00 crc kubenswrapper[4723]: E0309 13:46:00.147305 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9125e69b-8a83-46bc-9f9b-d23390153693" containerName="collect-profiles" Mar 09 13:46:00 crc kubenswrapper[4723]: I0309 13:46:00.147321 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="9125e69b-8a83-46bc-9f9b-d23390153693" containerName="collect-profiles" Mar 09 13:46:00 crc kubenswrapper[4723]: I0309 13:46:00.147586 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="9125e69b-8a83-46bc-9f9b-d23390153693" containerName="collect-profiles" Mar 09 13:46:00 crc kubenswrapper[4723]: I0309 13:46:00.148381 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551066-8rghf" Mar 09 13:46:00 crc kubenswrapper[4723]: I0309 13:46:00.152738 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jrp4x" Mar 09 13:46:00 crc kubenswrapper[4723]: I0309 13:46:00.152994 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:46:00 crc kubenswrapper[4723]: I0309 13:46:00.153124 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:46:00 crc kubenswrapper[4723]: I0309 13:46:00.163392 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551066-8rghf"] Mar 09 13:46:00 crc kubenswrapper[4723]: I0309 13:46:00.321663 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mglsl\" (UniqueName: \"kubernetes.io/projected/3e7b78b7-e5cc-4c0f-8323-a8e46614b20f-kube-api-access-mglsl\") pod \"auto-csr-approver-29551066-8rghf\" (UID: \"3e7b78b7-e5cc-4c0f-8323-a8e46614b20f\") " pod="openshift-infra/auto-csr-approver-29551066-8rghf" Mar 09 13:46:00 crc kubenswrapper[4723]: I0309 13:46:00.424814 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mglsl\" (UniqueName: \"kubernetes.io/projected/3e7b78b7-e5cc-4c0f-8323-a8e46614b20f-kube-api-access-mglsl\") pod \"auto-csr-approver-29551066-8rghf\" (UID: \"3e7b78b7-e5cc-4c0f-8323-a8e46614b20f\") " pod="openshift-infra/auto-csr-approver-29551066-8rghf" Mar 09 13:46:00 crc kubenswrapper[4723]: I0309 13:46:00.447852 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mglsl\" (UniqueName: \"kubernetes.io/projected/3e7b78b7-e5cc-4c0f-8323-a8e46614b20f-kube-api-access-mglsl\") pod \"auto-csr-approver-29551066-8rghf\" (UID: \"3e7b78b7-e5cc-4c0f-8323-a8e46614b20f\") " pod="openshift-infra/auto-csr-approver-29551066-8rghf" Mar 09 13:46:00 crc kubenswrapper[4723]: I0309 13:46:00.508377 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551066-8rghf" Mar 09 13:46:00 crc kubenswrapper[4723]: I0309 13:46:00.985050 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551066-8rghf"] Mar 09 13:46:00 crc kubenswrapper[4723]: I0309 13:46:00.987046 4723 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 13:46:02 crc kubenswrapper[4723]: I0309 13:46:02.002790 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551066-8rghf" event={"ID":"3e7b78b7-e5cc-4c0f-8323-a8e46614b20f","Type":"ContainerStarted","Data":"6972de2595b601cf0e9a9b6e1c9960596ac13bf0412dca7b7ad05ab5f6947dc2"} Mar 09 13:46:03 crc kubenswrapper[4723]: I0309 13:46:03.042897 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551066-8rghf" event={"ID":"3e7b78b7-e5cc-4c0f-8323-a8e46614b20f","Type":"ContainerStarted","Data":"283b29033938169f34f6e693dda910293bc405c4aaf5386fb690744c3364bdd6"} Mar 09 13:46:03 crc kubenswrapper[4723]: I0309 13:46:03.063297 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551066-8rghf" podStartSLOduration=2.072201088 podStartE2EDuration="3.063263259s" podCreationTimestamp="2026-03-09 13:46:00 +0000 UTC" firstStartedPulling="2026-03-09 13:46:00.986522766 +0000 UTC m=+2835.000990326" lastFinishedPulling="2026-03-09 13:46:01.977584957 +0000 UTC m=+2835.992052497" observedRunningTime="2026-03-09 13:46:03.056637672 +0000 UTC m=+2837.071105212" watchObservedRunningTime="2026-03-09 13:46:03.063263259 +0000 UTC m=+2837.077730799" Mar 09 13:46:04 crc kubenswrapper[4723]: I0309 13:46:04.053830 4723 generic.go:334] "Generic (PLEG): container finished" podID="3e7b78b7-e5cc-4c0f-8323-a8e46614b20f" containerID="283b29033938169f34f6e693dda910293bc405c4aaf5386fb690744c3364bdd6" exitCode=0 Mar 09 13:46:04 crc kubenswrapper[4723]: I0309 13:46:04.053894 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551066-8rghf" event={"ID":"3e7b78b7-e5cc-4c0f-8323-a8e46614b20f","Type":"ContainerDied","Data":"283b29033938169f34f6e693dda910293bc405c4aaf5386fb690744c3364bdd6"} Mar 09 13:46:05 crc kubenswrapper[4723]: I0309 13:46:05.478114 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551066-8rghf" Mar 09 13:46:05 crc kubenswrapper[4723]: I0309 13:46:05.664843 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mglsl\" (UniqueName: \"kubernetes.io/projected/3e7b78b7-e5cc-4c0f-8323-a8e46614b20f-kube-api-access-mglsl\") pod \"3e7b78b7-e5cc-4c0f-8323-a8e46614b20f\" (UID: \"3e7b78b7-e5cc-4c0f-8323-a8e46614b20f\") " Mar 09 13:46:05 crc kubenswrapper[4723]: I0309 13:46:05.674140 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e7b78b7-e5cc-4c0f-8323-a8e46614b20f-kube-api-access-mglsl" (OuterVolumeSpecName: "kube-api-access-mglsl") pod "3e7b78b7-e5cc-4c0f-8323-a8e46614b20f" (UID: "3e7b78b7-e5cc-4c0f-8323-a8e46614b20f"). InnerVolumeSpecName "kube-api-access-mglsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:46:05 crc kubenswrapper[4723]: I0309 13:46:05.768782 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mglsl\" (UniqueName: \"kubernetes.io/projected/3e7b78b7-e5cc-4c0f-8323-a8e46614b20f-kube-api-access-mglsl\") on node \"crc\" DevicePath \"\"" Mar 09 13:46:06 crc kubenswrapper[4723]: I0309 13:46:06.080023 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551066-8rghf" event={"ID":"3e7b78b7-e5cc-4c0f-8323-a8e46614b20f","Type":"ContainerDied","Data":"6972de2595b601cf0e9a9b6e1c9960596ac13bf0412dca7b7ad05ab5f6947dc2"} Mar 09 13:46:06 crc kubenswrapper[4723]: I0309 13:46:06.080324 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6972de2595b601cf0e9a9b6e1c9960596ac13bf0412dca7b7ad05ab5f6947dc2" Mar 09 13:46:06 crc kubenswrapper[4723]: I0309 13:46:06.080112 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551066-8rghf" Mar 09 13:46:06 crc kubenswrapper[4723]: I0309 13:46:06.139623 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551060-gppnd"] Mar 09 13:46:06 crc kubenswrapper[4723]: I0309 13:46:06.152760 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551060-gppnd"] Mar 09 13:46:06 crc kubenswrapper[4723]: I0309 13:46:06.897191 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c87d396-1f7f-4388-a238-686ce16cfc80" path="/var/lib/kubelet/pods/6c87d396-1f7f-4388-a238-686ce16cfc80/volumes" Mar 09 13:46:25 crc kubenswrapper[4723]: I0309 13:46:25.204201 4723 scope.go:117] "RemoveContainer" containerID="526c7407dbe1cea2e487ded638c31a27b44b03b9caa4897314f20f1a6d7ab913" Mar 09 13:46:36 crc kubenswrapper[4723]: I0309 13:46:36.314540 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rh2kc"] Mar 09 13:46:36 crc kubenswrapper[4723]: E0309 13:46:36.316082 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e7b78b7-e5cc-4c0f-8323-a8e46614b20f" containerName="oc" Mar 09 13:46:36 crc kubenswrapper[4723]: I0309 13:46:36.316098 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e7b78b7-e5cc-4c0f-8323-a8e46614b20f" containerName="oc" Mar 09 13:46:36 crc kubenswrapper[4723]: I0309 13:46:36.316794 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e7b78b7-e5cc-4c0f-8323-a8e46614b20f" containerName="oc" Mar 09 13:46:36 crc kubenswrapper[4723]: I0309 13:46:36.318584 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rh2kc" Mar 09 13:46:36 crc kubenswrapper[4723]: I0309 13:46:36.328116 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rh2kc"] Mar 09 13:46:36 crc kubenswrapper[4723]: I0309 13:46:36.435118 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq9w8\" (UniqueName: \"kubernetes.io/projected/3d267d40-e820-44e5-bbcb-1e7a5785bb1b-kube-api-access-qq9w8\") pod \"community-operators-rh2kc\" (UID: \"3d267d40-e820-44e5-bbcb-1e7a5785bb1b\") " pod="openshift-marketplace/community-operators-rh2kc" Mar 09 13:46:36 crc kubenswrapper[4723]: I0309 13:46:36.435196 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d267d40-e820-44e5-bbcb-1e7a5785bb1b-utilities\") pod \"community-operators-rh2kc\" (UID: \"3d267d40-e820-44e5-bbcb-1e7a5785bb1b\") " pod="openshift-marketplace/community-operators-rh2kc" Mar 09 13:46:36 crc kubenswrapper[4723]: I0309 13:46:36.435254 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d267d40-e820-44e5-bbcb-1e7a5785bb1b-catalog-content\") pod \"community-operators-rh2kc\" (UID: \"3d267d40-e820-44e5-bbcb-1e7a5785bb1b\") " pod="openshift-marketplace/community-operators-rh2kc" Mar 09 13:46:36 crc kubenswrapper[4723]: I0309 13:46:36.538321 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq9w8\" (UniqueName: \"kubernetes.io/projected/3d267d40-e820-44e5-bbcb-1e7a5785bb1b-kube-api-access-qq9w8\") pod \"community-operators-rh2kc\" (UID: \"3d267d40-e820-44e5-bbcb-1e7a5785bb1b\") " pod="openshift-marketplace/community-operators-rh2kc" Mar 09 13:46:36 crc kubenswrapper[4723]: I0309 13:46:36.538391 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d267d40-e820-44e5-bbcb-1e7a5785bb1b-utilities\") pod \"community-operators-rh2kc\" (UID: \"3d267d40-e820-44e5-bbcb-1e7a5785bb1b\") " pod="openshift-marketplace/community-operators-rh2kc" Mar 09 13:46:36 crc kubenswrapper[4723]: I0309 13:46:36.538455 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d267d40-e820-44e5-bbcb-1e7a5785bb1b-catalog-content\") pod \"community-operators-rh2kc\" (UID: \"3d267d40-e820-44e5-bbcb-1e7a5785bb1b\") " pod="openshift-marketplace/community-operators-rh2kc" Mar 09 13:46:36 crc kubenswrapper[4723]: I0309 13:46:36.538932 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d267d40-e820-44e5-bbcb-1e7a5785bb1b-utilities\") pod \"community-operators-rh2kc\" (UID: \"3d267d40-e820-44e5-bbcb-1e7a5785bb1b\") " pod="openshift-marketplace/community-operators-rh2kc" Mar 09 13:46:36 crc kubenswrapper[4723]: I0309 13:46:36.539081 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d267d40-e820-44e5-bbcb-1e7a5785bb1b-catalog-content\") pod \"community-operators-rh2kc\" (UID: \"3d267d40-e820-44e5-bbcb-1e7a5785bb1b\") " pod="openshift-marketplace/community-operators-rh2kc" Mar 09 13:46:36 crc kubenswrapper[4723]: I0309 13:46:36.573244 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq9w8\" (UniqueName: \"kubernetes.io/projected/3d267d40-e820-44e5-bbcb-1e7a5785bb1b-kube-api-access-qq9w8\") pod \"community-operators-rh2kc\" (UID: \"3d267d40-e820-44e5-bbcb-1e7a5785bb1b\") " pod="openshift-marketplace/community-operators-rh2kc" Mar 09 13:46:36 crc kubenswrapper[4723]: I0309 13:46:36.649962 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rh2kc" Mar 09 13:46:37 crc kubenswrapper[4723]: I0309 13:46:37.264497 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rh2kc"] Mar 09 13:46:37 crc kubenswrapper[4723]: I0309 13:46:37.745915 4723 generic.go:334] "Generic (PLEG): container finished" podID="3d267d40-e820-44e5-bbcb-1e7a5785bb1b" containerID="b32807659e04e5593ab1e86aabc64195959e7922d998a6243ef50503e76df474" exitCode=0 Mar 09 13:46:37 crc kubenswrapper[4723]: I0309 13:46:37.746101 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rh2kc" event={"ID":"3d267d40-e820-44e5-bbcb-1e7a5785bb1b","Type":"ContainerDied","Data":"b32807659e04e5593ab1e86aabc64195959e7922d998a6243ef50503e76df474"} Mar 09 13:46:37 crc kubenswrapper[4723]: I0309 13:46:37.746209 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rh2kc" event={"ID":"3d267d40-e820-44e5-bbcb-1e7a5785bb1b","Type":"ContainerStarted","Data":"cad32ebe175a4365d073fa2b6afac14995271c46535a587a1c20aa9a1db759dc"} Mar 09 13:46:37 crc kubenswrapper[4723]: I0309 13:46:37.750978 4723 generic.go:334] "Generic (PLEG): container finished" podID="76b26f74-a654-4507-a416-617f8fec3d89" containerID="c7c4fcf237bfe36dc971d6eb8bfda2d7d5fa71791e08ad06e10dc704a2c57028" exitCode=0 Mar 09 13:46:37 crc kubenswrapper[4723]: I0309 13:46:37.751050 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb" event={"ID":"76b26f74-a654-4507-a416-617f8fec3d89","Type":"ContainerDied","Data":"c7c4fcf237bfe36dc971d6eb8bfda2d7d5fa71791e08ad06e10dc704a2c57028"} Mar 09 13:46:39 crc kubenswrapper[4723]: I0309 13:46:39.234044 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb" Mar 09 13:46:39 crc kubenswrapper[4723]: I0309 13:46:39.300784 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76b26f74-a654-4507-a416-617f8fec3d89-telemetry-combined-ca-bundle\") pod \"76b26f74-a654-4507-a416-617f8fec3d89\" (UID: \"76b26f74-a654-4507-a416-617f8fec3d89\") " Mar 09 13:46:39 crc kubenswrapper[4723]: I0309 13:46:39.300939 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/76b26f74-a654-4507-a416-617f8fec3d89-ssh-key-openstack-edpm-ipam\") pod \"76b26f74-a654-4507-a416-617f8fec3d89\" (UID: \"76b26f74-a654-4507-a416-617f8fec3d89\") " Mar 09 13:46:39 crc kubenswrapper[4723]: I0309 13:46:39.301048 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/76b26f74-a654-4507-a416-617f8fec3d89-ceilometer-compute-config-data-1\") pod \"76b26f74-a654-4507-a416-617f8fec3d89\" (UID: \"76b26f74-a654-4507-a416-617f8fec3d89\") " Mar 09 13:46:39 crc kubenswrapper[4723]: I0309 13:46:39.301082 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/76b26f74-a654-4507-a416-617f8fec3d89-ceilometer-compute-config-data-0\") pod \"76b26f74-a654-4507-a416-617f8fec3d89\" (UID: \"76b26f74-a654-4507-a416-617f8fec3d89\") " Mar 09 13:46:39 crc kubenswrapper[4723]: I0309 13:46:39.301294 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/76b26f74-a654-4507-a416-617f8fec3d89-inventory\") pod \"76b26f74-a654-4507-a416-617f8fec3d89\" (UID: \"76b26f74-a654-4507-a416-617f8fec3d89\") " Mar 09 13:46:39 crc kubenswrapper[4723]: I0309 13:46:39.301325 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/76b26f74-a654-4507-a416-617f8fec3d89-ceilometer-compute-config-data-2\") pod \"76b26f74-a654-4507-a416-617f8fec3d89\" (UID: \"76b26f74-a654-4507-a416-617f8fec3d89\") " Mar 09 13:46:39 crc kubenswrapper[4723]: I0309 13:46:39.301386 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtg5m\" (UniqueName: \"kubernetes.io/projected/76b26f74-a654-4507-a416-617f8fec3d89-kube-api-access-qtg5m\") pod \"76b26f74-a654-4507-a416-617f8fec3d89\" (UID: \"76b26f74-a654-4507-a416-617f8fec3d89\") " Mar 09 13:46:39 crc kubenswrapper[4723]: I0309 13:46:39.308373 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76b26f74-a654-4507-a416-617f8fec3d89-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "76b26f74-a654-4507-a416-617f8fec3d89" (UID: "76b26f74-a654-4507-a416-617f8fec3d89"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:46:39 crc kubenswrapper[4723]: I0309 13:46:39.308474 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76b26f74-a654-4507-a416-617f8fec3d89-kube-api-access-qtg5m" (OuterVolumeSpecName: "kube-api-access-qtg5m") pod "76b26f74-a654-4507-a416-617f8fec3d89" (UID: "76b26f74-a654-4507-a416-617f8fec3d89"). InnerVolumeSpecName "kube-api-access-qtg5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:46:39 crc kubenswrapper[4723]: I0309 13:46:39.342129 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76b26f74-a654-4507-a416-617f8fec3d89-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "76b26f74-a654-4507-a416-617f8fec3d89" (UID: "76b26f74-a654-4507-a416-617f8fec3d89"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:46:39 crc kubenswrapper[4723]: I0309 13:46:39.342582 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76b26f74-a654-4507-a416-617f8fec3d89-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "76b26f74-a654-4507-a416-617f8fec3d89" (UID: "76b26f74-a654-4507-a416-617f8fec3d89"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:46:39 crc kubenswrapper[4723]: I0309 13:46:39.346095 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76b26f74-a654-4507-a416-617f8fec3d89-inventory" (OuterVolumeSpecName: "inventory") pod "76b26f74-a654-4507-a416-617f8fec3d89" (UID: "76b26f74-a654-4507-a416-617f8fec3d89"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:46:39 crc kubenswrapper[4723]: I0309 13:46:39.352818 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76b26f74-a654-4507-a416-617f8fec3d89-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "76b26f74-a654-4507-a416-617f8fec3d89" (UID: "76b26f74-a654-4507-a416-617f8fec3d89"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:46:39 crc kubenswrapper[4723]: I0309 13:46:39.353028 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76b26f74-a654-4507-a416-617f8fec3d89-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "76b26f74-a654-4507-a416-617f8fec3d89" (UID: "76b26f74-a654-4507-a416-617f8fec3d89"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:46:39 crc kubenswrapper[4723]: I0309 13:46:39.403845 4723 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/76b26f74-a654-4507-a416-617f8fec3d89-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:46:39 crc kubenswrapper[4723]: I0309 13:46:39.403888 4723 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/76b26f74-a654-4507-a416-617f8fec3d89-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 09 13:46:39 crc kubenswrapper[4723]: I0309 13:46:39.403900 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtg5m\" (UniqueName: \"kubernetes.io/projected/76b26f74-a654-4507-a416-617f8fec3d89-kube-api-access-qtg5m\") on node \"crc\" DevicePath \"\"" Mar 09 13:46:39 crc kubenswrapper[4723]: I0309 13:46:39.403909 4723 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76b26f74-a654-4507-a416-617f8fec3d89-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:46:39 crc kubenswrapper[4723]: I0309 13:46:39.403919 4723 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/76b26f74-a654-4507-a416-617f8fec3d89-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:46:39 crc kubenswrapper[4723]: I0309 13:46:39.403928 4723 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/76b26f74-a654-4507-a416-617f8fec3d89-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 09 13:46:39 crc kubenswrapper[4723]: I0309 13:46:39.403936 4723 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/76b26f74-a654-4507-a416-617f8fec3d89-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 09 13:46:39 crc kubenswrapper[4723]: I0309 13:46:39.779440 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rh2kc" event={"ID":"3d267d40-e820-44e5-bbcb-1e7a5785bb1b","Type":"ContainerStarted","Data":"4d1598f9cfdc1675fb7eeacfb586dcf2d325e8ec1e4d9b635d243f2ec08d6178"} Mar 09 13:46:39 crc kubenswrapper[4723]: I0309 13:46:39.784018 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb" event={"ID":"76b26f74-a654-4507-a416-617f8fec3d89","Type":"ContainerDied","Data":"6777ffa62689f686c2fbddfb306682ca0d836988649172d5e95ad07e4ea713c8"} Mar 09 13:46:39 crc kubenswrapper[4723]: I0309 13:46:39.784064 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6777ffa62689f686c2fbddfb306682ca0d836988649172d5e95ad07e4ea713c8" Mar 09 13:46:39 crc kubenswrapper[4723]: I0309 13:46:39.784132 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb" Mar 09 13:46:39 crc kubenswrapper[4723]: I0309 13:46:39.884012 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw"] Mar 09 13:46:39 crc kubenswrapper[4723]: E0309 13:46:39.884512 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76b26f74-a654-4507-a416-617f8fec3d89" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 09 13:46:39 crc kubenswrapper[4723]: I0309 13:46:39.884529 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="76b26f74-a654-4507-a416-617f8fec3d89" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 09 13:46:39 crc kubenswrapper[4723]: I0309 13:46:39.884746 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="76b26f74-a654-4507-a416-617f8fec3d89" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 09 13:46:39 crc kubenswrapper[4723]: I0309 13:46:39.885532 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw" Mar 09 13:46:39 crc kubenswrapper[4723]: I0309 13:46:39.888470 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gw7vt" Mar 09 13:46:39 crc kubenswrapper[4723]: I0309 13:46:39.888773 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:46:39 crc kubenswrapper[4723]: I0309 13:46:39.890259 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:46:39 crc kubenswrapper[4723]: I0309 13:46:39.901057 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:46:39 crc kubenswrapper[4723]: I0309 13:46:39.905257 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-ipmi-config-data" Mar 09 13:46:39 crc kubenswrapper[4723]: I0309 13:46:39.914417 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2f686f3c-fee2-4853-8ab5-459d64696efc-ssh-key-openstack-edpm-ipam\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw\" (UID: \"2f686f3c-fee2-4853-8ab5-459d64696efc\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw" Mar 09 13:46:39 crc kubenswrapper[4723]: I0309 13:46:39.914454 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4m2h\" (UniqueName: \"kubernetes.io/projected/2f686f3c-fee2-4853-8ab5-459d64696efc-kube-api-access-p4m2h\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw\" (UID: \"2f686f3c-fee2-4853-8ab5-459d64696efc\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw" Mar 09 13:46:39 crc kubenswrapper[4723]: I0309 13:46:39.914491 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2f686f3c-fee2-4853-8ab5-459d64696efc-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw\" (UID: \"2f686f3c-fee2-4853-8ab5-459d64696efc\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw" Mar 09 13:46:39 crc kubenswrapper[4723]: I0309 13:46:39.914524 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/2f686f3c-fee2-4853-8ab5-459d64696efc-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw\" (UID: \"2f686f3c-fee2-4853-8ab5-459d64696efc\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw" Mar 09 13:46:39 crc kubenswrapper[4723]: I0309 13:46:39.914649 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/2f686f3c-fee2-4853-8ab5-459d64696efc-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw\" (UID: \"2f686f3c-fee2-4853-8ab5-459d64696efc\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw" Mar 09 13:46:39 crc kubenswrapper[4723]: I0309 13:46:39.914668 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f686f3c-fee2-4853-8ab5-459d64696efc-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw\" (UID: \"2f686f3c-fee2-4853-8ab5-459d64696efc\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw" Mar 09 13:46:39 crc kubenswrapper[4723]: I0309 13:46:39.914714 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/2f686f3c-fee2-4853-8ab5-459d64696efc-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw\" (UID: \"2f686f3c-fee2-4853-8ab5-459d64696efc\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw" Mar 09 13:46:39 crc kubenswrapper[4723]: I0309 13:46:39.927077 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw"] Mar 09 13:46:40 crc kubenswrapper[4723]: I0309 13:46:40.017277 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2f686f3c-fee2-4853-8ab5-459d64696efc-ssh-key-openstack-edpm-ipam\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw\" (UID: \"2f686f3c-fee2-4853-8ab5-459d64696efc\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw" Mar 09 13:46:40 crc kubenswrapper[4723]: I0309 13:46:40.017323 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4m2h\" (UniqueName: \"kubernetes.io/projected/2f686f3c-fee2-4853-8ab5-459d64696efc-kube-api-access-p4m2h\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw\" (UID: \"2f686f3c-fee2-4853-8ab5-459d64696efc\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw" Mar 09 13:46:40 crc kubenswrapper[4723]: I0309 13:46:40.017357 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2f686f3c-fee2-4853-8ab5-459d64696efc-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw\" (UID: \"2f686f3c-fee2-4853-8ab5-459d64696efc\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw" Mar 09 13:46:40 crc kubenswrapper[4723]: I0309 13:46:40.017387 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/2f686f3c-fee2-4853-8ab5-459d64696efc-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw\" (UID: \"2f686f3c-fee2-4853-8ab5-459d64696efc\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw" Mar 09 13:46:40 crc kubenswrapper[4723]: I0309 13:46:40.017487 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/2f686f3c-fee2-4853-8ab5-459d64696efc-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw\" (UID: \"2f686f3c-fee2-4853-8ab5-459d64696efc\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw" Mar 09 13:46:40 crc kubenswrapper[4723]: I0309 13:46:40.017507 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f686f3c-fee2-4853-8ab5-459d64696efc-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw\" (UID: \"2f686f3c-fee2-4853-8ab5-459d64696efc\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw" Mar 09 13:46:40 crc kubenswrapper[4723]: I0309 13:46:40.017548 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/2f686f3c-fee2-4853-8ab5-459d64696efc-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw\" (UID: \"2f686f3c-fee2-4853-8ab5-459d64696efc\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw" Mar 09 13:46:40 crc kubenswrapper[4723]: I0309 13:46:40.022466 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/2f686f3c-fee2-4853-8ab5-459d64696efc-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw\" (UID: \"2f686f3c-fee2-4853-8ab5-459d64696efc\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw" Mar 09 13:46:40 crc kubenswrapper[4723]: I0309 13:46:40.037321 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/2f686f3c-fee2-4853-8ab5-459d64696efc-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw\" (UID: \"2f686f3c-fee2-4853-8ab5-459d64696efc\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw" Mar 09 13:46:40 crc kubenswrapper[4723]: I0309 13:46:40.037716 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f686f3c-fee2-4853-8ab5-459d64696efc-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw\" (UID: \"2f686f3c-fee2-4853-8ab5-459d64696efc\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw" Mar 09 13:46:40 crc kubenswrapper[4723]: I0309 13:46:40.037852 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2f686f3c-fee2-4853-8ab5-459d64696efc-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw\" (UID: \"2f686f3c-fee2-4853-8ab5-459d64696efc\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw" Mar 09 13:46:40 crc kubenswrapper[4723]: I0309 13:46:40.038030 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2f686f3c-fee2-4853-8ab5-459d64696efc-ssh-key-openstack-edpm-ipam\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw\" (UID: \"2f686f3c-fee2-4853-8ab5-459d64696efc\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw" Mar 09 13:46:40 crc kubenswrapper[4723]: I0309 13:46:40.038384 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/2f686f3c-fee2-4853-8ab5-459d64696efc-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw\" (UID: \"2f686f3c-fee2-4853-8ab5-459d64696efc\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw" Mar 09 13:46:40 crc kubenswrapper[4723]: I0309 13:46:40.040633 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4m2h\" (UniqueName: \"kubernetes.io/projected/2f686f3c-fee2-4853-8ab5-459d64696efc-kube-api-access-p4m2h\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw\" (UID: \"2f686f3c-fee2-4853-8ab5-459d64696efc\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw" Mar 09 13:46:40 crc kubenswrapper[4723]: I0309 13:46:40.206629 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw" Mar 09 13:46:40 crc kubenswrapper[4723]: I0309 13:46:40.757829 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw"] Mar 09 13:46:40 crc kubenswrapper[4723]: I0309 13:46:40.795652 4723 generic.go:334] "Generic (PLEG): container finished" podID="3d267d40-e820-44e5-bbcb-1e7a5785bb1b" containerID="4d1598f9cfdc1675fb7eeacfb586dcf2d325e8ec1e4d9b635d243f2ec08d6178" exitCode=0 Mar 09 13:46:40 crc kubenswrapper[4723]: I0309 13:46:40.795743 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rh2kc" event={"ID":"3d267d40-e820-44e5-bbcb-1e7a5785bb1b","Type":"ContainerDied","Data":"4d1598f9cfdc1675fb7eeacfb586dcf2d325e8ec1e4d9b635d243f2ec08d6178"} Mar 09 13:46:40 crc kubenswrapper[4723]: I0309 13:46:40.801666 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw" event={"ID":"2f686f3c-fee2-4853-8ab5-459d64696efc","Type":"ContainerStarted","Data":"6a39454c105c734c3aaf9cacbd659980efc6d8d97a9098c528f33d4f8beee249"} Mar 09 13:46:41 crc kubenswrapper[4723]: I0309 13:46:41.820610 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw" event={"ID":"2f686f3c-fee2-4853-8ab5-459d64696efc","Type":"ContainerStarted","Data":"fdc16f5f410c6184eecfcc759ff39adf9c05d8268e2686acdcde7506eff8e112"} Mar 09 13:46:41 crc kubenswrapper[4723]: I0309 13:46:41.824053 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rh2kc" event={"ID":"3d267d40-e820-44e5-bbcb-1e7a5785bb1b","Type":"ContainerStarted","Data":"9d3b0317ee28985b81b5e1b70abf8659398902477f508836897fe5801674d315"} Mar 09 13:46:41 crc kubenswrapper[4723]: I0309 13:46:41.856643 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw" podStartSLOduration=2.364997426 podStartE2EDuration="2.856623569s" podCreationTimestamp="2026-03-09 13:46:39 +0000 UTC" firstStartedPulling="2026-03-09 13:46:40.760471279 +0000 UTC m=+2874.774938809" lastFinishedPulling="2026-03-09 13:46:41.252097402 +0000 UTC m=+2875.266564952" observedRunningTime="2026-03-09 13:46:41.842593236 +0000 UTC m=+2875.857060776" watchObservedRunningTime="2026-03-09 13:46:41.856623569 +0000 UTC m=+2875.871091109" Mar 09 13:46:41 crc kubenswrapper[4723]: I0309 13:46:41.869464 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rh2kc" podStartSLOduration=2.39272293 podStartE2EDuration="5.86944796s" podCreationTimestamp="2026-03-09 13:46:36 +0000 UTC" firstStartedPulling="2026-03-09 13:46:37.748801725 +0000 UTC m=+2871.763269265" lastFinishedPulling="2026-03-09 13:46:41.225526755 +0000 UTC m=+2875.239994295" observedRunningTime="2026-03-09 13:46:41.868876165 +0000 UTC m=+2875.883343715" watchObservedRunningTime="2026-03-09 13:46:41.86944796 +0000 UTC m=+2875.883915500" Mar 09 13:46:46 crc kubenswrapper[4723]: I0309 13:46:46.650750 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rh2kc" Mar 09 13:46:46 crc kubenswrapper[4723]: I0309 13:46:46.651245 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rh2kc" Mar 09 13:46:46 crc kubenswrapper[4723]: I0309 13:46:46.698410 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rh2kc" Mar 09 13:46:46 crc kubenswrapper[4723]: I0309 13:46:46.935011 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rh2kc" Mar 09 13:46:46 crc kubenswrapper[4723]: I0309 13:46:46.990891 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rh2kc"] Mar 09 13:46:48 crc kubenswrapper[4723]: I0309 13:46:48.897602 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rh2kc" podUID="3d267d40-e820-44e5-bbcb-1e7a5785bb1b" containerName="registry-server" containerID="cri-o://9d3b0317ee28985b81b5e1b70abf8659398902477f508836897fe5801674d315" gracePeriod=2 Mar 09 13:46:49 crc kubenswrapper[4723]: I0309 13:46:49.481373 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rh2kc" Mar 09 13:46:49 crc kubenswrapper[4723]: I0309 13:46:49.559364 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d267d40-e820-44e5-bbcb-1e7a5785bb1b-utilities\") pod \"3d267d40-e820-44e5-bbcb-1e7a5785bb1b\" (UID: \"3d267d40-e820-44e5-bbcb-1e7a5785bb1b\") " Mar 09 13:46:49 crc kubenswrapper[4723]: I0309 13:46:49.559680 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d267d40-e820-44e5-bbcb-1e7a5785bb1b-catalog-content\") pod \"3d267d40-e820-44e5-bbcb-1e7a5785bb1b\" (UID: \"3d267d40-e820-44e5-bbcb-1e7a5785bb1b\") " Mar 09 13:46:49 crc kubenswrapper[4723]: I0309 13:46:49.559831 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qq9w8\" (UniqueName: \"kubernetes.io/projected/3d267d40-e820-44e5-bbcb-1e7a5785bb1b-kube-api-access-qq9w8\") pod \"3d267d40-e820-44e5-bbcb-1e7a5785bb1b\" (UID: \"3d267d40-e820-44e5-bbcb-1e7a5785bb1b\") " Mar 09 13:46:49 crc kubenswrapper[4723]: I0309 13:46:49.560532 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d267d40-e820-44e5-bbcb-1e7a5785bb1b-utilities" (OuterVolumeSpecName: "utilities") pod "3d267d40-e820-44e5-bbcb-1e7a5785bb1b" (UID: "3d267d40-e820-44e5-bbcb-1e7a5785bb1b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:46:49 crc kubenswrapper[4723]: I0309 13:46:49.568109 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d267d40-e820-44e5-bbcb-1e7a5785bb1b-kube-api-access-qq9w8" (OuterVolumeSpecName: "kube-api-access-qq9w8") pod "3d267d40-e820-44e5-bbcb-1e7a5785bb1b" (UID: "3d267d40-e820-44e5-bbcb-1e7a5785bb1b"). InnerVolumeSpecName "kube-api-access-qq9w8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:46:49 crc kubenswrapper[4723]: I0309 13:46:49.612625 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d267d40-e820-44e5-bbcb-1e7a5785bb1b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3d267d40-e820-44e5-bbcb-1e7a5785bb1b" (UID: "3d267d40-e820-44e5-bbcb-1e7a5785bb1b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:46:49 crc kubenswrapper[4723]: I0309 13:46:49.663423 4723 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d267d40-e820-44e5-bbcb-1e7a5785bb1b-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:46:49 crc kubenswrapper[4723]: I0309 13:46:49.663510 4723 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d267d40-e820-44e5-bbcb-1e7a5785bb1b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:46:49 crc kubenswrapper[4723]: I0309 13:46:49.663550 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qq9w8\" (UniqueName: \"kubernetes.io/projected/3d267d40-e820-44e5-bbcb-1e7a5785bb1b-kube-api-access-qq9w8\") on node \"crc\" DevicePath \"\"" Mar 09 13:46:49 crc kubenswrapper[4723]: I0309 13:46:49.912592 4723 generic.go:334] "Generic (PLEG): container finished" podID="3d267d40-e820-44e5-bbcb-1e7a5785bb1b" containerID="9d3b0317ee28985b81b5e1b70abf8659398902477f508836897fe5801674d315" exitCode=0 Mar 09 13:46:49 crc kubenswrapper[4723]: I0309 13:46:49.912658 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rh2kc" event={"ID":"3d267d40-e820-44e5-bbcb-1e7a5785bb1b","Type":"ContainerDied","Data":"9d3b0317ee28985b81b5e1b70abf8659398902477f508836897fe5801674d315"} Mar 09 13:46:49 crc kubenswrapper[4723]: I0309 13:46:49.912664 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rh2kc" Mar 09 13:46:49 crc kubenswrapper[4723]: I0309 13:46:49.912688 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rh2kc" event={"ID":"3d267d40-e820-44e5-bbcb-1e7a5785bb1b","Type":"ContainerDied","Data":"cad32ebe175a4365d073fa2b6afac14995271c46535a587a1c20aa9a1db759dc"} Mar 09 13:46:49 crc kubenswrapper[4723]: I0309 13:46:49.912706 4723 scope.go:117] "RemoveContainer" containerID="9d3b0317ee28985b81b5e1b70abf8659398902477f508836897fe5801674d315" Mar 09 13:46:49 crc kubenswrapper[4723]: I0309 13:46:49.954490 4723 scope.go:117] "RemoveContainer" containerID="4d1598f9cfdc1675fb7eeacfb586dcf2d325e8ec1e4d9b635d243f2ec08d6178" Mar 09 13:46:49 crc kubenswrapper[4723]: I0309 13:46:49.969700 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rh2kc"] Mar 09 13:46:49 crc kubenswrapper[4723]: I0309 13:46:49.983092 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rh2kc"] Mar 09 13:46:49 crc kubenswrapper[4723]: I0309 13:46:49.994309 4723 scope.go:117] "RemoveContainer" containerID="b32807659e04e5593ab1e86aabc64195959e7922d998a6243ef50503e76df474" Mar 09 13:46:50 crc kubenswrapper[4723]: I0309 13:46:50.037066 4723 scope.go:117] "RemoveContainer" containerID="9d3b0317ee28985b81b5e1b70abf8659398902477f508836897fe5801674d315" Mar 09 13:46:50 crc kubenswrapper[4723]: E0309 13:46:50.037449 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d3b0317ee28985b81b5e1b70abf8659398902477f508836897fe5801674d315\": container with ID starting with 9d3b0317ee28985b81b5e1b70abf8659398902477f508836897fe5801674d315 not found: ID does not exist" containerID="9d3b0317ee28985b81b5e1b70abf8659398902477f508836897fe5801674d315" Mar 09 13:46:50 crc kubenswrapper[4723]: I0309 13:46:50.037484 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d3b0317ee28985b81b5e1b70abf8659398902477f508836897fe5801674d315"} err="failed to get container status \"9d3b0317ee28985b81b5e1b70abf8659398902477f508836897fe5801674d315\": rpc error: code = NotFound desc = could not find container \"9d3b0317ee28985b81b5e1b70abf8659398902477f508836897fe5801674d315\": container with ID starting with 9d3b0317ee28985b81b5e1b70abf8659398902477f508836897fe5801674d315 not found: ID does not exist" Mar 09 13:46:50 crc kubenswrapper[4723]: I0309 13:46:50.037503 4723 scope.go:117] "RemoveContainer" containerID="4d1598f9cfdc1675fb7eeacfb586dcf2d325e8ec1e4d9b635d243f2ec08d6178" Mar 09 13:46:50 crc kubenswrapper[4723]: E0309 13:46:50.037731 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d1598f9cfdc1675fb7eeacfb586dcf2d325e8ec1e4d9b635d243f2ec08d6178\": container with ID starting with 4d1598f9cfdc1675fb7eeacfb586dcf2d325e8ec1e4d9b635d243f2ec08d6178 not found: ID does not exist" containerID="4d1598f9cfdc1675fb7eeacfb586dcf2d325e8ec1e4d9b635d243f2ec08d6178" Mar 09 13:46:50 crc kubenswrapper[4723]: I0309 13:46:50.037754 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d1598f9cfdc1675fb7eeacfb586dcf2d325e8ec1e4d9b635d243f2ec08d6178"} err="failed to get container status \"4d1598f9cfdc1675fb7eeacfb586dcf2d325e8ec1e4d9b635d243f2ec08d6178\": rpc error: code = NotFound desc = could not find container \"4d1598f9cfdc1675fb7eeacfb586dcf2d325e8ec1e4d9b635d243f2ec08d6178\": container with ID starting with 4d1598f9cfdc1675fb7eeacfb586dcf2d325e8ec1e4d9b635d243f2ec08d6178 not found: ID does not exist" Mar 09 13:46:50 crc kubenswrapper[4723]: I0309 13:46:50.037772 4723 scope.go:117] "RemoveContainer" containerID="b32807659e04e5593ab1e86aabc64195959e7922d998a6243ef50503e76df474" Mar 09 13:46:50 crc kubenswrapper[4723]: E0309 13:46:50.037984 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b32807659e04e5593ab1e86aabc64195959e7922d998a6243ef50503e76df474\": container with ID starting with b32807659e04e5593ab1e86aabc64195959e7922d998a6243ef50503e76df474 not found: ID does not exist" containerID="b32807659e04e5593ab1e86aabc64195959e7922d998a6243ef50503e76df474" Mar 09 13:46:50 crc kubenswrapper[4723]: I0309 13:46:50.038011 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b32807659e04e5593ab1e86aabc64195959e7922d998a6243ef50503e76df474"} err="failed to get container status \"b32807659e04e5593ab1e86aabc64195959e7922d998a6243ef50503e76df474\": rpc error: code = NotFound desc = could not find container \"b32807659e04e5593ab1e86aabc64195959e7922d998a6243ef50503e76df474\": container with ID starting with b32807659e04e5593ab1e86aabc64195959e7922d998a6243ef50503e76df474 not found: ID does not exist" Mar 09 13:46:50 crc kubenswrapper[4723]: I0309 13:46:50.893149 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d267d40-e820-44e5-bbcb-1e7a5785bb1b" path="/var/lib/kubelet/pods/3d267d40-e820-44e5-bbcb-1e7a5785bb1b/volumes" Mar 09 13:47:42 crc kubenswrapper[4723]: I0309 13:47:42.648296 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-k5qqr"] Mar 09 13:47:42 crc kubenswrapper[4723]: E0309 13:47:42.649104 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d267d40-e820-44e5-bbcb-1e7a5785bb1b" containerName="registry-server" Mar 09 13:47:42 crc kubenswrapper[4723]: I0309 13:47:42.649117 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d267d40-e820-44e5-bbcb-1e7a5785bb1b" containerName="registry-server" Mar 09 13:47:42 crc kubenswrapper[4723]: E0309 13:47:42.649160 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d267d40-e820-44e5-bbcb-1e7a5785bb1b" containerName="extract-utilities" Mar 09 13:47:42 crc kubenswrapper[4723]: I0309 13:47:42.649166 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d267d40-e820-44e5-bbcb-1e7a5785bb1b" containerName="extract-utilities" Mar 09 13:47:42 crc kubenswrapper[4723]: E0309 13:47:42.649181 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d267d40-e820-44e5-bbcb-1e7a5785bb1b" containerName="extract-content" Mar 09 13:47:42 crc kubenswrapper[4723]: I0309 13:47:42.649187 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d267d40-e820-44e5-bbcb-1e7a5785bb1b" containerName="extract-content" Mar 09 13:47:42 crc kubenswrapper[4723]: I0309 13:47:42.649390 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d267d40-e820-44e5-bbcb-1e7a5785bb1b" containerName="registry-server" Mar 09 13:47:42 crc kubenswrapper[4723]: I0309 13:47:42.651023 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k5qqr" Mar 09 13:47:42 crc kubenswrapper[4723]: I0309 13:47:42.658269 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k5qqr"] Mar 09 13:47:42 crc kubenswrapper[4723]: I0309 13:47:42.796936 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eac45286-a4ab-4203-ab44-f02320088c84-utilities\") pod \"redhat-marketplace-k5qqr\" (UID: \"eac45286-a4ab-4203-ab44-f02320088c84\") " pod="openshift-marketplace/redhat-marketplace-k5qqr" Mar 09 13:47:42 crc kubenswrapper[4723]: I0309 13:47:42.796989 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eac45286-a4ab-4203-ab44-f02320088c84-catalog-content\") pod \"redhat-marketplace-k5qqr\" (UID: \"eac45286-a4ab-4203-ab44-f02320088c84\") " pod="openshift-marketplace/redhat-marketplace-k5qqr" Mar 09 13:47:42 crc kubenswrapper[4723]: I0309 13:47:42.797326 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdjgh\" (UniqueName: \"kubernetes.io/projected/eac45286-a4ab-4203-ab44-f02320088c84-kube-api-access-fdjgh\") pod \"redhat-marketplace-k5qqr\" (UID: \"eac45286-a4ab-4203-ab44-f02320088c84\") " pod="openshift-marketplace/redhat-marketplace-k5qqr" Mar 09 13:47:42 crc kubenswrapper[4723]: I0309 13:47:42.900160 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eac45286-a4ab-4203-ab44-f02320088c84-utilities\") pod \"redhat-marketplace-k5qqr\" (UID: \"eac45286-a4ab-4203-ab44-f02320088c84\") " pod="openshift-marketplace/redhat-marketplace-k5qqr" Mar 09 13:47:42 crc kubenswrapper[4723]: I0309 13:47:42.900233 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eac45286-a4ab-4203-ab44-f02320088c84-catalog-content\") pod \"redhat-marketplace-k5qqr\" (UID: \"eac45286-a4ab-4203-ab44-f02320088c84\") " pod="openshift-marketplace/redhat-marketplace-k5qqr" Mar 09 13:47:42 crc kubenswrapper[4723]: I0309 13:47:42.900517 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdjgh\" (UniqueName: \"kubernetes.io/projected/eac45286-a4ab-4203-ab44-f02320088c84-kube-api-access-fdjgh\") pod \"redhat-marketplace-k5qqr\" (UID: \"eac45286-a4ab-4203-ab44-f02320088c84\") " pod="openshift-marketplace/redhat-marketplace-k5qqr" Mar 09 13:47:42 crc kubenswrapper[4723]: I0309 13:47:42.900786 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eac45286-a4ab-4203-ab44-f02320088c84-utilities\") pod \"redhat-marketplace-k5qqr\" (UID: \"eac45286-a4ab-4203-ab44-f02320088c84\") " pod="openshift-marketplace/redhat-marketplace-k5qqr" Mar 09 13:47:42 crc kubenswrapper[4723]: I0309 13:47:42.901249 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eac45286-a4ab-4203-ab44-f02320088c84-catalog-content\") pod \"redhat-marketplace-k5qqr\" (UID: \"eac45286-a4ab-4203-ab44-f02320088c84\") " pod="openshift-marketplace/redhat-marketplace-k5qqr" Mar 09 13:47:42 crc kubenswrapper[4723]: I0309 13:47:42.938519 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdjgh\" (UniqueName: \"kubernetes.io/projected/eac45286-a4ab-4203-ab44-f02320088c84-kube-api-access-fdjgh\") pod \"redhat-marketplace-k5qqr\" (UID: \"eac45286-a4ab-4203-ab44-f02320088c84\") " pod="openshift-marketplace/redhat-marketplace-k5qqr" Mar 09 13:47:42 crc kubenswrapper[4723]: I0309 13:47:42.973522 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k5qqr" Mar 09 13:47:43 crc kubenswrapper[4723]: I0309 13:47:43.566436 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k5qqr"] Mar 09 13:47:43 crc kubenswrapper[4723]: I0309 13:47:43.584875 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5qqr" event={"ID":"eac45286-a4ab-4203-ab44-f02320088c84","Type":"ContainerStarted","Data":"a5365270ea519f54815138a22740f9478b413488835dc7867c85fb8137ae9aa6"} Mar 09 13:47:44 crc kubenswrapper[4723]: I0309 13:47:44.595722 4723 generic.go:334] "Generic (PLEG): container finished" podID="eac45286-a4ab-4203-ab44-f02320088c84" containerID="47dd1aa9871ca1c5052e1a4546cd7b7efba8632279e1b0f2dac38d7354aaff4a" exitCode=0 Mar 09 13:47:44 crc kubenswrapper[4723]: I0309 13:47:44.595792 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5qqr" event={"ID":"eac45286-a4ab-4203-ab44-f02320088c84","Type":"ContainerDied","Data":"47dd1aa9871ca1c5052e1a4546cd7b7efba8632279e1b0f2dac38d7354aaff4a"} Mar 09 13:47:45 crc kubenswrapper[4723]: I0309 13:47:45.611599 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5qqr" event={"ID":"eac45286-a4ab-4203-ab44-f02320088c84","Type":"ContainerStarted","Data":"a4e6ea3cb95f6669c89a1ab425e73b0f138e66a47cb90539d25e93f666e36bef"} Mar 09 13:47:46 crc kubenswrapper[4723]: I0309 13:47:46.622505 4723 generic.go:334] "Generic (PLEG): container finished" podID="eac45286-a4ab-4203-ab44-f02320088c84" containerID="a4e6ea3cb95f6669c89a1ab425e73b0f138e66a47cb90539d25e93f666e36bef" exitCode=0 Mar 09 13:47:46 crc kubenswrapper[4723]: I0309 13:47:46.622570 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5qqr" event={"ID":"eac45286-a4ab-4203-ab44-f02320088c84","Type":"ContainerDied","Data":"a4e6ea3cb95f6669c89a1ab425e73b0f138e66a47cb90539d25e93f666e36bef"} Mar 09 13:47:47 crc kubenswrapper[4723]: I0309 13:47:47.636928 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5qqr" event={"ID":"eac45286-a4ab-4203-ab44-f02320088c84","Type":"ContainerStarted","Data":"8f3b9d9eb4b33960c13453315a108df433d7a4c109de9d35aee64320bb7d3057"} Mar 09 13:47:47 crc kubenswrapper[4723]: I0309 13:47:47.660351 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-k5qqr" podStartSLOduration=3.069236571 podStartE2EDuration="5.660332482s" podCreationTimestamp="2026-03-09 13:47:42 +0000 UTC" firstStartedPulling="2026-03-09 13:47:44.59762599 +0000 UTC m=+2938.612093520" lastFinishedPulling="2026-03-09 13:47:47.188721891 +0000 UTC m=+2941.203189431" observedRunningTime="2026-03-09 13:47:47.655923294 +0000 UTC m=+2941.670390844" watchObservedRunningTime="2026-03-09 13:47:47.660332482 +0000 UTC m=+2941.674800032" Mar 09 13:47:52 crc kubenswrapper[4723]: I0309 13:47:52.973712 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-k5qqr" Mar 09 13:47:52 crc kubenswrapper[4723]: I0309 13:47:52.974245 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-k5qqr" Mar 09 13:47:53 crc kubenswrapper[4723]: I0309 13:47:53.023796 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-k5qqr" Mar 09 13:47:53 crc kubenswrapper[4723]: I0309 13:47:53.747210 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-k5qqr" Mar 09 13:47:53 crc kubenswrapper[4723]: I0309 13:47:53.799709 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k5qqr"] Mar 09 13:47:55 crc kubenswrapper[4723]: I0309 13:47:55.720557 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-k5qqr" podUID="eac45286-a4ab-4203-ab44-f02320088c84" containerName="registry-server" containerID="cri-o://8f3b9d9eb4b33960c13453315a108df433d7a4c109de9d35aee64320bb7d3057" gracePeriod=2 Mar 09 13:47:56 crc kubenswrapper[4723]: I0309 13:47:56.228562 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k5qqr" Mar 09 13:47:56 crc kubenswrapper[4723]: I0309 13:47:56.343950 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdjgh\" (UniqueName: \"kubernetes.io/projected/eac45286-a4ab-4203-ab44-f02320088c84-kube-api-access-fdjgh\") pod \"eac45286-a4ab-4203-ab44-f02320088c84\" (UID: \"eac45286-a4ab-4203-ab44-f02320088c84\") " Mar 09 13:47:56 crc kubenswrapper[4723]: I0309 13:47:56.344061 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eac45286-a4ab-4203-ab44-f02320088c84-utilities\") pod \"eac45286-a4ab-4203-ab44-f02320088c84\" (UID: \"eac45286-a4ab-4203-ab44-f02320088c84\") " Mar 09 13:47:56 crc kubenswrapper[4723]: I0309 13:47:56.344206 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eac45286-a4ab-4203-ab44-f02320088c84-catalog-content\") pod \"eac45286-a4ab-4203-ab44-f02320088c84\" (UID: \"eac45286-a4ab-4203-ab44-f02320088c84\") " Mar 09 13:47:56 crc kubenswrapper[4723]: I0309 13:47:56.344854 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eac45286-a4ab-4203-ab44-f02320088c84-utilities" (OuterVolumeSpecName: "utilities") pod "eac45286-a4ab-4203-ab44-f02320088c84" (UID: "eac45286-a4ab-4203-ab44-f02320088c84"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:47:56 crc kubenswrapper[4723]: I0309 13:47:56.353035 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eac45286-a4ab-4203-ab44-f02320088c84-kube-api-access-fdjgh" (OuterVolumeSpecName: "kube-api-access-fdjgh") pod "eac45286-a4ab-4203-ab44-f02320088c84" (UID: "eac45286-a4ab-4203-ab44-f02320088c84"). InnerVolumeSpecName "kube-api-access-fdjgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:47:56 crc kubenswrapper[4723]: I0309 13:47:56.370744 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eac45286-a4ab-4203-ab44-f02320088c84-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eac45286-a4ab-4203-ab44-f02320088c84" (UID: "eac45286-a4ab-4203-ab44-f02320088c84"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:47:56 crc kubenswrapper[4723]: I0309 13:47:56.447110 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdjgh\" (UniqueName: \"kubernetes.io/projected/eac45286-a4ab-4203-ab44-f02320088c84-kube-api-access-fdjgh\") on node \"crc\" DevicePath \"\"" Mar 09 13:47:56 crc kubenswrapper[4723]: I0309 13:47:56.447141 4723 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eac45286-a4ab-4203-ab44-f02320088c84-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:47:56 crc kubenswrapper[4723]: I0309 13:47:56.447153 4723 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eac45286-a4ab-4203-ab44-f02320088c84-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:47:56 crc kubenswrapper[4723]: I0309 13:47:56.741541 4723 generic.go:334] "Generic (PLEG): container finished" podID="eac45286-a4ab-4203-ab44-f02320088c84" containerID="8f3b9d9eb4b33960c13453315a108df433d7a4c109de9d35aee64320bb7d3057" exitCode=0 Mar 09 13:47:56 crc kubenswrapper[4723]: I0309 13:47:56.741595 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5qqr" event={"ID":"eac45286-a4ab-4203-ab44-f02320088c84","Type":"ContainerDied","Data":"8f3b9d9eb4b33960c13453315a108df433d7a4c109de9d35aee64320bb7d3057"} Mar 09 13:47:56 crc kubenswrapper[4723]: I0309 13:47:56.741639 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5qqr" event={"ID":"eac45286-a4ab-4203-ab44-f02320088c84","Type":"ContainerDied","Data":"a5365270ea519f54815138a22740f9478b413488835dc7867c85fb8137ae9aa6"} Mar 09 13:47:56 crc kubenswrapper[4723]: I0309 13:47:56.741664 4723 scope.go:117] "RemoveContainer" containerID="8f3b9d9eb4b33960c13453315a108df433d7a4c109de9d35aee64320bb7d3057" Mar 09 13:47:56 crc kubenswrapper[4723]: I0309 13:47:56.741673 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k5qqr" Mar 09 13:47:56 crc kubenswrapper[4723]: I0309 13:47:56.783777 4723 scope.go:117] "RemoveContainer" containerID="a4e6ea3cb95f6669c89a1ab425e73b0f138e66a47cb90539d25e93f666e36bef" Mar 09 13:47:56 crc kubenswrapper[4723]: I0309 13:47:56.810984 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k5qqr"] Mar 09 13:47:56 crc kubenswrapper[4723]: I0309 13:47:56.817939 4723 scope.go:117] "RemoveContainer" containerID="47dd1aa9871ca1c5052e1a4546cd7b7efba8632279e1b0f2dac38d7354aaff4a" Mar 09 13:47:56 crc kubenswrapper[4723]: I0309 13:47:56.832564 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-k5qqr"] Mar 09 13:47:56 crc kubenswrapper[4723]: I0309 13:47:56.914762 4723 scope.go:117] "RemoveContainer" containerID="8f3b9d9eb4b33960c13453315a108df433d7a4c109de9d35aee64320bb7d3057" Mar 09 13:47:56 crc kubenswrapper[4723]: E0309 13:47:56.916305 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f3b9d9eb4b33960c13453315a108df433d7a4c109de9d35aee64320bb7d3057\": container with ID starting with 8f3b9d9eb4b33960c13453315a108df433d7a4c109de9d35aee64320bb7d3057 not found: ID does not exist" containerID="8f3b9d9eb4b33960c13453315a108df433d7a4c109de9d35aee64320bb7d3057" Mar 09 13:47:56 crc kubenswrapper[4723]: I0309 13:47:56.916348 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f3b9d9eb4b33960c13453315a108df433d7a4c109de9d35aee64320bb7d3057"} err="failed to get container status \"8f3b9d9eb4b33960c13453315a108df433d7a4c109de9d35aee64320bb7d3057\": rpc error: code = NotFound desc = could not find container \"8f3b9d9eb4b33960c13453315a108df433d7a4c109de9d35aee64320bb7d3057\": container with ID starting with 8f3b9d9eb4b33960c13453315a108df433d7a4c109de9d35aee64320bb7d3057 not found: ID does not exist" Mar 09 13:47:56 crc kubenswrapper[4723]: I0309 13:47:56.916374 4723 scope.go:117] "RemoveContainer" containerID="a4e6ea3cb95f6669c89a1ab425e73b0f138e66a47cb90539d25e93f666e36bef" Mar 09 13:47:56 crc kubenswrapper[4723]: E0309 13:47:56.916718 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4e6ea3cb95f6669c89a1ab425e73b0f138e66a47cb90539d25e93f666e36bef\": container with ID starting with a4e6ea3cb95f6669c89a1ab425e73b0f138e66a47cb90539d25e93f666e36bef not found: ID does not exist" containerID="a4e6ea3cb95f6669c89a1ab425e73b0f138e66a47cb90539d25e93f666e36bef" Mar 09 13:47:56 crc kubenswrapper[4723]: I0309 13:47:56.916875 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4e6ea3cb95f6669c89a1ab425e73b0f138e66a47cb90539d25e93f666e36bef"} err="failed to get container status \"a4e6ea3cb95f6669c89a1ab425e73b0f138e66a47cb90539d25e93f666e36bef\": rpc error: code = NotFound desc = could not find container \"a4e6ea3cb95f6669c89a1ab425e73b0f138e66a47cb90539d25e93f666e36bef\": container with ID starting with a4e6ea3cb95f6669c89a1ab425e73b0f138e66a47cb90539d25e93f666e36bef not found: ID does not exist" Mar 09 13:47:56 crc kubenswrapper[4723]: I0309 13:47:56.916986 4723 scope.go:117] "RemoveContainer" containerID="47dd1aa9871ca1c5052e1a4546cd7b7efba8632279e1b0f2dac38d7354aaff4a" Mar 09 13:47:56 crc kubenswrapper[4723]: E0309 13:47:56.917994 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47dd1aa9871ca1c5052e1a4546cd7b7efba8632279e1b0f2dac38d7354aaff4a\": container with ID starting with 47dd1aa9871ca1c5052e1a4546cd7b7efba8632279e1b0f2dac38d7354aaff4a not found: ID does not exist" containerID="47dd1aa9871ca1c5052e1a4546cd7b7efba8632279e1b0f2dac38d7354aaff4a" Mar 09 13:47:56 crc kubenswrapper[4723]: I0309 13:47:56.918020 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47dd1aa9871ca1c5052e1a4546cd7b7efba8632279e1b0f2dac38d7354aaff4a"} err="failed to get container status \"47dd1aa9871ca1c5052e1a4546cd7b7efba8632279e1b0f2dac38d7354aaff4a\": rpc error: code = NotFound desc = could not find container \"47dd1aa9871ca1c5052e1a4546cd7b7efba8632279e1b0f2dac38d7354aaff4a\": container with ID starting with 47dd1aa9871ca1c5052e1a4546cd7b7efba8632279e1b0f2dac38d7354aaff4a not found: ID does not exist" Mar 09 13:47:56 crc kubenswrapper[4723]: I0309 13:47:56.925102 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eac45286-a4ab-4203-ab44-f02320088c84" path="/var/lib/kubelet/pods/eac45286-a4ab-4203-ab44-f02320088c84/volumes" Mar 09 13:48:00 crc kubenswrapper[4723]: I0309 13:48:00.155722 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551068-k78fk"] Mar 09 13:48:00 crc kubenswrapper[4723]: E0309 13:48:00.156774 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eac45286-a4ab-4203-ab44-f02320088c84" containerName="registry-server" Mar 09 13:48:00 crc kubenswrapper[4723]: I0309 13:48:00.156790 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="eac45286-a4ab-4203-ab44-f02320088c84" containerName="registry-server" Mar 09 13:48:00 crc kubenswrapper[4723]: E0309 13:48:00.156834 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eac45286-a4ab-4203-ab44-f02320088c84" containerName="extract-content" Mar 09 13:48:00 crc kubenswrapper[4723]: I0309 13:48:00.156843 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="eac45286-a4ab-4203-ab44-f02320088c84" containerName="extract-content" Mar 09 13:48:00 crc kubenswrapper[4723]: E0309 13:48:00.156879 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eac45286-a4ab-4203-ab44-f02320088c84" containerName="extract-utilities" Mar 09 13:48:00 crc kubenswrapper[4723]: I0309 13:48:00.156888 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="eac45286-a4ab-4203-ab44-f02320088c84" containerName="extract-utilities" Mar 09 13:48:00 crc kubenswrapper[4723]: I0309 13:48:00.157164 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="eac45286-a4ab-4203-ab44-f02320088c84" containerName="registry-server" Mar 09 13:48:00 crc kubenswrapper[4723]: I0309 13:48:00.158168 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551068-k78fk" Mar 09 13:48:00 crc kubenswrapper[4723]: I0309 13:48:00.160118 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jrp4x" Mar 09 13:48:00 crc kubenswrapper[4723]: I0309 13:48:00.160432 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:48:00 crc kubenswrapper[4723]: I0309 13:48:00.160655 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:48:00 crc kubenswrapper[4723]: I0309 13:48:00.198004 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551068-k78fk"] Mar 09 13:48:00 crc kubenswrapper[4723]: I0309 13:48:00.259118 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv2fp\" (UniqueName: \"kubernetes.io/projected/efc86c47-59ae-4e46-94c7-138f4cadfd66-kube-api-access-kv2fp\") pod \"auto-csr-approver-29551068-k78fk\" (UID: \"efc86c47-59ae-4e46-94c7-138f4cadfd66\") " pod="openshift-infra/auto-csr-approver-29551068-k78fk" Mar 09 13:48:00 crc kubenswrapper[4723]: I0309 13:48:00.361510 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv2fp\" (UniqueName: \"kubernetes.io/projected/efc86c47-59ae-4e46-94c7-138f4cadfd66-kube-api-access-kv2fp\") pod \"auto-csr-approver-29551068-k78fk\" (UID: \"efc86c47-59ae-4e46-94c7-138f4cadfd66\") " pod="openshift-infra/auto-csr-approver-29551068-k78fk" Mar 09 13:48:00 crc kubenswrapper[4723]: I0309 13:48:00.383479 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv2fp\" (UniqueName: \"kubernetes.io/projected/efc86c47-59ae-4e46-94c7-138f4cadfd66-kube-api-access-kv2fp\") pod \"auto-csr-approver-29551068-k78fk\" (UID: \"efc86c47-59ae-4e46-94c7-138f4cadfd66\") " pod="openshift-infra/auto-csr-approver-29551068-k78fk" Mar 09 13:48:00 crc kubenswrapper[4723]: I0309 13:48:00.481681 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551068-k78fk" Mar 09 13:48:00 crc kubenswrapper[4723]: I0309 13:48:00.983038 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551068-k78fk"] Mar 09 13:48:01 crc kubenswrapper[4723]: I0309 13:48:01.794655 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551068-k78fk" event={"ID":"efc86c47-59ae-4e46-94c7-138f4cadfd66","Type":"ContainerStarted","Data":"7bb150ee91a9408343968632c7ea8c923e9865128b6fd85d2174d64e7ec1d8e4"} Mar 09 13:48:02 crc kubenswrapper[4723]: I0309 13:48:02.818279 4723 generic.go:334] "Generic (PLEG): container finished" podID="efc86c47-59ae-4e46-94c7-138f4cadfd66" containerID="295e3428eebc7f6b12046c952c84970ed0331a3c09a0a8fc2a3f7f74222ac18d" exitCode=0 Mar 09 13:48:02 crc kubenswrapper[4723]: I0309 13:48:02.818551 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551068-k78fk" event={"ID":"efc86c47-59ae-4e46-94c7-138f4cadfd66","Type":"ContainerDied","Data":"295e3428eebc7f6b12046c952c84970ed0331a3c09a0a8fc2a3f7f74222ac18d"} Mar 09 13:48:03 crc kubenswrapper[4723]: I0309 13:48:03.948811 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:48:03 crc kubenswrapper[4723]: I0309 13:48:03.949153 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:48:04 crc kubenswrapper[4723]: I0309 13:48:04.245013 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551068-k78fk" Mar 09 13:48:04 crc kubenswrapper[4723]: I0309 13:48:04.350639 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kv2fp\" (UniqueName: \"kubernetes.io/projected/efc86c47-59ae-4e46-94c7-138f4cadfd66-kube-api-access-kv2fp\") pod \"efc86c47-59ae-4e46-94c7-138f4cadfd66\" (UID: \"efc86c47-59ae-4e46-94c7-138f4cadfd66\") " Mar 09 13:48:04 crc kubenswrapper[4723]: I0309 13:48:04.363081 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efc86c47-59ae-4e46-94c7-138f4cadfd66-kube-api-access-kv2fp" (OuterVolumeSpecName: "kube-api-access-kv2fp") pod "efc86c47-59ae-4e46-94c7-138f4cadfd66" (UID: "efc86c47-59ae-4e46-94c7-138f4cadfd66"). InnerVolumeSpecName "kube-api-access-kv2fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:48:04 crc kubenswrapper[4723]: I0309 13:48:04.453337 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kv2fp\" (UniqueName: \"kubernetes.io/projected/efc86c47-59ae-4e46-94c7-138f4cadfd66-kube-api-access-kv2fp\") on node \"crc\" DevicePath \"\"" Mar 09 13:48:04 crc kubenswrapper[4723]: I0309 13:48:04.845928 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551068-k78fk" event={"ID":"efc86c47-59ae-4e46-94c7-138f4cadfd66","Type":"ContainerDied","Data":"7bb150ee91a9408343968632c7ea8c923e9865128b6fd85d2174d64e7ec1d8e4"} Mar 09 13:48:04 crc kubenswrapper[4723]: I0309 13:48:04.845997 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bb150ee91a9408343968632c7ea8c923e9865128b6fd85d2174d64e7ec1d8e4" Mar 09 13:48:04 crc kubenswrapper[4723]: I0309 13:48:04.846076 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551068-k78fk" Mar 09 13:48:05 crc kubenswrapper[4723]: I0309 13:48:05.321716 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551062-6nggw"] Mar 09 13:48:05 crc kubenswrapper[4723]: I0309 13:48:05.358550 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551062-6nggw"] Mar 09 13:48:06 crc kubenswrapper[4723]: I0309 13:48:06.898816 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07a9bd7d-dc03-4b2c-9d9d-673da5110b61" path="/var/lib/kubelet/pods/07a9bd7d-dc03-4b2c-9d9d-673da5110b61/volumes" Mar 09 13:48:25 crc kubenswrapper[4723]: I0309 13:48:25.359323 4723 scope.go:117] "RemoveContainer" containerID="c2eef59af587150b8d74827db9f407add65326c4259d10f929c1f0c8e2e7cc01" Mar 09 13:48:33 crc kubenswrapper[4723]: I0309 13:48:33.947363 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:48:33 crc kubenswrapper[4723]: I0309 13:48:33.947920 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:48:39 crc kubenswrapper[4723]: I0309 13:48:39.209913 4723 generic.go:334] "Generic (PLEG): container finished" podID="2f686f3c-fee2-4853-8ab5-459d64696efc" containerID="fdc16f5f410c6184eecfcc759ff39adf9c05d8268e2686acdcde7506eff8e112" exitCode=0 Mar 09 13:48:39 crc kubenswrapper[4723]: I0309 13:48:39.210006 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw" event={"ID":"2f686f3c-fee2-4853-8ab5-459d64696efc","Type":"ContainerDied","Data":"fdc16f5f410c6184eecfcc759ff39adf9c05d8268e2686acdcde7506eff8e112"} Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:40.800753 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw" Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:40.973505 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2f686f3c-fee2-4853-8ab5-459d64696efc-inventory\") pod \"2f686f3c-fee2-4853-8ab5-459d64696efc\" (UID: \"2f686f3c-fee2-4853-8ab5-459d64696efc\") " Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:40.973614 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2f686f3c-fee2-4853-8ab5-459d64696efc-ssh-key-openstack-edpm-ipam\") pod \"2f686f3c-fee2-4853-8ab5-459d64696efc\" (UID: \"2f686f3c-fee2-4853-8ab5-459d64696efc\") " Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:40.973732 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/2f686f3c-fee2-4853-8ab5-459d64696efc-ceilometer-ipmi-config-data-1\") pod \"2f686f3c-fee2-4853-8ab5-459d64696efc\" (UID: \"2f686f3c-fee2-4853-8ab5-459d64696efc\") " Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:40.973888 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f686f3c-fee2-4853-8ab5-459d64696efc-telemetry-power-monitoring-combined-ca-bundle\") pod \"2f686f3c-fee2-4853-8ab5-459d64696efc\" (UID: \"2f686f3c-fee2-4853-8ab5-459d64696efc\") " Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:40.974072 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/2f686f3c-fee2-4853-8ab5-459d64696efc-ceilometer-ipmi-config-data-0\") pod \"2f686f3c-fee2-4853-8ab5-459d64696efc\" (UID: \"2f686f3c-fee2-4853-8ab5-459d64696efc\") " Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:40.974109 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/2f686f3c-fee2-4853-8ab5-459d64696efc-ceilometer-ipmi-config-data-2\") pod \"2f686f3c-fee2-4853-8ab5-459d64696efc\" (UID: \"2f686f3c-fee2-4853-8ab5-459d64696efc\") " Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:40.974183 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4m2h\" (UniqueName: \"kubernetes.io/projected/2f686f3c-fee2-4853-8ab5-459d64696efc-kube-api-access-p4m2h\") pod \"2f686f3c-fee2-4853-8ab5-459d64696efc\" (UID: \"2f686f3c-fee2-4853-8ab5-459d64696efc\") " Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:40.980069 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f686f3c-fee2-4853-8ab5-459d64696efc-kube-api-access-p4m2h" (OuterVolumeSpecName: "kube-api-access-p4m2h") pod "2f686f3c-fee2-4853-8ab5-459d64696efc" (UID: "2f686f3c-fee2-4853-8ab5-459d64696efc"). InnerVolumeSpecName "kube-api-access-p4m2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:40.983266 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f686f3c-fee2-4853-8ab5-459d64696efc-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "2f686f3c-fee2-4853-8ab5-459d64696efc" (UID: "2f686f3c-fee2-4853-8ab5-459d64696efc"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:41.009449 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f686f3c-fee2-4853-8ab5-459d64696efc-ceilometer-ipmi-config-data-2" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-2") pod "2f686f3c-fee2-4853-8ab5-459d64696efc" (UID: "2f686f3c-fee2-4853-8ab5-459d64696efc"). InnerVolumeSpecName "ceilometer-ipmi-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:41.019457 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f686f3c-fee2-4853-8ab5-459d64696efc-inventory" (OuterVolumeSpecName: "inventory") pod "2f686f3c-fee2-4853-8ab5-459d64696efc" (UID: "2f686f3c-fee2-4853-8ab5-459d64696efc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:41.025695 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f686f3c-fee2-4853-8ab5-459d64696efc-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2f686f3c-fee2-4853-8ab5-459d64696efc" (UID: "2f686f3c-fee2-4853-8ab5-459d64696efc"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:41.026033 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f686f3c-fee2-4853-8ab5-459d64696efc-ceilometer-ipmi-config-data-0" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-0") pod "2f686f3c-fee2-4853-8ab5-459d64696efc" (UID: "2f686f3c-fee2-4853-8ab5-459d64696efc"). InnerVolumeSpecName "ceilometer-ipmi-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:41.030638 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f686f3c-fee2-4853-8ab5-459d64696efc-ceilometer-ipmi-config-data-1" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-1") pod "2f686f3c-fee2-4853-8ab5-459d64696efc" (UID: "2f686f3c-fee2-4853-8ab5-459d64696efc"). InnerVolumeSpecName "ceilometer-ipmi-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:41.077480 4723 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/2f686f3c-fee2-4853-8ab5-459d64696efc-ceilometer-ipmi-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:41.077517 4723 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/2f686f3c-fee2-4853-8ab5-459d64696efc-ceilometer-ipmi-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:41.077531 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4m2h\" (UniqueName: \"kubernetes.io/projected/2f686f3c-fee2-4853-8ab5-459d64696efc-kube-api-access-p4m2h\") on node \"crc\" DevicePath \"\"" Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:41.077545 4723 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2f686f3c-fee2-4853-8ab5-459d64696efc-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:41.077559 4723 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2f686f3c-fee2-4853-8ab5-459d64696efc-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:41.077571 4723 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/2f686f3c-fee2-4853-8ab5-459d64696efc-ceilometer-ipmi-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:41.077583 4723 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f686f3c-fee2-4853-8ab5-459d64696efc-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:41.230498 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw" event={"ID":"2f686f3c-fee2-4853-8ab5-459d64696efc","Type":"ContainerDied","Data":"6a39454c105c734c3aaf9cacbd659980efc6d8d97a9098c528f33d4f8beee249"} Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:41.230540 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a39454c105c734c3aaf9cacbd659980efc6d8d97a9098c528f33d4f8beee249" Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:41.230593 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw" Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:41.385591 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-db7kb"] Mar 09 13:48:41 crc kubenswrapper[4723]: E0309 13:48:41.386158 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f686f3c-fee2-4853-8ab5-459d64696efc" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:41.386189 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f686f3c-fee2-4853-8ab5-459d64696efc" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Mar 09 13:48:41 crc kubenswrapper[4723]: E0309 13:48:41.386228 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efc86c47-59ae-4e46-94c7-138f4cadfd66" containerName="oc" Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:41.386266 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="efc86c47-59ae-4e46-94c7-138f4cadfd66" containerName="oc" Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:41.386792 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="efc86c47-59ae-4e46-94c7-138f4cadfd66" containerName="oc" Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:41.386822 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f686f3c-fee2-4853-8ab5-459d64696efc" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:41.387681 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-db7kb" Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:41.390908 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gw7vt" Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:41.391556 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:41.392124 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:41.392243 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"logging-compute-config-data" Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:41.404285 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:41.416927 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-db7kb"] Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:41.489284 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b024d678-4342-4754-86de-89af19e8153a-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-db7kb\" (UID: \"b024d678-4342-4754-86de-89af19e8153a\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-db7kb" Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:41.489376 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b024d678-4342-4754-86de-89af19e8153a-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-db7kb\" (UID: \"b024d678-4342-4754-86de-89af19e8153a\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-db7kb" Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:41.489417 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b024d678-4342-4754-86de-89af19e8153a-ssh-key-openstack-edpm-ipam\") pod \"logging-edpm-deployment-openstack-edpm-ipam-db7kb\" (UID: \"b024d678-4342-4754-86de-89af19e8153a\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-db7kb" Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:41.489458 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r77b\" (UniqueName: \"kubernetes.io/projected/b024d678-4342-4754-86de-89af19e8153a-kube-api-access-8r77b\") pod \"logging-edpm-deployment-openstack-edpm-ipam-db7kb\" (UID: \"b024d678-4342-4754-86de-89af19e8153a\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-db7kb" Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:41.489517 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b024d678-4342-4754-86de-89af19e8153a-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-db7kb\" (UID: \"b024d678-4342-4754-86de-89af19e8153a\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-db7kb" Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:41.591786 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b024d678-4342-4754-86de-89af19e8153a-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-db7kb\" (UID: \"b024d678-4342-4754-86de-89af19e8153a\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-db7kb" Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:41.591957 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b024d678-4342-4754-86de-89af19e8153a-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-db7kb\" (UID: \"b024d678-4342-4754-86de-89af19e8153a\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-db7kb" Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:41.592065 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b024d678-4342-4754-86de-89af19e8153a-ssh-key-openstack-edpm-ipam\") pod \"logging-edpm-deployment-openstack-edpm-ipam-db7kb\" (UID: \"b024d678-4342-4754-86de-89af19e8153a\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-db7kb" Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:41.592167 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r77b\" (UniqueName: \"kubernetes.io/projected/b024d678-4342-4754-86de-89af19e8153a-kube-api-access-8r77b\") pod \"logging-edpm-deployment-openstack-edpm-ipam-db7kb\" (UID: \"b024d678-4342-4754-86de-89af19e8153a\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-db7kb" Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:41.592277 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b024d678-4342-4754-86de-89af19e8153a-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-db7kb\" (UID: \"b024d678-4342-4754-86de-89af19e8153a\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-db7kb" Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:41.598627 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b024d678-4342-4754-86de-89af19e8153a-ssh-key-openstack-edpm-ipam\") pod \"logging-edpm-deployment-openstack-edpm-ipam-db7kb\" (UID: \"b024d678-4342-4754-86de-89af19e8153a\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-db7kb" Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:41.598641 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b024d678-4342-4754-86de-89af19e8153a-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-db7kb\" (UID: \"b024d678-4342-4754-86de-89af19e8153a\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-db7kb" Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:41.598680 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b024d678-4342-4754-86de-89af19e8153a-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-db7kb\" (UID: \"b024d678-4342-4754-86de-89af19e8153a\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-db7kb" Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:41.599069 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b024d678-4342-4754-86de-89af19e8153a-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-db7kb\" (UID: \"b024d678-4342-4754-86de-89af19e8153a\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-db7kb" Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:41.614892 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r77b\" (UniqueName: \"kubernetes.io/projected/b024d678-4342-4754-86de-89af19e8153a-kube-api-access-8r77b\") pod \"logging-edpm-deployment-openstack-edpm-ipam-db7kb\" (UID: \"b024d678-4342-4754-86de-89af19e8153a\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-db7kb" Mar 09 13:48:41 crc kubenswrapper[4723]: I0309 13:48:41.716503 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-db7kb" Mar 09 13:48:42 crc kubenswrapper[4723]: I0309 13:48:42.289498 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-db7kb"] Mar 09 13:48:43 crc kubenswrapper[4723]: I0309 13:48:43.257762 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-db7kb" event={"ID":"b024d678-4342-4754-86de-89af19e8153a","Type":"ContainerStarted","Data":"6d0aa8e0dc82b92d87250efa4d8d3dd46f1e32cd96a0a42f2f39012e4d1718ea"} Mar 09 13:48:43 crc kubenswrapper[4723]: I0309 13:48:43.258183 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-db7kb" event={"ID":"b024d678-4342-4754-86de-89af19e8153a","Type":"ContainerStarted","Data":"11ebcf18ac3aeefa3e7446d49ed093c2082cbe37acc67d8f084ccaa718bc50a2"} Mar 09 13:48:43 crc kubenswrapper[4723]: I0309 13:48:43.289744 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-db7kb" podStartSLOduration=1.834777575 podStartE2EDuration="2.289720031s" podCreationTimestamp="2026-03-09 13:48:41 +0000 UTC" firstStartedPulling="2026-03-09 13:48:42.281396719 +0000 UTC m=+2996.295864269" lastFinishedPulling="2026-03-09 13:48:42.736339185 +0000 UTC m=+2996.750806725" observedRunningTime="2026-03-09 13:48:43.274851356 +0000 UTC m=+2997.289318936" watchObservedRunningTime="2026-03-09 13:48:43.289720031 +0000 UTC m=+2997.304187581" Mar 09 13:48:57 crc kubenswrapper[4723]: I0309 13:48:57.451275 4723 generic.go:334] "Generic (PLEG): container finished" podID="b024d678-4342-4754-86de-89af19e8153a" containerID="6d0aa8e0dc82b92d87250efa4d8d3dd46f1e32cd96a0a42f2f39012e4d1718ea" exitCode=0 Mar 09 13:48:57 crc kubenswrapper[4723]: I0309 13:48:57.451332 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-db7kb" event={"ID":"b024d678-4342-4754-86de-89af19e8153a","Type":"ContainerDied","Data":"6d0aa8e0dc82b92d87250efa4d8d3dd46f1e32cd96a0a42f2f39012e4d1718ea"} Mar 09 13:48:58 crc kubenswrapper[4723]: I0309 13:48:58.943635 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-db7kb" Mar 09 13:48:59 crc kubenswrapper[4723]: I0309 13:48:59.013900 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b024d678-4342-4754-86de-89af19e8153a-ssh-key-openstack-edpm-ipam\") pod \"b024d678-4342-4754-86de-89af19e8153a\" (UID: \"b024d678-4342-4754-86de-89af19e8153a\") " Mar 09 13:48:59 crc kubenswrapper[4723]: I0309 13:48:59.013941 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b024d678-4342-4754-86de-89af19e8153a-logging-compute-config-data-0\") pod \"b024d678-4342-4754-86de-89af19e8153a\" (UID: \"b024d678-4342-4754-86de-89af19e8153a\") " Mar 09 13:48:59 crc kubenswrapper[4723]: I0309 13:48:59.013964 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b024d678-4342-4754-86de-89af19e8153a-logging-compute-config-data-1\") pod \"b024d678-4342-4754-86de-89af19e8153a\" (UID: \"b024d678-4342-4754-86de-89af19e8153a\") " Mar 09 13:48:59 crc kubenswrapper[4723]: I0309 13:48:59.014116 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8r77b\" (UniqueName: \"kubernetes.io/projected/b024d678-4342-4754-86de-89af19e8153a-kube-api-access-8r77b\") pod \"b024d678-4342-4754-86de-89af19e8153a\" (UID: \"b024d678-4342-4754-86de-89af19e8153a\") " Mar 09 13:48:59 crc kubenswrapper[4723]: I0309 13:48:59.014805 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b024d678-4342-4754-86de-89af19e8153a-inventory\") pod \"b024d678-4342-4754-86de-89af19e8153a\" (UID: \"b024d678-4342-4754-86de-89af19e8153a\") " Mar 09 13:48:59 crc kubenswrapper[4723]: I0309 13:48:59.028236 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b024d678-4342-4754-86de-89af19e8153a-kube-api-access-8r77b" (OuterVolumeSpecName: "kube-api-access-8r77b") pod "b024d678-4342-4754-86de-89af19e8153a" (UID: "b024d678-4342-4754-86de-89af19e8153a"). InnerVolumeSpecName "kube-api-access-8r77b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:48:59 crc kubenswrapper[4723]: I0309 13:48:59.057834 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b024d678-4342-4754-86de-89af19e8153a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b024d678-4342-4754-86de-89af19e8153a" (UID: "b024d678-4342-4754-86de-89af19e8153a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:48:59 crc kubenswrapper[4723]: I0309 13:48:59.064070 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b024d678-4342-4754-86de-89af19e8153a-logging-compute-config-data-1" (OuterVolumeSpecName: "logging-compute-config-data-1") pod "b024d678-4342-4754-86de-89af19e8153a" (UID: "b024d678-4342-4754-86de-89af19e8153a"). InnerVolumeSpecName "logging-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:48:59 crc kubenswrapper[4723]: I0309 13:48:59.066332 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b024d678-4342-4754-86de-89af19e8153a-inventory" (OuterVolumeSpecName: "inventory") pod "b024d678-4342-4754-86de-89af19e8153a" (UID: "b024d678-4342-4754-86de-89af19e8153a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:48:59 crc kubenswrapper[4723]: I0309 13:48:59.091517 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b024d678-4342-4754-86de-89af19e8153a-logging-compute-config-data-0" (OuterVolumeSpecName: "logging-compute-config-data-0") pod "b024d678-4342-4754-86de-89af19e8153a" (UID: "b024d678-4342-4754-86de-89af19e8153a"). InnerVolumeSpecName "logging-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:48:59 crc kubenswrapper[4723]: I0309 13:48:59.116999 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8r77b\" (UniqueName: \"kubernetes.io/projected/b024d678-4342-4754-86de-89af19e8153a-kube-api-access-8r77b\") on node \"crc\" DevicePath \"\"" Mar 09 13:48:59 crc kubenswrapper[4723]: I0309 13:48:59.117172 4723 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b024d678-4342-4754-86de-89af19e8153a-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:48:59 crc kubenswrapper[4723]: I0309 13:48:59.117232 4723 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b024d678-4342-4754-86de-89af19e8153a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:48:59 crc kubenswrapper[4723]: I0309 13:48:59.117289 4723 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b024d678-4342-4754-86de-89af19e8153a-logging-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 09 13:48:59 crc kubenswrapper[4723]: I0309 13:48:59.117342 4723 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b024d678-4342-4754-86de-89af19e8153a-logging-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 09 13:48:59 crc kubenswrapper[4723]: I0309 13:48:59.478327 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-db7kb" event={"ID":"b024d678-4342-4754-86de-89af19e8153a","Type":"ContainerDied","Data":"11ebcf18ac3aeefa3e7446d49ed093c2082cbe37acc67d8f084ccaa718bc50a2"} Mar 09 13:48:59 crc kubenswrapper[4723]: I0309 13:48:59.478624 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11ebcf18ac3aeefa3e7446d49ed093c2082cbe37acc67d8f084ccaa718bc50a2" Mar 09 13:48:59 crc kubenswrapper[4723]: I0309 13:48:59.478684 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-db7kb" Mar 09 13:49:03 crc kubenswrapper[4723]: I0309 13:49:03.946451 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:49:03 crc kubenswrapper[4723]: I0309 13:49:03.946981 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:49:03 crc kubenswrapper[4723]: I0309 13:49:03.947025 4723 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" Mar 09 13:49:03 crc kubenswrapper[4723]: I0309 13:49:03.947881 4723 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"93ab1ab8464ceda0c7f74b2147e43d17ddb836385b932c3d4e0c352f241797bc"} pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 13:49:03 crc kubenswrapper[4723]: I0309 13:49:03.947934 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" containerID="cri-o://93ab1ab8464ceda0c7f74b2147e43d17ddb836385b932c3d4e0c352f241797bc" gracePeriod=600 Mar 09 13:49:04 crc kubenswrapper[4723]: I0309 13:49:04.536986 4723 generic.go:334] "Generic (PLEG): container finished" podID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerID="93ab1ab8464ceda0c7f74b2147e43d17ddb836385b932c3d4e0c352f241797bc" exitCode=0 Mar 09 13:49:04 crc kubenswrapper[4723]: I0309 13:49:04.537066 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" event={"ID":"983d5ed4-cfc7-402a-b226-29dc071c6e4e","Type":"ContainerDied","Data":"93ab1ab8464ceda0c7f74b2147e43d17ddb836385b932c3d4e0c352f241797bc"} Mar 09 13:49:04 crc kubenswrapper[4723]: I0309 13:49:04.537776 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" event={"ID":"983d5ed4-cfc7-402a-b226-29dc071c6e4e","Type":"ContainerStarted","Data":"4904fbc2a93aa096dedb54bfdc0020c144a85acbe4849899c125620f31985658"} Mar 09 13:49:04 crc kubenswrapper[4723]: I0309 13:49:04.537844 4723 scope.go:117] "RemoveContainer" containerID="ec7109e644ba6b7891bd72977a28590072eb643f158e140ac3846f72eb7be55a" Mar 09 13:50:00 crc kubenswrapper[4723]: I0309 13:50:00.161156 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551070-zdjk2"] Mar 09 13:50:00 crc kubenswrapper[4723]: E0309 13:50:00.162182 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b024d678-4342-4754-86de-89af19e8153a" containerName="logging-edpm-deployment-openstack-edpm-ipam" Mar 09 13:50:00 crc kubenswrapper[4723]: I0309 13:50:00.162196 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="b024d678-4342-4754-86de-89af19e8153a" containerName="logging-edpm-deployment-openstack-edpm-ipam" Mar 09 13:50:00 crc kubenswrapper[4723]: I0309 13:50:00.162398 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="b024d678-4342-4754-86de-89af19e8153a" containerName="logging-edpm-deployment-openstack-edpm-ipam" Mar 09 13:50:00 crc kubenswrapper[4723]: I0309 13:50:00.163347 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551070-zdjk2" Mar 09 13:50:00 crc kubenswrapper[4723]: I0309 13:50:00.173250 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551070-zdjk2"] Mar 09 13:50:00 crc kubenswrapper[4723]: I0309 13:50:00.192893 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jrp4x" Mar 09 13:50:00 crc kubenswrapper[4723]: I0309 13:50:00.192997 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:50:00 crc kubenswrapper[4723]: I0309 13:50:00.193719 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:50:00 crc kubenswrapper[4723]: I0309 13:50:00.238363 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g4k6\" (UniqueName: \"kubernetes.io/projected/3183843f-294d-45f6-b7e7-2e931faa3035-kube-api-access-8g4k6\") pod \"auto-csr-approver-29551070-zdjk2\" (UID: \"3183843f-294d-45f6-b7e7-2e931faa3035\") " pod="openshift-infra/auto-csr-approver-29551070-zdjk2" Mar 09 13:50:00 crc kubenswrapper[4723]: I0309 13:50:00.342128 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g4k6\" (UniqueName: \"kubernetes.io/projected/3183843f-294d-45f6-b7e7-2e931faa3035-kube-api-access-8g4k6\") pod \"auto-csr-approver-29551070-zdjk2\" (UID: \"3183843f-294d-45f6-b7e7-2e931faa3035\") " pod="openshift-infra/auto-csr-approver-29551070-zdjk2" Mar 09 13:50:00 crc kubenswrapper[4723]: I0309 13:50:00.363711 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g4k6\" (UniqueName: \"kubernetes.io/projected/3183843f-294d-45f6-b7e7-2e931faa3035-kube-api-access-8g4k6\") pod \"auto-csr-approver-29551070-zdjk2\" (UID: \"3183843f-294d-45f6-b7e7-2e931faa3035\") " pod="openshift-infra/auto-csr-approver-29551070-zdjk2" Mar 09 13:50:00 crc kubenswrapper[4723]: I0309 13:50:00.530515 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551070-zdjk2" Mar 09 13:50:01 crc kubenswrapper[4723]: I0309 13:50:01.027791 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551070-zdjk2"] Mar 09 13:50:01 crc kubenswrapper[4723]: I0309 13:50:01.137581 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551070-zdjk2" event={"ID":"3183843f-294d-45f6-b7e7-2e931faa3035","Type":"ContainerStarted","Data":"dc6fa3c47f852611b3b990b3579d1c25c5b0fe575764965feb20eb24265e3b29"} Mar 09 13:50:03 crc kubenswrapper[4723]: I0309 13:50:03.159158 4723 generic.go:334] "Generic (PLEG): container finished" podID="3183843f-294d-45f6-b7e7-2e931faa3035" containerID="788ad5f2279320daafd418e59ceb3bda13bbbaf8b8f5133152c484cc05727c19" exitCode=0 Mar 09 13:50:03 crc kubenswrapper[4723]: I0309 13:50:03.159839 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551070-zdjk2" event={"ID":"3183843f-294d-45f6-b7e7-2e931faa3035","Type":"ContainerDied","Data":"788ad5f2279320daafd418e59ceb3bda13bbbaf8b8f5133152c484cc05727c19"} Mar 09 13:50:04 crc kubenswrapper[4723]: I0309 13:50:04.613273 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551070-zdjk2" Mar 09 13:50:04 crc kubenswrapper[4723]: I0309 13:50:04.673446 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8g4k6\" (UniqueName: \"kubernetes.io/projected/3183843f-294d-45f6-b7e7-2e931faa3035-kube-api-access-8g4k6\") pod \"3183843f-294d-45f6-b7e7-2e931faa3035\" (UID: \"3183843f-294d-45f6-b7e7-2e931faa3035\") " Mar 09 13:50:04 crc kubenswrapper[4723]: I0309 13:50:04.691394 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3183843f-294d-45f6-b7e7-2e931faa3035-kube-api-access-8g4k6" (OuterVolumeSpecName: "kube-api-access-8g4k6") pod "3183843f-294d-45f6-b7e7-2e931faa3035" (UID: "3183843f-294d-45f6-b7e7-2e931faa3035"). InnerVolumeSpecName "kube-api-access-8g4k6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:50:04 crc kubenswrapper[4723]: I0309 13:50:04.778004 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8g4k6\" (UniqueName: \"kubernetes.io/projected/3183843f-294d-45f6-b7e7-2e931faa3035-kube-api-access-8g4k6\") on node \"crc\" DevicePath \"\"" Mar 09 13:50:05 crc kubenswrapper[4723]: I0309 13:50:05.186688 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551070-zdjk2" event={"ID":"3183843f-294d-45f6-b7e7-2e931faa3035","Type":"ContainerDied","Data":"dc6fa3c47f852611b3b990b3579d1c25c5b0fe575764965feb20eb24265e3b29"} Mar 09 13:50:05 crc kubenswrapper[4723]: I0309 13:50:05.186742 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc6fa3c47f852611b3b990b3579d1c25c5b0fe575764965feb20eb24265e3b29" Mar 09 13:50:05 crc kubenswrapper[4723]: I0309 13:50:05.186845 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551070-zdjk2" Mar 09 13:50:05 crc kubenswrapper[4723]: I0309 13:50:05.684611 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551064-vff59"] Mar 09 13:50:05 crc kubenswrapper[4723]: I0309 13:50:05.697439 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551064-vff59"] Mar 09 13:50:06 crc kubenswrapper[4723]: I0309 13:50:06.895159 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc9de4ea-678a-4fc1-ad89-09afe0581584" path="/var/lib/kubelet/pods/bc9de4ea-678a-4fc1-ad89-09afe0581584/volumes" Mar 09 13:50:25 crc kubenswrapper[4723]: I0309 13:50:25.486051 4723 scope.go:117] "RemoveContainer" containerID="ea165a7614d5ccfad94bd5ad5343af2df093a51c50c89bbe998b7a79f284d26b" Mar 09 13:50:29 crc kubenswrapper[4723]: E0309 13:50:29.825394 4723 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.129:53174->38.102.83.129:35705: write tcp 38.102.83.129:53174->38.102.83.129:35705: write: connection reset by peer Mar 09 13:51:33 crc kubenswrapper[4723]: I0309 13:51:33.947488 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:51:33 crc kubenswrapper[4723]: I0309 13:51:33.948062 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:52:00 crc kubenswrapper[4723]: I0309 13:52:00.190885 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551072-hvd2s"] Mar 09 13:52:00 crc kubenswrapper[4723]: E0309 13:52:00.191831 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3183843f-294d-45f6-b7e7-2e931faa3035" containerName="oc" Mar 09 13:52:00 crc kubenswrapper[4723]: I0309 13:52:00.191842 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="3183843f-294d-45f6-b7e7-2e931faa3035" containerName="oc" Mar 09 13:52:00 crc kubenswrapper[4723]: I0309 13:52:00.192131 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="3183843f-294d-45f6-b7e7-2e931faa3035" containerName="oc" Mar 09 13:52:00 crc kubenswrapper[4723]: I0309 13:52:00.192961 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551072-hvd2s" Mar 09 13:52:00 crc kubenswrapper[4723]: I0309 13:52:00.195779 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:52:00 crc kubenswrapper[4723]: I0309 13:52:00.195924 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:52:00 crc kubenswrapper[4723]: I0309 13:52:00.196521 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jrp4x" Mar 09 13:52:00 crc kubenswrapper[4723]: I0309 13:52:00.205198 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551072-hvd2s"] Mar 09 13:52:00 crc kubenswrapper[4723]: I0309 13:52:00.303786 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm6c8\" (UniqueName: \"kubernetes.io/projected/02fe7f4c-1014-47ad-8c3e-4a000dc2b7bc-kube-api-access-sm6c8\") pod \"auto-csr-approver-29551072-hvd2s\" (UID: \"02fe7f4c-1014-47ad-8c3e-4a000dc2b7bc\") " pod="openshift-infra/auto-csr-approver-29551072-hvd2s" Mar 09 13:52:00 crc kubenswrapper[4723]: I0309 13:52:00.406224 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sm6c8\" (UniqueName: \"kubernetes.io/projected/02fe7f4c-1014-47ad-8c3e-4a000dc2b7bc-kube-api-access-sm6c8\") pod \"auto-csr-approver-29551072-hvd2s\" (UID: \"02fe7f4c-1014-47ad-8c3e-4a000dc2b7bc\") " pod="openshift-infra/auto-csr-approver-29551072-hvd2s" Mar 09 13:52:00 crc kubenswrapper[4723]: I0309 13:52:00.427193 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm6c8\" (UniqueName: \"kubernetes.io/projected/02fe7f4c-1014-47ad-8c3e-4a000dc2b7bc-kube-api-access-sm6c8\") pod \"auto-csr-approver-29551072-hvd2s\" (UID: \"02fe7f4c-1014-47ad-8c3e-4a000dc2b7bc\") " pod="openshift-infra/auto-csr-approver-29551072-hvd2s" Mar 09 13:52:00 crc kubenswrapper[4723]: I0309 13:52:00.529622 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551072-hvd2s" Mar 09 13:52:01 crc kubenswrapper[4723]: I0309 13:52:01.023375 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551072-hvd2s"] Mar 09 13:52:01 crc kubenswrapper[4723]: I0309 13:52:01.030788 4723 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 13:52:01 crc kubenswrapper[4723]: I0309 13:52:01.459455 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551072-hvd2s" event={"ID":"02fe7f4c-1014-47ad-8c3e-4a000dc2b7bc","Type":"ContainerStarted","Data":"bafb659c04bc5d9c6aa6dae21a660fcc30768334be136c2050b7aa8affc39db5"} Mar 09 13:52:02 crc kubenswrapper[4723]: I0309 13:52:02.481570 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551072-hvd2s" event={"ID":"02fe7f4c-1014-47ad-8c3e-4a000dc2b7bc","Type":"ContainerStarted","Data":"e023e9169c07f629dd606ba94c869181b888da0df2ab409cf6a2986843d640fe"} Mar 09 13:52:02 crc kubenswrapper[4723]: I0309 13:52:02.538120 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551072-hvd2s" podStartSLOduration=1.58868217 podStartE2EDuration="2.538095204s" podCreationTimestamp="2026-03-09 13:52:00 +0000 UTC" firstStartedPulling="2026-03-09 13:52:01.030579548 +0000 UTC m=+3195.045047088" lastFinishedPulling="2026-03-09 13:52:01.979992582 +0000 UTC m=+3195.994460122" observedRunningTime="2026-03-09 13:52:02.521193444 +0000 UTC m=+3196.535660984" watchObservedRunningTime="2026-03-09 13:52:02.538095204 +0000 UTC m=+3196.552562744" Mar 09 13:52:03 crc kubenswrapper[4723]: I0309 13:52:03.499520 4723 generic.go:334] "Generic (PLEG): container finished" podID="02fe7f4c-1014-47ad-8c3e-4a000dc2b7bc" containerID="e023e9169c07f629dd606ba94c869181b888da0df2ab409cf6a2986843d640fe" exitCode=0 Mar 09 13:52:03 crc kubenswrapper[4723]: I0309 13:52:03.499586 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551072-hvd2s" event={"ID":"02fe7f4c-1014-47ad-8c3e-4a000dc2b7bc","Type":"ContainerDied","Data":"e023e9169c07f629dd606ba94c869181b888da0df2ab409cf6a2986843d640fe"} Mar 09 13:52:03 crc kubenswrapper[4723]: I0309 13:52:03.946784 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:52:03 crc kubenswrapper[4723]: I0309 13:52:03.947176 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:52:04 crc kubenswrapper[4723]: I0309 13:52:04.968520 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551072-hvd2s" Mar 09 13:52:05 crc kubenswrapper[4723]: I0309 13:52:05.030269 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sm6c8\" (UniqueName: \"kubernetes.io/projected/02fe7f4c-1014-47ad-8c3e-4a000dc2b7bc-kube-api-access-sm6c8\") pod \"02fe7f4c-1014-47ad-8c3e-4a000dc2b7bc\" (UID: \"02fe7f4c-1014-47ad-8c3e-4a000dc2b7bc\") " Mar 09 13:52:05 crc kubenswrapper[4723]: I0309 13:52:05.041112 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02fe7f4c-1014-47ad-8c3e-4a000dc2b7bc-kube-api-access-sm6c8" (OuterVolumeSpecName: "kube-api-access-sm6c8") pod "02fe7f4c-1014-47ad-8c3e-4a000dc2b7bc" (UID: "02fe7f4c-1014-47ad-8c3e-4a000dc2b7bc"). InnerVolumeSpecName "kube-api-access-sm6c8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:52:05 crc kubenswrapper[4723]: I0309 13:52:05.133406 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sm6c8\" (UniqueName: \"kubernetes.io/projected/02fe7f4c-1014-47ad-8c3e-4a000dc2b7bc-kube-api-access-sm6c8\") on node \"crc\" DevicePath \"\"" Mar 09 13:52:05 crc kubenswrapper[4723]: I0309 13:52:05.530364 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551072-hvd2s" event={"ID":"02fe7f4c-1014-47ad-8c3e-4a000dc2b7bc","Type":"ContainerDied","Data":"bafb659c04bc5d9c6aa6dae21a660fcc30768334be136c2050b7aa8affc39db5"} Mar 09 13:52:05 crc kubenswrapper[4723]: I0309 13:52:05.530415 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bafb659c04bc5d9c6aa6dae21a660fcc30768334be136c2050b7aa8affc39db5" Mar 09 13:52:05 crc kubenswrapper[4723]: I0309 13:52:05.530473 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551072-hvd2s" Mar 09 13:52:06 crc kubenswrapper[4723]: I0309 13:52:06.064681 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551066-8rghf"] Mar 09 13:52:06 crc kubenswrapper[4723]: I0309 13:52:06.082702 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551066-8rghf"] Mar 09 13:52:06 crc kubenswrapper[4723]: I0309 13:52:06.897634 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e7b78b7-e5cc-4c0f-8323-a8e46614b20f" path="/var/lib/kubelet/pods/3e7b78b7-e5cc-4c0f-8323-a8e46614b20f/volumes" Mar 09 13:52:25 crc kubenswrapper[4723]: I0309 13:52:25.582446 4723 scope.go:117] "RemoveContainer" containerID="283b29033938169f34f6e693dda910293bc405c4aaf5386fb690744c3364bdd6" Mar 09 13:52:33 crc kubenswrapper[4723]: I0309 13:52:33.946678 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:52:33 crc kubenswrapper[4723]: I0309 13:52:33.947282 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:52:33 crc kubenswrapper[4723]: I0309 13:52:33.947328 4723 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" Mar 09 13:52:33 crc kubenswrapper[4723]: I0309 13:52:33.948150 4723 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4904fbc2a93aa096dedb54bfdc0020c144a85acbe4849899c125620f31985658"} pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 13:52:33 crc kubenswrapper[4723]: I0309 13:52:33.948209 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" containerID="cri-o://4904fbc2a93aa096dedb54bfdc0020c144a85acbe4849899c125620f31985658" gracePeriod=600 Mar 09 13:52:34 crc kubenswrapper[4723]: E0309 13:52:34.098600 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:52:34 crc kubenswrapper[4723]: I0309 13:52:34.827058 4723 generic.go:334] "Generic (PLEG): container finished" podID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerID="4904fbc2a93aa096dedb54bfdc0020c144a85acbe4849899c125620f31985658" exitCode=0 Mar 09 13:52:34 crc kubenswrapper[4723]: I0309 13:52:34.827150 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" event={"ID":"983d5ed4-cfc7-402a-b226-29dc071c6e4e","Type":"ContainerDied","Data":"4904fbc2a93aa096dedb54bfdc0020c144a85acbe4849899c125620f31985658"} Mar 09 13:52:34 crc kubenswrapper[4723]: I0309 13:52:34.827336 4723 scope.go:117] "RemoveContainer" containerID="93ab1ab8464ceda0c7f74b2147e43d17ddb836385b932c3d4e0c352f241797bc" Mar 09 13:52:34 crc kubenswrapper[4723]: I0309 13:52:34.829139 4723 scope.go:117] "RemoveContainer" containerID="4904fbc2a93aa096dedb54bfdc0020c144a85acbe4849899c125620f31985658" Mar 09 13:52:34 crc kubenswrapper[4723]: E0309 13:52:34.829740 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:52:45 crc kubenswrapper[4723]: I0309 13:52:45.881336 4723 scope.go:117] "RemoveContainer" containerID="4904fbc2a93aa096dedb54bfdc0020c144a85acbe4849899c125620f31985658" Mar 09 13:52:45 crc kubenswrapper[4723]: E0309 13:52:45.883366 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:52:59 crc kubenswrapper[4723]: I0309 13:52:59.880810 4723 scope.go:117] "RemoveContainer" containerID="4904fbc2a93aa096dedb54bfdc0020c144a85acbe4849899c125620f31985658" Mar 09 13:52:59 crc kubenswrapper[4723]: E0309 13:52:59.881507 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:53:14 crc kubenswrapper[4723]: I0309 13:53:14.881722 4723 scope.go:117] "RemoveContainer" containerID="4904fbc2a93aa096dedb54bfdc0020c144a85acbe4849899c125620f31985658" Mar 09 13:53:14 crc kubenswrapper[4723]: E0309 13:53:14.882454 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:53:28 crc kubenswrapper[4723]: I0309 13:53:28.882548 4723 scope.go:117] "RemoveContainer" containerID="4904fbc2a93aa096dedb54bfdc0020c144a85acbe4849899c125620f31985658" Mar 09 13:53:28 crc kubenswrapper[4723]: E0309 13:53:28.883263 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:53:42 crc kubenswrapper[4723]: I0309 13:53:42.880955 4723 scope.go:117] "RemoveContainer" containerID="4904fbc2a93aa096dedb54bfdc0020c144a85acbe4849899c125620f31985658" Mar 09 13:53:42 crc kubenswrapper[4723]: E0309 13:53:42.881846 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:53:56 crc kubenswrapper[4723]: I0309 13:53:56.888437 4723 scope.go:117] "RemoveContainer" containerID="4904fbc2a93aa096dedb54bfdc0020c144a85acbe4849899c125620f31985658" Mar 09 13:53:56 crc kubenswrapper[4723]: E0309 13:53:56.889794 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:54:00 crc kubenswrapper[4723]: I0309 13:54:00.146153 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551074-cxwsk"] Mar 09 13:54:00 crc kubenswrapper[4723]: E0309 13:54:00.147591 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02fe7f4c-1014-47ad-8c3e-4a000dc2b7bc" containerName="oc" Mar 09 13:54:00 crc kubenswrapper[4723]: I0309 13:54:00.147608 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="02fe7f4c-1014-47ad-8c3e-4a000dc2b7bc" containerName="oc" Mar 09 13:54:00 crc kubenswrapper[4723]: I0309 13:54:00.147964 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="02fe7f4c-1014-47ad-8c3e-4a000dc2b7bc" containerName="oc" Mar 09 13:54:00 crc kubenswrapper[4723]: I0309 13:54:00.149005 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551074-cxwsk" Mar 09 13:54:00 crc kubenswrapper[4723]: I0309 13:54:00.152270 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:54:00 crc kubenswrapper[4723]: I0309 13:54:00.152498 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:54:00 crc kubenswrapper[4723]: I0309 13:54:00.152575 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jrp4x" Mar 09 13:54:00 crc kubenswrapper[4723]: I0309 13:54:00.158138 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551074-cxwsk"] Mar 09 13:54:00 crc kubenswrapper[4723]: I0309 13:54:00.264237 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t8r8\" (UniqueName: \"kubernetes.io/projected/111c3f03-b05b-4dc8-9449-8453cafd181d-kube-api-access-2t8r8\") pod \"auto-csr-approver-29551074-cxwsk\" (UID: \"111c3f03-b05b-4dc8-9449-8453cafd181d\") " pod="openshift-infra/auto-csr-approver-29551074-cxwsk" Mar 09 13:54:00 crc kubenswrapper[4723]: I0309 13:54:00.366732 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2t8r8\" (UniqueName: \"kubernetes.io/projected/111c3f03-b05b-4dc8-9449-8453cafd181d-kube-api-access-2t8r8\") pod \"auto-csr-approver-29551074-cxwsk\" (UID: \"111c3f03-b05b-4dc8-9449-8453cafd181d\") " pod="openshift-infra/auto-csr-approver-29551074-cxwsk" Mar 09 13:54:00 crc kubenswrapper[4723]: I0309 13:54:00.389681 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t8r8\" (UniqueName: \"kubernetes.io/projected/111c3f03-b05b-4dc8-9449-8453cafd181d-kube-api-access-2t8r8\") pod \"auto-csr-approver-29551074-cxwsk\" (UID: \"111c3f03-b05b-4dc8-9449-8453cafd181d\") " pod="openshift-infra/auto-csr-approver-29551074-cxwsk" Mar 09 13:54:00 crc kubenswrapper[4723]: I0309 13:54:00.474851 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551074-cxwsk" Mar 09 13:54:00 crc kubenswrapper[4723]: I0309 13:54:00.989643 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551074-cxwsk"] Mar 09 13:54:01 crc kubenswrapper[4723]: I0309 13:54:01.746523 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551074-cxwsk" event={"ID":"111c3f03-b05b-4dc8-9449-8453cafd181d","Type":"ContainerStarted","Data":"3acf44d9e9cbb637f3a817c87347fc8b690a58fa61758a71d96042dfd2f1756b"} Mar 09 13:54:02 crc kubenswrapper[4723]: I0309 13:54:02.794776 4723 generic.go:334] "Generic (PLEG): container finished" podID="111c3f03-b05b-4dc8-9449-8453cafd181d" containerID="e4757d8409c7b613154a2d525330e6f6606954a7811656a386cba12fdf833c44" exitCode=0 Mar 09 13:54:02 crc kubenswrapper[4723]: I0309 13:54:02.794906 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551074-cxwsk" event={"ID":"111c3f03-b05b-4dc8-9449-8453cafd181d","Type":"ContainerDied","Data":"e4757d8409c7b613154a2d525330e6f6606954a7811656a386cba12fdf833c44"} Mar 09 13:54:04 crc kubenswrapper[4723]: I0309 13:54:04.242458 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551074-cxwsk" Mar 09 13:54:04 crc kubenswrapper[4723]: I0309 13:54:04.381811 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2t8r8\" (UniqueName: \"kubernetes.io/projected/111c3f03-b05b-4dc8-9449-8453cafd181d-kube-api-access-2t8r8\") pod \"111c3f03-b05b-4dc8-9449-8453cafd181d\" (UID: \"111c3f03-b05b-4dc8-9449-8453cafd181d\") " Mar 09 13:54:04 crc kubenswrapper[4723]: I0309 13:54:04.387436 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/111c3f03-b05b-4dc8-9449-8453cafd181d-kube-api-access-2t8r8" (OuterVolumeSpecName: "kube-api-access-2t8r8") pod "111c3f03-b05b-4dc8-9449-8453cafd181d" (UID: "111c3f03-b05b-4dc8-9449-8453cafd181d"). InnerVolumeSpecName "kube-api-access-2t8r8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:54:04 crc kubenswrapper[4723]: I0309 13:54:04.484774 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2t8r8\" (UniqueName: \"kubernetes.io/projected/111c3f03-b05b-4dc8-9449-8453cafd181d-kube-api-access-2t8r8\") on node \"crc\" DevicePath \"\"" Mar 09 13:54:04 crc kubenswrapper[4723]: I0309 13:54:04.837023 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551074-cxwsk" event={"ID":"111c3f03-b05b-4dc8-9449-8453cafd181d","Type":"ContainerDied","Data":"3acf44d9e9cbb637f3a817c87347fc8b690a58fa61758a71d96042dfd2f1756b"} Mar 09 13:54:04 crc kubenswrapper[4723]: I0309 13:54:04.837080 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3acf44d9e9cbb637f3a817c87347fc8b690a58fa61758a71d96042dfd2f1756b" Mar 09 13:54:04 crc kubenswrapper[4723]: I0309 13:54:04.837157 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551074-cxwsk" Mar 09 13:54:05 crc kubenswrapper[4723]: I0309 13:54:05.318434 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551068-k78fk"] Mar 09 13:54:05 crc kubenswrapper[4723]: I0309 13:54:05.332767 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551068-k78fk"] Mar 09 13:54:06 crc kubenswrapper[4723]: I0309 13:54:06.901114 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efc86c47-59ae-4e46-94c7-138f4cadfd66" path="/var/lib/kubelet/pods/efc86c47-59ae-4e46-94c7-138f4cadfd66/volumes" Mar 09 13:54:10 crc kubenswrapper[4723]: I0309 13:54:10.880884 4723 scope.go:117] "RemoveContainer" containerID="4904fbc2a93aa096dedb54bfdc0020c144a85acbe4849899c125620f31985658" Mar 09 13:54:10 crc kubenswrapper[4723]: E0309 13:54:10.881697 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:54:16 crc kubenswrapper[4723]: I0309 13:54:16.743042 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-r7s7q"] Mar 09 13:54:16 crc kubenswrapper[4723]: E0309 13:54:16.744219 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="111c3f03-b05b-4dc8-9449-8453cafd181d" containerName="oc" Mar 09 13:54:16 crc kubenswrapper[4723]: I0309 13:54:16.744237 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="111c3f03-b05b-4dc8-9449-8453cafd181d" containerName="oc" Mar 09 13:54:16 crc kubenswrapper[4723]: I0309 13:54:16.744570 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="111c3f03-b05b-4dc8-9449-8453cafd181d" containerName="oc" Mar 09 13:54:16 crc kubenswrapper[4723]: I0309 13:54:16.746936 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r7s7q" Mar 09 13:54:16 crc kubenswrapper[4723]: I0309 13:54:16.773153 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r7s7q"] Mar 09 13:54:16 crc kubenswrapper[4723]: I0309 13:54:16.818529 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6a78aa6-eeea-4193-a33e-f5a91b6a52e4-catalog-content\") pod \"redhat-operators-r7s7q\" (UID: \"a6a78aa6-eeea-4193-a33e-f5a91b6a52e4\") " pod="openshift-marketplace/redhat-operators-r7s7q" Mar 09 13:54:16 crc kubenswrapper[4723]: I0309 13:54:16.818627 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6a78aa6-eeea-4193-a33e-f5a91b6a52e4-utilities\") pod \"redhat-operators-r7s7q\" (UID: \"a6a78aa6-eeea-4193-a33e-f5a91b6a52e4\") " pod="openshift-marketplace/redhat-operators-r7s7q" Mar 09 13:54:16 crc kubenswrapper[4723]: I0309 13:54:16.818734 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zrj8\" (UniqueName: \"kubernetes.io/projected/a6a78aa6-eeea-4193-a33e-f5a91b6a52e4-kube-api-access-9zrj8\") pod \"redhat-operators-r7s7q\" (UID: \"a6a78aa6-eeea-4193-a33e-f5a91b6a52e4\") " pod="openshift-marketplace/redhat-operators-r7s7q" Mar 09 13:54:16 crc kubenswrapper[4723]: I0309 13:54:16.923622 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zrj8\" (UniqueName: \"kubernetes.io/projected/a6a78aa6-eeea-4193-a33e-f5a91b6a52e4-kube-api-access-9zrj8\") pod \"redhat-operators-r7s7q\" (UID: \"a6a78aa6-eeea-4193-a33e-f5a91b6a52e4\") " pod="openshift-marketplace/redhat-operators-r7s7q" Mar 09 13:54:16 crc kubenswrapper[4723]: I0309 13:54:16.923884 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6a78aa6-eeea-4193-a33e-f5a91b6a52e4-catalog-content\") pod \"redhat-operators-r7s7q\" (UID: \"a6a78aa6-eeea-4193-a33e-f5a91b6a52e4\") " pod="openshift-marketplace/redhat-operators-r7s7q" Mar 09 13:54:16 crc kubenswrapper[4723]: I0309 13:54:16.923959 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6a78aa6-eeea-4193-a33e-f5a91b6a52e4-utilities\") pod \"redhat-operators-r7s7q\" (UID: \"a6a78aa6-eeea-4193-a33e-f5a91b6a52e4\") " pod="openshift-marketplace/redhat-operators-r7s7q" Mar 09 13:54:16 crc kubenswrapper[4723]: I0309 13:54:16.924504 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6a78aa6-eeea-4193-a33e-f5a91b6a52e4-utilities\") pod \"redhat-operators-r7s7q\" (UID: \"a6a78aa6-eeea-4193-a33e-f5a91b6a52e4\") " pod="openshift-marketplace/redhat-operators-r7s7q" Mar 09 13:54:16 crc kubenswrapper[4723]: I0309 13:54:16.926377 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6a78aa6-eeea-4193-a33e-f5a91b6a52e4-catalog-content\") pod \"redhat-operators-r7s7q\" (UID: \"a6a78aa6-eeea-4193-a33e-f5a91b6a52e4\") " pod="openshift-marketplace/redhat-operators-r7s7q" Mar 09 13:54:16 crc kubenswrapper[4723]: I0309 13:54:16.960082 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zrj8\" (UniqueName: \"kubernetes.io/projected/a6a78aa6-eeea-4193-a33e-f5a91b6a52e4-kube-api-access-9zrj8\") pod \"redhat-operators-r7s7q\" (UID: \"a6a78aa6-eeea-4193-a33e-f5a91b6a52e4\") " pod="openshift-marketplace/redhat-operators-r7s7q" Mar 09 13:54:17 crc kubenswrapper[4723]: I0309 13:54:17.088637 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r7s7q" Mar 09 13:54:17 crc kubenswrapper[4723]: I0309 13:54:17.585946 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r7s7q"] Mar 09 13:54:18 crc kubenswrapper[4723]: I0309 13:54:18.011658 4723 generic.go:334] "Generic (PLEG): container finished" podID="a6a78aa6-eeea-4193-a33e-f5a91b6a52e4" containerID="68f1809cd84feac4bf98c95bdb257485bcc3a940f7b4e542499d47aa694a62cb" exitCode=0 Mar 09 13:54:18 crc kubenswrapper[4723]: I0309 13:54:18.011753 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r7s7q" event={"ID":"a6a78aa6-eeea-4193-a33e-f5a91b6a52e4","Type":"ContainerDied","Data":"68f1809cd84feac4bf98c95bdb257485bcc3a940f7b4e542499d47aa694a62cb"} Mar 09 13:54:18 crc kubenswrapper[4723]: I0309 13:54:18.011992 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r7s7q" event={"ID":"a6a78aa6-eeea-4193-a33e-f5a91b6a52e4","Type":"ContainerStarted","Data":"8c4565e7591a35ee17c9866486d5784c9f5d0a6cf5ff4c618211bd1f0b544c23"} Mar 09 13:54:19 crc kubenswrapper[4723]: I0309 13:54:19.025991 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r7s7q" event={"ID":"a6a78aa6-eeea-4193-a33e-f5a91b6a52e4","Type":"ContainerStarted","Data":"73921ff151abbd801faa14c70c57fa8164e19efc7389862640e6b55146e4bea3"} Mar 09 13:54:25 crc kubenswrapper[4723]: I0309 13:54:25.098781 4723 generic.go:334] "Generic (PLEG): container finished" podID="a6a78aa6-eeea-4193-a33e-f5a91b6a52e4" containerID="73921ff151abbd801faa14c70c57fa8164e19efc7389862640e6b55146e4bea3" exitCode=0 Mar 09 13:54:25 crc kubenswrapper[4723]: I0309 13:54:25.098968 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r7s7q" event={"ID":"a6a78aa6-eeea-4193-a33e-f5a91b6a52e4","Type":"ContainerDied","Data":"73921ff151abbd801faa14c70c57fa8164e19efc7389862640e6b55146e4bea3"} Mar 09 13:54:25 crc kubenswrapper[4723]: I0309 13:54:25.693577 4723 scope.go:117] "RemoveContainer" containerID="295e3428eebc7f6b12046c952c84970ed0331a3c09a0a8fc2a3f7f74222ac18d" Mar 09 13:54:25 crc kubenswrapper[4723]: I0309 13:54:25.880897 4723 scope.go:117] "RemoveContainer" containerID="4904fbc2a93aa096dedb54bfdc0020c144a85acbe4849899c125620f31985658" Mar 09 13:54:25 crc kubenswrapper[4723]: E0309 13:54:25.881585 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:54:26 crc kubenswrapper[4723]: I0309 13:54:26.113416 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r7s7q" event={"ID":"a6a78aa6-eeea-4193-a33e-f5a91b6a52e4","Type":"ContainerStarted","Data":"dd2dc82b1976f5d3bd8a692adcdb99cf66fe39f59f43e806ea62df5125c102f7"} Mar 09 13:54:26 crc kubenswrapper[4723]: I0309 13:54:26.172997 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-r7s7q" podStartSLOduration=2.623505627 podStartE2EDuration="10.172966986s" podCreationTimestamp="2026-03-09 13:54:16 +0000 UTC" firstStartedPulling="2026-03-09 13:54:18.013499234 +0000 UTC m=+3332.027966774" lastFinishedPulling="2026-03-09 13:54:25.562960583 +0000 UTC m=+3339.577428133" observedRunningTime="2026-03-09 13:54:26.133850735 +0000 UTC m=+3340.148318275" watchObservedRunningTime="2026-03-09 13:54:26.172966986 +0000 UTC m=+3340.187434566" Mar 09 13:54:27 crc kubenswrapper[4723]: I0309 13:54:27.089533 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-r7s7q" Mar 09 13:54:27 crc kubenswrapper[4723]: I0309 13:54:27.090099 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-r7s7q" Mar 09 13:54:28 crc kubenswrapper[4723]: I0309 13:54:28.152721 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-r7s7q" podUID="a6a78aa6-eeea-4193-a33e-f5a91b6a52e4" containerName="registry-server" probeResult="failure" output=< Mar 09 13:54:28 crc kubenswrapper[4723]: timeout: failed to connect service ":50051" within 1s Mar 09 13:54:28 crc kubenswrapper[4723]: > Mar 09 13:54:38 crc kubenswrapper[4723]: I0309 13:54:38.137271 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-r7s7q" podUID="a6a78aa6-eeea-4193-a33e-f5a91b6a52e4" containerName="registry-server" probeResult="failure" output=< Mar 09 13:54:38 crc kubenswrapper[4723]: timeout: failed to connect service ":50051" within 1s Mar 09 13:54:38 crc kubenswrapper[4723]: > Mar 09 13:54:38 crc kubenswrapper[4723]: I0309 13:54:38.880973 4723 scope.go:117] "RemoveContainer" containerID="4904fbc2a93aa096dedb54bfdc0020c144a85acbe4849899c125620f31985658" Mar 09 13:54:38 crc kubenswrapper[4723]: E0309 13:54:38.881373 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:54:48 crc kubenswrapper[4723]: I0309 13:54:48.146911 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-r7s7q" podUID="a6a78aa6-eeea-4193-a33e-f5a91b6a52e4" containerName="registry-server" probeResult="failure" output=< Mar 09 13:54:48 crc kubenswrapper[4723]: timeout: failed to connect service ":50051" within 1s Mar 09 13:54:48 crc kubenswrapper[4723]: > Mar 09 13:54:52 crc kubenswrapper[4723]: I0309 13:54:52.881377 4723 scope.go:117] "RemoveContainer" containerID="4904fbc2a93aa096dedb54bfdc0020c144a85acbe4849899c125620f31985658" Mar 09 13:54:52 crc kubenswrapper[4723]: E0309 13:54:52.882319 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:54:58 crc kubenswrapper[4723]: I0309 13:54:58.138491 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-r7s7q" podUID="a6a78aa6-eeea-4193-a33e-f5a91b6a52e4" containerName="registry-server" probeResult="failure" output=< Mar 09 13:54:58 crc kubenswrapper[4723]: timeout: failed to connect service ":50051" within 1s Mar 09 13:54:58 crc kubenswrapper[4723]: > Mar 09 13:55:04 crc kubenswrapper[4723]: I0309 13:55:04.882353 4723 scope.go:117] "RemoveContainer" containerID="4904fbc2a93aa096dedb54bfdc0020c144a85acbe4849899c125620f31985658" Mar 09 13:55:04 crc kubenswrapper[4723]: E0309 13:55:04.883219 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:55:07 crc kubenswrapper[4723]: I0309 13:55:07.155684 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-r7s7q" Mar 09 13:55:07 crc kubenswrapper[4723]: I0309 13:55:07.215284 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-r7s7q" Mar 09 13:55:07 crc kubenswrapper[4723]: I0309 13:55:07.402466 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r7s7q"] Mar 09 13:55:08 crc kubenswrapper[4723]: I0309 13:55:08.567231 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-r7s7q" podUID="a6a78aa6-eeea-4193-a33e-f5a91b6a52e4" containerName="registry-server" containerID="cri-o://dd2dc82b1976f5d3bd8a692adcdb99cf66fe39f59f43e806ea62df5125c102f7" gracePeriod=2 Mar 09 13:55:09 crc kubenswrapper[4723]: I0309 13:55:09.331628 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r7s7q" Mar 09 13:55:09 crc kubenswrapper[4723]: I0309 13:55:09.417790 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zrj8\" (UniqueName: \"kubernetes.io/projected/a6a78aa6-eeea-4193-a33e-f5a91b6a52e4-kube-api-access-9zrj8\") pod \"a6a78aa6-eeea-4193-a33e-f5a91b6a52e4\" (UID: \"a6a78aa6-eeea-4193-a33e-f5a91b6a52e4\") " Mar 09 13:55:09 crc kubenswrapper[4723]: I0309 13:55:09.417900 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6a78aa6-eeea-4193-a33e-f5a91b6a52e4-utilities\") pod \"a6a78aa6-eeea-4193-a33e-f5a91b6a52e4\" (UID: \"a6a78aa6-eeea-4193-a33e-f5a91b6a52e4\") " Mar 09 13:55:09 crc kubenswrapper[4723]: I0309 13:55:09.418246 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6a78aa6-eeea-4193-a33e-f5a91b6a52e4-catalog-content\") pod \"a6a78aa6-eeea-4193-a33e-f5a91b6a52e4\" (UID: \"a6a78aa6-eeea-4193-a33e-f5a91b6a52e4\") " Mar 09 13:55:09 crc kubenswrapper[4723]: I0309 13:55:09.419353 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6a78aa6-eeea-4193-a33e-f5a91b6a52e4-utilities" (OuterVolumeSpecName: "utilities") pod "a6a78aa6-eeea-4193-a33e-f5a91b6a52e4" (UID: "a6a78aa6-eeea-4193-a33e-f5a91b6a52e4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:55:09 crc kubenswrapper[4723]: I0309 13:55:09.438246 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6a78aa6-eeea-4193-a33e-f5a91b6a52e4-kube-api-access-9zrj8" (OuterVolumeSpecName: "kube-api-access-9zrj8") pod "a6a78aa6-eeea-4193-a33e-f5a91b6a52e4" (UID: "a6a78aa6-eeea-4193-a33e-f5a91b6a52e4"). InnerVolumeSpecName "kube-api-access-9zrj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:55:09 crc kubenswrapper[4723]: I0309 13:55:09.521772 4723 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6a78aa6-eeea-4193-a33e-f5a91b6a52e4-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:55:09 crc kubenswrapper[4723]: I0309 13:55:09.521819 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zrj8\" (UniqueName: \"kubernetes.io/projected/a6a78aa6-eeea-4193-a33e-f5a91b6a52e4-kube-api-access-9zrj8\") on node \"crc\" DevicePath \"\"" Mar 09 13:55:09 crc kubenswrapper[4723]: I0309 13:55:09.541761 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6a78aa6-eeea-4193-a33e-f5a91b6a52e4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a6a78aa6-eeea-4193-a33e-f5a91b6a52e4" (UID: "a6a78aa6-eeea-4193-a33e-f5a91b6a52e4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:55:09 crc kubenswrapper[4723]: I0309 13:55:09.596416 4723 generic.go:334] "Generic (PLEG): container finished" podID="a6a78aa6-eeea-4193-a33e-f5a91b6a52e4" containerID="dd2dc82b1976f5d3bd8a692adcdb99cf66fe39f59f43e806ea62df5125c102f7" exitCode=0 Mar 09 13:55:09 crc kubenswrapper[4723]: I0309 13:55:09.596474 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r7s7q" event={"ID":"a6a78aa6-eeea-4193-a33e-f5a91b6a52e4","Type":"ContainerDied","Data":"dd2dc82b1976f5d3bd8a692adcdb99cf66fe39f59f43e806ea62df5125c102f7"} Mar 09 13:55:09 crc kubenswrapper[4723]: I0309 13:55:09.596506 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r7s7q" event={"ID":"a6a78aa6-eeea-4193-a33e-f5a91b6a52e4","Type":"ContainerDied","Data":"8c4565e7591a35ee17c9866486d5784c9f5d0a6cf5ff4c618211bd1f0b544c23"} Mar 09 13:55:09 crc kubenswrapper[4723]: I0309 13:55:09.596526 4723 scope.go:117] "RemoveContainer" containerID="dd2dc82b1976f5d3bd8a692adcdb99cf66fe39f59f43e806ea62df5125c102f7" Mar 09 13:55:09 crc kubenswrapper[4723]: I0309 13:55:09.596710 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r7s7q" Mar 09 13:55:09 crc kubenswrapper[4723]: I0309 13:55:09.623971 4723 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6a78aa6-eeea-4193-a33e-f5a91b6a52e4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:55:09 crc kubenswrapper[4723]: I0309 13:55:09.633644 4723 scope.go:117] "RemoveContainer" containerID="73921ff151abbd801faa14c70c57fa8164e19efc7389862640e6b55146e4bea3" Mar 09 13:55:09 crc kubenswrapper[4723]: I0309 13:55:09.655017 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r7s7q"] Mar 09 13:55:09 crc kubenswrapper[4723]: I0309 13:55:09.672983 4723 scope.go:117] "RemoveContainer" containerID="68f1809cd84feac4bf98c95bdb257485bcc3a940f7b4e542499d47aa694a62cb" Mar 09 13:55:09 crc kubenswrapper[4723]: I0309 13:55:09.674048 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-r7s7q"] Mar 09 13:55:09 crc kubenswrapper[4723]: I0309 13:55:09.734121 4723 scope.go:117] "RemoveContainer" containerID="dd2dc82b1976f5d3bd8a692adcdb99cf66fe39f59f43e806ea62df5125c102f7" Mar 09 13:55:09 crc kubenswrapper[4723]: E0309 13:55:09.734698 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd2dc82b1976f5d3bd8a692adcdb99cf66fe39f59f43e806ea62df5125c102f7\": container with ID starting with dd2dc82b1976f5d3bd8a692adcdb99cf66fe39f59f43e806ea62df5125c102f7 not found: ID does not exist" containerID="dd2dc82b1976f5d3bd8a692adcdb99cf66fe39f59f43e806ea62df5125c102f7" Mar 09 13:55:09 crc kubenswrapper[4723]: I0309 13:55:09.734761 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd2dc82b1976f5d3bd8a692adcdb99cf66fe39f59f43e806ea62df5125c102f7"} err="failed to get container status \"dd2dc82b1976f5d3bd8a692adcdb99cf66fe39f59f43e806ea62df5125c102f7\": rpc error: code = NotFound desc = could not find container \"dd2dc82b1976f5d3bd8a692adcdb99cf66fe39f59f43e806ea62df5125c102f7\": container with ID starting with dd2dc82b1976f5d3bd8a692adcdb99cf66fe39f59f43e806ea62df5125c102f7 not found: ID does not exist" Mar 09 13:55:09 crc kubenswrapper[4723]: I0309 13:55:09.734801 4723 scope.go:117] "RemoveContainer" containerID="73921ff151abbd801faa14c70c57fa8164e19efc7389862640e6b55146e4bea3" Mar 09 13:55:09 crc kubenswrapper[4723]: E0309 13:55:09.735232 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73921ff151abbd801faa14c70c57fa8164e19efc7389862640e6b55146e4bea3\": container with ID starting with 73921ff151abbd801faa14c70c57fa8164e19efc7389862640e6b55146e4bea3 not found: ID does not exist" containerID="73921ff151abbd801faa14c70c57fa8164e19efc7389862640e6b55146e4bea3" Mar 09 13:55:09 crc kubenswrapper[4723]: I0309 13:55:09.735270 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73921ff151abbd801faa14c70c57fa8164e19efc7389862640e6b55146e4bea3"} err="failed to get container status \"73921ff151abbd801faa14c70c57fa8164e19efc7389862640e6b55146e4bea3\": rpc error: code = NotFound desc = could not find container \"73921ff151abbd801faa14c70c57fa8164e19efc7389862640e6b55146e4bea3\": container with ID starting with 73921ff151abbd801faa14c70c57fa8164e19efc7389862640e6b55146e4bea3 not found: ID does not exist" Mar 09 13:55:09 crc kubenswrapper[4723]: I0309 13:55:09.735298 4723 scope.go:117] "RemoveContainer" containerID="68f1809cd84feac4bf98c95bdb257485bcc3a940f7b4e542499d47aa694a62cb" Mar 09 13:55:09 crc kubenswrapper[4723]: E0309 13:55:09.735558 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68f1809cd84feac4bf98c95bdb257485bcc3a940f7b4e542499d47aa694a62cb\": container with ID starting with 68f1809cd84feac4bf98c95bdb257485bcc3a940f7b4e542499d47aa694a62cb not found: ID does not exist" containerID="68f1809cd84feac4bf98c95bdb257485bcc3a940f7b4e542499d47aa694a62cb" Mar 09 13:55:09 crc kubenswrapper[4723]: I0309 13:55:09.735606 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68f1809cd84feac4bf98c95bdb257485bcc3a940f7b4e542499d47aa694a62cb"} err="failed to get container status \"68f1809cd84feac4bf98c95bdb257485bcc3a940f7b4e542499d47aa694a62cb\": rpc error: code = NotFound desc = could not find container \"68f1809cd84feac4bf98c95bdb257485bcc3a940f7b4e542499d47aa694a62cb\": container with ID starting with 68f1809cd84feac4bf98c95bdb257485bcc3a940f7b4e542499d47aa694a62cb not found: ID does not exist" Mar 09 13:55:10 crc kubenswrapper[4723]: I0309 13:55:10.893071 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6a78aa6-eeea-4193-a33e-f5a91b6a52e4" path="/var/lib/kubelet/pods/a6a78aa6-eeea-4193-a33e-f5a91b6a52e4/volumes" Mar 09 13:55:15 crc kubenswrapper[4723]: I0309 13:55:15.881457 4723 scope.go:117] "RemoveContainer" containerID="4904fbc2a93aa096dedb54bfdc0020c144a85acbe4849899c125620f31985658" Mar 09 13:55:15 crc kubenswrapper[4723]: E0309 13:55:15.882412 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:55:28 crc kubenswrapper[4723]: I0309 13:55:28.881452 4723 scope.go:117] "RemoveContainer" containerID="4904fbc2a93aa096dedb54bfdc0020c144a85acbe4849899c125620f31985658" Mar 09 13:55:28 crc kubenswrapper[4723]: E0309 13:55:28.882764 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:55:39 crc kubenswrapper[4723]: I0309 13:55:39.881027 4723 scope.go:117] "RemoveContainer" containerID="4904fbc2a93aa096dedb54bfdc0020c144a85acbe4849899c125620f31985658" Mar 09 13:55:39 crc kubenswrapper[4723]: E0309 13:55:39.882839 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:55:50 crc kubenswrapper[4723]: I0309 13:55:50.881982 4723 scope.go:117] "RemoveContainer" containerID="4904fbc2a93aa096dedb54bfdc0020c144a85acbe4849899c125620f31985658" Mar 09 13:55:50 crc kubenswrapper[4723]: E0309 13:55:50.882914 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:56:00 crc kubenswrapper[4723]: I0309 13:56:00.156179 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551076-2q99b"] Mar 09 13:56:00 crc kubenswrapper[4723]: E0309 13:56:00.157498 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6a78aa6-eeea-4193-a33e-f5a91b6a52e4" containerName="extract-utilities" Mar 09 13:56:00 crc kubenswrapper[4723]: I0309 13:56:00.157514 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6a78aa6-eeea-4193-a33e-f5a91b6a52e4" containerName="extract-utilities" Mar 09 13:56:00 crc kubenswrapper[4723]: E0309 13:56:00.157524 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6a78aa6-eeea-4193-a33e-f5a91b6a52e4" containerName="registry-server" Mar 09 13:56:00 crc kubenswrapper[4723]: I0309 13:56:00.157530 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6a78aa6-eeea-4193-a33e-f5a91b6a52e4" containerName="registry-server" Mar 09 13:56:00 crc kubenswrapper[4723]: E0309 13:56:00.157544 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6a78aa6-eeea-4193-a33e-f5a91b6a52e4" containerName="extract-content" Mar 09 13:56:00 crc kubenswrapper[4723]: I0309 13:56:00.157550 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6a78aa6-eeea-4193-a33e-f5a91b6a52e4" containerName="extract-content" Mar 09 13:56:00 crc kubenswrapper[4723]: I0309 13:56:00.157820 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6a78aa6-eeea-4193-a33e-f5a91b6a52e4" containerName="registry-server" Mar 09 13:56:00 crc kubenswrapper[4723]: I0309 13:56:00.158630 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551076-2q99b" Mar 09 13:56:00 crc kubenswrapper[4723]: I0309 13:56:00.162203 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:56:00 crc kubenswrapper[4723]: I0309 13:56:00.162548 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jrp4x" Mar 09 13:56:00 crc kubenswrapper[4723]: I0309 13:56:00.162709 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:56:00 crc kubenswrapper[4723]: I0309 13:56:00.173367 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551076-2q99b"] Mar 09 13:56:00 crc kubenswrapper[4723]: I0309 13:56:00.222262 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68zf6\" (UniqueName: \"kubernetes.io/projected/b1d59bae-bf1e-43a5-9151-81096f514920-kube-api-access-68zf6\") pod \"auto-csr-approver-29551076-2q99b\" (UID: \"b1d59bae-bf1e-43a5-9151-81096f514920\") " pod="openshift-infra/auto-csr-approver-29551076-2q99b" Mar 09 13:56:00 crc kubenswrapper[4723]: I0309 13:56:00.324769 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68zf6\" (UniqueName: \"kubernetes.io/projected/b1d59bae-bf1e-43a5-9151-81096f514920-kube-api-access-68zf6\") pod \"auto-csr-approver-29551076-2q99b\" (UID: \"b1d59bae-bf1e-43a5-9151-81096f514920\") " pod="openshift-infra/auto-csr-approver-29551076-2q99b" Mar 09 13:56:00 crc kubenswrapper[4723]: I0309 13:56:00.348257 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68zf6\" (UniqueName: \"kubernetes.io/projected/b1d59bae-bf1e-43a5-9151-81096f514920-kube-api-access-68zf6\") pod \"auto-csr-approver-29551076-2q99b\" (UID: \"b1d59bae-bf1e-43a5-9151-81096f514920\") " pod="openshift-infra/auto-csr-approver-29551076-2q99b" Mar 09 13:56:00 crc kubenswrapper[4723]: I0309 13:56:00.478718 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551076-2q99b" Mar 09 13:56:01 crc kubenswrapper[4723]: I0309 13:56:01.015467 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551076-2q99b"] Mar 09 13:56:01 crc kubenswrapper[4723]: I0309 13:56:01.216254 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551076-2q99b" event={"ID":"b1d59bae-bf1e-43a5-9151-81096f514920","Type":"ContainerStarted","Data":"9ff23b53bca03c1b5c7ff5161991fd72b648c49fb6bbe5786324c77f02362ff1"} Mar 09 13:56:03 crc kubenswrapper[4723]: I0309 13:56:03.241800 4723 generic.go:334] "Generic (PLEG): container finished" podID="b1d59bae-bf1e-43a5-9151-81096f514920" containerID="b73ef56ce5c41209c125422c96370bf6e61e90cc60a0e05679f6f031934142f6" exitCode=0 Mar 09 13:56:03 crc kubenswrapper[4723]: I0309 13:56:03.242429 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551076-2q99b" event={"ID":"b1d59bae-bf1e-43a5-9151-81096f514920","Type":"ContainerDied","Data":"b73ef56ce5c41209c125422c96370bf6e61e90cc60a0e05679f6f031934142f6"} Mar 09 13:56:03 crc kubenswrapper[4723]: I0309 13:56:03.881548 4723 scope.go:117] "RemoveContainer" containerID="4904fbc2a93aa096dedb54bfdc0020c144a85acbe4849899c125620f31985658" Mar 09 13:56:03 crc kubenswrapper[4723]: E0309 13:56:03.881926 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:56:04 crc kubenswrapper[4723]: I0309 13:56:04.677128 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551076-2q99b" Mar 09 13:56:04 crc kubenswrapper[4723]: I0309 13:56:04.774475 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68zf6\" (UniqueName: \"kubernetes.io/projected/b1d59bae-bf1e-43a5-9151-81096f514920-kube-api-access-68zf6\") pod \"b1d59bae-bf1e-43a5-9151-81096f514920\" (UID: \"b1d59bae-bf1e-43a5-9151-81096f514920\") " Mar 09 13:56:04 crc kubenswrapper[4723]: I0309 13:56:04.780656 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1d59bae-bf1e-43a5-9151-81096f514920-kube-api-access-68zf6" (OuterVolumeSpecName: "kube-api-access-68zf6") pod "b1d59bae-bf1e-43a5-9151-81096f514920" (UID: "b1d59bae-bf1e-43a5-9151-81096f514920"). InnerVolumeSpecName "kube-api-access-68zf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:56:04 crc kubenswrapper[4723]: I0309 13:56:04.877766 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68zf6\" (UniqueName: \"kubernetes.io/projected/b1d59bae-bf1e-43a5-9151-81096f514920-kube-api-access-68zf6\") on node \"crc\" DevicePath \"\"" Mar 09 13:56:05 crc kubenswrapper[4723]: I0309 13:56:05.275476 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551076-2q99b" event={"ID":"b1d59bae-bf1e-43a5-9151-81096f514920","Type":"ContainerDied","Data":"9ff23b53bca03c1b5c7ff5161991fd72b648c49fb6bbe5786324c77f02362ff1"} Mar 09 13:56:05 crc kubenswrapper[4723]: I0309 13:56:05.275516 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ff23b53bca03c1b5c7ff5161991fd72b648c49fb6bbe5786324c77f02362ff1" Mar 09 13:56:05 crc kubenswrapper[4723]: I0309 13:56:05.275553 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551076-2q99b" Mar 09 13:56:05 crc kubenswrapper[4723]: I0309 13:56:05.753391 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551070-zdjk2"] Mar 09 13:56:05 crc kubenswrapper[4723]: I0309 13:56:05.767922 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551070-zdjk2"] Mar 09 13:56:06 crc kubenswrapper[4723]: I0309 13:56:06.897434 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3183843f-294d-45f6-b7e7-2e931faa3035" path="/var/lib/kubelet/pods/3183843f-294d-45f6-b7e7-2e931faa3035/volumes" Mar 09 13:56:15 crc kubenswrapper[4723]: I0309 13:56:15.881113 4723 scope.go:117] "RemoveContainer" containerID="4904fbc2a93aa096dedb54bfdc0020c144a85acbe4849899c125620f31985658" Mar 09 13:56:15 crc kubenswrapper[4723]: E0309 13:56:15.881915 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:56:25 crc kubenswrapper[4723]: I0309 13:56:25.855714 4723 scope.go:117] "RemoveContainer" containerID="788ad5f2279320daafd418e59ceb3bda13bbbaf8b8f5133152c484cc05727c19" Mar 09 13:56:30 crc kubenswrapper[4723]: I0309 13:56:30.883556 4723 scope.go:117] "RemoveContainer" containerID="4904fbc2a93aa096dedb54bfdc0020c144a85acbe4849899c125620f31985658" Mar 09 13:56:30 crc kubenswrapper[4723]: E0309 13:56:30.884388 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:56:45 crc kubenswrapper[4723]: I0309 13:56:45.881300 4723 scope.go:117] "RemoveContainer" containerID="4904fbc2a93aa096dedb54bfdc0020c144a85acbe4849899c125620f31985658" Mar 09 13:56:45 crc kubenswrapper[4723]: E0309 13:56:45.882174 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:56:59 crc kubenswrapper[4723]: I0309 13:56:59.882583 4723 scope.go:117] "RemoveContainer" containerID="4904fbc2a93aa096dedb54bfdc0020c144a85acbe4849899c125620f31985658" Mar 09 13:56:59 crc kubenswrapper[4723]: E0309 13:56:59.884429 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:57:14 crc kubenswrapper[4723]: I0309 13:57:14.881188 4723 scope.go:117] "RemoveContainer" containerID="4904fbc2a93aa096dedb54bfdc0020c144a85acbe4849899c125620f31985658" Mar 09 13:57:14 crc kubenswrapper[4723]: E0309 13:57:14.882121 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:57:27 crc kubenswrapper[4723]: I0309 13:57:27.881924 4723 scope.go:117] "RemoveContainer" containerID="4904fbc2a93aa096dedb54bfdc0020c144a85acbe4849899c125620f31985658" Mar 09 13:57:27 crc kubenswrapper[4723]: E0309 13:57:27.882826 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 13:57:30 crc kubenswrapper[4723]: I0309 13:57:30.017509 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-p58dv"] Mar 09 13:57:30 crc kubenswrapper[4723]: E0309 13:57:30.018576 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1d59bae-bf1e-43a5-9151-81096f514920" containerName="oc" Mar 09 13:57:30 crc kubenswrapper[4723]: I0309 13:57:30.018592 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1d59bae-bf1e-43a5-9151-81096f514920" containerName="oc" Mar 09 13:57:30 crc kubenswrapper[4723]: I0309 13:57:30.018941 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1d59bae-bf1e-43a5-9151-81096f514920" containerName="oc" Mar 09 13:57:30 crc kubenswrapper[4723]: I0309 13:57:30.021032 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p58dv" Mar 09 13:57:30 crc kubenswrapper[4723]: I0309 13:57:30.041551 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p58dv"] Mar 09 13:57:30 crc kubenswrapper[4723]: I0309 13:57:30.136148 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f157d49d-b2d1-4579-aa73-41508a9c6ede-catalog-content\") pod \"community-operators-p58dv\" (UID: \"f157d49d-b2d1-4579-aa73-41508a9c6ede\") " pod="openshift-marketplace/community-operators-p58dv" Mar 09 13:57:30 crc kubenswrapper[4723]: I0309 13:57:30.136229 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8pfx\" (UniqueName: \"kubernetes.io/projected/f157d49d-b2d1-4579-aa73-41508a9c6ede-kube-api-access-c8pfx\") pod \"community-operators-p58dv\" (UID: \"f157d49d-b2d1-4579-aa73-41508a9c6ede\") " pod="openshift-marketplace/community-operators-p58dv" Mar 09 13:57:30 crc kubenswrapper[4723]: I0309 13:57:30.136293 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f157d49d-b2d1-4579-aa73-41508a9c6ede-utilities\") pod \"community-operators-p58dv\" (UID: \"f157d49d-b2d1-4579-aa73-41508a9c6ede\") " pod="openshift-marketplace/community-operators-p58dv" Mar 09 13:57:30 crc kubenswrapper[4723]: I0309 13:57:30.238969 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8pfx\" (UniqueName: \"kubernetes.io/projected/f157d49d-b2d1-4579-aa73-41508a9c6ede-kube-api-access-c8pfx\") pod \"community-operators-p58dv\" (UID: \"f157d49d-b2d1-4579-aa73-41508a9c6ede\") " pod="openshift-marketplace/community-operators-p58dv" Mar 09 13:57:30 crc kubenswrapper[4723]: I0309 13:57:30.239115 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f157d49d-b2d1-4579-aa73-41508a9c6ede-utilities\") pod \"community-operators-p58dv\" (UID: \"f157d49d-b2d1-4579-aa73-41508a9c6ede\") " pod="openshift-marketplace/community-operators-p58dv" Mar 09 13:57:30 crc kubenswrapper[4723]: I0309 13:57:30.239643 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f157d49d-b2d1-4579-aa73-41508a9c6ede-utilities\") pod \"community-operators-p58dv\" (UID: \"f157d49d-b2d1-4579-aa73-41508a9c6ede\") " pod="openshift-marketplace/community-operators-p58dv" Mar 09 13:57:30 crc kubenswrapper[4723]: I0309 13:57:30.240034 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f157d49d-b2d1-4579-aa73-41508a9c6ede-catalog-content\") pod \"community-operators-p58dv\" (UID: \"f157d49d-b2d1-4579-aa73-41508a9c6ede\") " pod="openshift-marketplace/community-operators-p58dv" Mar 09 13:57:30 crc kubenswrapper[4723]: I0309 13:57:30.240344 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f157d49d-b2d1-4579-aa73-41508a9c6ede-catalog-content\") pod \"community-operators-p58dv\" (UID: \"f157d49d-b2d1-4579-aa73-41508a9c6ede\") " pod="openshift-marketplace/community-operators-p58dv" Mar 09 13:57:30 crc kubenswrapper[4723]: I0309 13:57:30.262532 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8pfx\" (UniqueName: \"kubernetes.io/projected/f157d49d-b2d1-4579-aa73-41508a9c6ede-kube-api-access-c8pfx\") pod \"community-operators-p58dv\" (UID: \"f157d49d-b2d1-4579-aa73-41508a9c6ede\") " pod="openshift-marketplace/community-operators-p58dv" Mar 09 13:57:30 crc kubenswrapper[4723]: I0309 13:57:30.357860 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p58dv" Mar 09 13:57:31 crc kubenswrapper[4723]: I0309 13:57:31.016975 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p58dv"] Mar 09 13:57:31 crc kubenswrapper[4723]: I0309 13:57:31.244343 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p58dv" event={"ID":"f157d49d-b2d1-4579-aa73-41508a9c6ede","Type":"ContainerStarted","Data":"56c5cac8352e30c44081e232987f0193b11df360b8f05e4969acfdc420f858af"} Mar 09 13:57:32 crc kubenswrapper[4723]: I0309 13:57:32.258495 4723 generic.go:334] "Generic (PLEG): container finished" podID="f157d49d-b2d1-4579-aa73-41508a9c6ede" containerID="39aab85badaf0ac977a1f4b37c5bd275698bf4c0ce940f5f859964e7e8b1073a" exitCode=0 Mar 09 13:57:32 crc kubenswrapper[4723]: I0309 13:57:32.258829 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p58dv" event={"ID":"f157d49d-b2d1-4579-aa73-41508a9c6ede","Type":"ContainerDied","Data":"39aab85badaf0ac977a1f4b37c5bd275698bf4c0ce940f5f859964e7e8b1073a"} Mar 09 13:57:32 crc kubenswrapper[4723]: I0309 13:57:32.267678 4723 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 13:57:33 crc kubenswrapper[4723]: I0309 13:57:33.270699 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p58dv" event={"ID":"f157d49d-b2d1-4579-aa73-41508a9c6ede","Type":"ContainerStarted","Data":"8be024f78ff4c5064d1625b732d3b26cf03adfe30f004423f0b0792745ba70fa"} Mar 09 13:57:34 crc kubenswrapper[4723]: I0309 13:57:34.282966 4723 generic.go:334] "Generic (PLEG): container finished" podID="f157d49d-b2d1-4579-aa73-41508a9c6ede" containerID="8be024f78ff4c5064d1625b732d3b26cf03adfe30f004423f0b0792745ba70fa" exitCode=0 Mar 09 13:57:34 crc kubenswrapper[4723]: I0309 13:57:34.283057 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p58dv" event={"ID":"f157d49d-b2d1-4579-aa73-41508a9c6ede","Type":"ContainerDied","Data":"8be024f78ff4c5064d1625b732d3b26cf03adfe30f004423f0b0792745ba70fa"} Mar 09 13:57:35 crc kubenswrapper[4723]: I0309 13:57:35.295844 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p58dv" event={"ID":"f157d49d-b2d1-4579-aa73-41508a9c6ede","Type":"ContainerStarted","Data":"00cce0e906f5197ef752c33f2bb1fec1c922c503e2a7aeaacfdf262073e8c5c4"} Mar 09 13:57:35 crc kubenswrapper[4723]: I0309 13:57:35.332371 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-p58dv" podStartSLOduration=3.862198474 podStartE2EDuration="6.332349298s" podCreationTimestamp="2026-03-09 13:57:29 +0000 UTC" firstStartedPulling="2026-03-09 13:57:32.265223421 +0000 UTC m=+3526.279690981" lastFinishedPulling="2026-03-09 13:57:34.735374265 +0000 UTC m=+3528.749841805" observedRunningTime="2026-03-09 13:57:35.317839312 +0000 UTC m=+3529.332306872" watchObservedRunningTime="2026-03-09 13:57:35.332349298 +0000 UTC m=+3529.346816848" Mar 09 13:57:39 crc kubenswrapper[4723]: I0309 13:57:39.163240 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qqvzp"] Mar 09 13:57:39 crc kubenswrapper[4723]: I0309 13:57:39.166589 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qqvzp" Mar 09 13:57:39 crc kubenswrapper[4723]: I0309 13:57:39.180242 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qqvzp"] Mar 09 13:57:39 crc kubenswrapper[4723]: I0309 13:57:39.356688 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c488b08c-e0d5-4112-9521-afbfa19affb6-catalog-content\") pod \"certified-operators-qqvzp\" (UID: \"c488b08c-e0d5-4112-9521-afbfa19affb6\") " pod="openshift-marketplace/certified-operators-qqvzp" Mar 09 13:57:39 crc kubenswrapper[4723]: I0309 13:57:39.356801 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c488b08c-e0d5-4112-9521-afbfa19affb6-utilities\") pod \"certified-operators-qqvzp\" (UID: \"c488b08c-e0d5-4112-9521-afbfa19affb6\") " pod="openshift-marketplace/certified-operators-qqvzp" Mar 09 13:57:39 crc kubenswrapper[4723]: I0309 13:57:39.357012 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6ttp\" (UniqueName: \"kubernetes.io/projected/c488b08c-e0d5-4112-9521-afbfa19affb6-kube-api-access-x6ttp\") pod \"certified-operators-qqvzp\" (UID: \"c488b08c-e0d5-4112-9521-afbfa19affb6\") " pod="openshift-marketplace/certified-operators-qqvzp" Mar 09 13:57:39 crc kubenswrapper[4723]: I0309 13:57:39.459051 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c488b08c-e0d5-4112-9521-afbfa19affb6-catalog-content\") pod \"certified-operators-qqvzp\" (UID: \"c488b08c-e0d5-4112-9521-afbfa19affb6\") " pod="openshift-marketplace/certified-operators-qqvzp" Mar 09 13:57:39 crc kubenswrapper[4723]: I0309 13:57:39.459130 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c488b08c-e0d5-4112-9521-afbfa19affb6-utilities\") pod \"certified-operators-qqvzp\" (UID: \"c488b08c-e0d5-4112-9521-afbfa19affb6\") " pod="openshift-marketplace/certified-operators-qqvzp" Mar 09 13:57:39 crc kubenswrapper[4723]: I0309 13:57:39.459256 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6ttp\" (UniqueName: \"kubernetes.io/projected/c488b08c-e0d5-4112-9521-afbfa19affb6-kube-api-access-x6ttp\") pod \"certified-operators-qqvzp\" (UID: \"c488b08c-e0d5-4112-9521-afbfa19affb6\") " pod="openshift-marketplace/certified-operators-qqvzp" Mar 09 13:57:39 crc kubenswrapper[4723]: I0309 13:57:39.459533 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c488b08c-e0d5-4112-9521-afbfa19affb6-catalog-content\") pod \"certified-operators-qqvzp\" (UID: \"c488b08c-e0d5-4112-9521-afbfa19affb6\") " pod="openshift-marketplace/certified-operators-qqvzp" Mar 09 13:57:39 crc kubenswrapper[4723]: I0309 13:57:39.459665 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c488b08c-e0d5-4112-9521-afbfa19affb6-utilities\") pod \"certified-operators-qqvzp\" (UID: \"c488b08c-e0d5-4112-9521-afbfa19affb6\") " pod="openshift-marketplace/certified-operators-qqvzp" Mar 09 13:57:39 crc kubenswrapper[4723]: I0309 13:57:39.480621 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6ttp\" (UniqueName: \"kubernetes.io/projected/c488b08c-e0d5-4112-9521-afbfa19affb6-kube-api-access-x6ttp\") pod \"certified-operators-qqvzp\" (UID: \"c488b08c-e0d5-4112-9521-afbfa19affb6\") " pod="openshift-marketplace/certified-operators-qqvzp" Mar 09 13:57:39 crc kubenswrapper[4723]: I0309 13:57:39.493571 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qqvzp" Mar 09 13:57:40 crc kubenswrapper[4723]: I0309 13:57:40.063788 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qqvzp"] Mar 09 13:57:40 crc kubenswrapper[4723]: I0309 13:57:40.345063 4723 generic.go:334] "Generic (PLEG): container finished" podID="c488b08c-e0d5-4112-9521-afbfa19affb6" containerID="ac628d4e25d5d2065fe703b7e3113e74aa46c9e22bc9ceefb5ce268b521c5d83" exitCode=0 Mar 09 13:57:40 crc kubenswrapper[4723]: I0309 13:57:40.345235 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qqvzp" event={"ID":"c488b08c-e0d5-4112-9521-afbfa19affb6","Type":"ContainerDied","Data":"ac628d4e25d5d2065fe703b7e3113e74aa46c9e22bc9ceefb5ce268b521c5d83"} Mar 09 13:57:40 crc kubenswrapper[4723]: I0309 13:57:40.345389 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qqvzp" event={"ID":"c488b08c-e0d5-4112-9521-afbfa19affb6","Type":"ContainerStarted","Data":"196e090a5a3b633141a4322b487ea363da2796a493feef795a995c7d44c5e278"} Mar 09 13:57:40 crc kubenswrapper[4723]: I0309 13:57:40.359082 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p58dv" Mar 09 13:57:40 crc kubenswrapper[4723]: I0309 13:57:40.359212 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-p58dv" Mar 09 13:57:40 crc kubenswrapper[4723]: I0309 13:57:40.428810 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-p58dv" Mar 09 13:57:40 crc kubenswrapper[4723]: I0309 13:57:40.881046 4723 scope.go:117] "RemoveContainer" containerID="4904fbc2a93aa096dedb54bfdc0020c144a85acbe4849899c125620f31985658" Mar 09 13:57:41 crc kubenswrapper[4723]: I0309 13:57:41.358495 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qqvzp" event={"ID":"c488b08c-e0d5-4112-9521-afbfa19affb6","Type":"ContainerStarted","Data":"0ae63eb4d966c492861a36a9bcaa2db998ae172cd7152c648ab4a5f5be35dbbb"} Mar 09 13:57:41 crc kubenswrapper[4723]: I0309 13:57:41.361787 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" event={"ID":"983d5ed4-cfc7-402a-b226-29dc071c6e4e","Type":"ContainerStarted","Data":"b9917d1efbbbf05f4cd1ae94e4d9bdfa5b2f0c6eaedeaba13758e7498a18fdc2"} Mar 09 13:57:41 crc kubenswrapper[4723]: I0309 13:57:41.418129 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-p58dv" Mar 09 13:57:43 crc kubenswrapper[4723]: I0309 13:57:43.754972 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p58dv"] Mar 09 13:57:43 crc kubenswrapper[4723]: I0309 13:57:43.755635 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-p58dv" podUID="f157d49d-b2d1-4579-aa73-41508a9c6ede" containerName="registry-server" containerID="cri-o://00cce0e906f5197ef752c33f2bb1fec1c922c503e2a7aeaacfdf262073e8c5c4" gracePeriod=2 Mar 09 13:57:44 crc kubenswrapper[4723]: I0309 13:57:44.309708 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p58dv" Mar 09 13:57:44 crc kubenswrapper[4723]: I0309 13:57:44.395176 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f157d49d-b2d1-4579-aa73-41508a9c6ede-catalog-content\") pod \"f157d49d-b2d1-4579-aa73-41508a9c6ede\" (UID: \"f157d49d-b2d1-4579-aa73-41508a9c6ede\") " Mar 09 13:57:44 crc kubenswrapper[4723]: I0309 13:57:44.395406 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f157d49d-b2d1-4579-aa73-41508a9c6ede-utilities\") pod \"f157d49d-b2d1-4579-aa73-41508a9c6ede\" (UID: \"f157d49d-b2d1-4579-aa73-41508a9c6ede\") " Mar 09 13:57:44 crc kubenswrapper[4723]: I0309 13:57:44.395454 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8pfx\" (UniqueName: \"kubernetes.io/projected/f157d49d-b2d1-4579-aa73-41508a9c6ede-kube-api-access-c8pfx\") pod \"f157d49d-b2d1-4579-aa73-41508a9c6ede\" (UID: \"f157d49d-b2d1-4579-aa73-41508a9c6ede\") " Mar 09 13:57:44 crc kubenswrapper[4723]: I0309 13:57:44.395757 4723 generic.go:334] "Generic (PLEG): container finished" podID="f157d49d-b2d1-4579-aa73-41508a9c6ede" containerID="00cce0e906f5197ef752c33f2bb1fec1c922c503e2a7aeaacfdf262073e8c5c4" exitCode=0 Mar 09 13:57:44 crc kubenswrapper[4723]: I0309 13:57:44.395953 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p58dv" event={"ID":"f157d49d-b2d1-4579-aa73-41508a9c6ede","Type":"ContainerDied","Data":"00cce0e906f5197ef752c33f2bb1fec1c922c503e2a7aeaacfdf262073e8c5c4"} Mar 09 13:57:44 crc kubenswrapper[4723]: I0309 13:57:44.396008 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p58dv" event={"ID":"f157d49d-b2d1-4579-aa73-41508a9c6ede","Type":"ContainerDied","Data":"56c5cac8352e30c44081e232987f0193b11df360b8f05e4969acfdc420f858af"} Mar 09 13:57:44 crc kubenswrapper[4723]: I0309 13:57:44.396007 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p58dv" Mar 09 13:57:44 crc kubenswrapper[4723]: I0309 13:57:44.396022 4723 scope.go:117] "RemoveContainer" containerID="00cce0e906f5197ef752c33f2bb1fec1c922c503e2a7aeaacfdf262073e8c5c4" Mar 09 13:57:44 crc kubenswrapper[4723]: I0309 13:57:44.397643 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f157d49d-b2d1-4579-aa73-41508a9c6ede-utilities" (OuterVolumeSpecName: "utilities") pod "f157d49d-b2d1-4579-aa73-41508a9c6ede" (UID: "f157d49d-b2d1-4579-aa73-41508a9c6ede"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:57:44 crc kubenswrapper[4723]: I0309 13:57:44.402083 4723 generic.go:334] "Generic (PLEG): container finished" podID="c488b08c-e0d5-4112-9521-afbfa19affb6" containerID="0ae63eb4d966c492861a36a9bcaa2db998ae172cd7152c648ab4a5f5be35dbbb" exitCode=0 Mar 09 13:57:44 crc kubenswrapper[4723]: I0309 13:57:44.402123 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qqvzp" event={"ID":"c488b08c-e0d5-4112-9521-afbfa19affb6","Type":"ContainerDied","Data":"0ae63eb4d966c492861a36a9bcaa2db998ae172cd7152c648ab4a5f5be35dbbb"} Mar 09 13:57:44 crc kubenswrapper[4723]: I0309 13:57:44.412000 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f157d49d-b2d1-4579-aa73-41508a9c6ede-kube-api-access-c8pfx" (OuterVolumeSpecName: "kube-api-access-c8pfx") pod "f157d49d-b2d1-4579-aa73-41508a9c6ede" (UID: "f157d49d-b2d1-4579-aa73-41508a9c6ede"). InnerVolumeSpecName "kube-api-access-c8pfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:57:44 crc kubenswrapper[4723]: I0309 13:57:44.482745 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f157d49d-b2d1-4579-aa73-41508a9c6ede-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f157d49d-b2d1-4579-aa73-41508a9c6ede" (UID: "f157d49d-b2d1-4579-aa73-41508a9c6ede"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:57:44 crc kubenswrapper[4723]: I0309 13:57:44.493420 4723 scope.go:117] "RemoveContainer" containerID="8be024f78ff4c5064d1625b732d3b26cf03adfe30f004423f0b0792745ba70fa" Mar 09 13:57:44 crc kubenswrapper[4723]: I0309 13:57:44.500597 4723 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f157d49d-b2d1-4579-aa73-41508a9c6ede-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:57:44 crc kubenswrapper[4723]: I0309 13:57:44.500629 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8pfx\" (UniqueName: \"kubernetes.io/projected/f157d49d-b2d1-4579-aa73-41508a9c6ede-kube-api-access-c8pfx\") on node \"crc\" DevicePath \"\"" Mar 09 13:57:44 crc kubenswrapper[4723]: I0309 13:57:44.500638 4723 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f157d49d-b2d1-4579-aa73-41508a9c6ede-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:57:44 crc kubenswrapper[4723]: I0309 13:57:44.519043 4723 scope.go:117] "RemoveContainer" containerID="39aab85badaf0ac977a1f4b37c5bd275698bf4c0ce940f5f859964e7e8b1073a" Mar 09 13:57:44 crc kubenswrapper[4723]: I0309 13:57:44.572336 4723 scope.go:117] "RemoveContainer" containerID="00cce0e906f5197ef752c33f2bb1fec1c922c503e2a7aeaacfdf262073e8c5c4" Mar 09 13:57:44 crc kubenswrapper[4723]: E0309 13:57:44.572780 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00cce0e906f5197ef752c33f2bb1fec1c922c503e2a7aeaacfdf262073e8c5c4\": container with ID starting with 00cce0e906f5197ef752c33f2bb1fec1c922c503e2a7aeaacfdf262073e8c5c4 not found: ID does not exist" containerID="00cce0e906f5197ef752c33f2bb1fec1c922c503e2a7aeaacfdf262073e8c5c4" Mar 09 13:57:44 crc kubenswrapper[4723]: I0309 13:57:44.572810 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00cce0e906f5197ef752c33f2bb1fec1c922c503e2a7aeaacfdf262073e8c5c4"} err="failed to get container status \"00cce0e906f5197ef752c33f2bb1fec1c922c503e2a7aeaacfdf262073e8c5c4\": rpc error: code = NotFound desc = could not find container \"00cce0e906f5197ef752c33f2bb1fec1c922c503e2a7aeaacfdf262073e8c5c4\": container with ID starting with 00cce0e906f5197ef752c33f2bb1fec1c922c503e2a7aeaacfdf262073e8c5c4 not found: ID does not exist" Mar 09 13:57:44 crc kubenswrapper[4723]: I0309 13:57:44.572830 4723 scope.go:117] "RemoveContainer" containerID="8be024f78ff4c5064d1625b732d3b26cf03adfe30f004423f0b0792745ba70fa" Mar 09 13:57:44 crc kubenswrapper[4723]: E0309 13:57:44.573272 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8be024f78ff4c5064d1625b732d3b26cf03adfe30f004423f0b0792745ba70fa\": container with ID starting with 8be024f78ff4c5064d1625b732d3b26cf03adfe30f004423f0b0792745ba70fa not found: ID does not exist" containerID="8be024f78ff4c5064d1625b732d3b26cf03adfe30f004423f0b0792745ba70fa" Mar 09 13:57:44 crc kubenswrapper[4723]: I0309 13:57:44.573300 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8be024f78ff4c5064d1625b732d3b26cf03adfe30f004423f0b0792745ba70fa"} err="failed to get container status \"8be024f78ff4c5064d1625b732d3b26cf03adfe30f004423f0b0792745ba70fa\": rpc error: code = NotFound desc = could not find container \"8be024f78ff4c5064d1625b732d3b26cf03adfe30f004423f0b0792745ba70fa\": container with ID starting with 8be024f78ff4c5064d1625b732d3b26cf03adfe30f004423f0b0792745ba70fa not found: ID does not exist" Mar 09 13:57:44 crc kubenswrapper[4723]: I0309 13:57:44.573316 4723 scope.go:117] "RemoveContainer" containerID="39aab85badaf0ac977a1f4b37c5bd275698bf4c0ce940f5f859964e7e8b1073a" Mar 09 13:57:44 crc kubenswrapper[4723]: E0309 13:57:44.573595 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39aab85badaf0ac977a1f4b37c5bd275698bf4c0ce940f5f859964e7e8b1073a\": container with ID starting with 39aab85badaf0ac977a1f4b37c5bd275698bf4c0ce940f5f859964e7e8b1073a not found: ID does not exist" containerID="39aab85badaf0ac977a1f4b37c5bd275698bf4c0ce940f5f859964e7e8b1073a" Mar 09 13:57:44 crc kubenswrapper[4723]: I0309 13:57:44.573633 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39aab85badaf0ac977a1f4b37c5bd275698bf4c0ce940f5f859964e7e8b1073a"} err="failed to get container status \"39aab85badaf0ac977a1f4b37c5bd275698bf4c0ce940f5f859964e7e8b1073a\": rpc error: code = NotFound desc = could not find container \"39aab85badaf0ac977a1f4b37c5bd275698bf4c0ce940f5f859964e7e8b1073a\": container with ID starting with 39aab85badaf0ac977a1f4b37c5bd275698bf4c0ce940f5f859964e7e8b1073a not found: ID does not exist" Mar 09 13:57:44 crc kubenswrapper[4723]: I0309 13:57:44.740826 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p58dv"] Mar 09 13:57:44 crc kubenswrapper[4723]: I0309 13:57:44.758756 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-p58dv"] Mar 09 13:57:44 crc kubenswrapper[4723]: I0309 13:57:44.894234 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f157d49d-b2d1-4579-aa73-41508a9c6ede" path="/var/lib/kubelet/pods/f157d49d-b2d1-4579-aa73-41508a9c6ede/volumes" Mar 09 13:57:45 crc kubenswrapper[4723]: I0309 13:57:45.416336 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qqvzp" event={"ID":"c488b08c-e0d5-4112-9521-afbfa19affb6","Type":"ContainerStarted","Data":"c637ee3e25a5382010ee486b995a81099c97f9096955af328ffa2ed672eef487"} Mar 09 13:57:45 crc kubenswrapper[4723]: I0309 13:57:45.468538 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qqvzp" podStartSLOduration=1.958833584 podStartE2EDuration="6.468522207s" podCreationTimestamp="2026-03-09 13:57:39 +0000 UTC" firstStartedPulling="2026-03-09 13:57:40.34681947 +0000 UTC m=+3534.361287010" lastFinishedPulling="2026-03-09 13:57:44.856508093 +0000 UTC m=+3538.870975633" observedRunningTime="2026-03-09 13:57:45.46336575 +0000 UTC m=+3539.477833290" watchObservedRunningTime="2026-03-09 13:57:45.468522207 +0000 UTC m=+3539.482989747" Mar 09 13:57:49 crc kubenswrapper[4723]: I0309 13:57:49.494053 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qqvzp" Mar 09 13:57:49 crc kubenswrapper[4723]: I0309 13:57:49.494658 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qqvzp" Mar 09 13:57:49 crc kubenswrapper[4723]: I0309 13:57:49.583665 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qqvzp" Mar 09 13:57:50 crc kubenswrapper[4723]: I0309 13:57:50.529801 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qqvzp" Mar 09 13:57:50 crc kubenswrapper[4723]: I0309 13:57:50.755458 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qqvzp"] Mar 09 13:57:52 crc kubenswrapper[4723]: I0309 13:57:52.485981 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qqvzp" podUID="c488b08c-e0d5-4112-9521-afbfa19affb6" containerName="registry-server" containerID="cri-o://c637ee3e25a5382010ee486b995a81099c97f9096955af328ffa2ed672eef487" gracePeriod=2 Mar 09 13:57:52 crc kubenswrapper[4723]: I0309 13:57:52.972464 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qqvzp" Mar 09 13:57:53 crc kubenswrapper[4723]: I0309 13:57:53.099325 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6ttp\" (UniqueName: \"kubernetes.io/projected/c488b08c-e0d5-4112-9521-afbfa19affb6-kube-api-access-x6ttp\") pod \"c488b08c-e0d5-4112-9521-afbfa19affb6\" (UID: \"c488b08c-e0d5-4112-9521-afbfa19affb6\") " Mar 09 13:57:53 crc kubenswrapper[4723]: I0309 13:57:53.099583 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c488b08c-e0d5-4112-9521-afbfa19affb6-catalog-content\") pod \"c488b08c-e0d5-4112-9521-afbfa19affb6\" (UID: \"c488b08c-e0d5-4112-9521-afbfa19affb6\") " Mar 09 13:57:53 crc kubenswrapper[4723]: I0309 13:57:53.099755 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c488b08c-e0d5-4112-9521-afbfa19affb6-utilities\") pod \"c488b08c-e0d5-4112-9521-afbfa19affb6\" (UID: \"c488b08c-e0d5-4112-9521-afbfa19affb6\") " Mar 09 13:57:53 crc kubenswrapper[4723]: I0309 13:57:53.100546 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c488b08c-e0d5-4112-9521-afbfa19affb6-utilities" (OuterVolumeSpecName: "utilities") pod "c488b08c-e0d5-4112-9521-afbfa19affb6" (UID: "c488b08c-e0d5-4112-9521-afbfa19affb6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:57:53 crc kubenswrapper[4723]: I0309 13:57:53.101349 4723 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c488b08c-e0d5-4112-9521-afbfa19affb6-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:57:53 crc kubenswrapper[4723]: I0309 13:57:53.104906 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c488b08c-e0d5-4112-9521-afbfa19affb6-kube-api-access-x6ttp" (OuterVolumeSpecName: "kube-api-access-x6ttp") pod "c488b08c-e0d5-4112-9521-afbfa19affb6" (UID: "c488b08c-e0d5-4112-9521-afbfa19affb6"). InnerVolumeSpecName "kube-api-access-x6ttp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:57:53 crc kubenswrapper[4723]: I0309 13:57:53.157560 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c488b08c-e0d5-4112-9521-afbfa19affb6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c488b08c-e0d5-4112-9521-afbfa19affb6" (UID: "c488b08c-e0d5-4112-9521-afbfa19affb6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:57:53 crc kubenswrapper[4723]: I0309 13:57:53.204035 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6ttp\" (UniqueName: \"kubernetes.io/projected/c488b08c-e0d5-4112-9521-afbfa19affb6-kube-api-access-x6ttp\") on node \"crc\" DevicePath \"\"" Mar 09 13:57:53 crc kubenswrapper[4723]: I0309 13:57:53.204070 4723 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c488b08c-e0d5-4112-9521-afbfa19affb6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:57:53 crc kubenswrapper[4723]: I0309 13:57:53.502024 4723 generic.go:334] "Generic (PLEG): container finished" podID="c488b08c-e0d5-4112-9521-afbfa19affb6" containerID="c637ee3e25a5382010ee486b995a81099c97f9096955af328ffa2ed672eef487" exitCode=0 Mar 09 13:57:53 crc kubenswrapper[4723]: I0309 13:57:53.502084 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qqvzp" event={"ID":"c488b08c-e0d5-4112-9521-afbfa19affb6","Type":"ContainerDied","Data":"c637ee3e25a5382010ee486b995a81099c97f9096955af328ffa2ed672eef487"} Mar 09 13:57:53 crc kubenswrapper[4723]: I0309 13:57:53.502106 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qqvzp" Mar 09 13:57:53 crc kubenswrapper[4723]: I0309 13:57:53.502133 4723 scope.go:117] "RemoveContainer" containerID="c637ee3e25a5382010ee486b995a81099c97f9096955af328ffa2ed672eef487" Mar 09 13:57:53 crc kubenswrapper[4723]: I0309 13:57:53.502119 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qqvzp" event={"ID":"c488b08c-e0d5-4112-9521-afbfa19affb6","Type":"ContainerDied","Data":"196e090a5a3b633141a4322b487ea363da2796a493feef795a995c7d44c5e278"} Mar 09 13:57:53 crc kubenswrapper[4723]: I0309 13:57:53.548408 4723 scope.go:117] "RemoveContainer" containerID="0ae63eb4d966c492861a36a9bcaa2db998ae172cd7152c648ab4a5f5be35dbbb" Mar 09 13:57:53 crc kubenswrapper[4723]: I0309 13:57:53.561241 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qqvzp"] Mar 09 13:57:53 crc kubenswrapper[4723]: I0309 13:57:53.577949 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qqvzp"] Mar 09 13:57:53 crc kubenswrapper[4723]: I0309 13:57:53.605553 4723 scope.go:117] "RemoveContainer" containerID="ac628d4e25d5d2065fe703b7e3113e74aa46c9e22bc9ceefb5ce268b521c5d83" Mar 09 13:57:53 crc kubenswrapper[4723]: I0309 13:57:53.650190 4723 scope.go:117] "RemoveContainer" containerID="c637ee3e25a5382010ee486b995a81099c97f9096955af328ffa2ed672eef487" Mar 09 13:57:53 crc kubenswrapper[4723]: E0309 13:57:53.650619 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c637ee3e25a5382010ee486b995a81099c97f9096955af328ffa2ed672eef487\": container with ID starting with c637ee3e25a5382010ee486b995a81099c97f9096955af328ffa2ed672eef487 not found: ID does not exist" containerID="c637ee3e25a5382010ee486b995a81099c97f9096955af328ffa2ed672eef487" Mar 09 13:57:53 crc kubenswrapper[4723]: I0309 13:57:53.650645 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c637ee3e25a5382010ee486b995a81099c97f9096955af328ffa2ed672eef487"} err="failed to get container status \"c637ee3e25a5382010ee486b995a81099c97f9096955af328ffa2ed672eef487\": rpc error: code = NotFound desc = could not find container \"c637ee3e25a5382010ee486b995a81099c97f9096955af328ffa2ed672eef487\": container with ID starting with c637ee3e25a5382010ee486b995a81099c97f9096955af328ffa2ed672eef487 not found: ID does not exist" Mar 09 13:57:53 crc kubenswrapper[4723]: I0309 13:57:53.650664 4723 scope.go:117] "RemoveContainer" containerID="0ae63eb4d966c492861a36a9bcaa2db998ae172cd7152c648ab4a5f5be35dbbb" Mar 09 13:57:53 crc kubenswrapper[4723]: E0309 13:57:53.651121 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ae63eb4d966c492861a36a9bcaa2db998ae172cd7152c648ab4a5f5be35dbbb\": container with ID starting with 0ae63eb4d966c492861a36a9bcaa2db998ae172cd7152c648ab4a5f5be35dbbb not found: ID does not exist" containerID="0ae63eb4d966c492861a36a9bcaa2db998ae172cd7152c648ab4a5f5be35dbbb" Mar 09 13:57:53 crc kubenswrapper[4723]: I0309 13:57:53.651176 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ae63eb4d966c492861a36a9bcaa2db998ae172cd7152c648ab4a5f5be35dbbb"} err="failed to get container status \"0ae63eb4d966c492861a36a9bcaa2db998ae172cd7152c648ab4a5f5be35dbbb\": rpc error: code = NotFound desc = could not find container \"0ae63eb4d966c492861a36a9bcaa2db998ae172cd7152c648ab4a5f5be35dbbb\": container with ID starting with 0ae63eb4d966c492861a36a9bcaa2db998ae172cd7152c648ab4a5f5be35dbbb not found: ID does not exist" Mar 09 13:57:53 crc kubenswrapper[4723]: I0309 13:57:53.651219 4723 scope.go:117] "RemoveContainer" containerID="ac628d4e25d5d2065fe703b7e3113e74aa46c9e22bc9ceefb5ce268b521c5d83" Mar 09 13:57:53 crc kubenswrapper[4723]: E0309 13:57:53.651634 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac628d4e25d5d2065fe703b7e3113e74aa46c9e22bc9ceefb5ce268b521c5d83\": container with ID starting with ac628d4e25d5d2065fe703b7e3113e74aa46c9e22bc9ceefb5ce268b521c5d83 not found: ID does not exist" containerID="ac628d4e25d5d2065fe703b7e3113e74aa46c9e22bc9ceefb5ce268b521c5d83" Mar 09 13:57:53 crc kubenswrapper[4723]: I0309 13:57:53.651661 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac628d4e25d5d2065fe703b7e3113e74aa46c9e22bc9ceefb5ce268b521c5d83"} err="failed to get container status \"ac628d4e25d5d2065fe703b7e3113e74aa46c9e22bc9ceefb5ce268b521c5d83\": rpc error: code = NotFound desc = could not find container \"ac628d4e25d5d2065fe703b7e3113e74aa46c9e22bc9ceefb5ce268b521c5d83\": container with ID starting with ac628d4e25d5d2065fe703b7e3113e74aa46c9e22bc9ceefb5ce268b521c5d83 not found: ID does not exist" Mar 09 13:57:54 crc kubenswrapper[4723]: I0309 13:57:54.893337 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c488b08c-e0d5-4112-9521-afbfa19affb6" path="/var/lib/kubelet/pods/c488b08c-e0d5-4112-9521-afbfa19affb6/volumes" Mar 09 13:58:00 crc kubenswrapper[4723]: I0309 13:58:00.170319 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551078-2gl7q"] Mar 09 13:58:00 crc kubenswrapper[4723]: E0309 13:58:00.171968 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f157d49d-b2d1-4579-aa73-41508a9c6ede" containerName="extract-utilities" Mar 09 13:58:00 crc kubenswrapper[4723]: I0309 13:58:00.171988 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="f157d49d-b2d1-4579-aa73-41508a9c6ede" containerName="extract-utilities" Mar 09 13:58:00 crc kubenswrapper[4723]: E0309 13:58:00.172050 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f157d49d-b2d1-4579-aa73-41508a9c6ede" containerName="extract-content" Mar 09 13:58:00 crc kubenswrapper[4723]: I0309 13:58:00.172060 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="f157d49d-b2d1-4579-aa73-41508a9c6ede" containerName="extract-content" Mar 09 13:58:00 crc kubenswrapper[4723]: E0309 13:58:00.172075 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f157d49d-b2d1-4579-aa73-41508a9c6ede" containerName="registry-server" Mar 09 13:58:00 crc kubenswrapper[4723]: I0309 13:58:00.172084 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="f157d49d-b2d1-4579-aa73-41508a9c6ede" containerName="registry-server" Mar 09 13:58:00 crc kubenswrapper[4723]: E0309 13:58:00.172105 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c488b08c-e0d5-4112-9521-afbfa19affb6" containerName="extract-content" Mar 09 13:58:00 crc kubenswrapper[4723]: I0309 13:58:00.172113 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="c488b08c-e0d5-4112-9521-afbfa19affb6" containerName="extract-content" Mar 09 13:58:00 crc kubenswrapper[4723]: E0309 13:58:00.172160 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c488b08c-e0d5-4112-9521-afbfa19affb6" containerName="extract-utilities" Mar 09 13:58:00 crc kubenswrapper[4723]: I0309 13:58:00.172169 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="c488b08c-e0d5-4112-9521-afbfa19affb6" containerName="extract-utilities" Mar 09 13:58:00 crc kubenswrapper[4723]: E0309 13:58:00.172208 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c488b08c-e0d5-4112-9521-afbfa19affb6" containerName="registry-server" Mar 09 13:58:00 crc kubenswrapper[4723]: I0309 13:58:00.172217 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="c488b08c-e0d5-4112-9521-afbfa19affb6" containerName="registry-server" Mar 09 13:58:00 crc kubenswrapper[4723]: I0309 13:58:00.173132 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="f157d49d-b2d1-4579-aa73-41508a9c6ede" containerName="registry-server" Mar 09 13:58:00 crc kubenswrapper[4723]: I0309 13:58:00.173195 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="c488b08c-e0d5-4112-9521-afbfa19affb6" containerName="registry-server" Mar 09 13:58:00 crc kubenswrapper[4723]: I0309 13:58:00.174769 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551078-2gl7q" Mar 09 13:58:00 crc kubenswrapper[4723]: I0309 13:58:00.178077 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:58:00 crc kubenswrapper[4723]: I0309 13:58:00.178151 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jrp4x" Mar 09 13:58:00 crc kubenswrapper[4723]: I0309 13:58:00.178462 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:58:00 crc kubenswrapper[4723]: I0309 13:58:00.198289 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551078-2gl7q"] Mar 09 13:58:00 crc kubenswrapper[4723]: I0309 13:58:00.280107 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnvsx\" (UniqueName: \"kubernetes.io/projected/c346e089-31ed-4a23-9cf6-deea92044f18-kube-api-access-mnvsx\") pod \"auto-csr-approver-29551078-2gl7q\" (UID: \"c346e089-31ed-4a23-9cf6-deea92044f18\") " pod="openshift-infra/auto-csr-approver-29551078-2gl7q" Mar 09 13:58:00 crc kubenswrapper[4723]: I0309 13:58:00.382576 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnvsx\" (UniqueName: \"kubernetes.io/projected/c346e089-31ed-4a23-9cf6-deea92044f18-kube-api-access-mnvsx\") pod \"auto-csr-approver-29551078-2gl7q\" (UID: \"c346e089-31ed-4a23-9cf6-deea92044f18\") " pod="openshift-infra/auto-csr-approver-29551078-2gl7q" Mar 09 13:58:00 crc kubenswrapper[4723]: I0309 13:58:00.401715 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnvsx\" (UniqueName: \"kubernetes.io/projected/c346e089-31ed-4a23-9cf6-deea92044f18-kube-api-access-mnvsx\") pod \"auto-csr-approver-29551078-2gl7q\" (UID: \"c346e089-31ed-4a23-9cf6-deea92044f18\") " pod="openshift-infra/auto-csr-approver-29551078-2gl7q" Mar 09 13:58:00 crc kubenswrapper[4723]: I0309 13:58:00.527591 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551078-2gl7q" Mar 09 13:58:01 crc kubenswrapper[4723]: I0309 13:58:01.043199 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551078-2gl7q"] Mar 09 13:58:01 crc kubenswrapper[4723]: W0309 13:58:01.047270 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc346e089_31ed_4a23_9cf6_deea92044f18.slice/crio-824eed612732dbf87a7e3ebb05a13f1f217434dbd7e8d6b6ff41df06d21b74f6 WatchSource:0}: Error finding container 824eed612732dbf87a7e3ebb05a13f1f217434dbd7e8d6b6ff41df06d21b74f6: Status 404 returned error can't find the container with id 824eed612732dbf87a7e3ebb05a13f1f217434dbd7e8d6b6ff41df06d21b74f6 Mar 09 13:58:01 crc kubenswrapper[4723]: I0309 13:58:01.605569 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551078-2gl7q" event={"ID":"c346e089-31ed-4a23-9cf6-deea92044f18","Type":"ContainerStarted","Data":"824eed612732dbf87a7e3ebb05a13f1f217434dbd7e8d6b6ff41df06d21b74f6"} Mar 09 13:58:02 crc kubenswrapper[4723]: I0309 13:58:02.617342 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551078-2gl7q" event={"ID":"c346e089-31ed-4a23-9cf6-deea92044f18","Type":"ContainerStarted","Data":"dfa03197eccb7721e3110377970a5e517f752eb14ecabdbaff7f5e0dcc8ea991"} Mar 09 13:58:02 crc kubenswrapper[4723]: I0309 13:58:02.631248 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551078-2gl7q" podStartSLOduration=1.693209312 podStartE2EDuration="2.631231146s" podCreationTimestamp="2026-03-09 13:58:00 +0000 UTC" firstStartedPulling="2026-03-09 13:58:01.050169212 +0000 UTC m=+3555.064636762" lastFinishedPulling="2026-03-09 13:58:01.988191056 +0000 UTC m=+3556.002658596" observedRunningTime="2026-03-09 13:58:02.629263464 +0000 UTC m=+3556.643731004" watchObservedRunningTime="2026-03-09 13:58:02.631231146 +0000 UTC m=+3556.645698686" Mar 09 13:58:03 crc kubenswrapper[4723]: I0309 13:58:03.633216 4723 generic.go:334] "Generic (PLEG): container finished" podID="c346e089-31ed-4a23-9cf6-deea92044f18" containerID="dfa03197eccb7721e3110377970a5e517f752eb14ecabdbaff7f5e0dcc8ea991" exitCode=0 Mar 09 13:58:03 crc kubenswrapper[4723]: I0309 13:58:03.633515 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551078-2gl7q" event={"ID":"c346e089-31ed-4a23-9cf6-deea92044f18","Type":"ContainerDied","Data":"dfa03197eccb7721e3110377970a5e517f752eb14ecabdbaff7f5e0dcc8ea991"} Mar 09 13:58:05 crc kubenswrapper[4723]: I0309 13:58:05.155574 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551078-2gl7q" Mar 09 13:58:05 crc kubenswrapper[4723]: I0309 13:58:05.224822 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnvsx\" (UniqueName: \"kubernetes.io/projected/c346e089-31ed-4a23-9cf6-deea92044f18-kube-api-access-mnvsx\") pod \"c346e089-31ed-4a23-9cf6-deea92044f18\" (UID: \"c346e089-31ed-4a23-9cf6-deea92044f18\") " Mar 09 13:58:05 crc kubenswrapper[4723]: I0309 13:58:05.230966 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c346e089-31ed-4a23-9cf6-deea92044f18-kube-api-access-mnvsx" (OuterVolumeSpecName: "kube-api-access-mnvsx") pod "c346e089-31ed-4a23-9cf6-deea92044f18" (UID: "c346e089-31ed-4a23-9cf6-deea92044f18"). InnerVolumeSpecName "kube-api-access-mnvsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:58:05 crc kubenswrapper[4723]: I0309 13:58:05.329187 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnvsx\" (UniqueName: \"kubernetes.io/projected/c346e089-31ed-4a23-9cf6-deea92044f18-kube-api-access-mnvsx\") on node \"crc\" DevicePath \"\"" Mar 09 13:58:05 crc kubenswrapper[4723]: I0309 13:58:05.664517 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551078-2gl7q" event={"ID":"c346e089-31ed-4a23-9cf6-deea92044f18","Type":"ContainerDied","Data":"824eed612732dbf87a7e3ebb05a13f1f217434dbd7e8d6b6ff41df06d21b74f6"} Mar 09 13:58:05 crc kubenswrapper[4723]: I0309 13:58:05.664792 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="824eed612732dbf87a7e3ebb05a13f1f217434dbd7e8d6b6ff41df06d21b74f6" Mar 09 13:58:05 crc kubenswrapper[4723]: I0309 13:58:05.664686 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551078-2gl7q" Mar 09 13:58:05 crc kubenswrapper[4723]: I0309 13:58:05.721651 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551072-hvd2s"] Mar 09 13:58:05 crc kubenswrapper[4723]: I0309 13:58:05.733069 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551072-hvd2s"] Mar 09 13:58:06 crc kubenswrapper[4723]: I0309 13:58:06.895014 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02fe7f4c-1014-47ad-8c3e-4a000dc2b7bc" path="/var/lib/kubelet/pods/02fe7f4c-1014-47ad-8c3e-4a000dc2b7bc/volumes" Mar 09 13:58:25 crc kubenswrapper[4723]: I0309 13:58:25.978465 4723 scope.go:117] "RemoveContainer" containerID="e023e9169c07f629dd606ba94c869181b888da0df2ab409cf6a2986843d640fe" Mar 09 13:58:38 crc kubenswrapper[4723]: I0309 13:58:38.147046 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ghsmn"] Mar 09 13:58:38 crc kubenswrapper[4723]: E0309 13:58:38.149233 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c346e089-31ed-4a23-9cf6-deea92044f18" containerName="oc" Mar 09 13:58:38 crc kubenswrapper[4723]: I0309 13:58:38.149351 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="c346e089-31ed-4a23-9cf6-deea92044f18" containerName="oc" Mar 09 13:58:38 crc kubenswrapper[4723]: I0309 13:58:38.149655 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="c346e089-31ed-4a23-9cf6-deea92044f18" containerName="oc" Mar 09 13:58:38 crc kubenswrapper[4723]: I0309 13:58:38.151354 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ghsmn" Mar 09 13:58:38 crc kubenswrapper[4723]: I0309 13:58:38.166688 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ghsmn"] Mar 09 13:58:38 crc kubenswrapper[4723]: I0309 13:58:38.231894 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a55c97d4-91f8-4235-800a-afa463f4eac7-catalog-content\") pod \"redhat-marketplace-ghsmn\" (UID: \"a55c97d4-91f8-4235-800a-afa463f4eac7\") " pod="openshift-marketplace/redhat-marketplace-ghsmn" Mar 09 13:58:38 crc kubenswrapper[4723]: I0309 13:58:38.232003 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8dp7\" (UniqueName: \"kubernetes.io/projected/a55c97d4-91f8-4235-800a-afa463f4eac7-kube-api-access-l8dp7\") pod \"redhat-marketplace-ghsmn\" (UID: \"a55c97d4-91f8-4235-800a-afa463f4eac7\") " pod="openshift-marketplace/redhat-marketplace-ghsmn" Mar 09 13:58:38 crc kubenswrapper[4723]: I0309 13:58:38.232124 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a55c97d4-91f8-4235-800a-afa463f4eac7-utilities\") pod \"redhat-marketplace-ghsmn\" (UID: \"a55c97d4-91f8-4235-800a-afa463f4eac7\") " pod="openshift-marketplace/redhat-marketplace-ghsmn" Mar 09 13:58:38 crc kubenswrapper[4723]: I0309 13:58:38.334379 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8dp7\" (UniqueName: \"kubernetes.io/projected/a55c97d4-91f8-4235-800a-afa463f4eac7-kube-api-access-l8dp7\") pod \"redhat-marketplace-ghsmn\" (UID: \"a55c97d4-91f8-4235-800a-afa463f4eac7\") " pod="openshift-marketplace/redhat-marketplace-ghsmn" Mar 09 13:58:38 crc kubenswrapper[4723]: I0309 13:58:38.334465 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a55c97d4-91f8-4235-800a-afa463f4eac7-utilities\") pod \"redhat-marketplace-ghsmn\" (UID: \"a55c97d4-91f8-4235-800a-afa463f4eac7\") " pod="openshift-marketplace/redhat-marketplace-ghsmn" Mar 09 13:58:38 crc kubenswrapper[4723]: I0309 13:58:38.334658 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a55c97d4-91f8-4235-800a-afa463f4eac7-catalog-content\") pod \"redhat-marketplace-ghsmn\" (UID: \"a55c97d4-91f8-4235-800a-afa463f4eac7\") " pod="openshift-marketplace/redhat-marketplace-ghsmn" Mar 09 13:58:38 crc kubenswrapper[4723]: I0309 13:58:38.335046 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a55c97d4-91f8-4235-800a-afa463f4eac7-utilities\") pod \"redhat-marketplace-ghsmn\" (UID: \"a55c97d4-91f8-4235-800a-afa463f4eac7\") " pod="openshift-marketplace/redhat-marketplace-ghsmn" Mar 09 13:58:38 crc kubenswrapper[4723]: I0309 13:58:38.335088 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a55c97d4-91f8-4235-800a-afa463f4eac7-catalog-content\") pod \"redhat-marketplace-ghsmn\" (UID: \"a55c97d4-91f8-4235-800a-afa463f4eac7\") " pod="openshift-marketplace/redhat-marketplace-ghsmn" Mar 09 13:58:38 crc kubenswrapper[4723]: I0309 13:58:38.354779 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8dp7\" (UniqueName: \"kubernetes.io/projected/a55c97d4-91f8-4235-800a-afa463f4eac7-kube-api-access-l8dp7\") pod \"redhat-marketplace-ghsmn\" (UID: \"a55c97d4-91f8-4235-800a-afa463f4eac7\") " pod="openshift-marketplace/redhat-marketplace-ghsmn" Mar 09 13:58:38 crc kubenswrapper[4723]: I0309 13:58:38.479668 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ghsmn" Mar 09 13:58:38 crc kubenswrapper[4723]: I0309 13:58:38.981221 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ghsmn"] Mar 09 13:58:39 crc kubenswrapper[4723]: I0309 13:58:39.031647 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ghsmn" event={"ID":"a55c97d4-91f8-4235-800a-afa463f4eac7","Type":"ContainerStarted","Data":"412177f9bb74ccea17ec86b7fadce8bbcdaaba0d9adaf9af631522e37157d370"} Mar 09 13:58:40 crc kubenswrapper[4723]: I0309 13:58:40.046183 4723 generic.go:334] "Generic (PLEG): container finished" podID="a55c97d4-91f8-4235-800a-afa463f4eac7" containerID="5ae18bd034995f7c73139f6cfdc33caea17a423c0cbb74f1d34b826dd8cf0817" exitCode=0 Mar 09 13:58:40 crc kubenswrapper[4723]: I0309 13:58:40.046293 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ghsmn" event={"ID":"a55c97d4-91f8-4235-800a-afa463f4eac7","Type":"ContainerDied","Data":"5ae18bd034995f7c73139f6cfdc33caea17a423c0cbb74f1d34b826dd8cf0817"} Mar 09 13:58:41 crc kubenswrapper[4723]: I0309 13:58:41.059438 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ghsmn" event={"ID":"a55c97d4-91f8-4235-800a-afa463f4eac7","Type":"ContainerStarted","Data":"febf3471587c43855d75d22f23b118184844975ab5da5ee083dccbe862d44276"} Mar 09 13:58:42 crc kubenswrapper[4723]: I0309 13:58:42.070679 4723 generic.go:334] "Generic (PLEG): container finished" podID="a55c97d4-91f8-4235-800a-afa463f4eac7" containerID="febf3471587c43855d75d22f23b118184844975ab5da5ee083dccbe862d44276" exitCode=0 Mar 09 13:58:42 crc kubenswrapper[4723]: I0309 13:58:42.070728 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ghsmn" event={"ID":"a55c97d4-91f8-4235-800a-afa463f4eac7","Type":"ContainerDied","Data":"febf3471587c43855d75d22f23b118184844975ab5da5ee083dccbe862d44276"} Mar 09 13:58:43 crc kubenswrapper[4723]: I0309 13:58:43.081285 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ghsmn" event={"ID":"a55c97d4-91f8-4235-800a-afa463f4eac7","Type":"ContainerStarted","Data":"8807dbe69540f2c69a4de691738249c4904f35938a98eee4a07da00abd87cc5b"} Mar 09 13:58:43 crc kubenswrapper[4723]: I0309 13:58:43.105830 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ghsmn" podStartSLOduration=2.552418074 podStartE2EDuration="5.105807833s" podCreationTimestamp="2026-03-09 13:58:38 +0000 UTC" firstStartedPulling="2026-03-09 13:58:40.050694416 +0000 UTC m=+3594.065161946" lastFinishedPulling="2026-03-09 13:58:42.604084165 +0000 UTC m=+3596.618551705" observedRunningTime="2026-03-09 13:58:43.09816778 +0000 UTC m=+3597.112635340" watchObservedRunningTime="2026-03-09 13:58:43.105807833 +0000 UTC m=+3597.120275373" Mar 09 13:58:48 crc kubenswrapper[4723]: I0309 13:58:48.480463 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ghsmn" Mar 09 13:58:48 crc kubenswrapper[4723]: I0309 13:58:48.481038 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ghsmn" Mar 09 13:58:48 crc kubenswrapper[4723]: I0309 13:58:48.557082 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ghsmn" Mar 09 13:58:49 crc kubenswrapper[4723]: I0309 13:58:49.199023 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ghsmn" Mar 09 13:58:49 crc kubenswrapper[4723]: I0309 13:58:49.254160 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ghsmn"] Mar 09 13:58:51 crc kubenswrapper[4723]: I0309 13:58:51.194851 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ghsmn" podUID="a55c97d4-91f8-4235-800a-afa463f4eac7" containerName="registry-server" containerID="cri-o://8807dbe69540f2c69a4de691738249c4904f35938a98eee4a07da00abd87cc5b" gracePeriod=2 Mar 09 13:58:51 crc kubenswrapper[4723]: I0309 13:58:51.762253 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ghsmn" Mar 09 13:58:51 crc kubenswrapper[4723]: I0309 13:58:51.867215 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a55c97d4-91f8-4235-800a-afa463f4eac7-catalog-content\") pod \"a55c97d4-91f8-4235-800a-afa463f4eac7\" (UID: \"a55c97d4-91f8-4235-800a-afa463f4eac7\") " Mar 09 13:58:51 crc kubenswrapper[4723]: I0309 13:58:51.867377 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8dp7\" (UniqueName: \"kubernetes.io/projected/a55c97d4-91f8-4235-800a-afa463f4eac7-kube-api-access-l8dp7\") pod \"a55c97d4-91f8-4235-800a-afa463f4eac7\" (UID: \"a55c97d4-91f8-4235-800a-afa463f4eac7\") " Mar 09 13:58:51 crc kubenswrapper[4723]: I0309 13:58:51.867409 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a55c97d4-91f8-4235-800a-afa463f4eac7-utilities\") pod \"a55c97d4-91f8-4235-800a-afa463f4eac7\" (UID: \"a55c97d4-91f8-4235-800a-afa463f4eac7\") " Mar 09 13:58:51 crc kubenswrapper[4723]: I0309 13:58:51.868809 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a55c97d4-91f8-4235-800a-afa463f4eac7-utilities" (OuterVolumeSpecName: "utilities") pod "a55c97d4-91f8-4235-800a-afa463f4eac7" (UID: "a55c97d4-91f8-4235-800a-afa463f4eac7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:58:51 crc kubenswrapper[4723]: I0309 13:58:51.875181 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a55c97d4-91f8-4235-800a-afa463f4eac7-kube-api-access-l8dp7" (OuterVolumeSpecName: "kube-api-access-l8dp7") pod "a55c97d4-91f8-4235-800a-afa463f4eac7" (UID: "a55c97d4-91f8-4235-800a-afa463f4eac7"). InnerVolumeSpecName "kube-api-access-l8dp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:58:51 crc kubenswrapper[4723]: I0309 13:58:51.900271 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a55c97d4-91f8-4235-800a-afa463f4eac7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a55c97d4-91f8-4235-800a-afa463f4eac7" (UID: "a55c97d4-91f8-4235-800a-afa463f4eac7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:58:51 crc kubenswrapper[4723]: I0309 13:58:51.969882 4723 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a55c97d4-91f8-4235-800a-afa463f4eac7-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:58:51 crc kubenswrapper[4723]: I0309 13:58:51.969925 4723 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a55c97d4-91f8-4235-800a-afa463f4eac7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:58:51 crc kubenswrapper[4723]: I0309 13:58:51.969942 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8dp7\" (UniqueName: \"kubernetes.io/projected/a55c97d4-91f8-4235-800a-afa463f4eac7-kube-api-access-l8dp7\") on node \"crc\" DevicePath \"\"" Mar 09 13:58:52 crc kubenswrapper[4723]: I0309 13:58:52.207323 4723 generic.go:334] "Generic (PLEG): container finished" podID="a55c97d4-91f8-4235-800a-afa463f4eac7" containerID="8807dbe69540f2c69a4de691738249c4904f35938a98eee4a07da00abd87cc5b" exitCode=0 Mar 09 13:58:52 crc kubenswrapper[4723]: I0309 13:58:52.207373 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ghsmn" event={"ID":"a55c97d4-91f8-4235-800a-afa463f4eac7","Type":"ContainerDied","Data":"8807dbe69540f2c69a4de691738249c4904f35938a98eee4a07da00abd87cc5b"} Mar 09 13:58:52 crc kubenswrapper[4723]: I0309 13:58:52.207377 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ghsmn" Mar 09 13:58:52 crc kubenswrapper[4723]: I0309 13:58:52.207403 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ghsmn" event={"ID":"a55c97d4-91f8-4235-800a-afa463f4eac7","Type":"ContainerDied","Data":"412177f9bb74ccea17ec86b7fadce8bbcdaaba0d9adaf9af631522e37157d370"} Mar 09 13:58:52 crc kubenswrapper[4723]: I0309 13:58:52.207423 4723 scope.go:117] "RemoveContainer" containerID="8807dbe69540f2c69a4de691738249c4904f35938a98eee4a07da00abd87cc5b" Mar 09 13:58:52 crc kubenswrapper[4723]: I0309 13:58:52.234504 4723 scope.go:117] "RemoveContainer" containerID="febf3471587c43855d75d22f23b118184844975ab5da5ee083dccbe862d44276" Mar 09 13:58:52 crc kubenswrapper[4723]: I0309 13:58:52.242618 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ghsmn"] Mar 09 13:58:52 crc kubenswrapper[4723]: I0309 13:58:52.253645 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ghsmn"] Mar 09 13:58:52 crc kubenswrapper[4723]: I0309 13:58:52.277406 4723 scope.go:117] "RemoveContainer" containerID="5ae18bd034995f7c73139f6cfdc33caea17a423c0cbb74f1d34b826dd8cf0817" Mar 09 13:58:52 crc kubenswrapper[4723]: I0309 13:58:52.321522 4723 scope.go:117] "RemoveContainer" containerID="8807dbe69540f2c69a4de691738249c4904f35938a98eee4a07da00abd87cc5b" Mar 09 13:58:52 crc kubenswrapper[4723]: E0309 13:58:52.321923 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8807dbe69540f2c69a4de691738249c4904f35938a98eee4a07da00abd87cc5b\": container with ID starting with 8807dbe69540f2c69a4de691738249c4904f35938a98eee4a07da00abd87cc5b not found: ID does not exist" containerID="8807dbe69540f2c69a4de691738249c4904f35938a98eee4a07da00abd87cc5b" Mar 09 13:58:52 crc kubenswrapper[4723]: I0309 13:58:52.321959 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8807dbe69540f2c69a4de691738249c4904f35938a98eee4a07da00abd87cc5b"} err="failed to get container status \"8807dbe69540f2c69a4de691738249c4904f35938a98eee4a07da00abd87cc5b\": rpc error: code = NotFound desc = could not find container \"8807dbe69540f2c69a4de691738249c4904f35938a98eee4a07da00abd87cc5b\": container with ID starting with 8807dbe69540f2c69a4de691738249c4904f35938a98eee4a07da00abd87cc5b not found: ID does not exist" Mar 09 13:58:52 crc kubenswrapper[4723]: I0309 13:58:52.322060 4723 scope.go:117] "RemoveContainer" containerID="febf3471587c43855d75d22f23b118184844975ab5da5ee083dccbe862d44276" Mar 09 13:58:52 crc kubenswrapper[4723]: E0309 13:58:52.322472 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"febf3471587c43855d75d22f23b118184844975ab5da5ee083dccbe862d44276\": container with ID starting with febf3471587c43855d75d22f23b118184844975ab5da5ee083dccbe862d44276 not found: ID does not exist" containerID="febf3471587c43855d75d22f23b118184844975ab5da5ee083dccbe862d44276" Mar 09 13:58:52 crc kubenswrapper[4723]: I0309 13:58:52.322504 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"febf3471587c43855d75d22f23b118184844975ab5da5ee083dccbe862d44276"} err="failed to get container status \"febf3471587c43855d75d22f23b118184844975ab5da5ee083dccbe862d44276\": rpc error: code = NotFound desc = could not find container \"febf3471587c43855d75d22f23b118184844975ab5da5ee083dccbe862d44276\": container with ID starting with febf3471587c43855d75d22f23b118184844975ab5da5ee083dccbe862d44276 not found: ID does not exist" Mar 09 13:58:52 crc kubenswrapper[4723]: I0309 13:58:52.322522 4723 scope.go:117] "RemoveContainer" containerID="5ae18bd034995f7c73139f6cfdc33caea17a423c0cbb74f1d34b826dd8cf0817" Mar 09 13:58:52 crc kubenswrapper[4723]: E0309 13:58:52.322774 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ae18bd034995f7c73139f6cfdc33caea17a423c0cbb74f1d34b826dd8cf0817\": container with ID starting with 5ae18bd034995f7c73139f6cfdc33caea17a423c0cbb74f1d34b826dd8cf0817 not found: ID does not exist" containerID="5ae18bd034995f7c73139f6cfdc33caea17a423c0cbb74f1d34b826dd8cf0817" Mar 09 13:58:52 crc kubenswrapper[4723]: I0309 13:58:52.322802 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ae18bd034995f7c73139f6cfdc33caea17a423c0cbb74f1d34b826dd8cf0817"} err="failed to get container status \"5ae18bd034995f7c73139f6cfdc33caea17a423c0cbb74f1d34b826dd8cf0817\": rpc error: code = NotFound desc = could not find container \"5ae18bd034995f7c73139f6cfdc33caea17a423c0cbb74f1d34b826dd8cf0817\": container with ID starting with 5ae18bd034995f7c73139f6cfdc33caea17a423c0cbb74f1d34b826dd8cf0817 not found: ID does not exist" Mar 09 13:58:52 crc kubenswrapper[4723]: I0309 13:58:52.894200 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a55c97d4-91f8-4235-800a-afa463f4eac7" path="/var/lib/kubelet/pods/a55c97d4-91f8-4235-800a-afa463f4eac7/volumes" Mar 09 14:00:00 crc kubenswrapper[4723]: I0309 14:00:00.144940 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551080-9r5cj"] Mar 09 14:00:00 crc kubenswrapper[4723]: E0309 14:00:00.146064 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a55c97d4-91f8-4235-800a-afa463f4eac7" containerName="extract-utilities" Mar 09 14:00:00 crc kubenswrapper[4723]: I0309 14:00:00.146082 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="a55c97d4-91f8-4235-800a-afa463f4eac7" containerName="extract-utilities" Mar 09 14:00:00 crc kubenswrapper[4723]: E0309 14:00:00.146110 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a55c97d4-91f8-4235-800a-afa463f4eac7" containerName="registry-server" Mar 09 14:00:00 crc kubenswrapper[4723]: I0309 14:00:00.146118 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="a55c97d4-91f8-4235-800a-afa463f4eac7" containerName="registry-server" Mar 09 14:00:00 crc kubenswrapper[4723]: E0309 14:00:00.146130 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a55c97d4-91f8-4235-800a-afa463f4eac7" containerName="extract-content" Mar 09 14:00:00 crc kubenswrapper[4723]: I0309 14:00:00.146138 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="a55c97d4-91f8-4235-800a-afa463f4eac7" containerName="extract-content" Mar 09 14:00:00 crc kubenswrapper[4723]: I0309 14:00:00.146425 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="a55c97d4-91f8-4235-800a-afa463f4eac7" containerName="registry-server" Mar 09 14:00:00 crc kubenswrapper[4723]: I0309 14:00:00.147360 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551080-9r5cj" Mar 09 14:00:00 crc kubenswrapper[4723]: I0309 14:00:00.154934 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:00:00 crc kubenswrapper[4723]: I0309 14:00:00.155222 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:00:00 crc kubenswrapper[4723]: I0309 14:00:00.155505 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jrp4x" Mar 09 14:00:00 crc kubenswrapper[4723]: I0309 14:00:00.159977 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551080-2b5rm"] Mar 09 14:00:00 crc kubenswrapper[4723]: I0309 14:00:00.162649 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-2b5rm" Mar 09 14:00:00 crc kubenswrapper[4723]: I0309 14:00:00.164698 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 14:00:00 crc kubenswrapper[4723]: I0309 14:00:00.164925 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 14:00:00 crc kubenswrapper[4723]: I0309 14:00:00.184352 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551080-9r5cj"] Mar 09 14:00:00 crc kubenswrapper[4723]: I0309 14:00:00.195928 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551080-2b5rm"] Mar 09 14:00:00 crc kubenswrapper[4723]: I0309 14:00:00.285986 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7708895a-a124-4f3e-b4a9-01bbe6a7a5e9-secret-volume\") pod \"collect-profiles-29551080-2b5rm\" (UID: \"7708895a-a124-4f3e-b4a9-01bbe6a7a5e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-2b5rm" Mar 09 14:00:00 crc kubenswrapper[4723]: I0309 14:00:00.286137 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhwgr\" (UniqueName: \"kubernetes.io/projected/707e76d7-5321-48ed-afb3-782f7a953315-kube-api-access-bhwgr\") pod \"auto-csr-approver-29551080-9r5cj\" (UID: \"707e76d7-5321-48ed-afb3-782f7a953315\") " pod="openshift-infra/auto-csr-approver-29551080-9r5cj" Mar 09 14:00:00 crc kubenswrapper[4723]: I0309 14:00:00.286199 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7708895a-a124-4f3e-b4a9-01bbe6a7a5e9-config-volume\") pod \"collect-profiles-29551080-2b5rm\" (UID: \"7708895a-a124-4f3e-b4a9-01bbe6a7a5e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-2b5rm" Mar 09 14:00:00 crc kubenswrapper[4723]: I0309 14:00:00.286274 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr8hq\" (UniqueName: \"kubernetes.io/projected/7708895a-a124-4f3e-b4a9-01bbe6a7a5e9-kube-api-access-gr8hq\") pod \"collect-profiles-29551080-2b5rm\" (UID: \"7708895a-a124-4f3e-b4a9-01bbe6a7a5e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-2b5rm" Mar 09 14:00:00 crc kubenswrapper[4723]: I0309 14:00:00.388506 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7708895a-a124-4f3e-b4a9-01bbe6a7a5e9-secret-volume\") pod \"collect-profiles-29551080-2b5rm\" (UID: \"7708895a-a124-4f3e-b4a9-01bbe6a7a5e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-2b5rm" Mar 09 14:00:00 crc kubenswrapper[4723]: I0309 14:00:00.388647 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhwgr\" (UniqueName: \"kubernetes.io/projected/707e76d7-5321-48ed-afb3-782f7a953315-kube-api-access-bhwgr\") pod \"auto-csr-approver-29551080-9r5cj\" (UID: \"707e76d7-5321-48ed-afb3-782f7a953315\") " pod="openshift-infra/auto-csr-approver-29551080-9r5cj" Mar 09 14:00:00 crc kubenswrapper[4723]: I0309 14:00:00.388768 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7708895a-a124-4f3e-b4a9-01bbe6a7a5e9-config-volume\") pod \"collect-profiles-29551080-2b5rm\" (UID: \"7708895a-a124-4f3e-b4a9-01bbe6a7a5e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-2b5rm" Mar 09 14:00:00 crc kubenswrapper[4723]: I0309 14:00:00.388821 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr8hq\" (UniqueName: \"kubernetes.io/projected/7708895a-a124-4f3e-b4a9-01bbe6a7a5e9-kube-api-access-gr8hq\") pod \"collect-profiles-29551080-2b5rm\" (UID: \"7708895a-a124-4f3e-b4a9-01bbe6a7a5e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-2b5rm" Mar 09 14:00:00 crc kubenswrapper[4723]: I0309 14:00:00.390268 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7708895a-a124-4f3e-b4a9-01bbe6a7a5e9-config-volume\") pod \"collect-profiles-29551080-2b5rm\" (UID: \"7708895a-a124-4f3e-b4a9-01bbe6a7a5e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-2b5rm" Mar 09 14:00:00 crc kubenswrapper[4723]: I0309 14:00:00.397918 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7708895a-a124-4f3e-b4a9-01bbe6a7a5e9-secret-volume\") pod \"collect-profiles-29551080-2b5rm\" (UID: \"7708895a-a124-4f3e-b4a9-01bbe6a7a5e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-2b5rm" Mar 09 14:00:00 crc kubenswrapper[4723]: I0309 14:00:00.409682 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr8hq\" (UniqueName: \"kubernetes.io/projected/7708895a-a124-4f3e-b4a9-01bbe6a7a5e9-kube-api-access-gr8hq\") pod \"collect-profiles-29551080-2b5rm\" (UID: \"7708895a-a124-4f3e-b4a9-01bbe6a7a5e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-2b5rm" Mar 09 14:00:00 crc kubenswrapper[4723]: I0309 14:00:00.412058 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhwgr\" (UniqueName: \"kubernetes.io/projected/707e76d7-5321-48ed-afb3-782f7a953315-kube-api-access-bhwgr\") pod \"auto-csr-approver-29551080-9r5cj\" (UID: \"707e76d7-5321-48ed-afb3-782f7a953315\") " pod="openshift-infra/auto-csr-approver-29551080-9r5cj" Mar 09 14:00:00 crc kubenswrapper[4723]: I0309 14:00:00.478953 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551080-9r5cj" Mar 09 14:00:00 crc kubenswrapper[4723]: I0309 14:00:00.498302 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-2b5rm" Mar 09 14:00:00 crc kubenswrapper[4723]: I0309 14:00:00.990375 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551080-2b5rm"] Mar 09 14:00:01 crc kubenswrapper[4723]: I0309 14:00:01.105304 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551080-9r5cj"] Mar 09 14:00:01 crc kubenswrapper[4723]: W0309 14:00:01.109439 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod707e76d7_5321_48ed_afb3_782f7a953315.slice/crio-da2a3a6c8c7e4cb740615a7212c96bb5ab35ddce4b78b169422a9f60613bf647 WatchSource:0}: Error finding container da2a3a6c8c7e4cb740615a7212c96bb5ab35ddce4b78b169422a9f60613bf647: Status 404 returned error can't find the container with id da2a3a6c8c7e4cb740615a7212c96bb5ab35ddce4b78b169422a9f60613bf647 Mar 09 14:00:01 crc kubenswrapper[4723]: I0309 14:00:01.976194 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551080-9r5cj" event={"ID":"707e76d7-5321-48ed-afb3-782f7a953315","Type":"ContainerStarted","Data":"da2a3a6c8c7e4cb740615a7212c96bb5ab35ddce4b78b169422a9f60613bf647"} Mar 09 14:00:01 crc kubenswrapper[4723]: I0309 14:00:01.993673 4723 generic.go:334] "Generic (PLEG): container finished" podID="7708895a-a124-4f3e-b4a9-01bbe6a7a5e9" containerID="8d45a5aa1b4dcbc81f92692d59b3af519b73902b5584aa6331a248ec0bbacd6e" exitCode=0 Mar 09 14:00:01 crc kubenswrapper[4723]: I0309 14:00:01.993725 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-2b5rm" event={"ID":"7708895a-a124-4f3e-b4a9-01bbe6a7a5e9","Type":"ContainerDied","Data":"8d45a5aa1b4dcbc81f92692d59b3af519b73902b5584aa6331a248ec0bbacd6e"} Mar 09 14:00:01 crc kubenswrapper[4723]: I0309 14:00:01.993754 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-2b5rm" event={"ID":"7708895a-a124-4f3e-b4a9-01bbe6a7a5e9","Type":"ContainerStarted","Data":"0203909fa44d412c10b6aaa2e77879e591f2d4358e7965ecca734d3ae2a81a2c"} Mar 09 14:00:03 crc kubenswrapper[4723]: I0309 14:00:03.479669 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-2b5rm" Mar 09 14:00:03 crc kubenswrapper[4723]: I0309 14:00:03.488243 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7708895a-a124-4f3e-b4a9-01bbe6a7a5e9-config-volume\") pod \"7708895a-a124-4f3e-b4a9-01bbe6a7a5e9\" (UID: \"7708895a-a124-4f3e-b4a9-01bbe6a7a5e9\") " Mar 09 14:00:03 crc kubenswrapper[4723]: I0309 14:00:03.488514 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gr8hq\" (UniqueName: \"kubernetes.io/projected/7708895a-a124-4f3e-b4a9-01bbe6a7a5e9-kube-api-access-gr8hq\") pod \"7708895a-a124-4f3e-b4a9-01bbe6a7a5e9\" (UID: \"7708895a-a124-4f3e-b4a9-01bbe6a7a5e9\") " Mar 09 14:00:03 crc kubenswrapper[4723]: I0309 14:00:03.488655 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7708895a-a124-4f3e-b4a9-01bbe6a7a5e9-secret-volume\") pod \"7708895a-a124-4f3e-b4a9-01bbe6a7a5e9\" (UID: \"7708895a-a124-4f3e-b4a9-01bbe6a7a5e9\") " Mar 09 14:00:03 crc kubenswrapper[4723]: I0309 14:00:03.489644 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7708895a-a124-4f3e-b4a9-01bbe6a7a5e9-config-volume" (OuterVolumeSpecName: "config-volume") pod "7708895a-a124-4f3e-b4a9-01bbe6a7a5e9" (UID: "7708895a-a124-4f3e-b4a9-01bbe6a7a5e9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:00:03 crc kubenswrapper[4723]: I0309 14:00:03.495023 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7708895a-a124-4f3e-b4a9-01bbe6a7a5e9-kube-api-access-gr8hq" (OuterVolumeSpecName: "kube-api-access-gr8hq") pod "7708895a-a124-4f3e-b4a9-01bbe6a7a5e9" (UID: "7708895a-a124-4f3e-b4a9-01bbe6a7a5e9"). InnerVolumeSpecName "kube-api-access-gr8hq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:00:03 crc kubenswrapper[4723]: I0309 14:00:03.495367 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7708895a-a124-4f3e-b4a9-01bbe6a7a5e9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7708895a-a124-4f3e-b4a9-01bbe6a7a5e9" (UID: "7708895a-a124-4f3e-b4a9-01bbe6a7a5e9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:00:03 crc kubenswrapper[4723]: I0309 14:00:03.590662 4723 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7708895a-a124-4f3e-b4a9-01bbe6a7a5e9-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 14:00:03 crc kubenswrapper[4723]: I0309 14:00:03.590682 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gr8hq\" (UniqueName: \"kubernetes.io/projected/7708895a-a124-4f3e-b4a9-01bbe6a7a5e9-kube-api-access-gr8hq\") on node \"crc\" DevicePath \"\"" Mar 09 14:00:03 crc kubenswrapper[4723]: I0309 14:00:03.590691 4723 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7708895a-a124-4f3e-b4a9-01bbe6a7a5e9-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 14:00:03 crc kubenswrapper[4723]: I0309 14:00:03.946644 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:00:03 crc kubenswrapper[4723]: I0309 14:00:03.947000 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:00:04 crc kubenswrapper[4723]: I0309 14:00:04.023568 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-2b5rm" event={"ID":"7708895a-a124-4f3e-b4a9-01bbe6a7a5e9","Type":"ContainerDied","Data":"0203909fa44d412c10b6aaa2e77879e591f2d4358e7965ecca734d3ae2a81a2c"} Mar 09 14:00:04 crc kubenswrapper[4723]: I0309 14:00:04.023605 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0203909fa44d412c10b6aaa2e77879e591f2d4358e7965ecca734d3ae2a81a2c" Mar 09 14:00:04 crc kubenswrapper[4723]: I0309 14:00:04.023632 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-2b5rm" Mar 09 14:00:04 crc kubenswrapper[4723]: I0309 14:00:04.568806 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551035-7fwvj"] Mar 09 14:00:04 crc kubenswrapper[4723]: I0309 14:00:04.582214 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551035-7fwvj"] Mar 09 14:00:04 crc kubenswrapper[4723]: I0309 14:00:04.895473 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f5f9e3b-034c-47c8-810b-2bd21bd1c54d" path="/var/lib/kubelet/pods/1f5f9e3b-034c-47c8-810b-2bd21bd1c54d/volumes" Mar 09 14:00:06 crc kubenswrapper[4723]: I0309 14:00:06.054259 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551080-9r5cj" event={"ID":"707e76d7-5321-48ed-afb3-782f7a953315","Type":"ContainerStarted","Data":"13a75d979ca1ebd3f9c644bd7d6cabac562d49406ae6721992033a7b09ddb1a1"} Mar 09 14:00:06 crc kubenswrapper[4723]: I0309 14:00:06.076854 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551080-9r5cj" podStartSLOduration=1.739522556 podStartE2EDuration="6.07683606s" podCreationTimestamp="2026-03-09 14:00:00 +0000 UTC" firstStartedPulling="2026-03-09 14:00:01.116087658 +0000 UTC m=+3675.130555208" lastFinishedPulling="2026-03-09 14:00:05.453401172 +0000 UTC m=+3679.467868712" observedRunningTime="2026-03-09 14:00:06.069136755 +0000 UTC m=+3680.083604295" watchObservedRunningTime="2026-03-09 14:00:06.07683606 +0000 UTC m=+3680.091303600" Mar 09 14:00:07 crc kubenswrapper[4723]: I0309 14:00:07.063992 4723 generic.go:334] "Generic (PLEG): container finished" podID="707e76d7-5321-48ed-afb3-782f7a953315" containerID="13a75d979ca1ebd3f9c644bd7d6cabac562d49406ae6721992033a7b09ddb1a1" exitCode=0 Mar 09 14:00:07 crc kubenswrapper[4723]: I0309 14:00:07.064074 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551080-9r5cj" event={"ID":"707e76d7-5321-48ed-afb3-782f7a953315","Type":"ContainerDied","Data":"13a75d979ca1ebd3f9c644bd7d6cabac562d49406ae6721992033a7b09ddb1a1"} Mar 09 14:00:08 crc kubenswrapper[4723]: I0309 14:00:08.561918 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551080-9r5cj" Mar 09 14:00:08 crc kubenswrapper[4723]: I0309 14:00:08.604606 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhwgr\" (UniqueName: \"kubernetes.io/projected/707e76d7-5321-48ed-afb3-782f7a953315-kube-api-access-bhwgr\") pod \"707e76d7-5321-48ed-afb3-782f7a953315\" (UID: \"707e76d7-5321-48ed-afb3-782f7a953315\") " Mar 09 14:00:08 crc kubenswrapper[4723]: I0309 14:00:08.611083 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/707e76d7-5321-48ed-afb3-782f7a953315-kube-api-access-bhwgr" (OuterVolumeSpecName: "kube-api-access-bhwgr") pod "707e76d7-5321-48ed-afb3-782f7a953315" (UID: "707e76d7-5321-48ed-afb3-782f7a953315"). InnerVolumeSpecName "kube-api-access-bhwgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:00:08 crc kubenswrapper[4723]: I0309 14:00:08.707468 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhwgr\" (UniqueName: \"kubernetes.io/projected/707e76d7-5321-48ed-afb3-782f7a953315-kube-api-access-bhwgr\") on node \"crc\" DevicePath \"\"" Mar 09 14:00:09 crc kubenswrapper[4723]: I0309 14:00:09.086406 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551080-9r5cj" event={"ID":"707e76d7-5321-48ed-afb3-782f7a953315","Type":"ContainerDied","Data":"da2a3a6c8c7e4cb740615a7212c96bb5ab35ddce4b78b169422a9f60613bf647"} Mar 09 14:00:09 crc kubenswrapper[4723]: I0309 14:00:09.086468 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da2a3a6c8c7e4cb740615a7212c96bb5ab35ddce4b78b169422a9f60613bf647" Mar 09 14:00:09 crc kubenswrapper[4723]: I0309 14:00:09.086448 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551080-9r5cj" Mar 09 14:00:09 crc kubenswrapper[4723]: I0309 14:00:09.140932 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551074-cxwsk"] Mar 09 14:00:09 crc kubenswrapper[4723]: I0309 14:00:09.159018 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551074-cxwsk"] Mar 09 14:00:10 crc kubenswrapper[4723]: I0309 14:00:10.894035 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="111c3f03-b05b-4dc8-9449-8453cafd181d" path="/var/lib/kubelet/pods/111c3f03-b05b-4dc8-9449-8453cafd181d/volumes" Mar 09 14:00:26 crc kubenswrapper[4723]: I0309 14:00:26.138299 4723 scope.go:117] "RemoveContainer" containerID="d3ea6bd376bc3cacc69cb96a929ff5e610086f4f24a0a2264dc3ebdc2f2624ed" Mar 09 14:00:26 crc kubenswrapper[4723]: I0309 14:00:26.181537 4723 scope.go:117] "RemoveContainer" containerID="e4757d8409c7b613154a2d525330e6f6606954a7811656a386cba12fdf833c44" Mar 09 14:00:33 crc kubenswrapper[4723]: I0309 14:00:33.946778 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:00:33 crc kubenswrapper[4723]: I0309 14:00:33.947301 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:01:00 crc kubenswrapper[4723]: I0309 14:01:00.157604 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29551081-tq9wf"] Mar 09 14:01:00 crc kubenswrapper[4723]: E0309 14:01:00.159030 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="707e76d7-5321-48ed-afb3-782f7a953315" containerName="oc" Mar 09 14:01:00 crc kubenswrapper[4723]: I0309 14:01:00.159067 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="707e76d7-5321-48ed-afb3-782f7a953315" containerName="oc" Mar 09 14:01:00 crc kubenswrapper[4723]: E0309 14:01:00.159092 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7708895a-a124-4f3e-b4a9-01bbe6a7a5e9" containerName="collect-profiles" Mar 09 14:01:00 crc kubenswrapper[4723]: I0309 14:01:00.159102 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="7708895a-a124-4f3e-b4a9-01bbe6a7a5e9" containerName="collect-profiles" Mar 09 14:01:00 crc kubenswrapper[4723]: I0309 14:01:00.159509 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="707e76d7-5321-48ed-afb3-782f7a953315" containerName="oc" Mar 09 14:01:00 crc kubenswrapper[4723]: I0309 14:01:00.159541 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="7708895a-a124-4f3e-b4a9-01bbe6a7a5e9" containerName="collect-profiles" Mar 09 14:01:00 crc kubenswrapper[4723]: I0309 14:01:00.160802 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29551081-tq9wf" Mar 09 14:01:00 crc kubenswrapper[4723]: I0309 14:01:00.185941 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29551081-tq9wf"] Mar 09 14:01:00 crc kubenswrapper[4723]: I0309 14:01:00.242626 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxklz\" (UniqueName: \"kubernetes.io/projected/db975a46-d705-4df6-a5d6-1a598088dfd7-kube-api-access-jxklz\") pod \"keystone-cron-29551081-tq9wf\" (UID: \"db975a46-d705-4df6-a5d6-1a598088dfd7\") " pod="openstack/keystone-cron-29551081-tq9wf" Mar 09 14:01:00 crc kubenswrapper[4723]: I0309 14:01:00.242675 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/db975a46-d705-4df6-a5d6-1a598088dfd7-fernet-keys\") pod \"keystone-cron-29551081-tq9wf\" (UID: \"db975a46-d705-4df6-a5d6-1a598088dfd7\") " pod="openstack/keystone-cron-29551081-tq9wf" Mar 09 14:01:00 crc kubenswrapper[4723]: I0309 14:01:00.243298 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db975a46-d705-4df6-a5d6-1a598088dfd7-config-data\") pod \"keystone-cron-29551081-tq9wf\" (UID: \"db975a46-d705-4df6-a5d6-1a598088dfd7\") " pod="openstack/keystone-cron-29551081-tq9wf" Mar 09 14:01:00 crc kubenswrapper[4723]: I0309 14:01:00.243409 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db975a46-d705-4df6-a5d6-1a598088dfd7-combined-ca-bundle\") pod \"keystone-cron-29551081-tq9wf\" (UID: \"db975a46-d705-4df6-a5d6-1a598088dfd7\") " pod="openstack/keystone-cron-29551081-tq9wf" Mar 09 14:01:00 crc kubenswrapper[4723]: I0309 14:01:00.346360 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxklz\" (UniqueName: \"kubernetes.io/projected/db975a46-d705-4df6-a5d6-1a598088dfd7-kube-api-access-jxklz\") pod \"keystone-cron-29551081-tq9wf\" (UID: \"db975a46-d705-4df6-a5d6-1a598088dfd7\") " pod="openstack/keystone-cron-29551081-tq9wf" Mar 09 14:01:00 crc kubenswrapper[4723]: I0309 14:01:00.346411 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/db975a46-d705-4df6-a5d6-1a598088dfd7-fernet-keys\") pod \"keystone-cron-29551081-tq9wf\" (UID: \"db975a46-d705-4df6-a5d6-1a598088dfd7\") " pod="openstack/keystone-cron-29551081-tq9wf" Mar 09 14:01:00 crc kubenswrapper[4723]: I0309 14:01:00.346514 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db975a46-d705-4df6-a5d6-1a598088dfd7-config-data\") pod \"keystone-cron-29551081-tq9wf\" (UID: \"db975a46-d705-4df6-a5d6-1a598088dfd7\") " pod="openstack/keystone-cron-29551081-tq9wf" Mar 09 14:01:00 crc kubenswrapper[4723]: I0309 14:01:00.346542 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db975a46-d705-4df6-a5d6-1a598088dfd7-combined-ca-bundle\") pod \"keystone-cron-29551081-tq9wf\" (UID: \"db975a46-d705-4df6-a5d6-1a598088dfd7\") " pod="openstack/keystone-cron-29551081-tq9wf" Mar 09 14:01:00 crc kubenswrapper[4723]: I0309 14:01:00.352706 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db975a46-d705-4df6-a5d6-1a598088dfd7-combined-ca-bundle\") pod \"keystone-cron-29551081-tq9wf\" (UID: \"db975a46-d705-4df6-a5d6-1a598088dfd7\") " pod="openstack/keystone-cron-29551081-tq9wf" Mar 09 14:01:00 crc kubenswrapper[4723]: I0309 14:01:00.352843 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/db975a46-d705-4df6-a5d6-1a598088dfd7-fernet-keys\") pod \"keystone-cron-29551081-tq9wf\" (UID: \"db975a46-d705-4df6-a5d6-1a598088dfd7\") " pod="openstack/keystone-cron-29551081-tq9wf" Mar 09 14:01:00 crc kubenswrapper[4723]: I0309 14:01:00.353472 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db975a46-d705-4df6-a5d6-1a598088dfd7-config-data\") pod \"keystone-cron-29551081-tq9wf\" (UID: \"db975a46-d705-4df6-a5d6-1a598088dfd7\") " pod="openstack/keystone-cron-29551081-tq9wf" Mar 09 14:01:00 crc kubenswrapper[4723]: I0309 14:01:00.362529 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxklz\" (UniqueName: \"kubernetes.io/projected/db975a46-d705-4df6-a5d6-1a598088dfd7-kube-api-access-jxklz\") pod \"keystone-cron-29551081-tq9wf\" (UID: \"db975a46-d705-4df6-a5d6-1a598088dfd7\") " pod="openstack/keystone-cron-29551081-tq9wf" Mar 09 14:01:00 crc kubenswrapper[4723]: I0309 14:01:00.487010 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29551081-tq9wf" Mar 09 14:01:00 crc kubenswrapper[4723]: I0309 14:01:00.986888 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29551081-tq9wf"] Mar 09 14:01:01 crc kubenswrapper[4723]: I0309 14:01:01.686875 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29551081-tq9wf" event={"ID":"db975a46-d705-4df6-a5d6-1a598088dfd7","Type":"ContainerStarted","Data":"cdc8b3fb8b2ffd9e2c318f4a7301834e67b731e3b14461c7cd4a76708478d91c"} Mar 09 14:01:01 crc kubenswrapper[4723]: I0309 14:01:01.687219 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29551081-tq9wf" event={"ID":"db975a46-d705-4df6-a5d6-1a598088dfd7","Type":"ContainerStarted","Data":"5d79c797c26cddae6bfa7a1e2d0f250fb3c182e57150e7b09a7a26f5d73c2906"} Mar 09 14:01:01 crc kubenswrapper[4723]: I0309 14:01:01.715070 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29551081-tq9wf" podStartSLOduration=1.7150522860000001 podStartE2EDuration="1.715052286s" podCreationTimestamp="2026-03-09 14:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:01:01.704283869 +0000 UTC m=+3735.718751429" watchObservedRunningTime="2026-03-09 14:01:01.715052286 +0000 UTC m=+3735.729519826" Mar 09 14:01:03 crc kubenswrapper[4723]: I0309 14:01:03.947359 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:01:03 crc kubenswrapper[4723]: I0309 14:01:03.947751 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:01:03 crc kubenswrapper[4723]: I0309 14:01:03.947810 4723 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" Mar 09 14:01:03 crc kubenswrapper[4723]: I0309 14:01:03.949079 4723 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b9917d1efbbbf05f4cd1ae94e4d9bdfa5b2f0c6eaedeaba13758e7498a18fdc2"} pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 14:01:03 crc kubenswrapper[4723]: I0309 14:01:03.949169 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" containerID="cri-o://b9917d1efbbbf05f4cd1ae94e4d9bdfa5b2f0c6eaedeaba13758e7498a18fdc2" gracePeriod=600 Mar 09 14:01:04 crc kubenswrapper[4723]: I0309 14:01:04.735080 4723 generic.go:334] "Generic (PLEG): container finished" podID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerID="b9917d1efbbbf05f4cd1ae94e4d9bdfa5b2f0c6eaedeaba13758e7498a18fdc2" exitCode=0 Mar 09 14:01:04 crc kubenswrapper[4723]: I0309 14:01:04.735176 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" event={"ID":"983d5ed4-cfc7-402a-b226-29dc071c6e4e","Type":"ContainerDied","Data":"b9917d1efbbbf05f4cd1ae94e4d9bdfa5b2f0c6eaedeaba13758e7498a18fdc2"} Mar 09 14:01:04 crc kubenswrapper[4723]: I0309 14:01:04.735667 4723 scope.go:117] "RemoveContainer" containerID="4904fbc2a93aa096dedb54bfdc0020c144a85acbe4849899c125620f31985658" Mar 09 14:01:04 crc kubenswrapper[4723]: I0309 14:01:04.740337 4723 generic.go:334] "Generic (PLEG): container finished" podID="db975a46-d705-4df6-a5d6-1a598088dfd7" containerID="cdc8b3fb8b2ffd9e2c318f4a7301834e67b731e3b14461c7cd4a76708478d91c" exitCode=0 Mar 09 14:01:04 crc kubenswrapper[4723]: I0309 14:01:04.740408 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29551081-tq9wf" event={"ID":"db975a46-d705-4df6-a5d6-1a598088dfd7","Type":"ContainerDied","Data":"cdc8b3fb8b2ffd9e2c318f4a7301834e67b731e3b14461c7cd4a76708478d91c"} Mar 09 14:01:05 crc kubenswrapper[4723]: I0309 14:01:05.754426 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" event={"ID":"983d5ed4-cfc7-402a-b226-29dc071c6e4e","Type":"ContainerStarted","Data":"37326c58a388082cd97d4e16296313d00865dba4dd28ca9ddbf4cdd0872efba7"} Mar 09 14:01:06 crc kubenswrapper[4723]: I0309 14:01:06.177020 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29551081-tq9wf" Mar 09 14:01:06 crc kubenswrapper[4723]: I0309 14:01:06.202901 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db975a46-d705-4df6-a5d6-1a598088dfd7-combined-ca-bundle\") pod \"db975a46-d705-4df6-a5d6-1a598088dfd7\" (UID: \"db975a46-d705-4df6-a5d6-1a598088dfd7\") " Mar 09 14:01:06 crc kubenswrapper[4723]: I0309 14:01:06.203085 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/db975a46-d705-4df6-a5d6-1a598088dfd7-fernet-keys\") pod \"db975a46-d705-4df6-a5d6-1a598088dfd7\" (UID: \"db975a46-d705-4df6-a5d6-1a598088dfd7\") " Mar 09 14:01:06 crc kubenswrapper[4723]: I0309 14:01:06.203180 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db975a46-d705-4df6-a5d6-1a598088dfd7-config-data\") pod \"db975a46-d705-4df6-a5d6-1a598088dfd7\" (UID: \"db975a46-d705-4df6-a5d6-1a598088dfd7\") " Mar 09 14:01:06 crc kubenswrapper[4723]: I0309 14:01:06.203311 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxklz\" (UniqueName: \"kubernetes.io/projected/db975a46-d705-4df6-a5d6-1a598088dfd7-kube-api-access-jxklz\") pod \"db975a46-d705-4df6-a5d6-1a598088dfd7\" (UID: \"db975a46-d705-4df6-a5d6-1a598088dfd7\") " Mar 09 14:01:06 crc kubenswrapper[4723]: I0309 14:01:06.210946 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db975a46-d705-4df6-a5d6-1a598088dfd7-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "db975a46-d705-4df6-a5d6-1a598088dfd7" (UID: "db975a46-d705-4df6-a5d6-1a598088dfd7"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:01:06 crc kubenswrapper[4723]: I0309 14:01:06.213149 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db975a46-d705-4df6-a5d6-1a598088dfd7-kube-api-access-jxklz" (OuterVolumeSpecName: "kube-api-access-jxklz") pod "db975a46-d705-4df6-a5d6-1a598088dfd7" (UID: "db975a46-d705-4df6-a5d6-1a598088dfd7"). InnerVolumeSpecName "kube-api-access-jxklz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:01:06 crc kubenswrapper[4723]: I0309 14:01:06.255491 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db975a46-d705-4df6-a5d6-1a598088dfd7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db975a46-d705-4df6-a5d6-1a598088dfd7" (UID: "db975a46-d705-4df6-a5d6-1a598088dfd7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:01:06 crc kubenswrapper[4723]: I0309 14:01:06.294852 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db975a46-d705-4df6-a5d6-1a598088dfd7-config-data" (OuterVolumeSpecName: "config-data") pod "db975a46-d705-4df6-a5d6-1a598088dfd7" (UID: "db975a46-d705-4df6-a5d6-1a598088dfd7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:01:06 crc kubenswrapper[4723]: I0309 14:01:06.306382 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxklz\" (UniqueName: \"kubernetes.io/projected/db975a46-d705-4df6-a5d6-1a598088dfd7-kube-api-access-jxklz\") on node \"crc\" DevicePath \"\"" Mar 09 14:01:06 crc kubenswrapper[4723]: I0309 14:01:06.306416 4723 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db975a46-d705-4df6-a5d6-1a598088dfd7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:01:06 crc kubenswrapper[4723]: I0309 14:01:06.306426 4723 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/db975a46-d705-4df6-a5d6-1a598088dfd7-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 09 14:01:06 crc kubenswrapper[4723]: I0309 14:01:06.306435 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db975a46-d705-4df6-a5d6-1a598088dfd7-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:01:06 crc kubenswrapper[4723]: I0309 14:01:06.769208 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29551081-tq9wf" Mar 09 14:01:06 crc kubenswrapper[4723]: I0309 14:01:06.774088 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29551081-tq9wf" event={"ID":"db975a46-d705-4df6-a5d6-1a598088dfd7","Type":"ContainerDied","Data":"5d79c797c26cddae6bfa7a1e2d0f250fb3c182e57150e7b09a7a26f5d73c2906"} Mar 09 14:01:06 crc kubenswrapper[4723]: I0309 14:01:06.774150 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d79c797c26cddae6bfa7a1e2d0f250fb3c182e57150e7b09a7a26f5d73c2906" Mar 09 14:02:00 crc kubenswrapper[4723]: I0309 14:02:00.156546 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551082-x77jw"] Mar 09 14:02:00 crc kubenswrapper[4723]: E0309 14:02:00.158518 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db975a46-d705-4df6-a5d6-1a598088dfd7" containerName="keystone-cron" Mar 09 14:02:00 crc kubenswrapper[4723]: I0309 14:02:00.158546 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="db975a46-d705-4df6-a5d6-1a598088dfd7" containerName="keystone-cron" Mar 09 14:02:00 crc kubenswrapper[4723]: I0309 14:02:00.159075 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="db975a46-d705-4df6-a5d6-1a598088dfd7" containerName="keystone-cron" Mar 09 14:02:00 crc kubenswrapper[4723]: I0309 14:02:00.160746 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551082-x77jw" Mar 09 14:02:00 crc kubenswrapper[4723]: I0309 14:02:00.164735 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jrp4x" Mar 09 14:02:00 crc kubenswrapper[4723]: I0309 14:02:00.166330 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:02:00 crc kubenswrapper[4723]: I0309 14:02:00.166602 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:02:00 crc kubenswrapper[4723]: I0309 14:02:00.172729 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551082-x77jw"] Mar 09 14:02:00 crc kubenswrapper[4723]: I0309 14:02:00.182417 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb8gt\" (UniqueName: \"kubernetes.io/projected/03dc001b-84c1-40ab-8357-92909562c177-kube-api-access-cb8gt\") pod \"auto-csr-approver-29551082-x77jw\" (UID: \"03dc001b-84c1-40ab-8357-92909562c177\") " pod="openshift-infra/auto-csr-approver-29551082-x77jw" Mar 09 14:02:00 crc kubenswrapper[4723]: I0309 14:02:00.285010 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb8gt\" (UniqueName: \"kubernetes.io/projected/03dc001b-84c1-40ab-8357-92909562c177-kube-api-access-cb8gt\") pod \"auto-csr-approver-29551082-x77jw\" (UID: \"03dc001b-84c1-40ab-8357-92909562c177\") " pod="openshift-infra/auto-csr-approver-29551082-x77jw" Mar 09 14:02:00 crc kubenswrapper[4723]: I0309 14:02:00.309908 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb8gt\" (UniqueName: \"kubernetes.io/projected/03dc001b-84c1-40ab-8357-92909562c177-kube-api-access-cb8gt\") pod \"auto-csr-approver-29551082-x77jw\" (UID: \"03dc001b-84c1-40ab-8357-92909562c177\") " pod="openshift-infra/auto-csr-approver-29551082-x77jw" Mar 09 14:02:00 crc kubenswrapper[4723]: I0309 14:02:00.500343 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551082-x77jw" Mar 09 14:02:01 crc kubenswrapper[4723]: I0309 14:02:01.046630 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551082-x77jw"] Mar 09 14:02:01 crc kubenswrapper[4723]: I0309 14:02:01.356380 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551082-x77jw" event={"ID":"03dc001b-84c1-40ab-8357-92909562c177","Type":"ContainerStarted","Data":"afa24ef133327d172e4c53c5df80d53371a0b648637fd9738cdc246415d12a40"} Mar 09 14:02:03 crc kubenswrapper[4723]: I0309 14:02:03.388701 4723 generic.go:334] "Generic (PLEG): container finished" podID="03dc001b-84c1-40ab-8357-92909562c177" containerID="bd1e5f8b4e5a2309315267c54bd1a7e2b75f35c70a175866a745500ca499cec9" exitCode=0 Mar 09 14:02:03 crc kubenswrapper[4723]: I0309 14:02:03.389167 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551082-x77jw" event={"ID":"03dc001b-84c1-40ab-8357-92909562c177","Type":"ContainerDied","Data":"bd1e5f8b4e5a2309315267c54bd1a7e2b75f35c70a175866a745500ca499cec9"} Mar 09 14:02:04 crc kubenswrapper[4723]: I0309 14:02:04.896404 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551082-x77jw" Mar 09 14:02:05 crc kubenswrapper[4723]: I0309 14:02:05.006898 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cb8gt\" (UniqueName: \"kubernetes.io/projected/03dc001b-84c1-40ab-8357-92909562c177-kube-api-access-cb8gt\") pod \"03dc001b-84c1-40ab-8357-92909562c177\" (UID: \"03dc001b-84c1-40ab-8357-92909562c177\") " Mar 09 14:02:05 crc kubenswrapper[4723]: I0309 14:02:05.012560 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03dc001b-84c1-40ab-8357-92909562c177-kube-api-access-cb8gt" (OuterVolumeSpecName: "kube-api-access-cb8gt") pod "03dc001b-84c1-40ab-8357-92909562c177" (UID: "03dc001b-84c1-40ab-8357-92909562c177"). InnerVolumeSpecName "kube-api-access-cb8gt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:02:05 crc kubenswrapper[4723]: I0309 14:02:05.110350 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cb8gt\" (UniqueName: \"kubernetes.io/projected/03dc001b-84c1-40ab-8357-92909562c177-kube-api-access-cb8gt\") on node \"crc\" DevicePath \"\"" Mar 09 14:02:05 crc kubenswrapper[4723]: I0309 14:02:05.412658 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551082-x77jw" event={"ID":"03dc001b-84c1-40ab-8357-92909562c177","Type":"ContainerDied","Data":"afa24ef133327d172e4c53c5df80d53371a0b648637fd9738cdc246415d12a40"} Mar 09 14:02:05 crc kubenswrapper[4723]: I0309 14:02:05.412696 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afa24ef133327d172e4c53c5df80d53371a0b648637fd9738cdc246415d12a40" Mar 09 14:02:05 crc kubenswrapper[4723]: I0309 14:02:05.412746 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551082-x77jw" Mar 09 14:02:05 crc kubenswrapper[4723]: I0309 14:02:05.964297 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551076-2q99b"] Mar 09 14:02:05 crc kubenswrapper[4723]: I0309 14:02:05.979712 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551076-2q99b"] Mar 09 14:02:06 crc kubenswrapper[4723]: I0309 14:02:06.900962 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1d59bae-bf1e-43a5-9151-81096f514920" path="/var/lib/kubelet/pods/b1d59bae-bf1e-43a5-9151-81096f514920/volumes" Mar 09 14:02:26 crc kubenswrapper[4723]: I0309 14:02:26.341255 4723 scope.go:117] "RemoveContainer" containerID="b73ef56ce5c41209c125422c96370bf6e61e90cc60a0e05679f6f031934142f6" Mar 09 14:03:33 crc kubenswrapper[4723]: I0309 14:03:33.946620 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:03:33 crc kubenswrapper[4723]: I0309 14:03:33.947190 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:04:00 crc kubenswrapper[4723]: I0309 14:04:00.142708 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551084-dvqp6"] Mar 09 14:04:00 crc kubenswrapper[4723]: E0309 14:04:00.144274 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03dc001b-84c1-40ab-8357-92909562c177" containerName="oc" Mar 09 14:04:00 crc kubenswrapper[4723]: I0309 14:04:00.144289 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="03dc001b-84c1-40ab-8357-92909562c177" containerName="oc" Mar 09 14:04:00 crc kubenswrapper[4723]: I0309 14:04:00.144511 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="03dc001b-84c1-40ab-8357-92909562c177" containerName="oc" Mar 09 14:04:00 crc kubenswrapper[4723]: I0309 14:04:00.145293 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551084-dvqp6" Mar 09 14:04:00 crc kubenswrapper[4723]: I0309 14:04:00.147822 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:04:00 crc kubenswrapper[4723]: I0309 14:04:00.149796 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:04:00 crc kubenswrapper[4723]: I0309 14:04:00.150949 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jrp4x" Mar 09 14:04:00 crc kubenswrapper[4723]: I0309 14:04:00.173757 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551084-dvqp6"] Mar 09 14:04:00 crc kubenswrapper[4723]: I0309 14:04:00.282227 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnwft\" (UniqueName: \"kubernetes.io/projected/fac835c9-8cbe-4691-90d9-c5eecebecb33-kube-api-access-gnwft\") pod \"auto-csr-approver-29551084-dvqp6\" (UID: \"fac835c9-8cbe-4691-90d9-c5eecebecb33\") " pod="openshift-infra/auto-csr-approver-29551084-dvqp6" Mar 09 14:04:00 crc kubenswrapper[4723]: I0309 14:04:00.384488 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnwft\" (UniqueName: \"kubernetes.io/projected/fac835c9-8cbe-4691-90d9-c5eecebecb33-kube-api-access-gnwft\") pod \"auto-csr-approver-29551084-dvqp6\" (UID: \"fac835c9-8cbe-4691-90d9-c5eecebecb33\") " pod="openshift-infra/auto-csr-approver-29551084-dvqp6" Mar 09 14:04:00 crc kubenswrapper[4723]: I0309 14:04:00.402056 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnwft\" (UniqueName: \"kubernetes.io/projected/fac835c9-8cbe-4691-90d9-c5eecebecb33-kube-api-access-gnwft\") pod \"auto-csr-approver-29551084-dvqp6\" (UID: \"fac835c9-8cbe-4691-90d9-c5eecebecb33\") " pod="openshift-infra/auto-csr-approver-29551084-dvqp6" Mar 09 14:04:00 crc kubenswrapper[4723]: I0309 14:04:00.473532 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551084-dvqp6" Mar 09 14:04:00 crc kubenswrapper[4723]: I0309 14:04:00.987047 4723 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 14:04:00 crc kubenswrapper[4723]: I0309 14:04:00.990112 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551084-dvqp6"] Mar 09 14:04:01 crc kubenswrapper[4723]: I0309 14:04:01.778405 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551084-dvqp6" event={"ID":"fac835c9-8cbe-4691-90d9-c5eecebecb33","Type":"ContainerStarted","Data":"3d748e16453306f2ea1bf8d707c2b857236d61004d7e9ec74d49b2a485da7482"} Mar 09 14:04:02 crc kubenswrapper[4723]: I0309 14:04:02.794825 4723 generic.go:334] "Generic (PLEG): container finished" podID="fac835c9-8cbe-4691-90d9-c5eecebecb33" containerID="81ef19abb07c261bd33d6faaf3ed6d1dea7822396db4acf88df7e37a529bf598" exitCode=0 Mar 09 14:04:02 crc kubenswrapper[4723]: I0309 14:04:02.795113 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551084-dvqp6" event={"ID":"fac835c9-8cbe-4691-90d9-c5eecebecb33","Type":"ContainerDied","Data":"81ef19abb07c261bd33d6faaf3ed6d1dea7822396db4acf88df7e37a529bf598"} Mar 09 14:04:03 crc kubenswrapper[4723]: I0309 14:04:03.947670 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:04:03 crc kubenswrapper[4723]: I0309 14:04:03.952564 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:04:04 crc kubenswrapper[4723]: I0309 14:04:04.281759 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551084-dvqp6" Mar 09 14:04:04 crc kubenswrapper[4723]: I0309 14:04:04.377427 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnwft\" (UniqueName: \"kubernetes.io/projected/fac835c9-8cbe-4691-90d9-c5eecebecb33-kube-api-access-gnwft\") pod \"fac835c9-8cbe-4691-90d9-c5eecebecb33\" (UID: \"fac835c9-8cbe-4691-90d9-c5eecebecb33\") " Mar 09 14:04:04 crc kubenswrapper[4723]: I0309 14:04:04.383161 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fac835c9-8cbe-4691-90d9-c5eecebecb33-kube-api-access-gnwft" (OuterVolumeSpecName: "kube-api-access-gnwft") pod "fac835c9-8cbe-4691-90d9-c5eecebecb33" (UID: "fac835c9-8cbe-4691-90d9-c5eecebecb33"). InnerVolumeSpecName "kube-api-access-gnwft". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:04 crc kubenswrapper[4723]: I0309 14:04:04.480807 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnwft\" (UniqueName: \"kubernetes.io/projected/fac835c9-8cbe-4691-90d9-c5eecebecb33-kube-api-access-gnwft\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:04 crc kubenswrapper[4723]: I0309 14:04:04.818307 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551084-dvqp6" event={"ID":"fac835c9-8cbe-4691-90d9-c5eecebecb33","Type":"ContainerDied","Data":"3d748e16453306f2ea1bf8d707c2b857236d61004d7e9ec74d49b2a485da7482"} Mar 09 14:04:04 crc kubenswrapper[4723]: I0309 14:04:04.818364 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d748e16453306f2ea1bf8d707c2b857236d61004d7e9ec74d49b2a485da7482" Mar 09 14:04:04 crc kubenswrapper[4723]: I0309 14:04:04.818397 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551084-dvqp6" Mar 09 14:04:05 crc kubenswrapper[4723]: I0309 14:04:05.368564 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551078-2gl7q"] Mar 09 14:04:05 crc kubenswrapper[4723]: I0309 14:04:05.380948 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551078-2gl7q"] Mar 09 14:04:06 crc kubenswrapper[4723]: I0309 14:04:06.893845 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c346e089-31ed-4a23-9cf6-deea92044f18" path="/var/lib/kubelet/pods/c346e089-31ed-4a23-9cf6-deea92044f18/volumes" Mar 09 14:04:26 crc kubenswrapper[4723]: I0309 14:04:26.436028 4723 scope.go:117] "RemoveContainer" containerID="dfa03197eccb7721e3110377970a5e517f752eb14ecabdbaff7f5e0dcc8ea991" Mar 09 14:04:33 crc kubenswrapper[4723]: I0309 14:04:33.946513 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:04:33 crc kubenswrapper[4723]: I0309 14:04:33.947056 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:04:33 crc kubenswrapper[4723]: I0309 14:04:33.947106 4723 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" Mar 09 14:04:33 crc kubenswrapper[4723]: I0309 14:04:33.947701 4723 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"37326c58a388082cd97d4e16296313d00865dba4dd28ca9ddbf4cdd0872efba7"} pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 14:04:33 crc kubenswrapper[4723]: I0309 14:04:33.947772 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" containerID="cri-o://37326c58a388082cd97d4e16296313d00865dba4dd28ca9ddbf4cdd0872efba7" gracePeriod=600 Mar 09 14:04:34 crc kubenswrapper[4723]: E0309 14:04:34.069221 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:04:34 crc kubenswrapper[4723]: I0309 14:04:34.134576 4723 generic.go:334] "Generic (PLEG): container finished" podID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerID="37326c58a388082cd97d4e16296313d00865dba4dd28ca9ddbf4cdd0872efba7" exitCode=0 Mar 09 14:04:34 crc kubenswrapper[4723]: I0309 14:04:34.134628 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" event={"ID":"983d5ed4-cfc7-402a-b226-29dc071c6e4e","Type":"ContainerDied","Data":"37326c58a388082cd97d4e16296313d00865dba4dd28ca9ddbf4cdd0872efba7"} Mar 09 14:04:34 crc kubenswrapper[4723]: I0309 14:04:34.134667 4723 scope.go:117] "RemoveContainer" containerID="b9917d1efbbbf05f4cd1ae94e4d9bdfa5b2f0c6eaedeaba13758e7498a18fdc2" Mar 09 14:04:34 crc kubenswrapper[4723]: I0309 14:04:34.135580 4723 scope.go:117] "RemoveContainer" containerID="37326c58a388082cd97d4e16296313d00865dba4dd28ca9ddbf4cdd0872efba7" Mar 09 14:04:34 crc kubenswrapper[4723]: E0309 14:04:34.135978 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:04:46 crc kubenswrapper[4723]: I0309 14:04:46.895493 4723 scope.go:117] "RemoveContainer" containerID="37326c58a388082cd97d4e16296313d00865dba4dd28ca9ddbf4cdd0872efba7" Mar 09 14:04:46 crc kubenswrapper[4723]: E0309 14:04:46.898974 4723 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.129:46292->38.102.83.129:35705: write tcp 38.102.83.129:46292->38.102.83.129:35705: write: broken pipe Mar 09 14:04:46 crc kubenswrapper[4723]: E0309 14:04:46.900493 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:05:00 crc kubenswrapper[4723]: I0309 14:05:00.882093 4723 scope.go:117] "RemoveContainer" containerID="37326c58a388082cd97d4e16296313d00865dba4dd28ca9ddbf4cdd0872efba7" Mar 09 14:05:00 crc kubenswrapper[4723]: E0309 14:05:00.883355 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:05:12 crc kubenswrapper[4723]: I0309 14:05:12.880689 4723 scope.go:117] "RemoveContainer" containerID="37326c58a388082cd97d4e16296313d00865dba4dd28ca9ddbf4cdd0872efba7" Mar 09 14:05:12 crc kubenswrapper[4723]: E0309 14:05:12.881647 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:05:27 crc kubenswrapper[4723]: I0309 14:05:27.881431 4723 scope.go:117] "RemoveContainer" containerID="37326c58a388082cd97d4e16296313d00865dba4dd28ca9ddbf4cdd0872efba7" Mar 09 14:05:27 crc kubenswrapper[4723]: E0309 14:05:27.882228 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:05:38 crc kubenswrapper[4723]: I0309 14:05:38.881970 4723 scope.go:117] "RemoveContainer" containerID="37326c58a388082cd97d4e16296313d00865dba4dd28ca9ddbf4cdd0872efba7" Mar 09 14:05:38 crc kubenswrapper[4723]: E0309 14:05:38.882818 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:05:52 crc kubenswrapper[4723]: I0309 14:05:52.163869 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-t6bvx"] Mar 09 14:05:52 crc kubenswrapper[4723]: E0309 14:05:52.164924 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fac835c9-8cbe-4691-90d9-c5eecebecb33" containerName="oc" Mar 09 14:05:52 crc kubenswrapper[4723]: I0309 14:05:52.164942 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="fac835c9-8cbe-4691-90d9-c5eecebecb33" containerName="oc" Mar 09 14:05:52 crc kubenswrapper[4723]: I0309 14:05:52.165220 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="fac835c9-8cbe-4691-90d9-c5eecebecb33" containerName="oc" Mar 09 14:05:52 crc kubenswrapper[4723]: I0309 14:05:52.167401 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t6bvx" Mar 09 14:05:52 crc kubenswrapper[4723]: I0309 14:05:52.176376 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t6bvx"] Mar 09 14:05:52 crc kubenswrapper[4723]: I0309 14:05:52.222612 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16ddb1a4-1a34-4a06-af19-769b546d0079-utilities\") pod \"redhat-operators-t6bvx\" (UID: \"16ddb1a4-1a34-4a06-af19-769b546d0079\") " pod="openshift-marketplace/redhat-operators-t6bvx" Mar 09 14:05:52 crc kubenswrapper[4723]: I0309 14:05:52.222760 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16ddb1a4-1a34-4a06-af19-769b546d0079-catalog-content\") pod \"redhat-operators-t6bvx\" (UID: \"16ddb1a4-1a34-4a06-af19-769b546d0079\") " pod="openshift-marketplace/redhat-operators-t6bvx" Mar 09 14:05:52 crc kubenswrapper[4723]: I0309 14:05:52.222888 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4np8\" (UniqueName: \"kubernetes.io/projected/16ddb1a4-1a34-4a06-af19-769b546d0079-kube-api-access-j4np8\") pod \"redhat-operators-t6bvx\" (UID: \"16ddb1a4-1a34-4a06-af19-769b546d0079\") " pod="openshift-marketplace/redhat-operators-t6bvx" Mar 09 14:05:52 crc kubenswrapper[4723]: I0309 14:05:52.325340 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16ddb1a4-1a34-4a06-af19-769b546d0079-utilities\") pod \"redhat-operators-t6bvx\" (UID: \"16ddb1a4-1a34-4a06-af19-769b546d0079\") " pod="openshift-marketplace/redhat-operators-t6bvx" Mar 09 14:05:52 crc kubenswrapper[4723]: I0309 14:05:52.325469 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16ddb1a4-1a34-4a06-af19-769b546d0079-catalog-content\") pod \"redhat-operators-t6bvx\" (UID: \"16ddb1a4-1a34-4a06-af19-769b546d0079\") " pod="openshift-marketplace/redhat-operators-t6bvx" Mar 09 14:05:52 crc kubenswrapper[4723]: I0309 14:05:52.325539 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4np8\" (UniqueName: \"kubernetes.io/projected/16ddb1a4-1a34-4a06-af19-769b546d0079-kube-api-access-j4np8\") pod \"redhat-operators-t6bvx\" (UID: \"16ddb1a4-1a34-4a06-af19-769b546d0079\") " pod="openshift-marketplace/redhat-operators-t6bvx" Mar 09 14:05:52 crc kubenswrapper[4723]: I0309 14:05:52.327647 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16ddb1a4-1a34-4a06-af19-769b546d0079-utilities\") pod \"redhat-operators-t6bvx\" (UID: \"16ddb1a4-1a34-4a06-af19-769b546d0079\") " pod="openshift-marketplace/redhat-operators-t6bvx" Mar 09 14:05:52 crc kubenswrapper[4723]: I0309 14:05:52.331007 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16ddb1a4-1a34-4a06-af19-769b546d0079-catalog-content\") pod \"redhat-operators-t6bvx\" (UID: \"16ddb1a4-1a34-4a06-af19-769b546d0079\") " pod="openshift-marketplace/redhat-operators-t6bvx" Mar 09 14:05:52 crc kubenswrapper[4723]: I0309 14:05:52.349663 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4np8\" (UniqueName: \"kubernetes.io/projected/16ddb1a4-1a34-4a06-af19-769b546d0079-kube-api-access-j4np8\") pod \"redhat-operators-t6bvx\" (UID: \"16ddb1a4-1a34-4a06-af19-769b546d0079\") " pod="openshift-marketplace/redhat-operators-t6bvx" Mar 09 14:05:52 crc kubenswrapper[4723]: I0309 14:05:52.524721 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t6bvx" Mar 09 14:05:53 crc kubenswrapper[4723]: I0309 14:05:53.039687 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t6bvx"] Mar 09 14:05:53 crc kubenswrapper[4723]: I0309 14:05:53.882205 4723 scope.go:117] "RemoveContainer" containerID="37326c58a388082cd97d4e16296313d00865dba4dd28ca9ddbf4cdd0872efba7" Mar 09 14:05:53 crc kubenswrapper[4723]: E0309 14:05:53.884399 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:05:53 crc kubenswrapper[4723]: I0309 14:05:53.996525 4723 generic.go:334] "Generic (PLEG): container finished" podID="16ddb1a4-1a34-4a06-af19-769b546d0079" containerID="6f3fb43d082a1b1826064584731c47d01b0bb74a54aa8bf22f654e3a970fbe1b" exitCode=0 Mar 09 14:05:53 crc kubenswrapper[4723]: I0309 14:05:53.996571 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t6bvx" event={"ID":"16ddb1a4-1a34-4a06-af19-769b546d0079","Type":"ContainerDied","Data":"6f3fb43d082a1b1826064584731c47d01b0bb74a54aa8bf22f654e3a970fbe1b"} Mar 09 14:05:53 crc kubenswrapper[4723]: I0309 14:05:53.996595 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t6bvx" event={"ID":"16ddb1a4-1a34-4a06-af19-769b546d0079","Type":"ContainerStarted","Data":"e0e7c3c57b7d8fad10350172813fdca8d499f8e3bf64168b263575eb991ea448"} Mar 09 14:05:56 crc kubenswrapper[4723]: I0309 14:05:56.017504 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t6bvx" event={"ID":"16ddb1a4-1a34-4a06-af19-769b546d0079","Type":"ContainerStarted","Data":"ce14f986a5b474c9a4a387adedacd592782a996020b9f773b600b239a93eb5e0"} Mar 09 14:06:00 crc kubenswrapper[4723]: I0309 14:06:00.154066 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551086-brxt4"] Mar 09 14:06:00 crc kubenswrapper[4723]: I0309 14:06:00.156025 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551086-brxt4" Mar 09 14:06:00 crc kubenswrapper[4723]: I0309 14:06:00.158094 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jrp4x" Mar 09 14:06:00 crc kubenswrapper[4723]: I0309 14:06:00.158376 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:06:00 crc kubenswrapper[4723]: I0309 14:06:00.159422 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:06:00 crc kubenswrapper[4723]: I0309 14:06:00.170157 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551086-brxt4"] Mar 09 14:06:00 crc kubenswrapper[4723]: I0309 14:06:00.312738 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtjc5\" (UniqueName: \"kubernetes.io/projected/1c5f83a7-4b1c-40da-9642-65a5df6acdda-kube-api-access-vtjc5\") pod \"auto-csr-approver-29551086-brxt4\" (UID: \"1c5f83a7-4b1c-40da-9642-65a5df6acdda\") " pod="openshift-infra/auto-csr-approver-29551086-brxt4" Mar 09 14:06:00 crc kubenswrapper[4723]: I0309 14:06:00.415637 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtjc5\" (UniqueName: \"kubernetes.io/projected/1c5f83a7-4b1c-40da-9642-65a5df6acdda-kube-api-access-vtjc5\") pod \"auto-csr-approver-29551086-brxt4\" (UID: \"1c5f83a7-4b1c-40da-9642-65a5df6acdda\") " pod="openshift-infra/auto-csr-approver-29551086-brxt4" Mar 09 14:06:00 crc kubenswrapper[4723]: I0309 14:06:00.438762 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtjc5\" (UniqueName: \"kubernetes.io/projected/1c5f83a7-4b1c-40da-9642-65a5df6acdda-kube-api-access-vtjc5\") pod \"auto-csr-approver-29551086-brxt4\" (UID: \"1c5f83a7-4b1c-40da-9642-65a5df6acdda\") " pod="openshift-infra/auto-csr-approver-29551086-brxt4" Mar 09 14:06:00 crc kubenswrapper[4723]: I0309 14:06:00.481709 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551086-brxt4" Mar 09 14:06:01 crc kubenswrapper[4723]: I0309 14:06:01.075489 4723 generic.go:334] "Generic (PLEG): container finished" podID="16ddb1a4-1a34-4a06-af19-769b546d0079" containerID="ce14f986a5b474c9a4a387adedacd592782a996020b9f773b600b239a93eb5e0" exitCode=0 Mar 09 14:06:01 crc kubenswrapper[4723]: I0309 14:06:01.075589 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t6bvx" event={"ID":"16ddb1a4-1a34-4a06-af19-769b546d0079","Type":"ContainerDied","Data":"ce14f986a5b474c9a4a387adedacd592782a996020b9f773b600b239a93eb5e0"} Mar 09 14:06:01 crc kubenswrapper[4723]: I0309 14:06:01.179705 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551086-brxt4"] Mar 09 14:06:01 crc kubenswrapper[4723]: W0309 14:06:01.181169 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c5f83a7_4b1c_40da_9642_65a5df6acdda.slice/crio-f222823ef6df02fa3fbcc0b79d059cb76d21ddb8dc4a9ec891d6f2ee700a4d68 WatchSource:0}: Error finding container f222823ef6df02fa3fbcc0b79d059cb76d21ddb8dc4a9ec891d6f2ee700a4d68: Status 404 returned error can't find the container with id f222823ef6df02fa3fbcc0b79d059cb76d21ddb8dc4a9ec891d6f2ee700a4d68 Mar 09 14:06:02 crc kubenswrapper[4723]: I0309 14:06:02.096825 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t6bvx" event={"ID":"16ddb1a4-1a34-4a06-af19-769b546d0079","Type":"ContainerStarted","Data":"cb05ca21d1b793716575fef7970f8e6c635ad5bc5171e513e37f56590c10adc5"} Mar 09 14:06:02 crc kubenswrapper[4723]: I0309 14:06:02.098939 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551086-brxt4" event={"ID":"1c5f83a7-4b1c-40da-9642-65a5df6acdda","Type":"ContainerStarted","Data":"f222823ef6df02fa3fbcc0b79d059cb76d21ddb8dc4a9ec891d6f2ee700a4d68"} Mar 09 14:06:02 crc kubenswrapper[4723]: I0309 14:06:02.140595 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-t6bvx" podStartSLOduration=2.574784863 podStartE2EDuration="10.140576501s" podCreationTimestamp="2026-03-09 14:05:52 +0000 UTC" firstStartedPulling="2026-03-09 14:05:53.998489508 +0000 UTC m=+4028.012957048" lastFinishedPulling="2026-03-09 14:06:01.564281146 +0000 UTC m=+4035.578748686" observedRunningTime="2026-03-09 14:06:02.1326876 +0000 UTC m=+4036.147155140" watchObservedRunningTime="2026-03-09 14:06:02.140576501 +0000 UTC m=+4036.155044041" Mar 09 14:06:02 crc kubenswrapper[4723]: I0309 14:06:02.525332 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-t6bvx" Mar 09 14:06:02 crc kubenswrapper[4723]: I0309 14:06:02.525464 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-t6bvx" Mar 09 14:06:03 crc kubenswrapper[4723]: I0309 14:06:03.113012 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551086-brxt4" event={"ID":"1c5f83a7-4b1c-40da-9642-65a5df6acdda","Type":"ContainerStarted","Data":"06c8d261467db592ba6f97f0abb4809b00ee522b7b82452ac6328feb1d904775"} Mar 09 14:06:03 crc kubenswrapper[4723]: I0309 14:06:03.128503 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551086-brxt4" podStartSLOduration=2.105252939 podStartE2EDuration="3.128480651s" podCreationTimestamp="2026-03-09 14:06:00 +0000 UTC" firstStartedPulling="2026-03-09 14:06:01.183541472 +0000 UTC m=+4035.198009012" lastFinishedPulling="2026-03-09 14:06:02.206769184 +0000 UTC m=+4036.221236724" observedRunningTime="2026-03-09 14:06:03.123502438 +0000 UTC m=+4037.137969988" watchObservedRunningTime="2026-03-09 14:06:03.128480651 +0000 UTC m=+4037.142948201" Mar 09 14:06:03 crc kubenswrapper[4723]: I0309 14:06:03.584529 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-t6bvx" podUID="16ddb1a4-1a34-4a06-af19-769b546d0079" containerName="registry-server" probeResult="failure" output=< Mar 09 14:06:03 crc kubenswrapper[4723]: timeout: failed to connect service ":50051" within 1s Mar 09 14:06:03 crc kubenswrapper[4723]: > Mar 09 14:06:04 crc kubenswrapper[4723]: I0309 14:06:04.122605 4723 generic.go:334] "Generic (PLEG): container finished" podID="1c5f83a7-4b1c-40da-9642-65a5df6acdda" containerID="06c8d261467db592ba6f97f0abb4809b00ee522b7b82452ac6328feb1d904775" exitCode=0 Mar 09 14:06:04 crc kubenswrapper[4723]: I0309 14:06:04.122901 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551086-brxt4" event={"ID":"1c5f83a7-4b1c-40da-9642-65a5df6acdda","Type":"ContainerDied","Data":"06c8d261467db592ba6f97f0abb4809b00ee522b7b82452ac6328feb1d904775"} Mar 09 14:06:05 crc kubenswrapper[4723]: I0309 14:06:05.534080 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551086-brxt4" Mar 09 14:06:05 crc kubenswrapper[4723]: I0309 14:06:05.555825 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtjc5\" (UniqueName: \"kubernetes.io/projected/1c5f83a7-4b1c-40da-9642-65a5df6acdda-kube-api-access-vtjc5\") pod \"1c5f83a7-4b1c-40da-9642-65a5df6acdda\" (UID: \"1c5f83a7-4b1c-40da-9642-65a5df6acdda\") " Mar 09 14:06:05 crc kubenswrapper[4723]: I0309 14:06:05.562450 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c5f83a7-4b1c-40da-9642-65a5df6acdda-kube-api-access-vtjc5" (OuterVolumeSpecName: "kube-api-access-vtjc5") pod "1c5f83a7-4b1c-40da-9642-65a5df6acdda" (UID: "1c5f83a7-4b1c-40da-9642-65a5df6acdda"). InnerVolumeSpecName "kube-api-access-vtjc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:06:05 crc kubenswrapper[4723]: I0309 14:06:05.658511 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtjc5\" (UniqueName: \"kubernetes.io/projected/1c5f83a7-4b1c-40da-9642-65a5df6acdda-kube-api-access-vtjc5\") on node \"crc\" DevicePath \"\"" Mar 09 14:06:06 crc kubenswrapper[4723]: I0309 14:06:06.150453 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551086-brxt4" Mar 09 14:06:06 crc kubenswrapper[4723]: I0309 14:06:06.150369 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551086-brxt4" event={"ID":"1c5f83a7-4b1c-40da-9642-65a5df6acdda","Type":"ContainerDied","Data":"f222823ef6df02fa3fbcc0b79d059cb76d21ddb8dc4a9ec891d6f2ee700a4d68"} Mar 09 14:06:06 crc kubenswrapper[4723]: I0309 14:06:06.157057 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f222823ef6df02fa3fbcc0b79d059cb76d21ddb8dc4a9ec891d6f2ee700a4d68" Mar 09 14:06:06 crc kubenswrapper[4723]: I0309 14:06:06.211484 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551080-9r5cj"] Mar 09 14:06:06 crc kubenswrapper[4723]: I0309 14:06:06.224538 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551080-9r5cj"] Mar 09 14:06:06 crc kubenswrapper[4723]: I0309 14:06:06.893534 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="707e76d7-5321-48ed-afb3-782f7a953315" path="/var/lib/kubelet/pods/707e76d7-5321-48ed-afb3-782f7a953315/volumes" Mar 09 14:06:07 crc kubenswrapper[4723]: I0309 14:06:07.882192 4723 scope.go:117] "RemoveContainer" containerID="37326c58a388082cd97d4e16296313d00865dba4dd28ca9ddbf4cdd0872efba7" Mar 09 14:06:07 crc kubenswrapper[4723]: E0309 14:06:07.882893 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:06:13 crc kubenswrapper[4723]: I0309 14:06:13.590964 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-t6bvx" podUID="16ddb1a4-1a34-4a06-af19-769b546d0079" containerName="registry-server" probeResult="failure" output=< Mar 09 14:06:13 crc kubenswrapper[4723]: timeout: failed to connect service ":50051" within 1s Mar 09 14:06:13 crc kubenswrapper[4723]: > Mar 09 14:06:18 crc kubenswrapper[4723]: I0309 14:06:18.880961 4723 scope.go:117] "RemoveContainer" containerID="37326c58a388082cd97d4e16296313d00865dba4dd28ca9ddbf4cdd0872efba7" Mar 09 14:06:18 crc kubenswrapper[4723]: E0309 14:06:18.881762 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:06:23 crc kubenswrapper[4723]: I0309 14:06:23.576126 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-t6bvx" podUID="16ddb1a4-1a34-4a06-af19-769b546d0079" containerName="registry-server" probeResult="failure" output=< Mar 09 14:06:23 crc kubenswrapper[4723]: timeout: failed to connect service ":50051" within 1s Mar 09 14:06:23 crc kubenswrapper[4723]: > Mar 09 14:06:26 crc kubenswrapper[4723]: I0309 14:06:26.587496 4723 scope.go:117] "RemoveContainer" containerID="13a75d979ca1ebd3f9c644bd7d6cabac562d49406ae6721992033a7b09ddb1a1" Mar 09 14:06:31 crc kubenswrapper[4723]: I0309 14:06:31.883475 4723 scope.go:117] "RemoveContainer" containerID="37326c58a388082cd97d4e16296313d00865dba4dd28ca9ddbf4cdd0872efba7" Mar 09 14:06:31 crc kubenswrapper[4723]: E0309 14:06:31.884421 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:06:34 crc kubenswrapper[4723]: I0309 14:06:34.054805 4723 patch_prober.go:28] interesting pod/thanos-querier-f994cb665-42jsl container/kube-rbac-proxy-web namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.86:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:06:34 crc kubenswrapper[4723]: I0309 14:06:34.055507 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/thanos-querier-f994cb665-42jsl" podUID="338186cb-4546-4740-bba3-c1c430d8aacc" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.86:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 14:06:34 crc kubenswrapper[4723]: I0309 14:06:34.062785 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-t6bvx" podUID="16ddb1a4-1a34-4a06-af19-769b546d0079" containerName="registry-server" probeResult="failure" output=< Mar 09 14:06:34 crc kubenswrapper[4723]: timeout: failed to connect service ":50051" within 1s Mar 09 14:06:34 crc kubenswrapper[4723]: > Mar 09 14:06:43 crc kubenswrapper[4723]: I0309 14:06:43.836946 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-t6bvx" podUID="16ddb1a4-1a34-4a06-af19-769b546d0079" containerName="registry-server" probeResult="failure" output=< Mar 09 14:06:43 crc kubenswrapper[4723]: timeout: failed to connect service ":50051" within 1s Mar 09 14:06:43 crc kubenswrapper[4723]: > Mar 09 14:06:46 crc kubenswrapper[4723]: I0309 14:06:46.893791 4723 scope.go:117] "RemoveContainer" containerID="37326c58a388082cd97d4e16296313d00865dba4dd28ca9ddbf4cdd0872efba7" Mar 09 14:06:46 crc kubenswrapper[4723]: E0309 14:06:46.895781 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:06:52 crc kubenswrapper[4723]: I0309 14:06:52.594096 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-t6bvx" Mar 09 14:06:52 crc kubenswrapper[4723]: I0309 14:06:52.766521 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-t6bvx" Mar 09 14:06:53 crc kubenswrapper[4723]: I0309 14:06:53.403370 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t6bvx"] Mar 09 14:06:53 crc kubenswrapper[4723]: I0309 14:06:53.649820 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-t6bvx" podUID="16ddb1a4-1a34-4a06-af19-769b546d0079" containerName="registry-server" containerID="cri-o://cb05ca21d1b793716575fef7970f8e6c635ad5bc5171e513e37f56590c10adc5" gracePeriod=2 Mar 09 14:06:54 crc kubenswrapper[4723]: I0309 14:06:54.427166 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t6bvx" Mar 09 14:06:54 crc kubenswrapper[4723]: I0309 14:06:54.498373 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16ddb1a4-1a34-4a06-af19-769b546d0079-utilities\") pod \"16ddb1a4-1a34-4a06-af19-769b546d0079\" (UID: \"16ddb1a4-1a34-4a06-af19-769b546d0079\") " Mar 09 14:06:54 crc kubenswrapper[4723]: I0309 14:06:54.498431 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4np8\" (UniqueName: \"kubernetes.io/projected/16ddb1a4-1a34-4a06-af19-769b546d0079-kube-api-access-j4np8\") pod \"16ddb1a4-1a34-4a06-af19-769b546d0079\" (UID: \"16ddb1a4-1a34-4a06-af19-769b546d0079\") " Mar 09 14:06:54 crc kubenswrapper[4723]: I0309 14:06:54.498520 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16ddb1a4-1a34-4a06-af19-769b546d0079-catalog-content\") pod \"16ddb1a4-1a34-4a06-af19-769b546d0079\" (UID: \"16ddb1a4-1a34-4a06-af19-769b546d0079\") " Mar 09 14:06:54 crc kubenswrapper[4723]: I0309 14:06:54.499344 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16ddb1a4-1a34-4a06-af19-769b546d0079-utilities" (OuterVolumeSpecName: "utilities") pod "16ddb1a4-1a34-4a06-af19-769b546d0079" (UID: "16ddb1a4-1a34-4a06-af19-769b546d0079"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:06:54 crc kubenswrapper[4723]: I0309 14:06:54.505154 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16ddb1a4-1a34-4a06-af19-769b546d0079-kube-api-access-j4np8" (OuterVolumeSpecName: "kube-api-access-j4np8") pod "16ddb1a4-1a34-4a06-af19-769b546d0079" (UID: "16ddb1a4-1a34-4a06-af19-769b546d0079"). InnerVolumeSpecName "kube-api-access-j4np8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:06:54 crc kubenswrapper[4723]: I0309 14:06:54.602211 4723 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16ddb1a4-1a34-4a06-af19-769b546d0079-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:06:54 crc kubenswrapper[4723]: I0309 14:06:54.602251 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4np8\" (UniqueName: \"kubernetes.io/projected/16ddb1a4-1a34-4a06-af19-769b546d0079-kube-api-access-j4np8\") on node \"crc\" DevicePath \"\"" Mar 09 14:06:54 crc kubenswrapper[4723]: I0309 14:06:54.661720 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16ddb1a4-1a34-4a06-af19-769b546d0079-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "16ddb1a4-1a34-4a06-af19-769b546d0079" (UID: "16ddb1a4-1a34-4a06-af19-769b546d0079"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:06:54 crc kubenswrapper[4723]: I0309 14:06:54.663156 4723 generic.go:334] "Generic (PLEG): container finished" podID="16ddb1a4-1a34-4a06-af19-769b546d0079" containerID="cb05ca21d1b793716575fef7970f8e6c635ad5bc5171e513e37f56590c10adc5" exitCode=0 Mar 09 14:06:54 crc kubenswrapper[4723]: I0309 14:06:54.663198 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t6bvx" event={"ID":"16ddb1a4-1a34-4a06-af19-769b546d0079","Type":"ContainerDied","Data":"cb05ca21d1b793716575fef7970f8e6c635ad5bc5171e513e37f56590c10adc5"} Mar 09 14:06:54 crc kubenswrapper[4723]: I0309 14:06:54.663231 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t6bvx" event={"ID":"16ddb1a4-1a34-4a06-af19-769b546d0079","Type":"ContainerDied","Data":"e0e7c3c57b7d8fad10350172813fdca8d499f8e3bf64168b263575eb991ea448"} Mar 09 14:06:54 crc kubenswrapper[4723]: I0309 14:06:54.663250 4723 scope.go:117] "RemoveContainer" containerID="cb05ca21d1b793716575fef7970f8e6c635ad5bc5171e513e37f56590c10adc5" Mar 09 14:06:54 crc kubenswrapper[4723]: I0309 14:06:54.663399 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t6bvx" Mar 09 14:06:54 crc kubenswrapper[4723]: I0309 14:06:54.704740 4723 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16ddb1a4-1a34-4a06-af19-769b546d0079-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:06:54 crc kubenswrapper[4723]: I0309 14:06:54.707669 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t6bvx"] Mar 09 14:06:54 crc kubenswrapper[4723]: I0309 14:06:54.725413 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-t6bvx"] Mar 09 14:06:54 crc kubenswrapper[4723]: I0309 14:06:54.725492 4723 scope.go:117] "RemoveContainer" containerID="ce14f986a5b474c9a4a387adedacd592782a996020b9f773b600b239a93eb5e0" Mar 09 14:06:54 crc kubenswrapper[4723]: I0309 14:06:54.773023 4723 scope.go:117] "RemoveContainer" containerID="6f3fb43d082a1b1826064584731c47d01b0bb74a54aa8bf22f654e3a970fbe1b" Mar 09 14:06:54 crc kubenswrapper[4723]: I0309 14:06:54.820197 4723 scope.go:117] "RemoveContainer" containerID="cb05ca21d1b793716575fef7970f8e6c635ad5bc5171e513e37f56590c10adc5" Mar 09 14:06:54 crc kubenswrapper[4723]: E0309 14:06:54.821104 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb05ca21d1b793716575fef7970f8e6c635ad5bc5171e513e37f56590c10adc5\": container with ID starting with cb05ca21d1b793716575fef7970f8e6c635ad5bc5171e513e37f56590c10adc5 not found: ID does not exist" containerID="cb05ca21d1b793716575fef7970f8e6c635ad5bc5171e513e37f56590c10adc5" Mar 09 14:06:54 crc kubenswrapper[4723]: I0309 14:06:54.821137 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb05ca21d1b793716575fef7970f8e6c635ad5bc5171e513e37f56590c10adc5"} err="failed to get container status \"cb05ca21d1b793716575fef7970f8e6c635ad5bc5171e513e37f56590c10adc5\": rpc error: code = NotFound desc = could not find container \"cb05ca21d1b793716575fef7970f8e6c635ad5bc5171e513e37f56590c10adc5\": container with ID starting with cb05ca21d1b793716575fef7970f8e6c635ad5bc5171e513e37f56590c10adc5 not found: ID does not exist" Mar 09 14:06:54 crc kubenswrapper[4723]: I0309 14:06:54.821161 4723 scope.go:117] "RemoveContainer" containerID="ce14f986a5b474c9a4a387adedacd592782a996020b9f773b600b239a93eb5e0" Mar 09 14:06:54 crc kubenswrapper[4723]: E0309 14:06:54.821814 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce14f986a5b474c9a4a387adedacd592782a996020b9f773b600b239a93eb5e0\": container with ID starting with ce14f986a5b474c9a4a387adedacd592782a996020b9f773b600b239a93eb5e0 not found: ID does not exist" containerID="ce14f986a5b474c9a4a387adedacd592782a996020b9f773b600b239a93eb5e0" Mar 09 14:06:54 crc kubenswrapper[4723]: I0309 14:06:54.821919 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce14f986a5b474c9a4a387adedacd592782a996020b9f773b600b239a93eb5e0"} err="failed to get container status \"ce14f986a5b474c9a4a387adedacd592782a996020b9f773b600b239a93eb5e0\": rpc error: code = NotFound desc = could not find container \"ce14f986a5b474c9a4a387adedacd592782a996020b9f773b600b239a93eb5e0\": container with ID starting with ce14f986a5b474c9a4a387adedacd592782a996020b9f773b600b239a93eb5e0 not found: ID does not exist" Mar 09 14:06:54 crc kubenswrapper[4723]: I0309 14:06:54.821959 4723 scope.go:117] "RemoveContainer" containerID="6f3fb43d082a1b1826064584731c47d01b0bb74a54aa8bf22f654e3a970fbe1b" Mar 09 14:06:54 crc kubenswrapper[4723]: E0309 14:06:54.822336 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f3fb43d082a1b1826064584731c47d01b0bb74a54aa8bf22f654e3a970fbe1b\": container with ID starting with 6f3fb43d082a1b1826064584731c47d01b0bb74a54aa8bf22f654e3a970fbe1b not found: ID does not exist" containerID="6f3fb43d082a1b1826064584731c47d01b0bb74a54aa8bf22f654e3a970fbe1b" Mar 09 14:06:54 crc kubenswrapper[4723]: I0309 14:06:54.822361 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f3fb43d082a1b1826064584731c47d01b0bb74a54aa8bf22f654e3a970fbe1b"} err="failed to get container status \"6f3fb43d082a1b1826064584731c47d01b0bb74a54aa8bf22f654e3a970fbe1b\": rpc error: code = NotFound desc = could not find container \"6f3fb43d082a1b1826064584731c47d01b0bb74a54aa8bf22f654e3a970fbe1b\": container with ID starting with 6f3fb43d082a1b1826064584731c47d01b0bb74a54aa8bf22f654e3a970fbe1b not found: ID does not exist" Mar 09 14:06:54 crc kubenswrapper[4723]: I0309 14:06:54.905049 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16ddb1a4-1a34-4a06-af19-769b546d0079" path="/var/lib/kubelet/pods/16ddb1a4-1a34-4a06-af19-769b546d0079/volumes" Mar 09 14:06:59 crc kubenswrapper[4723]: I0309 14:06:59.881163 4723 scope.go:117] "RemoveContainer" containerID="37326c58a388082cd97d4e16296313d00865dba4dd28ca9ddbf4cdd0872efba7" Mar 09 14:06:59 crc kubenswrapper[4723]: E0309 14:06:59.881956 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:07:14 crc kubenswrapper[4723]: I0309 14:07:14.881126 4723 scope.go:117] "RemoveContainer" containerID="37326c58a388082cd97d4e16296313d00865dba4dd28ca9ddbf4cdd0872efba7" Mar 09 14:07:14 crc kubenswrapper[4723]: E0309 14:07:14.881935 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:07:27 crc kubenswrapper[4723]: I0309 14:07:27.881721 4723 scope.go:117] "RemoveContainer" containerID="37326c58a388082cd97d4e16296313d00865dba4dd28ca9ddbf4cdd0872efba7" Mar 09 14:07:27 crc kubenswrapper[4723]: E0309 14:07:27.883175 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:07:42 crc kubenswrapper[4723]: I0309 14:07:42.881400 4723 scope.go:117] "RemoveContainer" containerID="37326c58a388082cd97d4e16296313d00865dba4dd28ca9ddbf4cdd0872efba7" Mar 09 14:07:42 crc kubenswrapper[4723]: E0309 14:07:42.882236 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:07:51 crc kubenswrapper[4723]: I0309 14:07:51.022738 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rlhjc"] Mar 09 14:07:51 crc kubenswrapper[4723]: E0309 14:07:51.024336 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16ddb1a4-1a34-4a06-af19-769b546d0079" containerName="extract-content" Mar 09 14:07:51 crc kubenswrapper[4723]: I0309 14:07:51.024356 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="16ddb1a4-1a34-4a06-af19-769b546d0079" containerName="extract-content" Mar 09 14:07:51 crc kubenswrapper[4723]: E0309 14:07:51.024375 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16ddb1a4-1a34-4a06-af19-769b546d0079" containerName="registry-server" Mar 09 14:07:51 crc kubenswrapper[4723]: I0309 14:07:51.024382 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="16ddb1a4-1a34-4a06-af19-769b546d0079" containerName="registry-server" Mar 09 14:07:51 crc kubenswrapper[4723]: E0309 14:07:51.024405 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16ddb1a4-1a34-4a06-af19-769b546d0079" containerName="extract-utilities" Mar 09 14:07:51 crc kubenswrapper[4723]: I0309 14:07:51.024416 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="16ddb1a4-1a34-4a06-af19-769b546d0079" containerName="extract-utilities" Mar 09 14:07:51 crc kubenswrapper[4723]: E0309 14:07:51.024446 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c5f83a7-4b1c-40da-9642-65a5df6acdda" containerName="oc" Mar 09 14:07:51 crc kubenswrapper[4723]: I0309 14:07:51.024453 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c5f83a7-4b1c-40da-9642-65a5df6acdda" containerName="oc" Mar 09 14:07:51 crc kubenswrapper[4723]: I0309 14:07:51.024707 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="16ddb1a4-1a34-4a06-af19-769b546d0079" containerName="registry-server" Mar 09 14:07:51 crc kubenswrapper[4723]: I0309 14:07:51.024723 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c5f83a7-4b1c-40da-9642-65a5df6acdda" containerName="oc" Mar 09 14:07:51 crc kubenswrapper[4723]: I0309 14:07:51.026530 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rlhjc" Mar 09 14:07:51 crc kubenswrapper[4723]: I0309 14:07:51.051356 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rlhjc"] Mar 09 14:07:51 crc kubenswrapper[4723]: I0309 14:07:51.098482 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd50c257-b948-4187-98af-a7b71768eb04-catalog-content\") pod \"certified-operators-rlhjc\" (UID: \"cd50c257-b948-4187-98af-a7b71768eb04\") " pod="openshift-marketplace/certified-operators-rlhjc" Mar 09 14:07:51 crc kubenswrapper[4723]: I0309 14:07:51.098567 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd50c257-b948-4187-98af-a7b71768eb04-utilities\") pod \"certified-operators-rlhjc\" (UID: \"cd50c257-b948-4187-98af-a7b71768eb04\") " pod="openshift-marketplace/certified-operators-rlhjc" Mar 09 14:07:51 crc kubenswrapper[4723]: I0309 14:07:51.098758 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzmmq\" (UniqueName: \"kubernetes.io/projected/cd50c257-b948-4187-98af-a7b71768eb04-kube-api-access-fzmmq\") pod \"certified-operators-rlhjc\" (UID: \"cd50c257-b948-4187-98af-a7b71768eb04\") " pod="openshift-marketplace/certified-operators-rlhjc" Mar 09 14:07:51 crc kubenswrapper[4723]: I0309 14:07:51.200650 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzmmq\" (UniqueName: \"kubernetes.io/projected/cd50c257-b948-4187-98af-a7b71768eb04-kube-api-access-fzmmq\") pod \"certified-operators-rlhjc\" (UID: \"cd50c257-b948-4187-98af-a7b71768eb04\") " pod="openshift-marketplace/certified-operators-rlhjc" Mar 09 14:07:51 crc kubenswrapper[4723]: I0309 14:07:51.200845 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd50c257-b948-4187-98af-a7b71768eb04-catalog-content\") pod \"certified-operators-rlhjc\" (UID: \"cd50c257-b948-4187-98af-a7b71768eb04\") " pod="openshift-marketplace/certified-operators-rlhjc" Mar 09 14:07:51 crc kubenswrapper[4723]: I0309 14:07:51.200930 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd50c257-b948-4187-98af-a7b71768eb04-utilities\") pod \"certified-operators-rlhjc\" (UID: \"cd50c257-b948-4187-98af-a7b71768eb04\") " pod="openshift-marketplace/certified-operators-rlhjc" Mar 09 14:07:51 crc kubenswrapper[4723]: I0309 14:07:51.201342 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd50c257-b948-4187-98af-a7b71768eb04-catalog-content\") pod \"certified-operators-rlhjc\" (UID: \"cd50c257-b948-4187-98af-a7b71768eb04\") " pod="openshift-marketplace/certified-operators-rlhjc" Mar 09 14:07:51 crc kubenswrapper[4723]: I0309 14:07:51.201387 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd50c257-b948-4187-98af-a7b71768eb04-utilities\") pod \"certified-operators-rlhjc\" (UID: \"cd50c257-b948-4187-98af-a7b71768eb04\") " pod="openshift-marketplace/certified-operators-rlhjc" Mar 09 14:07:51 crc kubenswrapper[4723]: I0309 14:07:51.221760 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzmmq\" (UniqueName: \"kubernetes.io/projected/cd50c257-b948-4187-98af-a7b71768eb04-kube-api-access-fzmmq\") pod \"certified-operators-rlhjc\" (UID: \"cd50c257-b948-4187-98af-a7b71768eb04\") " pod="openshift-marketplace/certified-operators-rlhjc" Mar 09 14:07:51 crc kubenswrapper[4723]: I0309 14:07:51.350753 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rlhjc" Mar 09 14:07:51 crc kubenswrapper[4723]: I0309 14:07:51.990050 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rlhjc"] Mar 09 14:07:52 crc kubenswrapper[4723]: I0309 14:07:52.293186 4723 generic.go:334] "Generic (PLEG): container finished" podID="cd50c257-b948-4187-98af-a7b71768eb04" containerID="ab5f463d6e694cb00607f6a0fff2f5db68da02fb998f3c8070092859255d8c8e" exitCode=0 Mar 09 14:07:52 crc kubenswrapper[4723]: I0309 14:07:52.293315 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rlhjc" event={"ID":"cd50c257-b948-4187-98af-a7b71768eb04","Type":"ContainerDied","Data":"ab5f463d6e694cb00607f6a0fff2f5db68da02fb998f3c8070092859255d8c8e"} Mar 09 14:07:52 crc kubenswrapper[4723]: I0309 14:07:52.293455 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rlhjc" event={"ID":"cd50c257-b948-4187-98af-a7b71768eb04","Type":"ContainerStarted","Data":"de239c2ef8f1643f5abf1d941a44f891642757e560c98efabc8e2974eeea3b8f"} Mar 09 14:07:53 crc kubenswrapper[4723]: I0309 14:07:53.305007 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rlhjc" event={"ID":"cd50c257-b948-4187-98af-a7b71768eb04","Type":"ContainerStarted","Data":"97ee7a05e931e4da70d3bf1eb491c59d0c1e8c2fe7ce4bf913eb5f63585c5de2"} Mar 09 14:07:55 crc kubenswrapper[4723]: I0309 14:07:55.881586 4723 scope.go:117] "RemoveContainer" containerID="37326c58a388082cd97d4e16296313d00865dba4dd28ca9ddbf4cdd0872efba7" Mar 09 14:07:55 crc kubenswrapper[4723]: E0309 14:07:55.882804 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:07:56 crc kubenswrapper[4723]: I0309 14:07:56.343735 4723 generic.go:334] "Generic (PLEG): container finished" podID="cd50c257-b948-4187-98af-a7b71768eb04" containerID="97ee7a05e931e4da70d3bf1eb491c59d0c1e8c2fe7ce4bf913eb5f63585c5de2" exitCode=0 Mar 09 14:07:56 crc kubenswrapper[4723]: I0309 14:07:56.343786 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rlhjc" event={"ID":"cd50c257-b948-4187-98af-a7b71768eb04","Type":"ContainerDied","Data":"97ee7a05e931e4da70d3bf1eb491c59d0c1e8c2fe7ce4bf913eb5f63585c5de2"} Mar 09 14:07:57 crc kubenswrapper[4723]: I0309 14:07:57.354543 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rlhjc" event={"ID":"cd50c257-b948-4187-98af-a7b71768eb04","Type":"ContainerStarted","Data":"0d9da217c2a6cd4d5253674f8937efc6289c3eaf833f4ea875c058ddd402bf4b"} Mar 09 14:07:57 crc kubenswrapper[4723]: I0309 14:07:57.390662 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rlhjc" podStartSLOduration=1.887991237 podStartE2EDuration="6.390638172s" podCreationTimestamp="2026-03-09 14:07:51 +0000 UTC" firstStartedPulling="2026-03-09 14:07:52.295479971 +0000 UTC m=+4146.309947511" lastFinishedPulling="2026-03-09 14:07:56.798126906 +0000 UTC m=+4150.812594446" observedRunningTime="2026-03-09 14:07:57.38231752 +0000 UTC m=+4151.396785060" watchObservedRunningTime="2026-03-09 14:07:57.390638172 +0000 UTC m=+4151.405105712" Mar 09 14:08:00 crc kubenswrapper[4723]: I0309 14:08:00.147925 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551088-ccmrx"] Mar 09 14:08:00 crc kubenswrapper[4723]: I0309 14:08:00.150353 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551088-ccmrx" Mar 09 14:08:00 crc kubenswrapper[4723]: I0309 14:08:00.153829 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:08:00 crc kubenswrapper[4723]: I0309 14:08:00.154175 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:08:00 crc kubenswrapper[4723]: I0309 14:08:00.156357 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jrp4x" Mar 09 14:08:00 crc kubenswrapper[4723]: I0309 14:08:00.168163 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551088-ccmrx"] Mar 09 14:08:00 crc kubenswrapper[4723]: I0309 14:08:00.263524 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trzb6\" (UniqueName: \"kubernetes.io/projected/b413173e-1c5b-4fc7-9536-0fc73d3feaa3-kube-api-access-trzb6\") pod \"auto-csr-approver-29551088-ccmrx\" (UID: \"b413173e-1c5b-4fc7-9536-0fc73d3feaa3\") " pod="openshift-infra/auto-csr-approver-29551088-ccmrx" Mar 09 14:08:00 crc kubenswrapper[4723]: I0309 14:08:00.365966 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trzb6\" (UniqueName: \"kubernetes.io/projected/b413173e-1c5b-4fc7-9536-0fc73d3feaa3-kube-api-access-trzb6\") pod \"auto-csr-approver-29551088-ccmrx\" (UID: \"b413173e-1c5b-4fc7-9536-0fc73d3feaa3\") " pod="openshift-infra/auto-csr-approver-29551088-ccmrx" Mar 09 14:08:00 crc kubenswrapper[4723]: I0309 14:08:00.695996 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trzb6\" (UniqueName: \"kubernetes.io/projected/b413173e-1c5b-4fc7-9536-0fc73d3feaa3-kube-api-access-trzb6\") pod \"auto-csr-approver-29551088-ccmrx\" (UID: \"b413173e-1c5b-4fc7-9536-0fc73d3feaa3\") " pod="openshift-infra/auto-csr-approver-29551088-ccmrx" Mar 09 14:08:00 crc kubenswrapper[4723]: I0309 14:08:00.833881 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551088-ccmrx" Mar 09 14:08:01 crc kubenswrapper[4723]: I0309 14:08:01.351528 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rlhjc" Mar 09 14:08:01 crc kubenswrapper[4723]: I0309 14:08:01.351922 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rlhjc" Mar 09 14:08:01 crc kubenswrapper[4723]: I0309 14:08:01.400347 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551088-ccmrx"] Mar 09 14:08:01 crc kubenswrapper[4723]: I0309 14:08:01.420715 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rlhjc" Mar 09 14:08:02 crc kubenswrapper[4723]: I0309 14:08:02.404648 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551088-ccmrx" event={"ID":"b413173e-1c5b-4fc7-9536-0fc73d3feaa3","Type":"ContainerStarted","Data":"46d56ecf676a30f2f4c2be79cae5b4efc10bdacd53041bccbf3f2dcbdcb332b3"} Mar 09 14:08:03 crc kubenswrapper[4723]: I0309 14:08:03.415566 4723 generic.go:334] "Generic (PLEG): container finished" podID="b413173e-1c5b-4fc7-9536-0fc73d3feaa3" containerID="44947baa317e7f4abdfe531801f00827c46088386e6150ccadeacebc8083bfc0" exitCode=0 Mar 09 14:08:03 crc kubenswrapper[4723]: I0309 14:08:03.415619 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551088-ccmrx" event={"ID":"b413173e-1c5b-4fc7-9536-0fc73d3feaa3","Type":"ContainerDied","Data":"44947baa317e7f4abdfe531801f00827c46088386e6150ccadeacebc8083bfc0"} Mar 09 14:08:04 crc kubenswrapper[4723]: I0309 14:08:04.807198 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551088-ccmrx" Mar 09 14:08:04 crc kubenswrapper[4723]: I0309 14:08:04.880368 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trzb6\" (UniqueName: \"kubernetes.io/projected/b413173e-1c5b-4fc7-9536-0fc73d3feaa3-kube-api-access-trzb6\") pod \"b413173e-1c5b-4fc7-9536-0fc73d3feaa3\" (UID: \"b413173e-1c5b-4fc7-9536-0fc73d3feaa3\") " Mar 09 14:08:04 crc kubenswrapper[4723]: I0309 14:08:04.885714 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b413173e-1c5b-4fc7-9536-0fc73d3feaa3-kube-api-access-trzb6" (OuterVolumeSpecName: "kube-api-access-trzb6") pod "b413173e-1c5b-4fc7-9536-0fc73d3feaa3" (UID: "b413173e-1c5b-4fc7-9536-0fc73d3feaa3"). InnerVolumeSpecName "kube-api-access-trzb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:08:04 crc kubenswrapper[4723]: I0309 14:08:04.984144 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trzb6\" (UniqueName: \"kubernetes.io/projected/b413173e-1c5b-4fc7-9536-0fc73d3feaa3-kube-api-access-trzb6\") on node \"crc\" DevicePath \"\"" Mar 09 14:08:05 crc kubenswrapper[4723]: I0309 14:08:05.435841 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551088-ccmrx" event={"ID":"b413173e-1c5b-4fc7-9536-0fc73d3feaa3","Type":"ContainerDied","Data":"46d56ecf676a30f2f4c2be79cae5b4efc10bdacd53041bccbf3f2dcbdcb332b3"} Mar 09 14:08:05 crc kubenswrapper[4723]: I0309 14:08:05.436092 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46d56ecf676a30f2f4c2be79cae5b4efc10bdacd53041bccbf3f2dcbdcb332b3" Mar 09 14:08:05 crc kubenswrapper[4723]: I0309 14:08:05.435920 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551088-ccmrx" Mar 09 14:08:05 crc kubenswrapper[4723]: I0309 14:08:05.883840 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551082-x77jw"] Mar 09 14:08:05 crc kubenswrapper[4723]: I0309 14:08:05.900371 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551082-x77jw"] Mar 09 14:08:06 crc kubenswrapper[4723]: I0309 14:08:06.895183 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03dc001b-84c1-40ab-8357-92909562c177" path="/var/lib/kubelet/pods/03dc001b-84c1-40ab-8357-92909562c177/volumes" Mar 09 14:08:07 crc kubenswrapper[4723]: I0309 14:08:07.882052 4723 scope.go:117] "RemoveContainer" containerID="37326c58a388082cd97d4e16296313d00865dba4dd28ca9ddbf4cdd0872efba7" Mar 09 14:08:07 crc kubenswrapper[4723]: E0309 14:08:07.882490 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:08:11 crc kubenswrapper[4723]: I0309 14:08:11.454614 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rlhjc" Mar 09 14:08:11 crc kubenswrapper[4723]: I0309 14:08:11.527748 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rlhjc"] Mar 09 14:08:11 crc kubenswrapper[4723]: I0309 14:08:11.528081 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rlhjc" podUID="cd50c257-b948-4187-98af-a7b71768eb04" containerName="registry-server" containerID="cri-o://0d9da217c2a6cd4d5253674f8937efc6289c3eaf833f4ea875c058ddd402bf4b" gracePeriod=2 Mar 09 14:08:12 crc kubenswrapper[4723]: I0309 14:08:12.125191 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rlhjc" Mar 09 14:08:12 crc kubenswrapper[4723]: I0309 14:08:12.267618 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd50c257-b948-4187-98af-a7b71768eb04-catalog-content\") pod \"cd50c257-b948-4187-98af-a7b71768eb04\" (UID: \"cd50c257-b948-4187-98af-a7b71768eb04\") " Mar 09 14:08:12 crc kubenswrapper[4723]: I0309 14:08:12.267710 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd50c257-b948-4187-98af-a7b71768eb04-utilities\") pod \"cd50c257-b948-4187-98af-a7b71768eb04\" (UID: \"cd50c257-b948-4187-98af-a7b71768eb04\") " Mar 09 14:08:12 crc kubenswrapper[4723]: I0309 14:08:12.267746 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzmmq\" (UniqueName: \"kubernetes.io/projected/cd50c257-b948-4187-98af-a7b71768eb04-kube-api-access-fzmmq\") pod \"cd50c257-b948-4187-98af-a7b71768eb04\" (UID: \"cd50c257-b948-4187-98af-a7b71768eb04\") " Mar 09 14:08:12 crc kubenswrapper[4723]: I0309 14:08:12.268621 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd50c257-b948-4187-98af-a7b71768eb04-utilities" (OuterVolumeSpecName: "utilities") pod "cd50c257-b948-4187-98af-a7b71768eb04" (UID: "cd50c257-b948-4187-98af-a7b71768eb04"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:08:12 crc kubenswrapper[4723]: I0309 14:08:12.275681 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd50c257-b948-4187-98af-a7b71768eb04-kube-api-access-fzmmq" (OuterVolumeSpecName: "kube-api-access-fzmmq") pod "cd50c257-b948-4187-98af-a7b71768eb04" (UID: "cd50c257-b948-4187-98af-a7b71768eb04"). InnerVolumeSpecName "kube-api-access-fzmmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:08:12 crc kubenswrapper[4723]: I0309 14:08:12.359375 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd50c257-b948-4187-98af-a7b71768eb04-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cd50c257-b948-4187-98af-a7b71768eb04" (UID: "cd50c257-b948-4187-98af-a7b71768eb04"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:08:12 crc kubenswrapper[4723]: I0309 14:08:12.370627 4723 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd50c257-b948-4187-98af-a7b71768eb04-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:08:12 crc kubenswrapper[4723]: I0309 14:08:12.370660 4723 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd50c257-b948-4187-98af-a7b71768eb04-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:08:12 crc kubenswrapper[4723]: I0309 14:08:12.370670 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzmmq\" (UniqueName: \"kubernetes.io/projected/cd50c257-b948-4187-98af-a7b71768eb04-kube-api-access-fzmmq\") on node \"crc\" DevicePath \"\"" Mar 09 14:08:12 crc kubenswrapper[4723]: I0309 14:08:12.521992 4723 generic.go:334] "Generic (PLEG): container finished" podID="cd50c257-b948-4187-98af-a7b71768eb04" containerID="0d9da217c2a6cd4d5253674f8937efc6289c3eaf833f4ea875c058ddd402bf4b" exitCode=0 Mar 09 14:08:12 crc kubenswrapper[4723]: I0309 14:08:12.522067 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rlhjc" event={"ID":"cd50c257-b948-4187-98af-a7b71768eb04","Type":"ContainerDied","Data":"0d9da217c2a6cd4d5253674f8937efc6289c3eaf833f4ea875c058ddd402bf4b"} Mar 09 14:08:12 crc kubenswrapper[4723]: I0309 14:08:12.522080 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rlhjc" Mar 09 14:08:12 crc kubenswrapper[4723]: I0309 14:08:12.522105 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rlhjc" event={"ID":"cd50c257-b948-4187-98af-a7b71768eb04","Type":"ContainerDied","Data":"de239c2ef8f1643f5abf1d941a44f891642757e560c98efabc8e2974eeea3b8f"} Mar 09 14:08:12 crc kubenswrapper[4723]: I0309 14:08:12.522126 4723 scope.go:117] "RemoveContainer" containerID="0d9da217c2a6cd4d5253674f8937efc6289c3eaf833f4ea875c058ddd402bf4b" Mar 09 14:08:12 crc kubenswrapper[4723]: I0309 14:08:12.558790 4723 scope.go:117] "RemoveContainer" containerID="97ee7a05e931e4da70d3bf1eb491c59d0c1e8c2fe7ce4bf913eb5f63585c5de2" Mar 09 14:08:12 crc kubenswrapper[4723]: I0309 14:08:12.571811 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rlhjc"] Mar 09 14:08:12 crc kubenswrapper[4723]: I0309 14:08:12.589907 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rlhjc"] Mar 09 14:08:12 crc kubenswrapper[4723]: I0309 14:08:12.607421 4723 scope.go:117] "RemoveContainer" containerID="ab5f463d6e694cb00607f6a0fff2f5db68da02fb998f3c8070092859255d8c8e" Mar 09 14:08:12 crc kubenswrapper[4723]: I0309 14:08:12.663996 4723 scope.go:117] "RemoveContainer" containerID="0d9da217c2a6cd4d5253674f8937efc6289c3eaf833f4ea875c058ddd402bf4b" Mar 09 14:08:12 crc kubenswrapper[4723]: E0309 14:08:12.664511 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d9da217c2a6cd4d5253674f8937efc6289c3eaf833f4ea875c058ddd402bf4b\": container with ID starting with 0d9da217c2a6cd4d5253674f8937efc6289c3eaf833f4ea875c058ddd402bf4b not found: ID does not exist" containerID="0d9da217c2a6cd4d5253674f8937efc6289c3eaf833f4ea875c058ddd402bf4b" Mar 09 14:08:12 crc kubenswrapper[4723]: I0309 14:08:12.664546 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d9da217c2a6cd4d5253674f8937efc6289c3eaf833f4ea875c058ddd402bf4b"} err="failed to get container status \"0d9da217c2a6cd4d5253674f8937efc6289c3eaf833f4ea875c058ddd402bf4b\": rpc error: code = NotFound desc = could not find container \"0d9da217c2a6cd4d5253674f8937efc6289c3eaf833f4ea875c058ddd402bf4b\": container with ID starting with 0d9da217c2a6cd4d5253674f8937efc6289c3eaf833f4ea875c058ddd402bf4b not found: ID does not exist" Mar 09 14:08:12 crc kubenswrapper[4723]: I0309 14:08:12.664568 4723 scope.go:117] "RemoveContainer" containerID="97ee7a05e931e4da70d3bf1eb491c59d0c1e8c2fe7ce4bf913eb5f63585c5de2" Mar 09 14:08:12 crc kubenswrapper[4723]: E0309 14:08:12.665165 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97ee7a05e931e4da70d3bf1eb491c59d0c1e8c2fe7ce4bf913eb5f63585c5de2\": container with ID starting with 97ee7a05e931e4da70d3bf1eb491c59d0c1e8c2fe7ce4bf913eb5f63585c5de2 not found: ID does not exist" containerID="97ee7a05e931e4da70d3bf1eb491c59d0c1e8c2fe7ce4bf913eb5f63585c5de2" Mar 09 14:08:12 crc kubenswrapper[4723]: I0309 14:08:12.665189 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97ee7a05e931e4da70d3bf1eb491c59d0c1e8c2fe7ce4bf913eb5f63585c5de2"} err="failed to get container status \"97ee7a05e931e4da70d3bf1eb491c59d0c1e8c2fe7ce4bf913eb5f63585c5de2\": rpc error: code = NotFound desc = could not find container \"97ee7a05e931e4da70d3bf1eb491c59d0c1e8c2fe7ce4bf913eb5f63585c5de2\": container with ID starting with 97ee7a05e931e4da70d3bf1eb491c59d0c1e8c2fe7ce4bf913eb5f63585c5de2 not found: ID does not exist" Mar 09 14:08:12 crc kubenswrapper[4723]: I0309 14:08:12.665201 4723 scope.go:117] "RemoveContainer" containerID="ab5f463d6e694cb00607f6a0fff2f5db68da02fb998f3c8070092859255d8c8e" Mar 09 14:08:12 crc kubenswrapper[4723]: E0309 14:08:12.665509 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab5f463d6e694cb00607f6a0fff2f5db68da02fb998f3c8070092859255d8c8e\": container with ID starting with ab5f463d6e694cb00607f6a0fff2f5db68da02fb998f3c8070092859255d8c8e not found: ID does not exist" containerID="ab5f463d6e694cb00607f6a0fff2f5db68da02fb998f3c8070092859255d8c8e" Mar 09 14:08:12 crc kubenswrapper[4723]: I0309 14:08:12.665552 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab5f463d6e694cb00607f6a0fff2f5db68da02fb998f3c8070092859255d8c8e"} err="failed to get container status \"ab5f463d6e694cb00607f6a0fff2f5db68da02fb998f3c8070092859255d8c8e\": rpc error: code = NotFound desc = could not find container \"ab5f463d6e694cb00607f6a0fff2f5db68da02fb998f3c8070092859255d8c8e\": container with ID starting with ab5f463d6e694cb00607f6a0fff2f5db68da02fb998f3c8070092859255d8c8e not found: ID does not exist" Mar 09 14:08:12 crc kubenswrapper[4723]: I0309 14:08:12.896234 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd50c257-b948-4187-98af-a7b71768eb04" path="/var/lib/kubelet/pods/cd50c257-b948-4187-98af-a7b71768eb04/volumes" Mar 09 14:08:19 crc kubenswrapper[4723]: I0309 14:08:19.881085 4723 scope.go:117] "RemoveContainer" containerID="37326c58a388082cd97d4e16296313d00865dba4dd28ca9ddbf4cdd0872efba7" Mar 09 14:08:19 crc kubenswrapper[4723]: E0309 14:08:19.881947 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:08:26 crc kubenswrapper[4723]: I0309 14:08:26.727412 4723 scope.go:117] "RemoveContainer" containerID="bd1e5f8b4e5a2309315267c54bd1a7e2b75f35c70a175866a745500ca499cec9" Mar 09 14:08:34 crc kubenswrapper[4723]: I0309 14:08:34.881882 4723 scope.go:117] "RemoveContainer" containerID="37326c58a388082cd97d4e16296313d00865dba4dd28ca9ddbf4cdd0872efba7" Mar 09 14:08:34 crc kubenswrapper[4723]: E0309 14:08:34.882784 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:08:45 crc kubenswrapper[4723]: I0309 14:08:45.348354 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lsxml"] Mar 09 14:08:45 crc kubenswrapper[4723]: E0309 14:08:45.349537 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd50c257-b948-4187-98af-a7b71768eb04" containerName="extract-content" Mar 09 14:08:45 crc kubenswrapper[4723]: I0309 14:08:45.349555 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd50c257-b948-4187-98af-a7b71768eb04" containerName="extract-content" Mar 09 14:08:45 crc kubenswrapper[4723]: E0309 14:08:45.349577 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd50c257-b948-4187-98af-a7b71768eb04" containerName="extract-utilities" Mar 09 14:08:45 crc kubenswrapper[4723]: I0309 14:08:45.349586 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd50c257-b948-4187-98af-a7b71768eb04" containerName="extract-utilities" Mar 09 14:08:45 crc kubenswrapper[4723]: E0309 14:08:45.349631 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b413173e-1c5b-4fc7-9536-0fc73d3feaa3" containerName="oc" Mar 09 14:08:45 crc kubenswrapper[4723]: I0309 14:08:45.349640 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="b413173e-1c5b-4fc7-9536-0fc73d3feaa3" containerName="oc" Mar 09 14:08:45 crc kubenswrapper[4723]: E0309 14:08:45.349664 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd50c257-b948-4187-98af-a7b71768eb04" containerName="registry-server" Mar 09 14:08:45 crc kubenswrapper[4723]: I0309 14:08:45.349674 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd50c257-b948-4187-98af-a7b71768eb04" containerName="registry-server" Mar 09 14:08:45 crc kubenswrapper[4723]: I0309 14:08:45.349974 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd50c257-b948-4187-98af-a7b71768eb04" containerName="registry-server" Mar 09 14:08:45 crc kubenswrapper[4723]: I0309 14:08:45.349998 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="b413173e-1c5b-4fc7-9536-0fc73d3feaa3" containerName="oc" Mar 09 14:08:45 crc kubenswrapper[4723]: I0309 14:08:45.352241 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lsxml" Mar 09 14:08:45 crc kubenswrapper[4723]: I0309 14:08:45.371343 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lsxml"] Mar 09 14:08:45 crc kubenswrapper[4723]: I0309 14:08:45.461885 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d218bbea-0065-4906-a512-d4683d158463-catalog-content\") pod \"community-operators-lsxml\" (UID: \"d218bbea-0065-4906-a512-d4683d158463\") " pod="openshift-marketplace/community-operators-lsxml" Mar 09 14:08:45 crc kubenswrapper[4723]: I0309 14:08:45.462012 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g88g8\" (UniqueName: \"kubernetes.io/projected/d218bbea-0065-4906-a512-d4683d158463-kube-api-access-g88g8\") pod \"community-operators-lsxml\" (UID: \"d218bbea-0065-4906-a512-d4683d158463\") " pod="openshift-marketplace/community-operators-lsxml" Mar 09 14:08:45 crc kubenswrapper[4723]: I0309 14:08:45.462304 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d218bbea-0065-4906-a512-d4683d158463-utilities\") pod \"community-operators-lsxml\" (UID: \"d218bbea-0065-4906-a512-d4683d158463\") " pod="openshift-marketplace/community-operators-lsxml" Mar 09 14:08:45 crc kubenswrapper[4723]: I0309 14:08:45.565077 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d218bbea-0065-4906-a512-d4683d158463-catalog-content\") pod \"community-operators-lsxml\" (UID: \"d218bbea-0065-4906-a512-d4683d158463\") " pod="openshift-marketplace/community-operators-lsxml" Mar 09 14:08:45 crc kubenswrapper[4723]: I0309 14:08:45.565257 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g88g8\" (UniqueName: \"kubernetes.io/projected/d218bbea-0065-4906-a512-d4683d158463-kube-api-access-g88g8\") pod \"community-operators-lsxml\" (UID: \"d218bbea-0065-4906-a512-d4683d158463\") " pod="openshift-marketplace/community-operators-lsxml" Mar 09 14:08:45 crc kubenswrapper[4723]: I0309 14:08:45.565344 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d218bbea-0065-4906-a512-d4683d158463-utilities\") pod \"community-operators-lsxml\" (UID: \"d218bbea-0065-4906-a512-d4683d158463\") " pod="openshift-marketplace/community-operators-lsxml" Mar 09 14:08:45 crc kubenswrapper[4723]: I0309 14:08:45.565773 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d218bbea-0065-4906-a512-d4683d158463-catalog-content\") pod \"community-operators-lsxml\" (UID: \"d218bbea-0065-4906-a512-d4683d158463\") " pod="openshift-marketplace/community-operators-lsxml" Mar 09 14:08:45 crc kubenswrapper[4723]: I0309 14:08:45.565998 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d218bbea-0065-4906-a512-d4683d158463-utilities\") pod \"community-operators-lsxml\" (UID: \"d218bbea-0065-4906-a512-d4683d158463\") " pod="openshift-marketplace/community-operators-lsxml" Mar 09 14:08:45 crc kubenswrapper[4723]: I0309 14:08:45.589330 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g88g8\" (UniqueName: \"kubernetes.io/projected/d218bbea-0065-4906-a512-d4683d158463-kube-api-access-g88g8\") pod \"community-operators-lsxml\" (UID: \"d218bbea-0065-4906-a512-d4683d158463\") " pod="openshift-marketplace/community-operators-lsxml" Mar 09 14:08:45 crc kubenswrapper[4723]: I0309 14:08:45.729201 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lsxml" Mar 09 14:08:46 crc kubenswrapper[4723]: I0309 14:08:46.389809 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lsxml"] Mar 09 14:08:46 crc kubenswrapper[4723]: W0309 14:08:46.701950 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd218bbea_0065_4906_a512_d4683d158463.slice/crio-c88318945e60bed454eeb1761c469be42c9cd82c475475183170b564306758a3 WatchSource:0}: Error finding container c88318945e60bed454eeb1761c469be42c9cd82c475475183170b564306758a3: Status 404 returned error can't find the container with id c88318945e60bed454eeb1761c469be42c9cd82c475475183170b564306758a3 Mar 09 14:08:46 crc kubenswrapper[4723]: I0309 14:08:46.885207 4723 scope.go:117] "RemoveContainer" containerID="37326c58a388082cd97d4e16296313d00865dba4dd28ca9ddbf4cdd0872efba7" Mar 09 14:08:46 crc kubenswrapper[4723]: E0309 14:08:46.885825 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:08:46 crc kubenswrapper[4723]: I0309 14:08:46.957950 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lsxml" event={"ID":"d218bbea-0065-4906-a512-d4683d158463","Type":"ContainerStarted","Data":"3056d651be60e5f5136312c3f914e0026fe1be4c9c44da7ba16d0ee2352f7f2b"} Mar 09 14:08:46 crc kubenswrapper[4723]: I0309 14:08:46.957994 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lsxml" event={"ID":"d218bbea-0065-4906-a512-d4683d158463","Type":"ContainerStarted","Data":"c88318945e60bed454eeb1761c469be42c9cd82c475475183170b564306758a3"} Mar 09 14:08:47 crc kubenswrapper[4723]: I0309 14:08:47.974974 4723 generic.go:334] "Generic (PLEG): container finished" podID="d218bbea-0065-4906-a512-d4683d158463" containerID="3056d651be60e5f5136312c3f914e0026fe1be4c9c44da7ba16d0ee2352f7f2b" exitCode=0 Mar 09 14:08:47 crc kubenswrapper[4723]: I0309 14:08:47.975088 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lsxml" event={"ID":"d218bbea-0065-4906-a512-d4683d158463","Type":"ContainerDied","Data":"3056d651be60e5f5136312c3f914e0026fe1be4c9c44da7ba16d0ee2352f7f2b"} Mar 09 14:08:48 crc kubenswrapper[4723]: I0309 14:08:48.987570 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lsxml" event={"ID":"d218bbea-0065-4906-a512-d4683d158463","Type":"ContainerStarted","Data":"115b194dbdeead15c41b2b5cf4fad05da70a2fd9bc4d4759ecfcebc919c89d42"} Mar 09 14:08:51 crc kubenswrapper[4723]: I0309 14:08:51.012885 4723 generic.go:334] "Generic (PLEG): container finished" podID="d218bbea-0065-4906-a512-d4683d158463" containerID="115b194dbdeead15c41b2b5cf4fad05da70a2fd9bc4d4759ecfcebc919c89d42" exitCode=0 Mar 09 14:08:51 crc kubenswrapper[4723]: I0309 14:08:51.012966 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lsxml" event={"ID":"d218bbea-0065-4906-a512-d4683d158463","Type":"ContainerDied","Data":"115b194dbdeead15c41b2b5cf4fad05da70a2fd9bc4d4759ecfcebc919c89d42"} Mar 09 14:08:52 crc kubenswrapper[4723]: I0309 14:08:52.368458 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lsxml" event={"ID":"d218bbea-0065-4906-a512-d4683d158463","Type":"ContainerStarted","Data":"cfa50422b5c4a7b48351a36036f67d5f3d6f7629f9e31fd2917ef0384860d0aa"} Mar 09 14:08:52 crc kubenswrapper[4723]: I0309 14:08:52.406594 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lsxml" podStartSLOduration=3.919449702 podStartE2EDuration="7.40653252s" podCreationTimestamp="2026-03-09 14:08:45 +0000 UTC" firstStartedPulling="2026-03-09 14:08:47.980552045 +0000 UTC m=+4201.995019585" lastFinishedPulling="2026-03-09 14:08:51.467634863 +0000 UTC m=+4205.482102403" observedRunningTime="2026-03-09 14:08:52.397963912 +0000 UTC m=+4206.412431462" watchObservedRunningTime="2026-03-09 14:08:52.40653252 +0000 UTC m=+4206.421000060" Mar 09 14:08:55 crc kubenswrapper[4723]: I0309 14:08:55.729871 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lsxml" Mar 09 14:08:55 crc kubenswrapper[4723]: I0309 14:08:55.730646 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lsxml" Mar 09 14:08:55 crc kubenswrapper[4723]: I0309 14:08:55.818916 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lsxml" Mar 09 14:08:56 crc kubenswrapper[4723]: I0309 14:08:56.466432 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lsxml" Mar 09 14:08:56 crc kubenswrapper[4723]: I0309 14:08:56.519248 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lsxml"] Mar 09 14:08:58 crc kubenswrapper[4723]: I0309 14:08:58.433026 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lsxml" podUID="d218bbea-0065-4906-a512-d4683d158463" containerName="registry-server" containerID="cri-o://cfa50422b5c4a7b48351a36036f67d5f3d6f7629f9e31fd2917ef0384860d0aa" gracePeriod=2 Mar 09 14:08:59 crc kubenswrapper[4723]: I0309 14:08:59.016941 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lsxml" Mar 09 14:08:59 crc kubenswrapper[4723]: I0309 14:08:59.105529 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d218bbea-0065-4906-a512-d4683d158463-utilities\") pod \"d218bbea-0065-4906-a512-d4683d158463\" (UID: \"d218bbea-0065-4906-a512-d4683d158463\") " Mar 09 14:08:59 crc kubenswrapper[4723]: I0309 14:08:59.105670 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d218bbea-0065-4906-a512-d4683d158463-catalog-content\") pod \"d218bbea-0065-4906-a512-d4683d158463\" (UID: \"d218bbea-0065-4906-a512-d4683d158463\") " Mar 09 14:08:59 crc kubenswrapper[4723]: I0309 14:08:59.105828 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g88g8\" (UniqueName: \"kubernetes.io/projected/d218bbea-0065-4906-a512-d4683d158463-kube-api-access-g88g8\") pod \"d218bbea-0065-4906-a512-d4683d158463\" (UID: \"d218bbea-0065-4906-a512-d4683d158463\") " Mar 09 14:08:59 crc kubenswrapper[4723]: I0309 14:08:59.106846 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d218bbea-0065-4906-a512-d4683d158463-utilities" (OuterVolumeSpecName: "utilities") pod "d218bbea-0065-4906-a512-d4683d158463" (UID: "d218bbea-0065-4906-a512-d4683d158463"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:08:59 crc kubenswrapper[4723]: I0309 14:08:59.107314 4723 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d218bbea-0065-4906-a512-d4683d158463-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:08:59 crc kubenswrapper[4723]: I0309 14:08:59.118157 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d218bbea-0065-4906-a512-d4683d158463-kube-api-access-g88g8" (OuterVolumeSpecName: "kube-api-access-g88g8") pod "d218bbea-0065-4906-a512-d4683d158463" (UID: "d218bbea-0065-4906-a512-d4683d158463"). InnerVolumeSpecName "kube-api-access-g88g8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:08:59 crc kubenswrapper[4723]: I0309 14:08:59.210135 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g88g8\" (UniqueName: \"kubernetes.io/projected/d218bbea-0065-4906-a512-d4683d158463-kube-api-access-g88g8\") on node \"crc\" DevicePath \"\"" Mar 09 14:08:59 crc kubenswrapper[4723]: I0309 14:08:59.371762 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d218bbea-0065-4906-a512-d4683d158463-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d218bbea-0065-4906-a512-d4683d158463" (UID: "d218bbea-0065-4906-a512-d4683d158463"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:08:59 crc kubenswrapper[4723]: I0309 14:08:59.416188 4723 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d218bbea-0065-4906-a512-d4683d158463-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:08:59 crc kubenswrapper[4723]: I0309 14:08:59.444721 4723 generic.go:334] "Generic (PLEG): container finished" podID="d218bbea-0065-4906-a512-d4683d158463" containerID="cfa50422b5c4a7b48351a36036f67d5f3d6f7629f9e31fd2917ef0384860d0aa" exitCode=0 Mar 09 14:08:59 crc kubenswrapper[4723]: I0309 14:08:59.444764 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lsxml" event={"ID":"d218bbea-0065-4906-a512-d4683d158463","Type":"ContainerDied","Data":"cfa50422b5c4a7b48351a36036f67d5f3d6f7629f9e31fd2917ef0384860d0aa"} Mar 09 14:08:59 crc kubenswrapper[4723]: I0309 14:08:59.444790 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lsxml" event={"ID":"d218bbea-0065-4906-a512-d4683d158463","Type":"ContainerDied","Data":"c88318945e60bed454eeb1761c469be42c9cd82c475475183170b564306758a3"} Mar 09 14:08:59 crc kubenswrapper[4723]: I0309 14:08:59.444808 4723 scope.go:117] "RemoveContainer" containerID="cfa50422b5c4a7b48351a36036f67d5f3d6f7629f9e31fd2917ef0384860d0aa" Mar 09 14:08:59 crc kubenswrapper[4723]: I0309 14:08:59.445047 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lsxml" Mar 09 14:08:59 crc kubenswrapper[4723]: I0309 14:08:59.485648 4723 scope.go:117] "RemoveContainer" containerID="115b194dbdeead15c41b2b5cf4fad05da70a2fd9bc4d4759ecfcebc919c89d42" Mar 09 14:08:59 crc kubenswrapper[4723]: I0309 14:08:59.490919 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lsxml"] Mar 09 14:08:59 crc kubenswrapper[4723]: I0309 14:08:59.502112 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lsxml"] Mar 09 14:08:59 crc kubenswrapper[4723]: I0309 14:08:59.510436 4723 scope.go:117] "RemoveContainer" containerID="3056d651be60e5f5136312c3f914e0026fe1be4c9c44da7ba16d0ee2352f7f2b" Mar 09 14:08:59 crc kubenswrapper[4723]: I0309 14:08:59.566326 4723 scope.go:117] "RemoveContainer" containerID="cfa50422b5c4a7b48351a36036f67d5f3d6f7629f9e31fd2917ef0384860d0aa" Mar 09 14:08:59 crc kubenswrapper[4723]: E0309 14:08:59.566748 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfa50422b5c4a7b48351a36036f67d5f3d6f7629f9e31fd2917ef0384860d0aa\": container with ID starting with cfa50422b5c4a7b48351a36036f67d5f3d6f7629f9e31fd2917ef0384860d0aa not found: ID does not exist" containerID="cfa50422b5c4a7b48351a36036f67d5f3d6f7629f9e31fd2917ef0384860d0aa" Mar 09 14:08:59 crc kubenswrapper[4723]: I0309 14:08:59.566777 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfa50422b5c4a7b48351a36036f67d5f3d6f7629f9e31fd2917ef0384860d0aa"} err="failed to get container status \"cfa50422b5c4a7b48351a36036f67d5f3d6f7629f9e31fd2917ef0384860d0aa\": rpc error: code = NotFound desc = could not find container \"cfa50422b5c4a7b48351a36036f67d5f3d6f7629f9e31fd2917ef0384860d0aa\": container with ID starting with cfa50422b5c4a7b48351a36036f67d5f3d6f7629f9e31fd2917ef0384860d0aa not found: ID does not exist" Mar 09 14:08:59 crc kubenswrapper[4723]: I0309 14:08:59.566798 4723 scope.go:117] "RemoveContainer" containerID="115b194dbdeead15c41b2b5cf4fad05da70a2fd9bc4d4759ecfcebc919c89d42" Mar 09 14:08:59 crc kubenswrapper[4723]: E0309 14:08:59.567235 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"115b194dbdeead15c41b2b5cf4fad05da70a2fd9bc4d4759ecfcebc919c89d42\": container with ID starting with 115b194dbdeead15c41b2b5cf4fad05da70a2fd9bc4d4759ecfcebc919c89d42 not found: ID does not exist" containerID="115b194dbdeead15c41b2b5cf4fad05da70a2fd9bc4d4759ecfcebc919c89d42" Mar 09 14:08:59 crc kubenswrapper[4723]: I0309 14:08:59.567307 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"115b194dbdeead15c41b2b5cf4fad05da70a2fd9bc4d4759ecfcebc919c89d42"} err="failed to get container status \"115b194dbdeead15c41b2b5cf4fad05da70a2fd9bc4d4759ecfcebc919c89d42\": rpc error: code = NotFound desc = could not find container \"115b194dbdeead15c41b2b5cf4fad05da70a2fd9bc4d4759ecfcebc919c89d42\": container with ID starting with 115b194dbdeead15c41b2b5cf4fad05da70a2fd9bc4d4759ecfcebc919c89d42 not found: ID does not exist" Mar 09 14:08:59 crc kubenswrapper[4723]: I0309 14:08:59.567365 4723 scope.go:117] "RemoveContainer" containerID="3056d651be60e5f5136312c3f914e0026fe1be4c9c44da7ba16d0ee2352f7f2b" Mar 09 14:08:59 crc kubenswrapper[4723]: E0309 14:08:59.567675 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3056d651be60e5f5136312c3f914e0026fe1be4c9c44da7ba16d0ee2352f7f2b\": container with ID starting with 3056d651be60e5f5136312c3f914e0026fe1be4c9c44da7ba16d0ee2352f7f2b not found: ID does not exist" containerID="3056d651be60e5f5136312c3f914e0026fe1be4c9c44da7ba16d0ee2352f7f2b" Mar 09 14:08:59 crc kubenswrapper[4723]: I0309 14:08:59.567694 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3056d651be60e5f5136312c3f914e0026fe1be4c9c44da7ba16d0ee2352f7f2b"} err="failed to get container status \"3056d651be60e5f5136312c3f914e0026fe1be4c9c44da7ba16d0ee2352f7f2b\": rpc error: code = NotFound desc = could not find container \"3056d651be60e5f5136312c3f914e0026fe1be4c9c44da7ba16d0ee2352f7f2b\": container with ID starting with 3056d651be60e5f5136312c3f914e0026fe1be4c9c44da7ba16d0ee2352f7f2b not found: ID does not exist" Mar 09 14:09:00 crc kubenswrapper[4723]: I0309 14:09:00.881537 4723 scope.go:117] "RemoveContainer" containerID="37326c58a388082cd97d4e16296313d00865dba4dd28ca9ddbf4cdd0872efba7" Mar 09 14:09:00 crc kubenswrapper[4723]: E0309 14:09:00.882319 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:09:00 crc kubenswrapper[4723]: I0309 14:09:00.900776 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d218bbea-0065-4906-a512-d4683d158463" path="/var/lib/kubelet/pods/d218bbea-0065-4906-a512-d4683d158463/volumes" Mar 09 14:09:09 crc kubenswrapper[4723]: I0309 14:09:09.175504 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-p8kd8"] Mar 09 14:09:09 crc kubenswrapper[4723]: E0309 14:09:09.176421 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d218bbea-0065-4906-a512-d4683d158463" containerName="registry-server" Mar 09 14:09:09 crc kubenswrapper[4723]: I0309 14:09:09.176435 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="d218bbea-0065-4906-a512-d4683d158463" containerName="registry-server" Mar 09 14:09:09 crc kubenswrapper[4723]: E0309 14:09:09.176458 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d218bbea-0065-4906-a512-d4683d158463" containerName="extract-utilities" Mar 09 14:09:09 crc kubenswrapper[4723]: I0309 14:09:09.176465 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="d218bbea-0065-4906-a512-d4683d158463" containerName="extract-utilities" Mar 09 14:09:09 crc kubenswrapper[4723]: E0309 14:09:09.176478 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d218bbea-0065-4906-a512-d4683d158463" containerName="extract-content" Mar 09 14:09:09 crc kubenswrapper[4723]: I0309 14:09:09.176483 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="d218bbea-0065-4906-a512-d4683d158463" containerName="extract-content" Mar 09 14:09:09 crc kubenswrapper[4723]: I0309 14:09:09.176705 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="d218bbea-0065-4906-a512-d4683d158463" containerName="registry-server" Mar 09 14:09:09 crc kubenswrapper[4723]: I0309 14:09:09.180093 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p8kd8" Mar 09 14:09:09 crc kubenswrapper[4723]: I0309 14:09:09.197976 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p8kd8"] Mar 09 14:09:09 crc kubenswrapper[4723]: I0309 14:09:09.339347 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/453247db-2939-4260-b723-2c1f805b9493-catalog-content\") pod \"redhat-marketplace-p8kd8\" (UID: \"453247db-2939-4260-b723-2c1f805b9493\") " pod="openshift-marketplace/redhat-marketplace-p8kd8" Mar 09 14:09:09 crc kubenswrapper[4723]: I0309 14:09:09.339489 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpnks\" (UniqueName: \"kubernetes.io/projected/453247db-2939-4260-b723-2c1f805b9493-kube-api-access-cpnks\") pod \"redhat-marketplace-p8kd8\" (UID: \"453247db-2939-4260-b723-2c1f805b9493\") " pod="openshift-marketplace/redhat-marketplace-p8kd8" Mar 09 14:09:09 crc kubenswrapper[4723]: I0309 14:09:09.339532 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/453247db-2939-4260-b723-2c1f805b9493-utilities\") pod \"redhat-marketplace-p8kd8\" (UID: \"453247db-2939-4260-b723-2c1f805b9493\") " pod="openshift-marketplace/redhat-marketplace-p8kd8" Mar 09 14:09:09 crc kubenswrapper[4723]: I0309 14:09:09.441232 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/453247db-2939-4260-b723-2c1f805b9493-catalog-content\") pod \"redhat-marketplace-p8kd8\" (UID: \"453247db-2939-4260-b723-2c1f805b9493\") " pod="openshift-marketplace/redhat-marketplace-p8kd8" Mar 09 14:09:09 crc kubenswrapper[4723]: I0309 14:09:09.441383 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpnks\" (UniqueName: \"kubernetes.io/projected/453247db-2939-4260-b723-2c1f805b9493-kube-api-access-cpnks\") pod \"redhat-marketplace-p8kd8\" (UID: \"453247db-2939-4260-b723-2c1f805b9493\") " pod="openshift-marketplace/redhat-marketplace-p8kd8" Mar 09 14:09:09 crc kubenswrapper[4723]: I0309 14:09:09.441429 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/453247db-2939-4260-b723-2c1f805b9493-utilities\") pod \"redhat-marketplace-p8kd8\" (UID: \"453247db-2939-4260-b723-2c1f805b9493\") " pod="openshift-marketplace/redhat-marketplace-p8kd8" Mar 09 14:09:09 crc kubenswrapper[4723]: I0309 14:09:09.441812 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/453247db-2939-4260-b723-2c1f805b9493-utilities\") pod \"redhat-marketplace-p8kd8\" (UID: \"453247db-2939-4260-b723-2c1f805b9493\") " pod="openshift-marketplace/redhat-marketplace-p8kd8" Mar 09 14:09:09 crc kubenswrapper[4723]: I0309 14:09:09.441843 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/453247db-2939-4260-b723-2c1f805b9493-catalog-content\") pod \"redhat-marketplace-p8kd8\" (UID: \"453247db-2939-4260-b723-2c1f805b9493\") " pod="openshift-marketplace/redhat-marketplace-p8kd8" Mar 09 14:09:09 crc kubenswrapper[4723]: I0309 14:09:09.465887 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpnks\" (UniqueName: \"kubernetes.io/projected/453247db-2939-4260-b723-2c1f805b9493-kube-api-access-cpnks\") pod \"redhat-marketplace-p8kd8\" (UID: \"453247db-2939-4260-b723-2c1f805b9493\") " pod="openshift-marketplace/redhat-marketplace-p8kd8" Mar 09 14:09:09 crc kubenswrapper[4723]: I0309 14:09:09.505486 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p8kd8" Mar 09 14:09:10 crc kubenswrapper[4723]: I0309 14:09:10.158046 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p8kd8"] Mar 09 14:09:10 crc kubenswrapper[4723]: I0309 14:09:10.598904 4723 generic.go:334] "Generic (PLEG): container finished" podID="453247db-2939-4260-b723-2c1f805b9493" containerID="7312220d2e3f085c3c9fa56cce7c5dd75fb91f6708c337677eaf5b17aded17e1" exitCode=0 Mar 09 14:09:10 crc kubenswrapper[4723]: I0309 14:09:10.598996 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p8kd8" event={"ID":"453247db-2939-4260-b723-2c1f805b9493","Type":"ContainerDied","Data":"7312220d2e3f085c3c9fa56cce7c5dd75fb91f6708c337677eaf5b17aded17e1"} Mar 09 14:09:10 crc kubenswrapper[4723]: I0309 14:09:10.599242 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p8kd8" event={"ID":"453247db-2939-4260-b723-2c1f805b9493","Type":"ContainerStarted","Data":"160af5a95ba116ab1cba93443baa6c4598114bef667e4107e918974d9ad95c6c"} Mar 09 14:09:10 crc kubenswrapper[4723]: I0309 14:09:10.601498 4723 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 14:09:11 crc kubenswrapper[4723]: I0309 14:09:11.880975 4723 scope.go:117] "RemoveContainer" containerID="37326c58a388082cd97d4e16296313d00865dba4dd28ca9ddbf4cdd0872efba7" Mar 09 14:09:11 crc kubenswrapper[4723]: E0309 14:09:11.881840 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:09:12 crc kubenswrapper[4723]: I0309 14:09:12.630842 4723 generic.go:334] "Generic (PLEG): container finished" podID="453247db-2939-4260-b723-2c1f805b9493" containerID="d102890e98b1d5a05d96c675f3a322ff6ab10f537179d595225a92f2c4e1ce8a" exitCode=0 Mar 09 14:09:12 crc kubenswrapper[4723]: I0309 14:09:12.631101 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p8kd8" event={"ID":"453247db-2939-4260-b723-2c1f805b9493","Type":"ContainerDied","Data":"d102890e98b1d5a05d96c675f3a322ff6ab10f537179d595225a92f2c4e1ce8a"} Mar 09 14:09:13 crc kubenswrapper[4723]: I0309 14:09:13.646846 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p8kd8" event={"ID":"453247db-2939-4260-b723-2c1f805b9493","Type":"ContainerStarted","Data":"7cac0fc96251cbb1b6599b7d2d346818401b468957bef04873cb92ec9593b232"} Mar 09 14:09:13 crc kubenswrapper[4723]: I0309 14:09:13.663047 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-p8kd8" podStartSLOduration=2.210287612 podStartE2EDuration="4.663032621s" podCreationTimestamp="2026-03-09 14:09:09 +0000 UTC" firstStartedPulling="2026-03-09 14:09:10.601212494 +0000 UTC m=+4224.615680034" lastFinishedPulling="2026-03-09 14:09:13.053957503 +0000 UTC m=+4227.068425043" observedRunningTime="2026-03-09 14:09:13.661920971 +0000 UTC m=+4227.676388521" watchObservedRunningTime="2026-03-09 14:09:13.663032621 +0000 UTC m=+4227.677500151" Mar 09 14:09:19 crc kubenswrapper[4723]: I0309 14:09:19.506151 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-p8kd8" Mar 09 14:09:19 crc kubenswrapper[4723]: I0309 14:09:19.506657 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-p8kd8" Mar 09 14:09:19 crc kubenswrapper[4723]: I0309 14:09:19.873817 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-p8kd8" Mar 09 14:09:19 crc kubenswrapper[4723]: I0309 14:09:19.923000 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-p8kd8" Mar 09 14:09:20 crc kubenswrapper[4723]: I0309 14:09:20.121164 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p8kd8"] Mar 09 14:09:21 crc kubenswrapper[4723]: I0309 14:09:21.737445 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-p8kd8" podUID="453247db-2939-4260-b723-2c1f805b9493" containerName="registry-server" containerID="cri-o://7cac0fc96251cbb1b6599b7d2d346818401b468957bef04873cb92ec9593b232" gracePeriod=2 Mar 09 14:09:22 crc kubenswrapper[4723]: I0309 14:09:22.267007 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p8kd8" Mar 09 14:09:22 crc kubenswrapper[4723]: I0309 14:09:22.466131 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpnks\" (UniqueName: \"kubernetes.io/projected/453247db-2939-4260-b723-2c1f805b9493-kube-api-access-cpnks\") pod \"453247db-2939-4260-b723-2c1f805b9493\" (UID: \"453247db-2939-4260-b723-2c1f805b9493\") " Mar 09 14:09:22 crc kubenswrapper[4723]: I0309 14:09:22.466508 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/453247db-2939-4260-b723-2c1f805b9493-catalog-content\") pod \"453247db-2939-4260-b723-2c1f805b9493\" (UID: \"453247db-2939-4260-b723-2c1f805b9493\") " Mar 09 14:09:22 crc kubenswrapper[4723]: I0309 14:09:22.466793 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/453247db-2939-4260-b723-2c1f805b9493-utilities\") pod \"453247db-2939-4260-b723-2c1f805b9493\" (UID: \"453247db-2939-4260-b723-2c1f805b9493\") " Mar 09 14:09:22 crc kubenswrapper[4723]: I0309 14:09:22.467684 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/453247db-2939-4260-b723-2c1f805b9493-utilities" (OuterVolumeSpecName: "utilities") pod "453247db-2939-4260-b723-2c1f805b9493" (UID: "453247db-2939-4260-b723-2c1f805b9493"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:09:22 crc kubenswrapper[4723]: I0309 14:09:22.472969 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/453247db-2939-4260-b723-2c1f805b9493-kube-api-access-cpnks" (OuterVolumeSpecName: "kube-api-access-cpnks") pod "453247db-2939-4260-b723-2c1f805b9493" (UID: "453247db-2939-4260-b723-2c1f805b9493"). InnerVolumeSpecName "kube-api-access-cpnks". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:09:22 crc kubenswrapper[4723]: I0309 14:09:22.499844 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/453247db-2939-4260-b723-2c1f805b9493-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "453247db-2939-4260-b723-2c1f805b9493" (UID: "453247db-2939-4260-b723-2c1f805b9493"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:09:22 crc kubenswrapper[4723]: I0309 14:09:22.569654 4723 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/453247db-2939-4260-b723-2c1f805b9493-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:22 crc kubenswrapper[4723]: I0309 14:09:22.569687 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpnks\" (UniqueName: \"kubernetes.io/projected/453247db-2939-4260-b723-2c1f805b9493-kube-api-access-cpnks\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:22 crc kubenswrapper[4723]: I0309 14:09:22.569698 4723 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/453247db-2939-4260-b723-2c1f805b9493-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:22 crc kubenswrapper[4723]: I0309 14:09:22.820945 4723 generic.go:334] "Generic (PLEG): container finished" podID="453247db-2939-4260-b723-2c1f805b9493" containerID="7cac0fc96251cbb1b6599b7d2d346818401b468957bef04873cb92ec9593b232" exitCode=0 Mar 09 14:09:22 crc kubenswrapper[4723]: I0309 14:09:22.820999 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p8kd8" event={"ID":"453247db-2939-4260-b723-2c1f805b9493","Type":"ContainerDied","Data":"7cac0fc96251cbb1b6599b7d2d346818401b468957bef04873cb92ec9593b232"} Mar 09 14:09:22 crc kubenswrapper[4723]: I0309 14:09:22.821031 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p8kd8" event={"ID":"453247db-2939-4260-b723-2c1f805b9493","Type":"ContainerDied","Data":"160af5a95ba116ab1cba93443baa6c4598114bef667e4107e918974d9ad95c6c"} Mar 09 14:09:22 crc kubenswrapper[4723]: I0309 14:09:22.821052 4723 scope.go:117] "RemoveContainer" containerID="7cac0fc96251cbb1b6599b7d2d346818401b468957bef04873cb92ec9593b232" Mar 09 14:09:22 crc kubenswrapper[4723]: I0309 14:09:22.821354 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p8kd8" Mar 09 14:09:22 crc kubenswrapper[4723]: I0309 14:09:22.855476 4723 scope.go:117] "RemoveContainer" containerID="d102890e98b1d5a05d96c675f3a322ff6ab10f537179d595225a92f2c4e1ce8a" Mar 09 14:09:22 crc kubenswrapper[4723]: I0309 14:09:22.870944 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p8kd8"] Mar 09 14:09:22 crc kubenswrapper[4723]: I0309 14:09:22.881306 4723 scope.go:117] "RemoveContainer" containerID="37326c58a388082cd97d4e16296313d00865dba4dd28ca9ddbf4cdd0872efba7" Mar 09 14:09:22 crc kubenswrapper[4723]: E0309 14:09:22.881744 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:09:22 crc kubenswrapper[4723]: I0309 14:09:22.895182 4723 scope.go:117] "RemoveContainer" containerID="7312220d2e3f085c3c9fa56cce7c5dd75fb91f6708c337677eaf5b17aded17e1" Mar 09 14:09:22 crc kubenswrapper[4723]: I0309 14:09:22.898630 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-p8kd8"] Mar 09 14:09:22 crc kubenswrapper[4723]: I0309 14:09:22.963134 4723 scope.go:117] "RemoveContainer" containerID="7cac0fc96251cbb1b6599b7d2d346818401b468957bef04873cb92ec9593b232" Mar 09 14:09:22 crc kubenswrapper[4723]: E0309 14:09:22.964244 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cac0fc96251cbb1b6599b7d2d346818401b468957bef04873cb92ec9593b232\": container with ID starting with 7cac0fc96251cbb1b6599b7d2d346818401b468957bef04873cb92ec9593b232 not found: ID does not exist" containerID="7cac0fc96251cbb1b6599b7d2d346818401b468957bef04873cb92ec9593b232" Mar 09 14:09:22 crc kubenswrapper[4723]: I0309 14:09:22.964280 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cac0fc96251cbb1b6599b7d2d346818401b468957bef04873cb92ec9593b232"} err="failed to get container status \"7cac0fc96251cbb1b6599b7d2d346818401b468957bef04873cb92ec9593b232\": rpc error: code = NotFound desc = could not find container \"7cac0fc96251cbb1b6599b7d2d346818401b468957bef04873cb92ec9593b232\": container with ID starting with 7cac0fc96251cbb1b6599b7d2d346818401b468957bef04873cb92ec9593b232 not found: ID does not exist" Mar 09 14:09:22 crc kubenswrapper[4723]: I0309 14:09:22.964306 4723 scope.go:117] "RemoveContainer" containerID="d102890e98b1d5a05d96c675f3a322ff6ab10f537179d595225a92f2c4e1ce8a" Mar 09 14:09:22 crc kubenswrapper[4723]: E0309 14:09:22.968587 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d102890e98b1d5a05d96c675f3a322ff6ab10f537179d595225a92f2c4e1ce8a\": container with ID starting with d102890e98b1d5a05d96c675f3a322ff6ab10f537179d595225a92f2c4e1ce8a not found: ID does not exist" containerID="d102890e98b1d5a05d96c675f3a322ff6ab10f537179d595225a92f2c4e1ce8a" Mar 09 14:09:22 crc kubenswrapper[4723]: I0309 14:09:22.968647 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d102890e98b1d5a05d96c675f3a322ff6ab10f537179d595225a92f2c4e1ce8a"} err="failed to get container status \"d102890e98b1d5a05d96c675f3a322ff6ab10f537179d595225a92f2c4e1ce8a\": rpc error: code = NotFound desc = could not find container \"d102890e98b1d5a05d96c675f3a322ff6ab10f537179d595225a92f2c4e1ce8a\": container with ID starting with d102890e98b1d5a05d96c675f3a322ff6ab10f537179d595225a92f2c4e1ce8a not found: ID does not exist" Mar 09 14:09:22 crc kubenswrapper[4723]: I0309 14:09:22.968678 4723 scope.go:117] "RemoveContainer" containerID="7312220d2e3f085c3c9fa56cce7c5dd75fb91f6708c337677eaf5b17aded17e1" Mar 09 14:09:22 crc kubenswrapper[4723]: E0309 14:09:22.969248 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7312220d2e3f085c3c9fa56cce7c5dd75fb91f6708c337677eaf5b17aded17e1\": container with ID starting with 7312220d2e3f085c3c9fa56cce7c5dd75fb91f6708c337677eaf5b17aded17e1 not found: ID does not exist" containerID="7312220d2e3f085c3c9fa56cce7c5dd75fb91f6708c337677eaf5b17aded17e1" Mar 09 14:09:22 crc kubenswrapper[4723]: I0309 14:09:22.969297 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7312220d2e3f085c3c9fa56cce7c5dd75fb91f6708c337677eaf5b17aded17e1"} err="failed to get container status \"7312220d2e3f085c3c9fa56cce7c5dd75fb91f6708c337677eaf5b17aded17e1\": rpc error: code = NotFound desc = could not find container \"7312220d2e3f085c3c9fa56cce7c5dd75fb91f6708c337677eaf5b17aded17e1\": container with ID starting with 7312220d2e3f085c3c9fa56cce7c5dd75fb91f6708c337677eaf5b17aded17e1 not found: ID does not exist" Mar 09 14:09:24 crc kubenswrapper[4723]: I0309 14:09:24.896893 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="453247db-2939-4260-b723-2c1f805b9493" path="/var/lib/kubelet/pods/453247db-2939-4260-b723-2c1f805b9493/volumes" Mar 09 14:09:33 crc kubenswrapper[4723]: I0309 14:09:33.881222 4723 scope.go:117] "RemoveContainer" containerID="37326c58a388082cd97d4e16296313d00865dba4dd28ca9ddbf4cdd0872efba7" Mar 09 14:09:33 crc kubenswrapper[4723]: E0309 14:09:33.881970 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:09:48 crc kubenswrapper[4723]: I0309 14:09:48.883673 4723 scope.go:117] "RemoveContainer" containerID="37326c58a388082cd97d4e16296313d00865dba4dd28ca9ddbf4cdd0872efba7" Mar 09 14:09:50 crc kubenswrapper[4723]: I0309 14:09:50.131768 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" event={"ID":"983d5ed4-cfc7-402a-b226-29dc071c6e4e","Type":"ContainerStarted","Data":"9ce6a9fca22c7136a89fb9bc0304454b543a23b1215c6e803804804ed52d6cff"} Mar 09 14:10:00 crc kubenswrapper[4723]: I0309 14:10:00.161118 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551090-vnvqs"] Mar 09 14:10:00 crc kubenswrapper[4723]: E0309 14:10:00.162456 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="453247db-2939-4260-b723-2c1f805b9493" containerName="extract-content" Mar 09 14:10:00 crc kubenswrapper[4723]: I0309 14:10:00.162477 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="453247db-2939-4260-b723-2c1f805b9493" containerName="extract-content" Mar 09 14:10:00 crc kubenswrapper[4723]: E0309 14:10:00.162497 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="453247db-2939-4260-b723-2c1f805b9493" containerName="extract-utilities" Mar 09 14:10:00 crc kubenswrapper[4723]: I0309 14:10:00.162509 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="453247db-2939-4260-b723-2c1f805b9493" containerName="extract-utilities" Mar 09 14:10:00 crc kubenswrapper[4723]: E0309 14:10:00.162526 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="453247db-2939-4260-b723-2c1f805b9493" containerName="registry-server" Mar 09 14:10:00 crc kubenswrapper[4723]: I0309 14:10:00.162537 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="453247db-2939-4260-b723-2c1f805b9493" containerName="registry-server" Mar 09 14:10:00 crc kubenswrapper[4723]: I0309 14:10:00.162971 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="453247db-2939-4260-b723-2c1f805b9493" containerName="registry-server" Mar 09 14:10:00 crc kubenswrapper[4723]: I0309 14:10:00.164219 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551090-vnvqs" Mar 09 14:10:00 crc kubenswrapper[4723]: I0309 14:10:00.168474 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:10:00 crc kubenswrapper[4723]: I0309 14:10:00.169041 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jrp4x" Mar 09 14:10:00 crc kubenswrapper[4723]: I0309 14:10:00.169075 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:10:00 crc kubenswrapper[4723]: I0309 14:10:00.176800 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551090-vnvqs"] Mar 09 14:10:00 crc kubenswrapper[4723]: I0309 14:10:00.237243 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf6b7\" (UniqueName: \"kubernetes.io/projected/02167125-6205-49c3-8c1e-00349c7020a1-kube-api-access-mf6b7\") pod \"auto-csr-approver-29551090-vnvqs\" (UID: \"02167125-6205-49c3-8c1e-00349c7020a1\") " pod="openshift-infra/auto-csr-approver-29551090-vnvqs" Mar 09 14:10:00 crc kubenswrapper[4723]: I0309 14:10:00.339511 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mf6b7\" (UniqueName: \"kubernetes.io/projected/02167125-6205-49c3-8c1e-00349c7020a1-kube-api-access-mf6b7\") pod \"auto-csr-approver-29551090-vnvqs\" (UID: \"02167125-6205-49c3-8c1e-00349c7020a1\") " pod="openshift-infra/auto-csr-approver-29551090-vnvqs" Mar 09 14:10:00 crc kubenswrapper[4723]: I0309 14:10:00.357683 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf6b7\" (UniqueName: \"kubernetes.io/projected/02167125-6205-49c3-8c1e-00349c7020a1-kube-api-access-mf6b7\") pod \"auto-csr-approver-29551090-vnvqs\" (UID: \"02167125-6205-49c3-8c1e-00349c7020a1\") " pod="openshift-infra/auto-csr-approver-29551090-vnvqs" Mar 09 14:10:00 crc kubenswrapper[4723]: I0309 14:10:00.500956 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551090-vnvqs" Mar 09 14:10:00 crc kubenswrapper[4723]: I0309 14:10:00.987978 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551090-vnvqs"] Mar 09 14:10:01 crc kubenswrapper[4723]: I0309 14:10:01.253280 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551090-vnvqs" event={"ID":"02167125-6205-49c3-8c1e-00349c7020a1","Type":"ContainerStarted","Data":"0cd8437eefda997390281559d024a0a74b8190f4e0b3de2864ef251b595a6127"} Mar 09 14:10:03 crc kubenswrapper[4723]: I0309 14:10:03.290773 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551090-vnvqs" event={"ID":"02167125-6205-49c3-8c1e-00349c7020a1","Type":"ContainerStarted","Data":"a37d2094007d5dc0206e5b2e394ad7dd8904b61f44210734f2e38eb32e30ed0d"} Mar 09 14:10:03 crc kubenswrapper[4723]: I0309 14:10:03.318918 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551090-vnvqs" podStartSLOduration=1.796859495 podStartE2EDuration="3.318891946s" podCreationTimestamp="2026-03-09 14:10:00 +0000 UTC" firstStartedPulling="2026-03-09 14:10:00.989466383 +0000 UTC m=+4275.003933923" lastFinishedPulling="2026-03-09 14:10:02.511498834 +0000 UTC m=+4276.525966374" observedRunningTime="2026-03-09 14:10:03.307282807 +0000 UTC m=+4277.321750357" watchObservedRunningTime="2026-03-09 14:10:03.318891946 +0000 UTC m=+4277.333359506" Mar 09 14:10:04 crc kubenswrapper[4723]: I0309 14:10:04.304111 4723 generic.go:334] "Generic (PLEG): container finished" podID="02167125-6205-49c3-8c1e-00349c7020a1" containerID="a37d2094007d5dc0206e5b2e394ad7dd8904b61f44210734f2e38eb32e30ed0d" exitCode=0 Mar 09 14:10:04 crc kubenswrapper[4723]: I0309 14:10:04.304226 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551090-vnvqs" event={"ID":"02167125-6205-49c3-8c1e-00349c7020a1","Type":"ContainerDied","Data":"a37d2094007d5dc0206e5b2e394ad7dd8904b61f44210734f2e38eb32e30ed0d"} Mar 09 14:10:06 crc kubenswrapper[4723]: I0309 14:10:06.225599 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551090-vnvqs" Mar 09 14:10:06 crc kubenswrapper[4723]: I0309 14:10:06.349127 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551090-vnvqs" event={"ID":"02167125-6205-49c3-8c1e-00349c7020a1","Type":"ContainerDied","Data":"0cd8437eefda997390281559d024a0a74b8190f4e0b3de2864ef251b595a6127"} Mar 09 14:10:06 crc kubenswrapper[4723]: I0309 14:10:06.349173 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0cd8437eefda997390281559d024a0a74b8190f4e0b3de2864ef251b595a6127" Mar 09 14:10:06 crc kubenswrapper[4723]: I0309 14:10:06.349247 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551090-vnvqs" Mar 09 14:10:06 crc kubenswrapper[4723]: I0309 14:10:06.385158 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551084-dvqp6"] Mar 09 14:10:06 crc kubenswrapper[4723]: I0309 14:10:06.397264 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mf6b7\" (UniqueName: \"kubernetes.io/projected/02167125-6205-49c3-8c1e-00349c7020a1-kube-api-access-mf6b7\") pod \"02167125-6205-49c3-8c1e-00349c7020a1\" (UID: \"02167125-6205-49c3-8c1e-00349c7020a1\") " Mar 09 14:10:06 crc kubenswrapper[4723]: I0309 14:10:06.398312 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551084-dvqp6"] Mar 09 14:10:06 crc kubenswrapper[4723]: I0309 14:10:06.404826 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02167125-6205-49c3-8c1e-00349c7020a1-kube-api-access-mf6b7" (OuterVolumeSpecName: "kube-api-access-mf6b7") pod "02167125-6205-49c3-8c1e-00349c7020a1" (UID: "02167125-6205-49c3-8c1e-00349c7020a1"). InnerVolumeSpecName "kube-api-access-mf6b7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:10:06 crc kubenswrapper[4723]: I0309 14:10:06.501381 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mf6b7\" (UniqueName: \"kubernetes.io/projected/02167125-6205-49c3-8c1e-00349c7020a1-kube-api-access-mf6b7\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:06 crc kubenswrapper[4723]: I0309 14:10:06.894090 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fac835c9-8cbe-4691-90d9-c5eecebecb33" path="/var/lib/kubelet/pods/fac835c9-8cbe-4691-90d9-c5eecebecb33/volumes" Mar 09 14:10:26 crc kubenswrapper[4723]: I0309 14:10:26.969569 4723 scope.go:117] "RemoveContainer" containerID="81ef19abb07c261bd33d6faaf3ed6d1dea7822396db4acf88df7e37a529bf598" Mar 09 14:12:00 crc kubenswrapper[4723]: I0309 14:12:00.145522 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551092-rr82b"] Mar 09 14:12:00 crc kubenswrapper[4723]: E0309 14:12:00.146663 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02167125-6205-49c3-8c1e-00349c7020a1" containerName="oc" Mar 09 14:12:00 crc kubenswrapper[4723]: I0309 14:12:00.146682 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="02167125-6205-49c3-8c1e-00349c7020a1" containerName="oc" Mar 09 14:12:00 crc kubenswrapper[4723]: I0309 14:12:00.147035 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="02167125-6205-49c3-8c1e-00349c7020a1" containerName="oc" Mar 09 14:12:00 crc kubenswrapper[4723]: I0309 14:12:00.148072 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551092-rr82b" Mar 09 14:12:00 crc kubenswrapper[4723]: I0309 14:12:00.153416 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:12:00 crc kubenswrapper[4723]: I0309 14:12:00.153664 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jrp4x" Mar 09 14:12:00 crc kubenswrapper[4723]: I0309 14:12:00.153673 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:12:00 crc kubenswrapper[4723]: I0309 14:12:00.157479 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551092-rr82b"] Mar 09 14:12:00 crc kubenswrapper[4723]: I0309 14:12:00.319723 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt7qf\" (UniqueName: \"kubernetes.io/projected/c7ab0b7e-14e2-499f-a1d8-cdcdc0a0c248-kube-api-access-tt7qf\") pod \"auto-csr-approver-29551092-rr82b\" (UID: \"c7ab0b7e-14e2-499f-a1d8-cdcdc0a0c248\") " pod="openshift-infra/auto-csr-approver-29551092-rr82b" Mar 09 14:12:00 crc kubenswrapper[4723]: I0309 14:12:00.421714 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt7qf\" (UniqueName: \"kubernetes.io/projected/c7ab0b7e-14e2-499f-a1d8-cdcdc0a0c248-kube-api-access-tt7qf\") pod \"auto-csr-approver-29551092-rr82b\" (UID: \"c7ab0b7e-14e2-499f-a1d8-cdcdc0a0c248\") " pod="openshift-infra/auto-csr-approver-29551092-rr82b" Mar 09 14:12:00 crc kubenswrapper[4723]: I0309 14:12:00.444508 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt7qf\" (UniqueName: \"kubernetes.io/projected/c7ab0b7e-14e2-499f-a1d8-cdcdc0a0c248-kube-api-access-tt7qf\") pod \"auto-csr-approver-29551092-rr82b\" (UID: \"c7ab0b7e-14e2-499f-a1d8-cdcdc0a0c248\") " pod="openshift-infra/auto-csr-approver-29551092-rr82b" Mar 09 14:12:00 crc kubenswrapper[4723]: I0309 14:12:00.489437 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551092-rr82b" Mar 09 14:12:01 crc kubenswrapper[4723]: I0309 14:12:01.014047 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551092-rr82b"] Mar 09 14:12:01 crc kubenswrapper[4723]: I0309 14:12:01.699248 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551092-rr82b" event={"ID":"c7ab0b7e-14e2-499f-a1d8-cdcdc0a0c248","Type":"ContainerStarted","Data":"534392e6d19382df6cef5abc86d6c5e85071a09d8d86c1f5605053cb79f8c144"} Mar 09 14:12:02 crc kubenswrapper[4723]: I0309 14:12:02.709997 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551092-rr82b" event={"ID":"c7ab0b7e-14e2-499f-a1d8-cdcdc0a0c248","Type":"ContainerStarted","Data":"2635494cbf65a6a16a563448d3ec1a8c8b08ef035226bf16e28a69b6ee49f9c7"} Mar 09 14:12:02 crc kubenswrapper[4723]: I0309 14:12:02.726037 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551092-rr82b" podStartSLOduration=1.5187041890000001 podStartE2EDuration="2.726018816s" podCreationTimestamp="2026-03-09 14:12:00 +0000 UTC" firstStartedPulling="2026-03-09 14:12:01.025263812 +0000 UTC m=+4395.039731362" lastFinishedPulling="2026-03-09 14:12:02.232578449 +0000 UTC m=+4396.247045989" observedRunningTime="2026-03-09 14:12:02.72351812 +0000 UTC m=+4396.737985660" watchObservedRunningTime="2026-03-09 14:12:02.726018816 +0000 UTC m=+4396.740486346" Mar 09 14:12:03 crc kubenswrapper[4723]: I0309 14:12:03.719656 4723 generic.go:334] "Generic (PLEG): container finished" podID="c7ab0b7e-14e2-499f-a1d8-cdcdc0a0c248" containerID="2635494cbf65a6a16a563448d3ec1a8c8b08ef035226bf16e28a69b6ee49f9c7" exitCode=0 Mar 09 14:12:03 crc kubenswrapper[4723]: I0309 14:12:03.719706 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551092-rr82b" event={"ID":"c7ab0b7e-14e2-499f-a1d8-cdcdc0a0c248","Type":"ContainerDied","Data":"2635494cbf65a6a16a563448d3ec1a8c8b08ef035226bf16e28a69b6ee49f9c7"} Mar 09 14:12:03 crc kubenswrapper[4723]: I0309 14:12:03.946729 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:12:03 crc kubenswrapper[4723]: I0309 14:12:03.946796 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:12:05 crc kubenswrapper[4723]: I0309 14:12:05.169820 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551092-rr82b" Mar 09 14:12:05 crc kubenswrapper[4723]: I0309 14:12:05.358332 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tt7qf\" (UniqueName: \"kubernetes.io/projected/c7ab0b7e-14e2-499f-a1d8-cdcdc0a0c248-kube-api-access-tt7qf\") pod \"c7ab0b7e-14e2-499f-a1d8-cdcdc0a0c248\" (UID: \"c7ab0b7e-14e2-499f-a1d8-cdcdc0a0c248\") " Mar 09 14:12:05 crc kubenswrapper[4723]: I0309 14:12:05.371150 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7ab0b7e-14e2-499f-a1d8-cdcdc0a0c248-kube-api-access-tt7qf" (OuterVolumeSpecName: "kube-api-access-tt7qf") pod "c7ab0b7e-14e2-499f-a1d8-cdcdc0a0c248" (UID: "c7ab0b7e-14e2-499f-a1d8-cdcdc0a0c248"). InnerVolumeSpecName "kube-api-access-tt7qf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:12:05 crc kubenswrapper[4723]: I0309 14:12:05.463367 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tt7qf\" (UniqueName: \"kubernetes.io/projected/c7ab0b7e-14e2-499f-a1d8-cdcdc0a0c248-kube-api-access-tt7qf\") on node \"crc\" DevicePath \"\"" Mar 09 14:12:05 crc kubenswrapper[4723]: I0309 14:12:05.746475 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551092-rr82b" event={"ID":"c7ab0b7e-14e2-499f-a1d8-cdcdc0a0c248","Type":"ContainerDied","Data":"534392e6d19382df6cef5abc86d6c5e85071a09d8d86c1f5605053cb79f8c144"} Mar 09 14:12:05 crc kubenswrapper[4723]: I0309 14:12:05.746746 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="534392e6d19382df6cef5abc86d6c5e85071a09d8d86c1f5605053cb79f8c144" Mar 09 14:12:05 crc kubenswrapper[4723]: I0309 14:12:05.746558 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551092-rr82b" Mar 09 14:12:05 crc kubenswrapper[4723]: I0309 14:12:05.809501 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551086-brxt4"] Mar 09 14:12:05 crc kubenswrapper[4723]: I0309 14:12:05.820619 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551086-brxt4"] Mar 09 14:12:06 crc kubenswrapper[4723]: I0309 14:12:06.894024 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c5f83a7-4b1c-40da-9642-65a5df6acdda" path="/var/lib/kubelet/pods/1c5f83a7-4b1c-40da-9642-65a5df6acdda/volumes" Mar 09 14:12:27 crc kubenswrapper[4723]: I0309 14:12:27.741521 4723 scope.go:117] "RemoveContainer" containerID="06c8d261467db592ba6f97f0abb4809b00ee522b7b82452ac6328feb1d904775" Mar 09 14:12:33 crc kubenswrapper[4723]: I0309 14:12:33.947171 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:12:33 crc kubenswrapper[4723]: I0309 14:12:33.948669 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:13:03 crc kubenswrapper[4723]: I0309 14:13:03.946444 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:13:03 crc kubenswrapper[4723]: I0309 14:13:03.947044 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:13:03 crc kubenswrapper[4723]: I0309 14:13:03.947097 4723 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" Mar 09 14:13:03 crc kubenswrapper[4723]: I0309 14:13:03.948173 4723 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9ce6a9fca22c7136a89fb9bc0304454b543a23b1215c6e803804804ed52d6cff"} pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 14:13:03 crc kubenswrapper[4723]: I0309 14:13:03.948277 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" containerID="cri-o://9ce6a9fca22c7136a89fb9bc0304454b543a23b1215c6e803804804ed52d6cff" gracePeriod=600 Mar 09 14:13:04 crc kubenswrapper[4723]: I0309 14:13:04.410818 4723 generic.go:334] "Generic (PLEG): container finished" podID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerID="9ce6a9fca22c7136a89fb9bc0304454b543a23b1215c6e803804804ed52d6cff" exitCode=0 Mar 09 14:13:04 crc kubenswrapper[4723]: I0309 14:13:04.410894 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" event={"ID":"983d5ed4-cfc7-402a-b226-29dc071c6e4e","Type":"ContainerDied","Data":"9ce6a9fca22c7136a89fb9bc0304454b543a23b1215c6e803804804ed52d6cff"} Mar 09 14:13:04 crc kubenswrapper[4723]: I0309 14:13:04.411161 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" event={"ID":"983d5ed4-cfc7-402a-b226-29dc071c6e4e","Type":"ContainerStarted","Data":"b5af8cf4e6b647cc3921b22101db6b30f5e74eefccad33aef3904a6fa1114186"} Mar 09 14:13:04 crc kubenswrapper[4723]: I0309 14:13:04.411204 4723 scope.go:117] "RemoveContainer" containerID="37326c58a388082cd97d4e16296313d00865dba4dd28ca9ddbf4cdd0872efba7" Mar 09 14:13:13 crc kubenswrapper[4723]: I0309 14:13:13.300737 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 09 14:13:13 crc kubenswrapper[4723]: E0309 14:13:13.301774 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7ab0b7e-14e2-499f-a1d8-cdcdc0a0c248" containerName="oc" Mar 09 14:13:13 crc kubenswrapper[4723]: I0309 14:13:13.301791 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7ab0b7e-14e2-499f-a1d8-cdcdc0a0c248" containerName="oc" Mar 09 14:13:13 crc kubenswrapper[4723]: I0309 14:13:13.302085 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7ab0b7e-14e2-499f-a1d8-cdcdc0a0c248" containerName="oc" Mar 09 14:13:13 crc kubenswrapper[4723]: I0309 14:13:13.302977 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 09 14:13:13 crc kubenswrapper[4723]: I0309 14:13:13.305218 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 09 14:13:13 crc kubenswrapper[4723]: I0309 14:13:13.305439 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 09 14:13:13 crc kubenswrapper[4723]: I0309 14:13:13.305747 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 09 14:13:13 crc kubenswrapper[4723]: I0309 14:13:13.305989 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-7zlc4" Mar 09 14:13:13 crc kubenswrapper[4723]: I0309 14:13:13.349091 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 09 14:13:13 crc kubenswrapper[4723]: I0309 14:13:13.435419 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"ef1f6085-70f7-44a1-bf7c-5b4c90284dda\") " pod="openstack/tempest-tests-tempest" Mar 09 14:13:13 crc kubenswrapper[4723]: I0309 14:13:13.435465 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"ef1f6085-70f7-44a1-bf7c-5b4c90284dda\") " pod="openstack/tempest-tests-tempest" Mar 09 14:13:13 crc kubenswrapper[4723]: I0309 14:13:13.435546 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"ef1f6085-70f7-44a1-bf7c-5b4c90284dda\") " pod="openstack/tempest-tests-tempest" Mar 09 14:13:13 crc kubenswrapper[4723]: I0309 14:13:13.435610 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"ef1f6085-70f7-44a1-bf7c-5b4c90284dda\") " pod="openstack/tempest-tests-tempest" Mar 09 14:13:13 crc kubenswrapper[4723]: I0309 14:13:13.435664 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbgbr\" (UniqueName: \"kubernetes.io/projected/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-kube-api-access-zbgbr\") pod \"tempest-tests-tempest\" (UID: \"ef1f6085-70f7-44a1-bf7c-5b4c90284dda\") " pod="openstack/tempest-tests-tempest" Mar 09 14:13:13 crc kubenswrapper[4723]: I0309 14:13:13.435692 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"ef1f6085-70f7-44a1-bf7c-5b4c90284dda\") " pod="openstack/tempest-tests-tempest" Mar 09 14:13:13 crc kubenswrapper[4723]: I0309 14:13:13.435908 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-config-data\") pod \"tempest-tests-tempest\" (UID: \"ef1f6085-70f7-44a1-bf7c-5b4c90284dda\") " pod="openstack/tempest-tests-tempest" Mar 09 14:13:13 crc kubenswrapper[4723]: I0309 14:13:13.435945 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"ef1f6085-70f7-44a1-bf7c-5b4c90284dda\") " pod="openstack/tempest-tests-tempest" Mar 09 14:13:13 crc kubenswrapper[4723]: I0309 14:13:13.435999 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"ef1f6085-70f7-44a1-bf7c-5b4c90284dda\") " pod="openstack/tempest-tests-tempest" Mar 09 14:13:13 crc kubenswrapper[4723]: I0309 14:13:13.538082 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"ef1f6085-70f7-44a1-bf7c-5b4c90284dda\") " pod="openstack/tempest-tests-tempest" Mar 09 14:13:13 crc kubenswrapper[4723]: I0309 14:13:13.538149 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbgbr\" (UniqueName: \"kubernetes.io/projected/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-kube-api-access-zbgbr\") pod \"tempest-tests-tempest\" (UID: \"ef1f6085-70f7-44a1-bf7c-5b4c90284dda\") " pod="openstack/tempest-tests-tempest" Mar 09 14:13:13 crc kubenswrapper[4723]: I0309 14:13:13.538176 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"ef1f6085-70f7-44a1-bf7c-5b4c90284dda\") " pod="openstack/tempest-tests-tempest" Mar 09 14:13:13 crc kubenswrapper[4723]: I0309 14:13:13.538221 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-config-data\") pod \"tempest-tests-tempest\" (UID: \"ef1f6085-70f7-44a1-bf7c-5b4c90284dda\") " pod="openstack/tempest-tests-tempest" Mar 09 14:13:13 crc kubenswrapper[4723]: I0309 14:13:13.538240 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"ef1f6085-70f7-44a1-bf7c-5b4c90284dda\") " pod="openstack/tempest-tests-tempest" Mar 09 14:13:13 crc kubenswrapper[4723]: I0309 14:13:13.538267 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"ef1f6085-70f7-44a1-bf7c-5b4c90284dda\") " pod="openstack/tempest-tests-tempest" Mar 09 14:13:13 crc kubenswrapper[4723]: I0309 14:13:13.538424 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"ef1f6085-70f7-44a1-bf7c-5b4c90284dda\") " pod="openstack/tempest-tests-tempest" Mar 09 14:13:13 crc kubenswrapper[4723]: I0309 14:13:13.538465 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"ef1f6085-70f7-44a1-bf7c-5b4c90284dda\") " pod="openstack/tempest-tests-tempest" Mar 09 14:13:13 crc kubenswrapper[4723]: I0309 14:13:13.538516 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"ef1f6085-70f7-44a1-bf7c-5b4c90284dda\") " pod="openstack/tempest-tests-tempest" Mar 09 14:13:13 crc kubenswrapper[4723]: I0309 14:13:13.538620 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"ef1f6085-70f7-44a1-bf7c-5b4c90284dda\") " pod="openstack/tempest-tests-tempest" Mar 09 14:13:13 crc kubenswrapper[4723]: I0309 14:13:13.538669 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"ef1f6085-70f7-44a1-bf7c-5b4c90284dda\") " pod="openstack/tempest-tests-tempest" Mar 09 14:13:13 crc kubenswrapper[4723]: I0309 14:13:13.539520 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"ef1f6085-70f7-44a1-bf7c-5b4c90284dda\") " pod="openstack/tempest-tests-tempest" Mar 09 14:13:13 crc kubenswrapper[4723]: I0309 14:13:13.539598 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-config-data\") pod \"tempest-tests-tempest\" (UID: \"ef1f6085-70f7-44a1-bf7c-5b4c90284dda\") " pod="openstack/tempest-tests-tempest" Mar 09 14:13:13 crc kubenswrapper[4723]: I0309 14:13:13.541430 4723 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"ef1f6085-70f7-44a1-bf7c-5b4c90284dda\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/tempest-tests-tempest" Mar 09 14:13:13 crc kubenswrapper[4723]: I0309 14:13:13.544181 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"ef1f6085-70f7-44a1-bf7c-5b4c90284dda\") " pod="openstack/tempest-tests-tempest" Mar 09 14:13:13 crc kubenswrapper[4723]: I0309 14:13:13.544434 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"ef1f6085-70f7-44a1-bf7c-5b4c90284dda\") " pod="openstack/tempest-tests-tempest" Mar 09 14:13:13 crc kubenswrapper[4723]: I0309 14:13:13.557532 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"ef1f6085-70f7-44a1-bf7c-5b4c90284dda\") " pod="openstack/tempest-tests-tempest" Mar 09 14:13:13 crc kubenswrapper[4723]: I0309 14:13:13.562352 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbgbr\" (UniqueName: \"kubernetes.io/projected/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-kube-api-access-zbgbr\") pod \"tempest-tests-tempest\" (UID: \"ef1f6085-70f7-44a1-bf7c-5b4c90284dda\") " pod="openstack/tempest-tests-tempest" Mar 09 14:13:13 crc kubenswrapper[4723]: I0309 14:13:13.607538 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"ef1f6085-70f7-44a1-bf7c-5b4c90284dda\") " pod="openstack/tempest-tests-tempest" Mar 09 14:13:13 crc kubenswrapper[4723]: I0309 14:13:13.633320 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 09 14:13:14 crc kubenswrapper[4723]: I0309 14:13:14.088468 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 09 14:13:14 crc kubenswrapper[4723]: I0309 14:13:14.521568 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ef1f6085-70f7-44a1-bf7c-5b4c90284dda","Type":"ContainerStarted","Data":"6e0084f9fa42ee4719ab6ccf98202cf5d53b860860419b6840ce7ff569491cf1"} Mar 09 14:13:54 crc kubenswrapper[4723]: E0309 14:13:54.959786 4723 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 09 14:13:54 crc kubenswrapper[4723]: E0309 14:13:54.966313 4723 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zbgbr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(ef1f6085-70f7-44a1-bf7c-5b4c90284dda): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 14:13:54 crc kubenswrapper[4723]: E0309 14:13:54.967929 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="ef1f6085-70f7-44a1-bf7c-5b4c90284dda" Mar 09 14:13:55 crc kubenswrapper[4723]: E0309 14:13:55.255543 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="ef1f6085-70f7-44a1-bf7c-5b4c90284dda" Mar 09 14:14:00 crc kubenswrapper[4723]: I0309 14:14:00.161443 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551094-v7l4f"] Mar 09 14:14:00 crc kubenswrapper[4723]: I0309 14:14:00.163899 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551094-v7l4f" Mar 09 14:14:00 crc kubenswrapper[4723]: I0309 14:14:00.166429 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:14:00 crc kubenswrapper[4723]: I0309 14:14:00.166889 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jrp4x" Mar 09 14:14:00 crc kubenswrapper[4723]: I0309 14:14:00.166927 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:14:00 crc kubenswrapper[4723]: I0309 14:14:00.198365 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551094-v7l4f"] Mar 09 14:14:00 crc kubenswrapper[4723]: I0309 14:14:00.320477 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtwk9\" (UniqueName: \"kubernetes.io/projected/1e5b9c61-8285-4bfb-aa90-1dcb3130a55d-kube-api-access-vtwk9\") pod \"auto-csr-approver-29551094-v7l4f\" (UID: \"1e5b9c61-8285-4bfb-aa90-1dcb3130a55d\") " pod="openshift-infra/auto-csr-approver-29551094-v7l4f" Mar 09 14:14:00 crc kubenswrapper[4723]: I0309 14:14:00.423395 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtwk9\" (UniqueName: \"kubernetes.io/projected/1e5b9c61-8285-4bfb-aa90-1dcb3130a55d-kube-api-access-vtwk9\") pod \"auto-csr-approver-29551094-v7l4f\" (UID: \"1e5b9c61-8285-4bfb-aa90-1dcb3130a55d\") " pod="openshift-infra/auto-csr-approver-29551094-v7l4f" Mar 09 14:14:00 crc kubenswrapper[4723]: I0309 14:14:00.443419 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtwk9\" (UniqueName: \"kubernetes.io/projected/1e5b9c61-8285-4bfb-aa90-1dcb3130a55d-kube-api-access-vtwk9\") pod \"auto-csr-approver-29551094-v7l4f\" (UID: \"1e5b9c61-8285-4bfb-aa90-1dcb3130a55d\") " pod="openshift-infra/auto-csr-approver-29551094-v7l4f" Mar 09 14:14:00 crc kubenswrapper[4723]: I0309 14:14:00.492680 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551094-v7l4f" Mar 09 14:14:01 crc kubenswrapper[4723]: I0309 14:14:01.034309 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551094-v7l4f"] Mar 09 14:14:01 crc kubenswrapper[4723]: I0309 14:14:01.346070 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551094-v7l4f" event={"ID":"1e5b9c61-8285-4bfb-aa90-1dcb3130a55d","Type":"ContainerStarted","Data":"e194240eb921b7c9625333b56067e4489e6991282409bc3d19ded917ed15b414"} Mar 09 14:14:03 crc kubenswrapper[4723]: I0309 14:14:03.365520 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551094-v7l4f" event={"ID":"1e5b9c61-8285-4bfb-aa90-1dcb3130a55d","Type":"ContainerStarted","Data":"9ba5e1c7e54a9750af3ad8343e7a78621063b1a6c1b4b940b126177930680dfa"} Mar 09 14:14:03 crc kubenswrapper[4723]: I0309 14:14:03.390238 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551094-v7l4f" podStartSLOduration=2.412620683 podStartE2EDuration="3.390218589s" podCreationTimestamp="2026-03-09 14:14:00 +0000 UTC" firstStartedPulling="2026-03-09 14:14:01.034647899 +0000 UTC m=+4515.049115439" lastFinishedPulling="2026-03-09 14:14:02.012245805 +0000 UTC m=+4516.026713345" observedRunningTime="2026-03-09 14:14:03.379376861 +0000 UTC m=+4517.393844401" watchObservedRunningTime="2026-03-09 14:14:03.390218589 +0000 UTC m=+4517.404686129" Mar 09 14:14:04 crc kubenswrapper[4723]: I0309 14:14:04.378535 4723 generic.go:334] "Generic (PLEG): container finished" podID="1e5b9c61-8285-4bfb-aa90-1dcb3130a55d" containerID="9ba5e1c7e54a9750af3ad8343e7a78621063b1a6c1b4b940b126177930680dfa" exitCode=0 Mar 09 14:14:04 crc kubenswrapper[4723]: I0309 14:14:04.378625 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551094-v7l4f" event={"ID":"1e5b9c61-8285-4bfb-aa90-1dcb3130a55d","Type":"ContainerDied","Data":"9ba5e1c7e54a9750af3ad8343e7a78621063b1a6c1b4b940b126177930680dfa"} Mar 09 14:14:05 crc kubenswrapper[4723]: I0309 14:14:05.898453 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551094-v7l4f" Mar 09 14:14:05 crc kubenswrapper[4723]: I0309 14:14:05.975760 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtwk9\" (UniqueName: \"kubernetes.io/projected/1e5b9c61-8285-4bfb-aa90-1dcb3130a55d-kube-api-access-vtwk9\") pod \"1e5b9c61-8285-4bfb-aa90-1dcb3130a55d\" (UID: \"1e5b9c61-8285-4bfb-aa90-1dcb3130a55d\") " Mar 09 14:14:05 crc kubenswrapper[4723]: I0309 14:14:05.987111 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e5b9c61-8285-4bfb-aa90-1dcb3130a55d-kube-api-access-vtwk9" (OuterVolumeSpecName: "kube-api-access-vtwk9") pod "1e5b9c61-8285-4bfb-aa90-1dcb3130a55d" (UID: "1e5b9c61-8285-4bfb-aa90-1dcb3130a55d"). InnerVolumeSpecName "kube-api-access-vtwk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:14:06 crc kubenswrapper[4723]: I0309 14:14:06.078366 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtwk9\" (UniqueName: \"kubernetes.io/projected/1e5b9c61-8285-4bfb-aa90-1dcb3130a55d-kube-api-access-vtwk9\") on node \"crc\" DevicePath \"\"" Mar 09 14:14:06 crc kubenswrapper[4723]: I0309 14:14:06.417692 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551094-v7l4f" event={"ID":"1e5b9c61-8285-4bfb-aa90-1dcb3130a55d","Type":"ContainerDied","Data":"e194240eb921b7c9625333b56067e4489e6991282409bc3d19ded917ed15b414"} Mar 09 14:14:06 crc kubenswrapper[4723]: I0309 14:14:06.417736 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e194240eb921b7c9625333b56067e4489e6991282409bc3d19ded917ed15b414" Mar 09 14:14:06 crc kubenswrapper[4723]: I0309 14:14:06.417751 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551094-v7l4f" Mar 09 14:14:06 crc kubenswrapper[4723]: I0309 14:14:06.450413 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551088-ccmrx"] Mar 09 14:14:06 crc kubenswrapper[4723]: I0309 14:14:06.463063 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551088-ccmrx"] Mar 09 14:14:06 crc kubenswrapper[4723]: I0309 14:14:06.893091 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b413173e-1c5b-4fc7-9536-0fc73d3feaa3" path="/var/lib/kubelet/pods/b413173e-1c5b-4fc7-9536-0fc73d3feaa3/volumes" Mar 09 14:14:07 crc kubenswrapper[4723]: I0309 14:14:07.534475 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 09 14:14:10 crc kubenswrapper[4723]: I0309 14:14:10.478562 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ef1f6085-70f7-44a1-bf7c-5b4c90284dda","Type":"ContainerStarted","Data":"48a8eb6f0e13d1ee2d14659873efe909710f769f9b738e0085c2cd703c964116"} Mar 09 14:14:10 crc kubenswrapper[4723]: I0309 14:14:10.497530 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=5.067828388 podStartE2EDuration="58.497507141s" podCreationTimestamp="2026-03-09 14:13:12 +0000 UTC" firstStartedPulling="2026-03-09 14:13:14.102460331 +0000 UTC m=+4468.116927871" lastFinishedPulling="2026-03-09 14:14:07.532139084 +0000 UTC m=+4521.546606624" observedRunningTime="2026-03-09 14:14:10.494340577 +0000 UTC m=+4524.508808117" watchObservedRunningTime="2026-03-09 14:14:10.497507141 +0000 UTC m=+4524.511974681" Mar 09 14:14:27 crc kubenswrapper[4723]: I0309 14:14:27.937382 4723 scope.go:117] "RemoveContainer" containerID="44947baa317e7f4abdfe531801f00827c46088386e6150ccadeacebc8083bfc0" Mar 09 14:15:00 crc kubenswrapper[4723]: I0309 14:15:00.274796 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551095-mtk7p"] Mar 09 14:15:00 crc kubenswrapper[4723]: E0309 14:15:00.277463 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e5b9c61-8285-4bfb-aa90-1dcb3130a55d" containerName="oc" Mar 09 14:15:00 crc kubenswrapper[4723]: I0309 14:15:00.277488 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e5b9c61-8285-4bfb-aa90-1dcb3130a55d" containerName="oc" Mar 09 14:15:00 crc kubenswrapper[4723]: I0309 14:15:00.277702 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e5b9c61-8285-4bfb-aa90-1dcb3130a55d" containerName="oc" Mar 09 14:15:00 crc kubenswrapper[4723]: I0309 14:15:00.278492 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-mtk7p" Mar 09 14:15:00 crc kubenswrapper[4723]: I0309 14:15:00.282541 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 14:15:00 crc kubenswrapper[4723]: I0309 14:15:00.296494 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 14:15:00 crc kubenswrapper[4723]: I0309 14:15:00.323096 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551095-mtk7p"] Mar 09 14:15:00 crc kubenswrapper[4723]: I0309 14:15:00.382419 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4dd73fc5-311f-41e2-9759-285a5344fbf3-config-volume\") pod \"collect-profiles-29551095-mtk7p\" (UID: \"4dd73fc5-311f-41e2-9759-285a5344fbf3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-mtk7p" Mar 09 14:15:00 crc kubenswrapper[4723]: I0309 14:15:00.382610 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rl6s4\" (UniqueName: \"kubernetes.io/projected/4dd73fc5-311f-41e2-9759-285a5344fbf3-kube-api-access-rl6s4\") pod \"collect-profiles-29551095-mtk7p\" (UID: \"4dd73fc5-311f-41e2-9759-285a5344fbf3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-mtk7p" Mar 09 14:15:00 crc kubenswrapper[4723]: I0309 14:15:00.382678 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4dd73fc5-311f-41e2-9759-285a5344fbf3-secret-volume\") pod \"collect-profiles-29551095-mtk7p\" (UID: \"4dd73fc5-311f-41e2-9759-285a5344fbf3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-mtk7p" Mar 09 14:15:00 crc kubenswrapper[4723]: I0309 14:15:00.485078 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4dd73fc5-311f-41e2-9759-285a5344fbf3-config-volume\") pod \"collect-profiles-29551095-mtk7p\" (UID: \"4dd73fc5-311f-41e2-9759-285a5344fbf3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-mtk7p" Mar 09 14:15:00 crc kubenswrapper[4723]: I0309 14:15:00.485241 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rl6s4\" (UniqueName: \"kubernetes.io/projected/4dd73fc5-311f-41e2-9759-285a5344fbf3-kube-api-access-rl6s4\") pod \"collect-profiles-29551095-mtk7p\" (UID: \"4dd73fc5-311f-41e2-9759-285a5344fbf3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-mtk7p" Mar 09 14:15:00 crc kubenswrapper[4723]: I0309 14:15:00.485306 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4dd73fc5-311f-41e2-9759-285a5344fbf3-secret-volume\") pod \"collect-profiles-29551095-mtk7p\" (UID: \"4dd73fc5-311f-41e2-9759-285a5344fbf3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-mtk7p" Mar 09 14:15:00 crc kubenswrapper[4723]: I0309 14:15:00.491252 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4dd73fc5-311f-41e2-9759-285a5344fbf3-config-volume\") pod \"collect-profiles-29551095-mtk7p\" (UID: \"4dd73fc5-311f-41e2-9759-285a5344fbf3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-mtk7p" Mar 09 14:15:00 crc kubenswrapper[4723]: I0309 14:15:00.505581 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4dd73fc5-311f-41e2-9759-285a5344fbf3-secret-volume\") pod \"collect-profiles-29551095-mtk7p\" (UID: \"4dd73fc5-311f-41e2-9759-285a5344fbf3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-mtk7p" Mar 09 14:15:00 crc kubenswrapper[4723]: I0309 14:15:00.506344 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rl6s4\" (UniqueName: \"kubernetes.io/projected/4dd73fc5-311f-41e2-9759-285a5344fbf3-kube-api-access-rl6s4\") pod \"collect-profiles-29551095-mtk7p\" (UID: \"4dd73fc5-311f-41e2-9759-285a5344fbf3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-mtk7p" Mar 09 14:15:00 crc kubenswrapper[4723]: I0309 14:15:00.606035 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-mtk7p" Mar 09 14:15:01 crc kubenswrapper[4723]: I0309 14:15:01.986144 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551095-mtk7p"] Mar 09 14:15:02 crc kubenswrapper[4723]: I0309 14:15:02.037121 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-mtk7p" event={"ID":"4dd73fc5-311f-41e2-9759-285a5344fbf3","Type":"ContainerStarted","Data":"48b6aa1a870615706c5eec4a9ef8d52726a286bb0408e82c4cf3473440860569"} Mar 09 14:15:03 crc kubenswrapper[4723]: I0309 14:15:03.108460 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-mtk7p" event={"ID":"4dd73fc5-311f-41e2-9759-285a5344fbf3","Type":"ContainerStarted","Data":"339624f29dc7e268eec40f5b81497449e584f53fa07a2de207a2ab92332a857e"} Mar 09 14:15:03 crc kubenswrapper[4723]: I0309 14:15:03.164806 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-mtk7p" podStartSLOduration=3.164784903 podStartE2EDuration="3.164784903s" podCreationTimestamp="2026-03-09 14:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:15:03.135333808 +0000 UTC m=+4577.149801348" watchObservedRunningTime="2026-03-09 14:15:03.164784903 +0000 UTC m=+4577.179252443" Mar 09 14:15:04 crc kubenswrapper[4723]: I0309 14:15:04.122065 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-mtk7p" event={"ID":"4dd73fc5-311f-41e2-9759-285a5344fbf3","Type":"ContainerDied","Data":"339624f29dc7e268eec40f5b81497449e584f53fa07a2de207a2ab92332a857e"} Mar 09 14:15:04 crc kubenswrapper[4723]: I0309 14:15:04.121950 4723 generic.go:334] "Generic (PLEG): container finished" podID="4dd73fc5-311f-41e2-9759-285a5344fbf3" containerID="339624f29dc7e268eec40f5b81497449e584f53fa07a2de207a2ab92332a857e" exitCode=0 Mar 09 14:15:05 crc kubenswrapper[4723]: I0309 14:15:05.656822 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-mtk7p" Mar 09 14:15:05 crc kubenswrapper[4723]: I0309 14:15:05.767952 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4dd73fc5-311f-41e2-9759-285a5344fbf3-config-volume\") pod \"4dd73fc5-311f-41e2-9759-285a5344fbf3\" (UID: \"4dd73fc5-311f-41e2-9759-285a5344fbf3\") " Mar 09 14:15:05 crc kubenswrapper[4723]: I0309 14:15:05.769173 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dd73fc5-311f-41e2-9759-285a5344fbf3-config-volume" (OuterVolumeSpecName: "config-volume") pod "4dd73fc5-311f-41e2-9759-285a5344fbf3" (UID: "4dd73fc5-311f-41e2-9759-285a5344fbf3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:15:05 crc kubenswrapper[4723]: I0309 14:15:05.770377 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4dd73fc5-311f-41e2-9759-285a5344fbf3-secret-volume\") pod \"4dd73fc5-311f-41e2-9759-285a5344fbf3\" (UID: \"4dd73fc5-311f-41e2-9759-285a5344fbf3\") " Mar 09 14:15:05 crc kubenswrapper[4723]: I0309 14:15:05.770978 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rl6s4\" (UniqueName: \"kubernetes.io/projected/4dd73fc5-311f-41e2-9759-285a5344fbf3-kube-api-access-rl6s4\") pod \"4dd73fc5-311f-41e2-9759-285a5344fbf3\" (UID: \"4dd73fc5-311f-41e2-9759-285a5344fbf3\") " Mar 09 14:15:05 crc kubenswrapper[4723]: I0309 14:15:05.772411 4723 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4dd73fc5-311f-41e2-9759-285a5344fbf3-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 14:15:05 crc kubenswrapper[4723]: I0309 14:15:05.777038 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dd73fc5-311f-41e2-9759-285a5344fbf3-kube-api-access-rl6s4" (OuterVolumeSpecName: "kube-api-access-rl6s4") pod "4dd73fc5-311f-41e2-9759-285a5344fbf3" (UID: "4dd73fc5-311f-41e2-9759-285a5344fbf3"). InnerVolumeSpecName "kube-api-access-rl6s4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:15:05 crc kubenswrapper[4723]: I0309 14:15:05.777333 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dd73fc5-311f-41e2-9759-285a5344fbf3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4dd73fc5-311f-41e2-9759-285a5344fbf3" (UID: "4dd73fc5-311f-41e2-9759-285a5344fbf3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:15:05 crc kubenswrapper[4723]: I0309 14:15:05.875153 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rl6s4\" (UniqueName: \"kubernetes.io/projected/4dd73fc5-311f-41e2-9759-285a5344fbf3-kube-api-access-rl6s4\") on node \"crc\" DevicePath \"\"" Mar 09 14:15:05 crc kubenswrapper[4723]: I0309 14:15:05.875195 4723 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4dd73fc5-311f-41e2-9759-285a5344fbf3-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 14:15:06 crc kubenswrapper[4723]: I0309 14:15:06.146510 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-mtk7p" event={"ID":"4dd73fc5-311f-41e2-9759-285a5344fbf3","Type":"ContainerDied","Data":"48b6aa1a870615706c5eec4a9ef8d52726a286bb0408e82c4cf3473440860569"} Mar 09 14:15:06 crc kubenswrapper[4723]: I0309 14:15:06.146557 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48b6aa1a870615706c5eec4a9ef8d52726a286bb0408e82c4cf3473440860569" Mar 09 14:15:06 crc kubenswrapper[4723]: I0309 14:15:06.146756 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-mtk7p" Mar 09 14:15:06 crc kubenswrapper[4723]: I0309 14:15:06.223766 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551050-mcz8l"] Mar 09 14:15:06 crc kubenswrapper[4723]: I0309 14:15:06.236785 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551050-mcz8l"] Mar 09 14:15:06 crc kubenswrapper[4723]: I0309 14:15:06.897799 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="058d5dd9-a30c-4de2-a61a-4fd2a359799e" path="/var/lib/kubelet/pods/058d5dd9-a30c-4de2-a61a-4fd2a359799e/volumes" Mar 09 14:15:28 crc kubenswrapper[4723]: I0309 14:15:28.083425 4723 scope.go:117] "RemoveContainer" containerID="e11451d0ce7dca7b8212f8043bb36bdb36f297425e5d815f7cea4b03e679d4a9" Mar 09 14:15:33 crc kubenswrapper[4723]: I0309 14:15:33.949324 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:15:33 crc kubenswrapper[4723]: I0309 14:15:33.955903 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:15:59 crc kubenswrapper[4723]: I0309 14:15:59.018925 4723 patch_prober.go:28] interesting pod/controller-manager-774cb675cc-hwvwx container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.78:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:15:59 crc kubenswrapper[4723]: I0309 14:15:59.018963 4723 patch_prober.go:28] interesting pod/controller-manager-774cb675cc-hwvwx container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.78:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:15:59 crc kubenswrapper[4723]: I0309 14:15:59.025276 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-774cb675cc-hwvwx" podUID="f52a68c7-f2a2-4c16-a45b-b821debecd6d" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.78:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:15:59 crc kubenswrapper[4723]: I0309 14:15:59.025382 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-774cb675cc-hwvwx" podUID="f52a68c7-f2a2-4c16-a45b-b821debecd6d" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.78:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:15:59 crc kubenswrapper[4723]: I0309 14:15:59.605112 4723 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-2nqwq container/operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.8:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:15:59 crc kubenswrapper[4723]: I0309 14:15:59.605459 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/observability-operator-59bdc8b94-2nqwq" podUID="2bc0446d-1f37-4214-bd0a-0f7c64f844a8" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.8:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:15:59 crc kubenswrapper[4723]: I0309 14:15:59.605211 4723 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-2nqwq container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.8:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:15:59 crc kubenswrapper[4723]: I0309 14:15:59.605562 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-2nqwq" podUID="2bc0446d-1f37-4214-bd0a-0f7c64f844a8" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.8:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:15:59 crc kubenswrapper[4723]: I0309 14:15:59.646126 4723 patch_prober.go:28] interesting pod/perses-operator-5bf474d74f-25fp4 container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.21:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:15:59 crc kubenswrapper[4723]: I0309 14:15:59.646178 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-5bf474d74f-25fp4" podUID="5d6ca68e-2044-42ed-9ee9-41bcf5aaf8e6" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.21:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:15:59 crc kubenswrapper[4723]: I0309 14:15:59.988015 4723 patch_prober.go:28] interesting pod/authentication-operator-69f744f599-lwpcl container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:15:59 crc kubenswrapper[4723]: I0309 14:15:59.988090 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-69f744f599-lwpcl" podUID="61312a96-b8f6-431c-b24e-0046271cf40f" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:00 crc kubenswrapper[4723]: I0309 14:16:00.483202 4723 patch_prober.go:28] interesting pod/console-operator-58897d9998-wkpzg container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:00 crc kubenswrapper[4723]: I0309 14:16:00.484392 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-wkpzg" podUID="1f6455f2-bcad-4e11-8ef5-a272b406be88" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:00 crc kubenswrapper[4723]: I0309 14:16:00.494138 4723 patch_prober.go:28] interesting pod/console-operator-58897d9998-wkpzg container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:00 crc kubenswrapper[4723]: I0309 14:16:00.494423 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-wkpzg" podUID="1f6455f2-bcad-4e11-8ef5-a272b406be88" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:00 crc kubenswrapper[4723]: I0309 14:16:00.494223 4723 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-t25rf container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:00 crc kubenswrapper[4723]: I0309 14:16:00.494676 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t25rf" podUID="337d5692-12d3-4c0a-8187-eb66a2666e95" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:00 crc kubenswrapper[4723]: I0309 14:16:00.501073 4723 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-fhxc9 container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.30:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:00 crc kubenswrapper[4723]: I0309 14:16:00.501188 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhxc9" podUID="3321a715-9c5f-4417-bec1-4ba3ccce946c" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.30:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:00 crc kubenswrapper[4723]: I0309 14:16:00.501724 4723 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-fhxc9 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:00 crc kubenswrapper[4723]: I0309 14:16:00.501787 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhxc9" podUID="3321a715-9c5f-4417-bec1-4ba3ccce946c" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.30:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:00 crc kubenswrapper[4723]: I0309 14:16:00.990542 4723 patch_prober.go:28] interesting pod/logging-loki-gateway-867fb59d66-pxpr6 container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:00 crc kubenswrapper[4723]: I0309 14:16:00.990924 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-867fb59d66-pxpr6" podUID="3dcae42d-f05a-41f1-9d6a-11ccb28eb379" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.54:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:01 crc kubenswrapper[4723]: I0309 14:16:01.416180 4723 patch_prober.go:28] interesting pod/metrics-server-7f69f56458-z9f7c container/metrics-server namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.93:10250/livez\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:01 crc kubenswrapper[4723]: I0309 14:16:01.416272 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/metrics-server-7f69f56458-z9f7c" podUID="9a4a344f-6f96-422b-9468-56c8e988ad3f" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.93:10250/livez\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:01 crc kubenswrapper[4723]: I0309 14:16:01.799669 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17" containerName="galera" probeResult="failure" output="command timed out" Mar 09 14:16:01 crc kubenswrapper[4723]: I0309 14:16:01.799887 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17" containerName="galera" probeResult="failure" output="command timed out" Mar 09 14:16:02 crc kubenswrapper[4723]: I0309 14:16:02.796025 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="f01dc50c-55d6-4f99-92f8-d3adfcf8d71b" containerName="galera" probeResult="failure" output="command timed out" Mar 09 14:16:02 crc kubenswrapper[4723]: I0309 14:16:02.796850 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="f01dc50c-55d6-4f99-92f8-d3adfcf8d71b" containerName="galera" probeResult="failure" output="command timed out" Mar 09 14:16:03 crc kubenswrapper[4723]: I0309 14:16:03.584022 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="cert-manager/cert-manager-webhook-687f57d79b-6wj4n" podUID="7b820c49-0780-4d8d-a069-6cecf6ee0f1e" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.44:6080/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:03 crc kubenswrapper[4723]: I0309 14:16:03.584653 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-687f57d79b-6wj4n" podUID="7b820c49-0780-4d8d-a069-6cecf6ee0f1e" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.44:6080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:03 crc kubenswrapper[4723]: I0309 14:16:03.621964 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551096-jrfpb"] Mar 09 14:16:03 crc kubenswrapper[4723]: E0309 14:16:03.626417 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dd73fc5-311f-41e2-9759-285a5344fbf3" containerName="collect-profiles" Mar 09 14:16:03 crc kubenswrapper[4723]: I0309 14:16:03.626447 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dd73fc5-311f-41e2-9759-285a5344fbf3" containerName="collect-profiles" Mar 09 14:16:03 crc kubenswrapper[4723]: I0309 14:16:03.627326 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dd73fc5-311f-41e2-9759-285a5344fbf3" containerName="collect-profiles" Mar 09 14:16:03 crc kubenswrapper[4723]: I0309 14:16:03.641202 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551096-jrfpb" Mar 09 14:16:03 crc kubenswrapper[4723]: I0309 14:16:03.658750 4723 patch_prober.go:28] interesting pod/monitoring-plugin-66b86d466-5tr4x container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.90:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:03 crc kubenswrapper[4723]: I0309 14:16:03.658816 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-66b86d466-5tr4x" podUID="289dc72b-221e-415a-8c97-3889de9ceaed" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.90:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:03 crc kubenswrapper[4723]: I0309 14:16:03.704346 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jrp4x" Mar 09 14:16:03 crc kubenswrapper[4723]: I0309 14:16:03.704611 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:16:03 crc kubenswrapper[4723]: I0309 14:16:03.710012 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:16:03 crc kubenswrapper[4723]: I0309 14:16:03.799978 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="df18bf19-d23a-471f-8074-2eaaa7c4aead" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Mar 09 14:16:03 crc kubenswrapper[4723]: I0309 14:16:03.807292 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2s7c\" (UniqueName: \"kubernetes.io/projected/8d18d891-99a0-4089-86df-166cfa4297a6-kube-api-access-c2s7c\") pod \"auto-csr-approver-29551096-jrfpb\" (UID: \"8d18d891-99a0-4089-86df-166cfa4297a6\") " pod="openshift-infra/auto-csr-approver-29551096-jrfpb" Mar 09 14:16:03 crc kubenswrapper[4723]: I0309 14:16:03.909589 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2s7c\" (UniqueName: \"kubernetes.io/projected/8d18d891-99a0-4089-86df-166cfa4297a6-kube-api-access-c2s7c\") pod \"auto-csr-approver-29551096-jrfpb\" (UID: \"8d18d891-99a0-4089-86df-166cfa4297a6\") " pod="openshift-infra/auto-csr-approver-29551096-jrfpb" Mar 09 14:16:03 crc kubenswrapper[4723]: I0309 14:16:03.947050 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:16:03 crc kubenswrapper[4723]: I0309 14:16:03.947117 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:16:04 crc kubenswrapper[4723]: I0309 14:16:04.044688 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2s7c\" (UniqueName: \"kubernetes.io/projected/8d18d891-99a0-4089-86df-166cfa4297a6-kube-api-access-c2s7c\") pod \"auto-csr-approver-29551096-jrfpb\" (UID: \"8d18d891-99a0-4089-86df-166cfa4297a6\") " pod="openshift-infra/auto-csr-approver-29551096-jrfpb" Mar 09 14:16:04 crc kubenswrapper[4723]: I0309 14:16:04.322933 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551096-jrfpb" Mar 09 14:16:04 crc kubenswrapper[4723]: I0309 14:16:04.725162 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-v6s9h" podUID="5ea4f771-5b0c-410d-8a6c-a45b039edb6a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:04 crc kubenswrapper[4723]: I0309 14:16:04.725189 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-v6s9h" podUID="5ea4f771-5b0c-410d-8a6c-a45b039edb6a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:05 crc kubenswrapper[4723]: I0309 14:16:05.115156 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/metallb-operator-webhook-server-997cfb689-c8857" podUID="881230db-85c7-4159-b1dd-f537ed6baece" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.95:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:05 crc kubenswrapper[4723]: I0309 14:16:05.115370 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-997cfb689-c8857" podUID="881230db-85c7-4159-b1dd-f537ed6baece" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.95:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:05 crc kubenswrapper[4723]: I0309 14:16:05.197509 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cz5w4f" podUID="c3f17509-7e0b-452d-b3ca-0a3210159f17" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:05 crc kubenswrapper[4723]: I0309 14:16:05.201598 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cz5w4f" podUID="c3f17509-7e0b-452d-b3ca-0a3210159f17" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:05 crc kubenswrapper[4723]: I0309 14:16:05.662057 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-controller-manager-76577b8ddd-8748r" podUID="b9b75469-0c5d-47b4-b75c-28cdf8316167" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.125:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:05 crc kubenswrapper[4723]: I0309 14:16:05.662091 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-manager-76577b8ddd-8748r" podUID="b9b75469-0c5d-47b4-b75c-28cdf8316167" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.125:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:05 crc kubenswrapper[4723]: I0309 14:16:05.884047 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-t8zw4" podUID="3fab2f82-df4b-417b-8188-0c4f455df30c" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.97:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:05 crc kubenswrapper[4723]: I0309 14:16:05.884109 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-jqk9s" podUID="54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:05 crc kubenswrapper[4723]: I0309 14:16:05.884160 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-t8zw4" podUID="3fab2f82-df4b-417b-8188-0c4f455df30c" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.97:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:05 crc kubenswrapper[4723]: I0309 14:16:05.884206 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-jqk9s" podUID="54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:05 crc kubenswrapper[4723]: I0309 14:16:05.884442 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-jqk9s" podUID="54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:06 crc kubenswrapper[4723]: I0309 14:16:06.015338 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551096-jrfpb"] Mar 09 14:16:06 crc kubenswrapper[4723]: I0309 14:16:06.089252 4723 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:06 crc kubenswrapper[4723]: I0309 14:16:06.089314 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:06 crc kubenswrapper[4723]: I0309 14:16:06.264227 4723 patch_prober.go:28] interesting pod/nmstate-webhook-786f45cff4-msbbv container/nmstate-webhook namespace/openshift-nmstate: Readiness probe status=failure output="Get \"https://10.217.0.68:9443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:06 crc kubenswrapper[4723]: I0309 14:16:06.264323 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-webhook-786f45cff4-msbbv" podUID="3b47483e-69de-403b-ab71-5c6665c0a36d" containerName="nmstate-webhook" probeResult="failure" output="Get \"https://10.217.0.68:9443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:06 crc kubenswrapper[4723]: I0309 14:16:06.994948 4723 patch_prober.go:28] interesting pod/console-85b7499c-sqsr9 container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.141:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:06 crc kubenswrapper[4723]: I0309 14:16:06.995347 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-85b7499c-sqsr9" podUID="731ffd33-861f-45a8-a54a-5a18dcca5ae6" containerName="console" probeResult="failure" output="Get \"https://10.217.0.141:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:07 crc kubenswrapper[4723]: I0309 14:16:07.458919 4723 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-lfdgl container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.81:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:07 crc kubenswrapper[4723]: I0309 14:16:07.459295 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-lfdgl" podUID="2a058d13-df7c-45fc-9c82-83cd7d61ffbd" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.81:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:07 crc kubenswrapper[4723]: I0309 14:16:07.459800 4723 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-lfdgl container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.81:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:07 crc kubenswrapper[4723]: I0309 14:16:07.459849 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-lfdgl" podUID="2a058d13-df7c-45fc-9c82-83cd7d61ffbd" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.81:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:07 crc kubenswrapper[4723]: I0309 14:16:07.909086 4723 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Liveness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:07 crc kubenswrapper[4723]: I0309 14:16:07.909159 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:07 crc kubenswrapper[4723]: I0309 14:16:07.948261 4723 patch_prober.go:28] interesting pod/oauth-openshift-6775b6d8cc-n5skm container/oauth-openshift namespace/openshift-authentication: Liveness probe status=failure output="Get \"https://10.217.0.66:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:07 crc kubenswrapper[4723]: I0309 14:16:07.948344 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" podUID="f0275b6b-90ed-4c22-ae68-834792f8e5dd" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.66:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:07 crc kubenswrapper[4723]: I0309 14:16:07.948376 4723 patch_prober.go:28] interesting pod/oauth-openshift-6775b6d8cc-n5skm container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.66:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:07 crc kubenswrapper[4723]: I0309 14:16:07.948423 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" podUID="f0275b6b-90ed-4c22-ae68-834792f8e5dd" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.66:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:08 crc kubenswrapper[4723]: I0309 14:16:08.536107 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-687f57d79b-6wj4n" podUID="7b820c49-0780-4d8d-a069-6cecf6ee0f1e" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.44:6080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:08 crc kubenswrapper[4723]: I0309 14:16:08.803496 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="df18bf19-d23a-471f-8074-2eaaa7c4aead" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Mar 09 14:16:08 crc kubenswrapper[4723]: I0309 14:16:08.803522 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="df18bf19-d23a-471f-8074-2eaaa7c4aead" containerName="ceilometer-notification-agent" probeResult="failure" output="command timed out" Mar 09 14:16:08 crc kubenswrapper[4723]: I0309 14:16:08.983716 4723 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-7nz4v container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.70:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:08 crc kubenswrapper[4723]: I0309 14:16:08.983800 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-7nz4v" podUID="0c45ecd0-a916-4ef0-80aa-cfe88212d0ed" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.70:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:08 crc kubenswrapper[4723]: I0309 14:16:08.984194 4723 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-7nz4v container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.70:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:08 crc kubenswrapper[4723]: I0309 14:16:08.984267 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-7nz4v" podUID="0c45ecd0-a916-4ef0-80aa-cfe88212d0ed" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.70:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:08 crc kubenswrapper[4723]: I0309 14:16:08.991615 4723 patch_prober.go:28] interesting pod/route-controller-manager-7757f9dd75-n7jz4 container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.77:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:08 crc kubenswrapper[4723]: I0309 14:16:08.991689 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-7757f9dd75-n7jz4" podUID="bfc8cb7a-3df5-4dd3-8520-82316314e76b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.77:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:08 crc kubenswrapper[4723]: I0309 14:16:08.992476 4723 patch_prober.go:28] interesting pod/route-controller-manager-7757f9dd75-n7jz4 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.77:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:08 crc kubenswrapper[4723]: I0309 14:16:08.992526 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7757f9dd75-n7jz4" podUID="bfc8cb7a-3df5-4dd3-8520-82316314e76b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.77:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:09 crc kubenswrapper[4723]: I0309 14:16:09.002507 4723 patch_prober.go:28] interesting pod/controller-manager-774cb675cc-hwvwx container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.78:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:09 crc kubenswrapper[4723]: I0309 14:16:09.002720 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-774cb675cc-hwvwx" podUID="f52a68c7-f2a2-4c16-a45b-b821debecd6d" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.78:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:09 crc kubenswrapper[4723]: I0309 14:16:09.002854 4723 patch_prober.go:28] interesting pod/controller-manager-774cb675cc-hwvwx container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.78:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:09 crc kubenswrapper[4723]: I0309 14:16:09.002920 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-774cb675cc-hwvwx" podUID="f52a68c7-f2a2-4c16-a45b-b821debecd6d" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.78:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:09 crc kubenswrapper[4723]: I0309 14:16:09.603191 4723 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-2nqwq container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.8:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:09 crc kubenswrapper[4723]: I0309 14:16:09.603314 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-2nqwq" podUID="2bc0446d-1f37-4214-bd0a-0f7c64f844a8" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.8:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:09 crc kubenswrapper[4723]: I0309 14:16:09.603191 4723 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-2nqwq container/operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.8:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:09 crc kubenswrapper[4723]: I0309 14:16:09.603398 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/observability-operator-59bdc8b94-2nqwq" podUID="2bc0446d-1f37-4214-bd0a-0f7c64f844a8" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.8:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:09 crc kubenswrapper[4723]: I0309 14:16:09.687102 4723 patch_prober.go:28] interesting pod/perses-operator-5bf474d74f-25fp4 container/perses-operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.21:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:09 crc kubenswrapper[4723]: I0309 14:16:09.687167 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/perses-operator-5bf474d74f-25fp4" podUID="5d6ca68e-2044-42ed-9ee9-41bcf5aaf8e6" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.21:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:09 crc kubenswrapper[4723]: I0309 14:16:09.687257 4723 patch_prober.go:28] interesting pod/perses-operator-5bf474d74f-25fp4 container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.21:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:09 crc kubenswrapper[4723]: I0309 14:16:09.687270 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-5bf474d74f-25fp4" podUID="5d6ca68e-2044-42ed-9ee9-41bcf5aaf8e6" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.21:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:10 crc kubenswrapper[4723]: I0309 14:16:10.013151 4723 patch_prober.go:28] interesting pod/authentication-operator-69f744f599-lwpcl container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:10 crc kubenswrapper[4723]: I0309 14:16:10.013429 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-69f744f599-lwpcl" podUID="61312a96-b8f6-431c-b24e-0046271cf40f" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:10 crc kubenswrapper[4723]: I0309 14:16:10.013513 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-operators-r9xzz" podUID="0b462749-ee4f-4661-8a3a-06e721ef51a8" containerName="registry-server" probeResult="failure" output=< Mar 09 14:16:10 crc kubenswrapper[4723]: timeout: failed to connect service ":50051" within 1s Mar 09 14:16:10 crc kubenswrapper[4723]: > Mar 09 14:16:10 crc kubenswrapper[4723]: I0309 14:16:10.085816 4723 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-qp2l4 container/olm-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:10 crc kubenswrapper[4723]: I0309 14:16:10.085894 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qp2l4" podUID="3f57f760-4139-4eb7-b7e9-5f0bcd7cb3eb" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:10 crc kubenswrapper[4723]: I0309 14:16:10.090270 4723 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-qp2l4 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:10 crc kubenswrapper[4723]: I0309 14:16:10.090335 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qp2l4" podUID="3f57f760-4139-4eb7-b7e9-5f0bcd7cb3eb" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:10 crc kubenswrapper[4723]: I0309 14:16:10.346194 4723 patch_prober.go:28] interesting pod/downloads-7954f5f757-b5c74 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:10 crc kubenswrapper[4723]: I0309 14:16:10.346231 4723 patch_prober.go:28] interesting pod/downloads-7954f5f757-b5c74 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.13:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:10 crc kubenswrapper[4723]: I0309 14:16:10.346289 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-b5c74" podUID="e21fc837-8de2-4af5-a375-b14567f47d67" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:10 crc kubenswrapper[4723]: I0309 14:16:10.346385 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-b5c74" podUID="e21fc837-8de2-4af5-a375-b14567f47d67" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:10 crc kubenswrapper[4723]: I0309 14:16:10.358283 4723 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-t25rf container/catalog-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:10 crc kubenswrapper[4723]: I0309 14:16:10.358351 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t25rf" podUID="337d5692-12d3-4c0a-8187-eb66a2666e95" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:10 crc kubenswrapper[4723]: I0309 14:16:10.358588 4723 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-t25rf container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:10 crc kubenswrapper[4723]: I0309 14:16:10.358652 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t25rf" podUID="337d5692-12d3-4c0a-8187-eb66a2666e95" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:10 crc kubenswrapper[4723]: I0309 14:16:10.452095 4723 patch_prober.go:28] interesting pod/console-operator-58897d9998-wkpzg container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:10 crc kubenswrapper[4723]: I0309 14:16:10.452158 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-wkpzg" podUID="1f6455f2-bcad-4e11-8ef5-a272b406be88" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:10 crc kubenswrapper[4723]: I0309 14:16:10.452215 4723 patch_prober.go:28] interesting pod/console-operator-58897d9998-wkpzg container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:10 crc kubenswrapper[4723]: I0309 14:16:10.452231 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-wkpzg" podUID="1f6455f2-bcad-4e11-8ef5-a272b406be88" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:10 crc kubenswrapper[4723]: I0309 14:16:10.461785 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-vwn46" podUID="98195455-05c0-408c-b3e2-728b991eee12" containerName="registry-server" probeResult="failure" output=< Mar 09 14:16:10 crc kubenswrapper[4723]: timeout: failed to connect service ":50051" within 1s Mar 09 14:16:10 crc kubenswrapper[4723]: > Mar 09 14:16:10 crc kubenswrapper[4723]: I0309 14:16:10.461918 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-marketplace-vwn46" podUID="98195455-05c0-408c-b3e2-728b991eee12" containerName="registry-server" probeResult="failure" output=< Mar 09 14:16:10 crc kubenswrapper[4723]: timeout: failed to connect service ":50051" within 1s Mar 09 14:16:10 crc kubenswrapper[4723]: > Mar 09 14:16:10 crc kubenswrapper[4723]: I0309 14:16:10.486075 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-operators-r9xzz" podUID="0b462749-ee4f-4661-8a3a-06e721ef51a8" containerName="registry-server" probeResult="failure" output=< Mar 09 14:16:10 crc kubenswrapper[4723]: timeout: failed to connect service ":50051" within 1s Mar 09 14:16:10 crc kubenswrapper[4723]: > Mar 09 14:16:10 crc kubenswrapper[4723]: I0309 14:16:10.545136 4723 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-fhxc9 container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.30:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:10 crc kubenswrapper[4723]: I0309 14:16:10.545194 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhxc9" podUID="3321a715-9c5f-4417-bec1-4ba3ccce946c" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.30:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:10 crc kubenswrapper[4723]: I0309 14:16:10.545283 4723 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-p9x2d container/package-server-manager namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:10 crc kubenswrapper[4723]: I0309 14:16:10.545306 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p9x2d" podUID="09470b31-c2ae-42f8-8490-c446e979042d" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:10 crc kubenswrapper[4723]: I0309 14:16:10.545367 4723 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-fhxc9 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:10 crc kubenswrapper[4723]: I0309 14:16:10.545384 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhxc9" podUID="3321a715-9c5f-4417-bec1-4ba3ccce946c" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.30:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:10 crc kubenswrapper[4723]: I0309 14:16:10.545415 4723 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-p9x2d container/package-server-manager namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:10 crc kubenswrapper[4723]: I0309 14:16:10.545433 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p9x2d" podUID="09470b31-c2ae-42f8-8490-c446e979042d" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:10 crc kubenswrapper[4723]: I0309 14:16:10.545466 4723 patch_prober.go:28] interesting pod/logging-loki-distributor-5d5548c9f5-zg9mx container/loki-distributor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.50:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:10 crc kubenswrapper[4723]: I0309 14:16:10.545484 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-distributor-5d5548c9f5-zg9mx" podUID="04edbd9e-fd1b-4346-97ce-adfb011720a4" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.50:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:10 crc kubenswrapper[4723]: I0309 14:16:10.801372 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="5b572550-466a-4fae-9334-0a471e7c39be" containerName="prometheus" probeResult="failure" output="command timed out" Mar 09 14:16:10 crc kubenswrapper[4723]: I0309 14:16:10.862025 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="5b572550-466a-4fae-9334-0a471e7c39be" containerName="prometheus" probeResult="failure" output="command timed out" Mar 09 14:16:10 crc kubenswrapper[4723]: I0309 14:16:10.922840 4723 trace.go:236] Trace[585353037]: "Calculate volume metrics of persistence for pod openstack/rabbitmq-cell1-server-0" (09-Mar-2026 14:16:07.089) (total time: 3805ms): Mar 09 14:16:10 crc kubenswrapper[4723]: Trace[585353037]: [3.805897521s] [3.805897521s] END Mar 09 14:16:10 crc kubenswrapper[4723]: I0309 14:16:10.957120 4723 patch_prober.go:28] interesting pod/logging-loki-querier-76bf7b6d45-2d54b container/loki-querier namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.51:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:10 crc kubenswrapper[4723]: I0309 14:16:10.957185 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-querier-76bf7b6d45-2d54b" podUID="8b9cdd14-6347-4701-9825-1ced6362cd8c" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.51:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:10 crc kubenswrapper[4723]: I0309 14:16:10.962524 4723 patch_prober.go:28] interesting pod/logging-loki-query-frontend-6d6859c548-4mdnv container/loki-query-frontend namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.52:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:10 crc kubenswrapper[4723]: I0309 14:16:10.962590 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-query-frontend-6d6859c548-4mdnv" podUID="9cd1997b-cced-41c1-8a27-77321ffc48ae" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.52:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:10 crc kubenswrapper[4723]: I0309 14:16:10.989758 4723 patch_prober.go:28] interesting pod/logging-loki-gateway-867fb59d66-pxpr6 container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:10 crc kubenswrapper[4723]: I0309 14:16:10.989822 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-867fb59d66-pxpr6" podUID="3dcae42d-f05a-41f1-9d6a-11ccb28eb379" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.54:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:11 crc kubenswrapper[4723]: I0309 14:16:10.989900 4723 patch_prober.go:28] interesting pod/logging-loki-gateway-867fb59d66-pxpr6 container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:11 crc kubenswrapper[4723]: I0309 14:16:10.990013 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-867fb59d66-pxpr6" podUID="3dcae42d-f05a-41f1-9d6a-11ccb28eb379" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.54:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:11 crc kubenswrapper[4723]: I0309 14:16:11.081162 4723 patch_prober.go:28] interesting pod/router-default-5444994796-fzrk5 container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:11 crc kubenswrapper[4723]: I0309 14:16:11.081232 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-fzrk5" podUID="f386e3de-9b2f-478d-b7c5-8ac3f8aff8d1" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:11 crc kubenswrapper[4723]: I0309 14:16:11.081237 4723 patch_prober.go:28] interesting pod/router-default-5444994796-fzrk5 container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:11 crc kubenswrapper[4723]: I0309 14:16:11.081304 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-fzrk5" podUID="f386e3de-9b2f-478d-b7c5-8ac3f8aff8d1" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:11 crc kubenswrapper[4723]: I0309 14:16:11.374374 4723 patch_prober.go:28] interesting pod/metrics-server-7f69f56458-z9f7c container/metrics-server namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.93:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:11 crc kubenswrapper[4723]: I0309 14:16:11.374436 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/metrics-server-7f69f56458-z9f7c" podUID="9a4a344f-6f96-422b-9468-56c8e988ad3f" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.93:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:11 crc kubenswrapper[4723]: I0309 14:16:11.374447 4723 patch_prober.go:28] interesting pod/metrics-server-7f69f56458-z9f7c container/metrics-server namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.93:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:11 crc kubenswrapper[4723]: I0309 14:16:11.374523 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/metrics-server-7f69f56458-z9f7c" podUID="9a4a344f-6f96-422b-9468-56c8e988ad3f" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.93:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:11 crc kubenswrapper[4723]: I0309 14:16:11.562163 4723 patch_prober.go:28] interesting pod/logging-loki-gateway-867fb59d66-2pwh2 container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:11 crc kubenswrapper[4723]: I0309 14:16:11.562261 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-867fb59d66-2pwh2" podUID="0bd030fd-cf38-4403-971f-4170fdc71bb0" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.53:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:11 crc kubenswrapper[4723]: I0309 14:16:11.795492 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17" containerName="galera" probeResult="failure" output="command timed out" Mar 09 14:16:11 crc kubenswrapper[4723]: I0309 14:16:11.796001 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17" containerName="galera" probeResult="failure" output="command timed out" Mar 09 14:16:12 crc kubenswrapper[4723]: I0309 14:16:12.776145 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-dwlzx" podUID="6bb6b3ee-7923-42ce-b36d-dabdaa42f829" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.106:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:12 crc kubenswrapper[4723]: I0309 14:16:12.776146 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-4zwkg" podUID="01b1451d-b917-4176-abf6-fd84021ba30d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.104:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:12 crc kubenswrapper[4723]: I0309 14:16:12.776206 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-mtmcb" podUID="6e192922-8050-41f1-bf25-33a12ace409b" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.105:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:12 crc kubenswrapper[4723]: I0309 14:16:12.776203 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-4zwkg" podUID="01b1451d-b917-4176-abf6-fd84021ba30d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.104:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:12 crc kubenswrapper[4723]: I0309 14:16:12.795846 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="f01dc50c-55d6-4f99-92f8-d3adfcf8d71b" containerName="galera" probeResult="failure" output="command timed out" Mar 09 14:16:12 crc kubenswrapper[4723]: I0309 14:16:12.799345 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="f01dc50c-55d6-4f99-92f8-d3adfcf8d71b" containerName="galera" probeResult="failure" output="command timed out" Mar 09 14:16:12 crc kubenswrapper[4723]: I0309 14:16:12.859302 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-mtmcb" podUID="6e192922-8050-41f1-bf25-33a12ace409b" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.105:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:12 crc kubenswrapper[4723]: I0309 14:16:12.859338 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-dwlzx" podUID="6bb6b3ee-7923-42ce-b36d-dabdaa42f829" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.106:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:12 crc kubenswrapper[4723]: I0309 14:16:12.941161 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-9btqb" podUID="6bf9afff-37d5-41e4-9270-8994fc65deda" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.108:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:12 crc kubenswrapper[4723]: I0309 14:16:12.941298 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-89rtm" podUID="9646c273-606f-4551-82dd-39e09007dc17" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.107:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:12 crc kubenswrapper[4723]: I0309 14:16:12.941403 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-89rtm" podUID="9646c273-606f-4551-82dd-39e09007dc17" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.107:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:12 crc kubenswrapper[4723]: I0309 14:16:12.941433 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-9btqb" podUID="6bf9afff-37d5-41e4-9270-8994fc65deda" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.108:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:13 crc kubenswrapper[4723]: I0309 14:16:13.222157 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-b2fx2" podUID="49f841ea-0808-406e-a0d0-671f5db13f93" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.115:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:13 crc kubenswrapper[4723]: I0309 14:16:13.222230 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-6npjh" podUID="eb08b38d-0624-4bd5-a3ba-9447cdbc80fb" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.116:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:13 crc kubenswrapper[4723]: I0309 14:16:13.305191 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-rwzzl" podUID="76830983-65b6-495a-8283-c9e2df80562b" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.111:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:13 crc kubenswrapper[4723]: I0309 14:16:13.388028 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-lbqm8" podUID="a8a23c57-bff5-4820-955c-441521c1e8f2" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.117:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:13 crc kubenswrapper[4723]: I0309 14:16:13.388028 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-b2fx2" podUID="49f841ea-0808-406e-a0d0-671f5db13f93" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.115:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:13 crc kubenswrapper[4723]: I0309 14:16:13.388470 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-6npjh" podUID="eb08b38d-0624-4bd5-a3ba-9447cdbc80fb" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.116:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:13 crc kubenswrapper[4723]: I0309 14:16:13.471153 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-5wrg7" podUID="1e62b006-449e-440b-b425-d56fbb171cd5" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.118:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:13 crc kubenswrapper[4723]: I0309 14:16:13.471324 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-rwzzl" podUID="76830983-65b6-495a-8283-c9e2df80562b" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.111:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:13 crc kubenswrapper[4723]: I0309 14:16:13.637056 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-8w2sj" podUID="57b972b8-b38f-4bc5-8cb5-cb2d949ff3b8" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.121:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:13 crc kubenswrapper[4723]: I0309 14:16:13.720127 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-lbqm8" podUID="a8a23c57-bff5-4820-955c-441521c1e8f2" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.117:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:13 crc kubenswrapper[4723]: I0309 14:16:13.720179 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-5wrg7" podUID="1e62b006-449e-440b-b425-d56fbb171cd5" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.118:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:13 crc kubenswrapper[4723]: I0309 14:16:13.720138 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-ssps2" podUID="36e23b55-a129-4c5f-8938-26f58742541b" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.122:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:13 crc kubenswrapper[4723]: I0309 14:16:13.720491 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-mt7hb" podUID="f1620e57-58ba-4313-bba4-f5ece039f9f7" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:13 crc kubenswrapper[4723]: I0309 14:16:13.720788 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-mt7hb" podUID="f1620e57-58ba-4313-bba4-f5ece039f9f7" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:13 crc kubenswrapper[4723]: I0309 14:16:13.720817 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-8w2sj" podUID="57b972b8-b38f-4bc5-8cb5-cb2d949ff3b8" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.121:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:13 crc kubenswrapper[4723]: I0309 14:16:13.720855 4723 patch_prober.go:28] interesting pod/monitoring-plugin-66b86d466-5tr4x container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.90:9443/health\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:13 crc kubenswrapper[4723]: I0309 14:16:13.720911 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-66b86d466-5tr4x" podUID="289dc72b-221e-415a-8c97-3889de9ceaed" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.90:9443/health\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:13 crc kubenswrapper[4723]: I0309 14:16:13.803057 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-ssps2" podUID="36e23b55-a129-4c5f-8938-26f58742541b" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.122:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:13 crc kubenswrapper[4723]: I0309 14:16:13.803173 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-6qstb" podUID="8554b7c9-0bd7-4326-b906-fe07dcdce9da" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.123:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:13 crc kubenswrapper[4723]: I0309 14:16:13.887125 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-4czcc" podUID="e93b778c-c10f-4da5-a3c2-91010b4b3aab" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.124:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:13 crc kubenswrapper[4723]: I0309 14:16:13.887349 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-6qstb" podUID="8554b7c9-0bd7-4326-b906-fe07dcdce9da" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.123:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:13 crc kubenswrapper[4723]: I0309 14:16:13.887423 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-4czcc" podUID="e93b778c-c10f-4da5-a3c2-91010b4b3aab" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.124:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:14 crc kubenswrapper[4723]: I0309 14:16:14.005035 4723 patch_prober.go:28] interesting pod/thanos-querier-f994cb665-42jsl container/kube-rbac-proxy-web namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.86:9091/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:14 crc kubenswrapper[4723]: I0309 14:16:14.005126 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/thanos-querier-f994cb665-42jsl" podUID="338186cb-4546-4740-bba3-c1c430d8aacc" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.86:9091/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:14 crc kubenswrapper[4723]: I0309 14:16:14.009371 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-index-9kl8t" podUID="194a48e1-f053-4aa1-bdfe-07aa2a8a208e" containerName="registry-server" probeResult="failure" output=< Mar 09 14:16:14 crc kubenswrapper[4723]: timeout: failed to connect service ":50051" within 1s Mar 09 14:16:14 crc kubenswrapper[4723]: > Mar 09 14:16:14 crc kubenswrapper[4723]: I0309 14:16:14.015728 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-index-9kl8t" podUID="194a48e1-f053-4aa1-bdfe-07aa2a8a208e" containerName="registry-server" probeResult="failure" output=< Mar 09 14:16:14 crc kubenswrapper[4723]: timeout: failed to connect service ":50051" within 1s Mar 09 14:16:14 crc kubenswrapper[4723]: > Mar 09 14:16:14 crc kubenswrapper[4723]: I0309 14:16:14.180957 4723 patch_prober.go:28] interesting pod/loki-operator-controller-manager-856bf85654-nsk4x container/manager namespace/openshift-operators-redhat: Readiness probe status=failure output="Get \"http://10.217.0.47:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:14 crc kubenswrapper[4723]: I0309 14:16:14.181012 4723 patch_prober.go:28] interesting pod/loki-operator-controller-manager-856bf85654-nsk4x container/manager namespace/openshift-operators-redhat: Liveness probe status=failure output="Get \"http://10.217.0.47:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:14 crc kubenswrapper[4723]: I0309 14:16:14.181019 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators-redhat/loki-operator-controller-manager-856bf85654-nsk4x" podUID="a4427e9d-2cc9-4cec-acf7-7bbcc1c91582" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.47:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:14 crc kubenswrapper[4723]: I0309 14:16:14.181064 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators-redhat/loki-operator-controller-manager-856bf85654-nsk4x" podUID="a4427e9d-2cc9-4cec-acf7-7bbcc1c91582" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.47:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:14 crc kubenswrapper[4723]: I0309 14:16:14.370058 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-init-7d644d7fb7-r7swc" podUID="b223943e-1394-48af-8f5c-78a9d370b602" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.103:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:14 crc kubenswrapper[4723]: I0309 14:16:14.370084 4723 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-zkxq4 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:14 crc kubenswrapper[4723]: I0309 14:16:14.370443 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zkxq4" podUID="beee0ec0-e83b-41df-b1c5-b6dadb908961" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:14 crc kubenswrapper[4723]: I0309 14:16:14.370365 4723 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-zkxq4 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:14 crc kubenswrapper[4723]: I0309 14:16:14.370511 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zkxq4" podUID="beee0ec0-e83b-41df-b1c5-b6dadb908961" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:14 crc kubenswrapper[4723]: I0309 14:16:14.610066 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-controller-manager-6d8b459b8b-tm8sv" podUID="4fc91e18-da85-44c6-96c7-2c15123b9980" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.94:8080/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:14 crc kubenswrapper[4723]: I0309 14:16:14.677262 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-v6s9h" podUID="5ea4f771-5b0c-410d-8a6c-a45b039edb6a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:14 crc kubenswrapper[4723]: I0309 14:16:14.801659 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="df18bf19-d23a-471f-8074-2eaaa7c4aead" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Mar 09 14:16:14 crc kubenswrapper[4723]: I0309 14:16:14.802796 4723 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ceilometer-0" Mar 09 14:16:14 crc kubenswrapper[4723]: I0309 14:16:14.807482 4723 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="ceilometer-central-agent" containerStatusID={"Type":"cri-o","ID":"82ad22c202020bad5c2bc30629a84c341955f73333b0feea5d43b28474a7488e"} pod="openstack/ceilometer-0" containerMessage="Container ceilometer-central-agent failed liveness probe, will be restarted" Mar 09 14:16:14 crc kubenswrapper[4723]: I0309 14:16:14.808173 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="df18bf19-d23a-471f-8074-2eaaa7c4aead" containerName="ceilometer-central-agent" containerID="cri-o://82ad22c202020bad5c2bc30629a84c341955f73333b0feea5d43b28474a7488e" gracePeriod=30 Mar 09 14:16:15 crc kubenswrapper[4723]: I0309 14:16:15.115305 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/metallb-operator-webhook-server-997cfb689-c8857" podUID="881230db-85c7-4159-b1dd-f537ed6baece" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.95:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:15 crc kubenswrapper[4723]: I0309 14:16:15.115693 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-997cfb689-c8857" podUID="881230db-85c7-4159-b1dd-f537ed6baece" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.95:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:15 crc kubenswrapper[4723]: I0309 14:16:15.157143 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cz5w4f" podUID="c3f17509-7e0b-452d-b3ca-0a3210159f17" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:15 crc kubenswrapper[4723]: I0309 14:16:15.363346 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/prometheus-metric-storage-0" podUID="3e47df78-6587-4f83-a1c9-dcaf0aa9b73c" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.181:9090/-/healthy\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:15 crc kubenswrapper[4723]: I0309 14:16:15.363758 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="3e47df78-6587-4f83-a1c9-dcaf0aa9b73c" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.181:9090/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:15 crc kubenswrapper[4723]: I0309 14:16:15.619238 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-manager-76577b8ddd-8748r" podUID="b9b75469-0c5d-47b4-b75c-28cdf8316167" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.125:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:15 crc kubenswrapper[4723]: I0309 14:16:15.805483 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="5b572550-466a-4fae-9334-0a471e7c39be" containerName="prometheus" probeResult="failure" output="command timed out" Mar 09 14:16:15 crc kubenswrapper[4723]: I0309 14:16:15.805782 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="5b572550-466a-4fae-9334-0a471e7c39be" containerName="prometheus" probeResult="failure" output="command timed out" Mar 09 14:16:15 crc kubenswrapper[4723]: I0309 14:16:15.881121 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-t8zw4" podUID="3fab2f82-df4b-417b-8188-0c4f455df30c" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.97:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:15 crc kubenswrapper[4723]: I0309 14:16:15.881121 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-jqk9s" podUID="54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:15 crc kubenswrapper[4723]: I0309 14:16:15.881155 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-jqk9s" podUID="54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:15 crc kubenswrapper[4723]: I0309 14:16:15.881220 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-t8zw4" podUID="3fab2f82-df4b-417b-8188-0c4f455df30c" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.97:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:15 crc kubenswrapper[4723]: I0309 14:16:15.881959 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-jqk9s" podUID="54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:16 crc kubenswrapper[4723]: I0309 14:16:16.002282 4723 patch_prober.go:28] interesting pod/logging-loki-gateway-867fb59d66-pxpr6 container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:16 crc kubenswrapper[4723]: I0309 14:16:16.002658 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-867fb59d66-pxpr6" podUID="3dcae42d-f05a-41f1-9d6a-11ccb28eb379" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.54:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:16 crc kubenswrapper[4723]: I0309 14:16:16.089698 4723 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:16 crc kubenswrapper[4723]: I0309 14:16:16.089816 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:16 crc kubenswrapper[4723]: I0309 14:16:16.222836 4723 patch_prober.go:28] interesting pod/nmstate-webhook-786f45cff4-msbbv container/nmstate-webhook namespace/openshift-nmstate: Readiness probe status=failure output="Get \"https://10.217.0.68:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:16 crc kubenswrapper[4723]: I0309 14:16:16.223148 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-webhook-786f45cff4-msbbv" podUID="3b47483e-69de-403b-ab71-5c6665c0a36d" containerName="nmstate-webhook" probeResult="failure" output="Get \"https://10.217.0.68:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:16 crc kubenswrapper[4723]: I0309 14:16:16.413047 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/controller-86ddb6bd46-7hbxv" podUID="351b987c-4b9a-4bf6-8832-a0504c9c16ed" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.98:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:16 crc kubenswrapper[4723]: I0309 14:16:16.454670 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/controller-86ddb6bd46-7hbxv" podUID="351b987c-4b9a-4bf6-8832-a0504c9c16ed" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.98:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:16 crc kubenswrapper[4723]: I0309 14:16:16.559983 4723 patch_prober.go:28] interesting pod/logging-loki-gateway-867fb59d66-2pwh2 container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:8083/ready\": context deadline exceeded" start-of-body= Mar 09 14:16:16 crc kubenswrapper[4723]: I0309 14:16:16.560055 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-867fb59d66-2pwh2" podUID="0bd030fd-cf38-4403-971f-4170fdc71bb0" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.53:8083/ready\": context deadline exceeded" Mar 09 14:16:16 crc kubenswrapper[4723]: I0309 14:16:16.991037 4723 patch_prober.go:28] interesting pod/console-85b7499c-sqsr9 container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.141:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:17 crc kubenswrapper[4723]: I0309 14:16:16.991373 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-85b7499c-sqsr9" podUID="731ffd33-861f-45a8-a54a-5a18dcca5ae6" containerName="console" probeResult="failure" output="Get \"https://10.217.0.141:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:17 crc kubenswrapper[4723]: I0309 14:16:17.459327 4723 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-lfdgl container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.81:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:17 crc kubenswrapper[4723]: I0309 14:16:17.459366 4723 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-lfdgl container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.81:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:17 crc kubenswrapper[4723]: I0309 14:16:17.459391 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-lfdgl" podUID="2a058d13-df7c-45fc-9c82-83cd7d61ffbd" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.81:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:17 crc kubenswrapper[4723]: I0309 14:16:17.459442 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-lfdgl" podUID="2a058d13-df7c-45fc-9c82-83cd7d61ffbd" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.81:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:17 crc kubenswrapper[4723]: I0309 14:16:17.898707 4723 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Liveness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:17 crc kubenswrapper[4723]: I0309 14:16:17.899188 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:17 crc kubenswrapper[4723]: I0309 14:16:17.948182 4723 patch_prober.go:28] interesting pod/oauth-openshift-6775b6d8cc-n5skm container/oauth-openshift namespace/openshift-authentication: Liveness probe status=failure output="Get \"https://10.217.0.66:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:17 crc kubenswrapper[4723]: I0309 14:16:17.948253 4723 patch_prober.go:28] interesting pod/oauth-openshift-6775b6d8cc-n5skm container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.66:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:17 crc kubenswrapper[4723]: I0309 14:16:17.948272 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" podUID="f0275b6b-90ed-4c22-ae68-834792f8e5dd" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.66:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:17 crc kubenswrapper[4723]: I0309 14:16:17.948284 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-6775b6d8cc-n5skm" podUID="f0275b6b-90ed-4c22-ae68-834792f8e5dd" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.66:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:18 crc kubenswrapper[4723]: I0309 14:16:18.537288 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-687f57d79b-6wj4n" podUID="7b820c49-0780-4d8d-a069-6cecf6ee0f1e" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.44:6080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:18 crc kubenswrapper[4723]: I0309 14:16:18.938628 4723 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-7nz4v container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.70:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:18 crc kubenswrapper[4723]: I0309 14:16:18.938835 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-7nz4v" podUID="0c45ecd0-a916-4ef0-80aa-cfe88212d0ed" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.70:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:18 crc kubenswrapper[4723]: I0309 14:16:18.982056 4723 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-7nz4v container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.70:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:18 crc kubenswrapper[4723]: I0309 14:16:18.982113 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-7nz4v" podUID="0c45ecd0-a916-4ef0-80aa-cfe88212d0ed" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.70:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:18 crc kubenswrapper[4723]: I0309 14:16:18.991993 4723 patch_prober.go:28] interesting pod/route-controller-manager-7757f9dd75-n7jz4 container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.77:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:18 crc kubenswrapper[4723]: I0309 14:16:18.992055 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-7757f9dd75-n7jz4" podUID="bfc8cb7a-3df5-4dd3-8520-82316314e76b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.77:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:18 crc kubenswrapper[4723]: I0309 14:16:18.993274 4723 patch_prober.go:28] interesting pod/route-controller-manager-7757f9dd75-n7jz4 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.77:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:18 crc kubenswrapper[4723]: I0309 14:16:18.993337 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7757f9dd75-n7jz4" podUID="bfc8cb7a-3df5-4dd3-8520-82316314e76b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.77:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:19 crc kubenswrapper[4723]: I0309 14:16:19.002311 4723 patch_prober.go:28] interesting pod/controller-manager-774cb675cc-hwvwx container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.78:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:19 crc kubenswrapper[4723]: I0309 14:16:19.002375 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-774cb675cc-hwvwx" podUID="f52a68c7-f2a2-4c16-a45b-b821debecd6d" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.78:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:19 crc kubenswrapper[4723]: I0309 14:16:19.002437 4723 patch_prober.go:28] interesting pod/controller-manager-774cb675cc-hwvwx container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.78:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:19 crc kubenswrapper[4723]: I0309 14:16:19.002453 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-774cb675cc-hwvwx" podUID="f52a68c7-f2a2-4c16-a45b-b821debecd6d" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.78:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:19 crc kubenswrapper[4723]: I0309 14:16:19.006013 4723 patch_prober.go:28] interesting pod/thanos-querier-f994cb665-42jsl container/kube-rbac-proxy-web namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.86:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:19 crc kubenswrapper[4723]: I0309 14:16:19.006098 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/thanos-querier-f994cb665-42jsl" podUID="338186cb-4546-4740-bba3-c1c430d8aacc" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.86:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:19 crc kubenswrapper[4723]: I0309 14:16:19.017023 4723 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-controller-manager/controller-manager-774cb675cc-hwvwx" Mar 09 14:16:19 crc kubenswrapper[4723]: I0309 14:16:19.024116 4723 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="controller-manager" containerStatusID={"Type":"cri-o","ID":"9926e9b2436ede65b865baf0c89a935a8a0f1e4c6e755d249aa36f187081c72d"} pod="openshift-controller-manager/controller-manager-774cb675cc-hwvwx" containerMessage="Container controller-manager failed liveness probe, will be restarted" Mar 09 14:16:19 crc kubenswrapper[4723]: I0309 14:16:19.053305 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-774cb675cc-hwvwx" podUID="f52a68c7-f2a2-4c16-a45b-b821debecd6d" containerName="controller-manager" containerID="cri-o://9926e9b2436ede65b865baf0c89a935a8a0f1e4c6e755d249aa36f187081c72d" gracePeriod=30 Mar 09 14:16:19 crc kubenswrapper[4723]: I0309 14:16:19.603109 4723 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-2nqwq container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.8:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:19 crc kubenswrapper[4723]: I0309 14:16:19.603125 4723 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-2nqwq container/operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.8:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:19 crc kubenswrapper[4723]: I0309 14:16:19.603222 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-2nqwq" podUID="2bc0446d-1f37-4214-bd0a-0f7c64f844a8" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.8:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:19 crc kubenswrapper[4723]: I0309 14:16:19.603315 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/observability-operator-59bdc8b94-2nqwq" podUID="2bc0446d-1f37-4214-bd0a-0f7c64f844a8" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.8:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:19 crc kubenswrapper[4723]: I0309 14:16:19.620969 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-2nqwq" Mar 09 14:16:19 crc kubenswrapper[4723]: I0309 14:16:19.621019 4723 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operators/observability-operator-59bdc8b94-2nqwq" Mar 09 14:16:19 crc kubenswrapper[4723]: I0309 14:16:19.624295 4723 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="operator" containerStatusID={"Type":"cri-o","ID":"2c14553978a656aee48096498a7ef6cbe4de47d9224deed86c2c389a5cbe69aa"} pod="openshift-operators/observability-operator-59bdc8b94-2nqwq" containerMessage="Container operator failed liveness probe, will be restarted" Mar 09 14:16:19 crc kubenswrapper[4723]: I0309 14:16:19.624339 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-operators/observability-operator-59bdc8b94-2nqwq" podUID="2bc0446d-1f37-4214-bd0a-0f7c64f844a8" containerName="operator" containerID="cri-o://2c14553978a656aee48096498a7ef6cbe4de47d9224deed86c2c389a5cbe69aa" gracePeriod=30 Mar 09 14:16:19 crc kubenswrapper[4723]: I0309 14:16:19.644007 4723 patch_prober.go:28] interesting pod/perses-operator-5bf474d74f-25fp4 container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.21:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:19 crc kubenswrapper[4723]: I0309 14:16:19.644075 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-5bf474d74f-25fp4" podUID="5d6ca68e-2044-42ed-9ee9-41bcf5aaf8e6" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.21:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:19 crc kubenswrapper[4723]: I0309 14:16:19.644146 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-25fp4" Mar 09 14:16:19 crc kubenswrapper[4723]: I0309 14:16:19.699611 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-25fp4" Mar 09 14:16:19 crc kubenswrapper[4723]: I0309 14:16:19.715581 4723 trace.go:236] Trace[126864727]: "Calculate volume metrics of swift for pod openstack/swift-storage-0" (09-Mar-2026 14:16:13.645) (total time: 6065ms): Mar 09 14:16:19 crc kubenswrapper[4723]: Trace[126864727]: [6.065136755s] [6.065136755s] END Mar 09 14:16:19 crc kubenswrapper[4723]: I0309 14:16:19.715592 4723 trace.go:236] Trace[614717976]: "Calculate volume metrics of ovndbcluster-nb-etc-ovn for pod openstack/ovsdbserver-nb-0" (09-Mar-2026 14:16:15.107) (total time: 4603ms): Mar 09 14:16:19 crc kubenswrapper[4723]: Trace[614717976]: [4.603879813s] [4.603879813s] END Mar 09 14:16:19 crc kubenswrapper[4723]: I0309 14:16:19.970694 4723 patch_prober.go:28] interesting pod/authentication-operator-69f744f599-lwpcl container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:19 crc kubenswrapper[4723]: I0309 14:16:19.971038 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-69f744f599-lwpcl" podUID="61312a96-b8f6-431c-b24e-0046271cf40f" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:19 crc kubenswrapper[4723]: I0309 14:16:19.971093 4723 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication-operator/authentication-operator-69f744f599-lwpcl" Mar 09 14:16:19 crc kubenswrapper[4723]: I0309 14:16:19.972642 4723 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="authentication-operator" containerStatusID={"Type":"cri-o","ID":"44b4eec54c1a41912c42ad25c6f2da3afbee7664d18e1c63cf74a5d2e99e28aa"} pod="openshift-authentication-operator/authentication-operator-69f744f599-lwpcl" containerMessage="Container authentication-operator failed liveness probe, will be restarted" Mar 09 14:16:19 crc kubenswrapper[4723]: I0309 14:16:19.972689 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication-operator/authentication-operator-69f744f599-lwpcl" podUID="61312a96-b8f6-431c-b24e-0046271cf40f" containerName="authentication-operator" containerID="cri-o://44b4eec54c1a41912c42ad25c6f2da3afbee7664d18e1c63cf74a5d2e99e28aa" gracePeriod=30 Mar 09 14:16:20 crc kubenswrapper[4723]: I0309 14:16:20.167196 4723 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-qp2l4 container/olm-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:20 crc kubenswrapper[4723]: I0309 14:16:20.167274 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qp2l4" podUID="3f57f760-4139-4eb7-b7e9-5f0bcd7cb3eb" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:20 crc kubenswrapper[4723]: I0309 14:16:20.167209 4723 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-qp2l4 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:20 crc kubenswrapper[4723]: I0309 14:16:20.167697 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qp2l4" podUID="3f57f760-4139-4eb7-b7e9-5f0bcd7cb3eb" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:20 crc kubenswrapper[4723]: I0309 14:16:20.451066 4723 patch_prober.go:28] interesting pod/console-operator-58897d9998-wkpzg container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:20 crc kubenswrapper[4723]: I0309 14:16:20.451122 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-wkpzg" podUID="1f6455f2-bcad-4e11-8ef5-a272b406be88" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:20 crc kubenswrapper[4723]: I0309 14:16:20.451198 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-wkpzg" Mar 09 14:16:20 crc kubenswrapper[4723]: I0309 14:16:20.451562 4723 patch_prober.go:28] interesting pod/console-operator-58897d9998-wkpzg container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:20 crc kubenswrapper[4723]: I0309 14:16:20.451620 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-wkpzg" podUID="1f6455f2-bcad-4e11-8ef5-a272b406be88" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:20 crc kubenswrapper[4723]: I0309 14:16:20.451674 4723 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-58897d9998-wkpzg" Mar 09 14:16:20 crc kubenswrapper[4723]: I0309 14:16:20.452135 4723 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="console-operator" containerStatusID={"Type":"cri-o","ID":"b6cf57e91e838d2637897bd9893d85047af5a69c0747e4b77c808af59b8169e0"} pod="openshift-console-operator/console-operator-58897d9998-wkpzg" containerMessage="Container console-operator failed liveness probe, will be restarted" Mar 09 14:16:20 crc kubenswrapper[4723]: I0309 14:16:20.452174 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console-operator/console-operator-58897d9998-wkpzg" podUID="1f6455f2-bcad-4e11-8ef5-a272b406be88" containerName="console-operator" containerID="cri-o://b6cf57e91e838d2637897bd9893d85047af5a69c0747e4b77c808af59b8169e0" gracePeriod=30 Mar 09 14:16:20 crc kubenswrapper[4723]: I0309 14:16:20.502182 4723 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-fhxc9 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:20 crc kubenswrapper[4723]: I0309 14:16:20.502257 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhxc9" podUID="3321a715-9c5f-4417-bec1-4ba3ccce946c" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.30:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:20 crc kubenswrapper[4723]: I0309 14:16:20.502343 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhxc9" Mar 09 14:16:20 crc kubenswrapper[4723]: I0309 14:16:20.502759 4723 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-fhxc9 container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.30:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:20 crc kubenswrapper[4723]: I0309 14:16:20.502781 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhxc9" podUID="3321a715-9c5f-4417-bec1-4ba3ccce946c" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.30:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:20 crc kubenswrapper[4723]: I0309 14:16:20.502804 4723 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhxc9" Mar 09 14:16:20 crc kubenswrapper[4723]: I0309 14:16:20.502939 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-2nqwq" Mar 09 14:16:20 crc kubenswrapper[4723]: I0309 14:16:20.503691 4723 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="packageserver" containerStatusID={"Type":"cri-o","ID":"46a6f990613d3789d7d30c91ec7ff1b64d39a3d0c008963524b8fc8f38728b1d"} pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhxc9" containerMessage="Container packageserver failed liveness probe, will be restarted" Mar 09 14:16:20 crc kubenswrapper[4723]: I0309 14:16:20.503741 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhxc9" podUID="3321a715-9c5f-4417-bec1-4ba3ccce946c" containerName="packageserver" containerID="cri-o://46a6f990613d3789d7d30c91ec7ff1b64d39a3d0c008963524b8fc8f38728b1d" gracePeriod=30 Mar 09 14:16:20 crc kubenswrapper[4723]: I0309 14:16:20.523492 4723 patch_prober.go:28] interesting pod/logging-loki-distributor-5d5548c9f5-zg9mx container/loki-distributor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.50:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:20 crc kubenswrapper[4723]: I0309 14:16:20.523557 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-distributor-5d5548c9f5-zg9mx" podUID="04edbd9e-fd1b-4346-97ce-adfb011720a4" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.50:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:20 crc kubenswrapper[4723]: I0309 14:16:20.689697 4723 patch_prober.go:28] interesting pod/logging-loki-querier-76bf7b6d45-2d54b container/loki-querier namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.51:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:20 crc kubenswrapper[4723]: I0309 14:16:20.690111 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-querier-76bf7b6d45-2d54b" podUID="8b9cdd14-6347-4701-9825-1ced6362cd8c" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.51:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:20 crc kubenswrapper[4723]: I0309 14:16:20.736387 4723 patch_prober.go:28] interesting pod/console-operator-58897d9998-wkpzg container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/readyz\": EOF" start-of-body= Mar 09 14:16:20 crc kubenswrapper[4723]: I0309 14:16:20.736457 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-wkpzg" podUID="1f6455f2-bcad-4e11-8ef5-a272b406be88" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/readyz\": EOF" Mar 09 14:16:20 crc kubenswrapper[4723]: I0309 14:16:20.769684 4723 patch_prober.go:28] interesting pod/logging-loki-query-frontend-6d6859c548-4mdnv container/loki-query-frontend namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.52:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:20 crc kubenswrapper[4723]: I0309 14:16:20.769746 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-query-frontend-6d6859c548-4mdnv" podUID="9cd1997b-cced-41c1-8a27-77321ffc48ae" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.52:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:20 crc kubenswrapper[4723]: I0309 14:16:20.910131 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/certified-operators-wn4hg" podUID="901259b6-1c9d-49ca-9c13-4626d65c68fa" containerName="registry-server" probeResult="failure" output=< Mar 09 14:16:20 crc kubenswrapper[4723]: timeout: failed to connect service ":50051" within 1s Mar 09 14:16:20 crc kubenswrapper[4723]: > Mar 09 14:16:20 crc kubenswrapper[4723]: I0309 14:16:20.910390 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/certified-operators-wn4hg" podUID="901259b6-1c9d-49ca-9c13-4626d65c68fa" containerName="registry-server" probeResult="failure" output=< Mar 09 14:16:20 crc kubenswrapper[4723]: timeout: failed to connect service ":50051" within 1s Mar 09 14:16:20 crc kubenswrapper[4723]: > Mar 09 14:16:21 crc kubenswrapper[4723]: I0309 14:16:21.081092 4723 patch_prober.go:28] interesting pod/router-default-5444994796-fzrk5 container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:21 crc kubenswrapper[4723]: I0309 14:16:21.081137 4723 patch_prober.go:28] interesting pod/router-default-5444994796-fzrk5 container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:21 crc kubenswrapper[4723]: I0309 14:16:21.081155 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-fzrk5" podUID="f386e3de-9b2f-478d-b7c5-8ac3f8aff8d1" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:21 crc kubenswrapper[4723]: I0309 14:16:21.081201 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-fzrk5" podUID="f386e3de-9b2f-478d-b7c5-8ac3f8aff8d1" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:21 crc kubenswrapper[4723]: I0309 14:16:21.159654 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-774cb675cc-hwvwx" event={"ID":"f52a68c7-f2a2-4c16-a45b-b821debecd6d","Type":"ContainerDied","Data":"9926e9b2436ede65b865baf0c89a935a8a0f1e4c6e755d249aa36f187081c72d"} Mar 09 14:16:21 crc kubenswrapper[4723]: I0309 14:16:21.160351 4723 generic.go:334] "Generic (PLEG): container finished" podID="f52a68c7-f2a2-4c16-a45b-b821debecd6d" containerID="9926e9b2436ede65b865baf0c89a935a8a0f1e4c6e755d249aa36f187081c72d" exitCode=0 Mar 09 14:16:21 crc kubenswrapper[4723]: I0309 14:16:21.162050 4723 generic.go:334] "Generic (PLEG): container finished" podID="2bc0446d-1f37-4214-bd0a-0f7c64f844a8" containerID="2c14553978a656aee48096498a7ef6cbe4de47d9224deed86c2c389a5cbe69aa" exitCode=0 Mar 09 14:16:21 crc kubenswrapper[4723]: I0309 14:16:21.162153 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-2nqwq" event={"ID":"2bc0446d-1f37-4214-bd0a-0f7c64f844a8","Type":"ContainerDied","Data":"2c14553978a656aee48096498a7ef6cbe4de47d9224deed86c2c389a5cbe69aa"} Mar 09 14:16:21 crc kubenswrapper[4723]: I0309 14:16:21.170700 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-58897d9998-wkpzg_1f6455f2-bcad-4e11-8ef5-a272b406be88/console-operator/0.log" Mar 09 14:16:21 crc kubenswrapper[4723]: I0309 14:16:21.170778 4723 generic.go:334] "Generic (PLEG): container finished" podID="1f6455f2-bcad-4e11-8ef5-a272b406be88" containerID="b6cf57e91e838d2637897bd9893d85047af5a69c0747e4b77c808af59b8169e0" exitCode=1 Mar 09 14:16:21 crc kubenswrapper[4723]: I0309 14:16:21.170920 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-wkpzg" event={"ID":"1f6455f2-bcad-4e11-8ef5-a272b406be88","Type":"ContainerDied","Data":"b6cf57e91e838d2637897bd9893d85047af5a69c0747e4b77c808af59b8169e0"} Mar 09 14:16:21 crc kubenswrapper[4723]: I0309 14:16:21.173147 4723 generic.go:334] "Generic (PLEG): container finished" podID="61312a96-b8f6-431c-b24e-0046271cf40f" containerID="44b4eec54c1a41912c42ad25c6f2da3afbee7664d18e1c63cf74a5d2e99e28aa" exitCode=0 Mar 09 14:16:21 crc kubenswrapper[4723]: I0309 14:16:21.173184 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-lwpcl" event={"ID":"61312a96-b8f6-431c-b24e-0046271cf40f","Type":"ContainerDied","Data":"44b4eec54c1a41912c42ad25c6f2da3afbee7664d18e1c63cf74a5d2e99e28aa"} Mar 09 14:16:21 crc kubenswrapper[4723]: I0309 14:16:21.313429 4723 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 14:16:21 crc kubenswrapper[4723]: I0309 14:16:21.393737 4723 patch_prober.go:28] interesting pod/metrics-server-7f69f56458-z9f7c container/metrics-server namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.93:10250/livez\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:21 crc kubenswrapper[4723]: I0309 14:16:21.393806 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/metrics-server-7f69f56458-z9f7c" podUID="9a4a344f-6f96-422b-9468-56c8e988ad3f" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.93:10250/livez\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:21 crc kubenswrapper[4723]: I0309 14:16:21.393899 4723 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-monitoring/metrics-server-7f69f56458-z9f7c" Mar 09 14:16:21 crc kubenswrapper[4723]: I0309 14:16:21.400532 4723 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="metrics-server" containerStatusID={"Type":"cri-o","ID":"1cf976ee2556a62cbe7223703cbe5dc18a2e0cef64591475186da4c9dd8de172"} pod="openshift-monitoring/metrics-server-7f69f56458-z9f7c" containerMessage="Container metrics-server failed liveness probe, will be restarted" Mar 09 14:16:21 crc kubenswrapper[4723]: I0309 14:16:21.400602 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/metrics-server-7f69f56458-z9f7c" podUID="9a4a344f-6f96-422b-9468-56c8e988ad3f" containerName="metrics-server" containerID="cri-o://1cf976ee2556a62cbe7223703cbe5dc18a2e0cef64591475186da4c9dd8de172" gracePeriod=170 Mar 09 14:16:21 crc kubenswrapper[4723]: I0309 14:16:21.503481 4723 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-fhxc9 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:5443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:21 crc kubenswrapper[4723]: I0309 14:16:21.503540 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhxc9" podUID="3321a715-9c5f-4417-bec1-4ba3ccce946c" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.30:5443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:21 crc kubenswrapper[4723]: I0309 14:16:21.796964 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17" containerName="galera" probeResult="failure" output="command timed out" Mar 09 14:16:21 crc kubenswrapper[4723]: I0309 14:16:21.797322 4723 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/openstack-galera-0" Mar 09 14:16:21 crc kubenswrapper[4723]: I0309 14:16:21.798078 4723 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="galera" containerStatusID={"Type":"cri-o","ID":"ab2fbb8f1c9652233a9ef65bfa17a386d216ea38a67831da48d799997c4ddebd"} pod="openstack/openstack-galera-0" containerMessage="Container galera failed liveness probe, will be restarted" Mar 09 14:16:21 crc kubenswrapper[4723]: I0309 14:16:21.798312 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17" containerName="galera" probeResult="failure" output="command timed out" Mar 09 14:16:21 crc kubenswrapper[4723]: I0309 14:16:21.798427 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 09 14:16:22 crc kubenswrapper[4723]: E0309 14:16:22.104918 4723 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf18bf19_d23a_471f_8074_2eaaa7c4aead.slice/crio-82ad22c202020bad5c2bc30629a84c341955f73333b0feea5d43b28474a7488e.scope\": RecentStats: unable to find data in memory cache]" Mar 09 14:16:22 crc kubenswrapper[4723]: I0309 14:16:22.232522 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-lwpcl" event={"ID":"61312a96-b8f6-431c-b24e-0046271cf40f","Type":"ContainerStarted","Data":"927d5cf68ebbf659b6bafb6959bceb80222513f91a5bb8a6716862c598f35c4b"} Mar 09 14:16:22 crc kubenswrapper[4723]: I0309 14:16:22.236324 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-774cb675cc-hwvwx" event={"ID":"f52a68c7-f2a2-4c16-a45b-b821debecd6d","Type":"ContainerStarted","Data":"ef266dd1495e65b8770d4ed0a7e7d96731fe0213e431d08b0cb6559777272319"} Mar 09 14:16:22 crc kubenswrapper[4723]: I0309 14:16:22.237118 4723 patch_prober.go:28] interesting pod/controller-manager-774cb675cc-hwvwx container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.78:8443/healthz\": dial tcp 10.217.0.78:8443: connect: connection refused" start-of-body= Mar 09 14:16:22 crc kubenswrapper[4723]: I0309 14:16:22.237164 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-774cb675cc-hwvwx" podUID="f52a68c7-f2a2-4c16-a45b-b821debecd6d" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.78:8443/healthz\": dial tcp 10.217.0.78:8443: connect: connection refused" Mar 09 14:16:22 crc kubenswrapper[4723]: I0309 14:16:22.237578 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-774cb675cc-hwvwx" Mar 09 14:16:22 crc kubenswrapper[4723]: I0309 14:16:22.241101 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-58897d9998-wkpzg_1f6455f2-bcad-4e11-8ef5-a272b406be88/console-operator/0.log" Mar 09 14:16:22 crc kubenswrapper[4723]: I0309 14:16:22.241151 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-wkpzg" event={"ID":"1f6455f2-bcad-4e11-8ef5-a272b406be88","Type":"ContainerStarted","Data":"d03018c1db0e6af7f55a166cf62b020ed5e22f24310618babd41144f41bbe978"} Mar 09 14:16:22 crc kubenswrapper[4723]: I0309 14:16:22.241999 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-wkpzg" Mar 09 14:16:22 crc kubenswrapper[4723]: I0309 14:16:22.242092 4723 patch_prober.go:28] interesting pod/console-operator-58897d9998-wkpzg container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/readyz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Mar 09 14:16:22 crc kubenswrapper[4723]: I0309 14:16:22.242129 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-wkpzg" podUID="1f6455f2-bcad-4e11-8ef5-a272b406be88" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/readyz\": dial tcp 10.217.0.23:8443: connect: connection refused" Mar 09 14:16:22 crc kubenswrapper[4723]: I0309 14:16:22.703199 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-89rtm" podUID="9646c273-606f-4551-82dd-39e09007dc17" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.107:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:23 crc kubenswrapper[4723]: I0309 14:16:23.195347 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-rwzzl" podUID="76830983-65b6-495a-8283-c9e2df80562b" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.111:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:23 crc kubenswrapper[4723]: I0309 14:16:23.253826 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-2nqwq" event={"ID":"2bc0446d-1f37-4214-bd0a-0f7c64f844a8","Type":"ContainerStarted","Data":"2045d9fbf5308e5cf90cd2fdb3b01df844d2e79bc54307abcba07615ad60cbb6"} Mar 09 14:16:23 crc kubenswrapper[4723]: I0309 14:16:23.255253 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-2nqwq" Mar 09 14:16:23 crc kubenswrapper[4723]: I0309 14:16:23.255484 4723 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-2nqwq container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.8:8081/healthz\": dial tcp 10.217.0.8:8081: connect: connection refused" start-of-body= Mar 09 14:16:23 crc kubenswrapper[4723]: I0309 14:16:23.255535 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-2nqwq" podUID="2bc0446d-1f37-4214-bd0a-0f7c64f844a8" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.8:8081/healthz\": dial tcp 10.217.0.8:8081: connect: connection refused" Mar 09 14:16:23 crc kubenswrapper[4723]: I0309 14:16:23.258977 4723 generic.go:334] "Generic (PLEG): container finished" podID="df18bf19-d23a-471f-8074-2eaaa7c4aead" containerID="82ad22c202020bad5c2bc30629a84c341955f73333b0feea5d43b28474a7488e" exitCode=0 Mar 09 14:16:23 crc kubenswrapper[4723]: I0309 14:16:23.259029 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df18bf19-d23a-471f-8074-2eaaa7c4aead","Type":"ContainerDied","Data":"82ad22c202020bad5c2bc30629a84c341955f73333b0feea5d43b28474a7488e"} Mar 09 14:16:23 crc kubenswrapper[4723]: I0309 14:16:23.261608 4723 generic.go:334] "Generic (PLEG): container finished" podID="3321a715-9c5f-4417-bec1-4ba3ccce946c" containerID="46a6f990613d3789d7d30c91ec7ff1b64d39a3d0c008963524b8fc8f38728b1d" exitCode=0 Mar 09 14:16:23 crc kubenswrapper[4723]: I0309 14:16:23.261714 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhxc9" event={"ID":"3321a715-9c5f-4417-bec1-4ba3ccce946c","Type":"ContainerDied","Data":"46a6f990613d3789d7d30c91ec7ff1b64d39a3d0c008963524b8fc8f38728b1d"} Mar 09 14:16:23 crc kubenswrapper[4723]: I0309 14:16:23.262201 4723 patch_prober.go:28] interesting pod/console-operator-58897d9998-wkpzg container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/readyz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Mar 09 14:16:23 crc kubenswrapper[4723]: I0309 14:16:23.262250 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-wkpzg" podUID="1f6455f2-bcad-4e11-8ef5-a272b406be88" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/readyz\": dial tcp 10.217.0.23:8443: connect: connection refused" Mar 09 14:16:23 crc kubenswrapper[4723]: I0309 14:16:23.262374 4723 patch_prober.go:28] interesting pod/controller-manager-774cb675cc-hwvwx container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.78:8443/healthz\": dial tcp 10.217.0.78:8443: connect: connection refused" start-of-body= Mar 09 14:16:23 crc kubenswrapper[4723]: I0309 14:16:23.262398 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-774cb675cc-hwvwx" podUID="f52a68c7-f2a2-4c16-a45b-b821debecd6d" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.78:8443/healthz\": dial tcp 10.217.0.78:8443: connect: connection refused" Mar 09 14:16:23 crc kubenswrapper[4723]: I0309 14:16:23.314065 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-mt7hb" podUID="f1620e57-58ba-4313-bba4-f5ece039f9f7" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:23 crc kubenswrapper[4723]: I0309 14:16:23.661980 4723 patch_prober.go:28] interesting pod/monitoring-plugin-66b86d466-5tr4x container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.90:9443/health\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:16:23 crc kubenswrapper[4723]: I0309 14:16:23.662037 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-66b86d466-5tr4x" podUID="289dc72b-221e-415a-8c97-3889de9ceaed" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.90:9443/health\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:23 crc kubenswrapper[4723]: I0309 14:16:23.662112 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-66b86d466-5tr4x" Mar 09 14:16:23 crc kubenswrapper[4723]: I0309 14:16:23.700803 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-index-9kl8t" podUID="194a48e1-f053-4aa1-bdfe-07aa2a8a208e" containerName="registry-server" probeResult="failure" output=< Mar 09 14:16:23 crc kubenswrapper[4723]: timeout: health rpc did not complete within 1s Mar 09 14:16:23 crc kubenswrapper[4723]: > Mar 09 14:16:23 crc kubenswrapper[4723]: I0309 14:16:23.718276 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-index-9kl8t" podUID="194a48e1-f053-4aa1-bdfe-07aa2a8a208e" containerName="registry-server" probeResult="failure" output=< Mar 09 14:16:23 crc kubenswrapper[4723]: timeout: health rpc did not complete within 1s Mar 09 14:16:23 crc kubenswrapper[4723]: > Mar 09 14:16:23 crc kubenswrapper[4723]: I0309 14:16:23.752338 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-4czcc" podUID="e93b778c-c10f-4da5-a3c2-91010b4b3aab" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.124:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:23 crc kubenswrapper[4723]: I0309 14:16:23.752614 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-6qstb" podUID="8554b7c9-0bd7-4326-b906-fe07dcdce9da" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.123:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:24 crc kubenswrapper[4723]: I0309 14:16:24.100960 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-66b86d466-5tr4x" Mar 09 14:16:24 crc kubenswrapper[4723]: I0309 14:16:24.115590 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bs5ch"] Mar 09 14:16:24 crc kubenswrapper[4723]: I0309 14:16:24.126842 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bs5ch" Mar 09 14:16:24 crc kubenswrapper[4723]: I0309 14:16:24.272621 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhxc9" event={"ID":"3321a715-9c5f-4417-bec1-4ba3ccce946c","Type":"ContainerStarted","Data":"6210fdf2858dbc66e7d3a9f6f3d4f8946206fb4b074d94b241f7ce2a5be0eea8"} Mar 09 14:16:24 crc kubenswrapper[4723]: I0309 14:16:24.273048 4723 patch_prober.go:28] interesting pod/controller-manager-774cb675cc-hwvwx container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.78:8443/healthz\": dial tcp 10.217.0.78:8443: connect: connection refused" start-of-body= Mar 09 14:16:24 crc kubenswrapper[4723]: I0309 14:16:24.273058 4723 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-2nqwq container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.8:8081/healthz\": dial tcp 10.217.0.8:8081: connect: connection refused" start-of-body= Mar 09 14:16:24 crc kubenswrapper[4723]: I0309 14:16:24.273077 4723 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-fhxc9 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:5443/healthz\": dial tcp 10.217.0.30:5443: connect: connection refused" start-of-body= Mar 09 14:16:24 crc kubenswrapper[4723]: I0309 14:16:24.273088 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-774cb675cc-hwvwx" podUID="f52a68c7-f2a2-4c16-a45b-b821debecd6d" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.78:8443/healthz\": dial tcp 10.217.0.78:8443: connect: connection refused" Mar 09 14:16:24 crc kubenswrapper[4723]: I0309 14:16:24.273115 4723 patch_prober.go:28] interesting pod/console-operator-58897d9998-wkpzg container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/readyz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Mar 09 14:16:24 crc kubenswrapper[4723]: I0309 14:16:24.273106 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-2nqwq" podUID="2bc0446d-1f37-4214-bd0a-0f7c64f844a8" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.8:8081/healthz\": dial tcp 10.217.0.8:8081: connect: connection refused" Mar 09 14:16:24 crc kubenswrapper[4723]: I0309 14:16:24.273106 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhxc9" podUID="3321a715-9c5f-4417-bec1-4ba3ccce946c" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.30:5443/healthz\": dial tcp 10.217.0.30:5443: connect: connection refused" Mar 09 14:16:24 crc kubenswrapper[4723]: I0309 14:16:24.273137 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-wkpzg" podUID="1f6455f2-bcad-4e11-8ef5-a272b406be88" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/readyz\": dial tcp 10.217.0.23:8443: connect: connection refused" Mar 09 14:16:24 crc kubenswrapper[4723]: I0309 14:16:24.299304 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bec22b2-9167-4c92-837d-167bd5046273-utilities\") pod \"redhat-operators-bs5ch\" (UID: \"6bec22b2-9167-4c92-837d-167bd5046273\") " pod="openshift-marketplace/redhat-operators-bs5ch" Mar 09 14:16:24 crc kubenswrapper[4723]: I0309 14:16:24.299466 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bec22b2-9167-4c92-837d-167bd5046273-catalog-content\") pod \"redhat-operators-bs5ch\" (UID: \"6bec22b2-9167-4c92-837d-167bd5046273\") " pod="openshift-marketplace/redhat-operators-bs5ch" Mar 09 14:16:24 crc kubenswrapper[4723]: I0309 14:16:24.299945 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxwm4\" (UniqueName: \"kubernetes.io/projected/6bec22b2-9167-4c92-837d-167bd5046273-kube-api-access-cxwm4\") pod \"redhat-operators-bs5ch\" (UID: \"6bec22b2-9167-4c92-837d-167bd5046273\") " pod="openshift-marketplace/redhat-operators-bs5ch" Mar 09 14:16:24 crc kubenswrapper[4723]: I0309 14:16:24.402062 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bec22b2-9167-4c92-837d-167bd5046273-utilities\") pod \"redhat-operators-bs5ch\" (UID: \"6bec22b2-9167-4c92-837d-167bd5046273\") " pod="openshift-marketplace/redhat-operators-bs5ch" Mar 09 14:16:24 crc kubenswrapper[4723]: I0309 14:16:24.402128 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bec22b2-9167-4c92-837d-167bd5046273-catalog-content\") pod \"redhat-operators-bs5ch\" (UID: \"6bec22b2-9167-4c92-837d-167bd5046273\") " pod="openshift-marketplace/redhat-operators-bs5ch" Mar 09 14:16:24 crc kubenswrapper[4723]: I0309 14:16:24.402999 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxwm4\" (UniqueName: \"kubernetes.io/projected/6bec22b2-9167-4c92-837d-167bd5046273-kube-api-access-cxwm4\") pod \"redhat-operators-bs5ch\" (UID: \"6bec22b2-9167-4c92-837d-167bd5046273\") " pod="openshift-marketplace/redhat-operators-bs5ch" Mar 09 14:16:24 crc kubenswrapper[4723]: I0309 14:16:24.422672 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bec22b2-9167-4c92-837d-167bd5046273-catalog-content\") pod \"redhat-operators-bs5ch\" (UID: \"6bec22b2-9167-4c92-837d-167bd5046273\") " pod="openshift-marketplace/redhat-operators-bs5ch" Mar 09 14:16:24 crc kubenswrapper[4723]: I0309 14:16:24.422687 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bec22b2-9167-4c92-837d-167bd5046273-utilities\") pod \"redhat-operators-bs5ch\" (UID: \"6bec22b2-9167-4c92-837d-167bd5046273\") " pod="openshift-marketplace/redhat-operators-bs5ch" Mar 09 14:16:24 crc kubenswrapper[4723]: I0309 14:16:24.456503 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxwm4\" (UniqueName: \"kubernetes.io/projected/6bec22b2-9167-4c92-837d-167bd5046273-kube-api-access-cxwm4\") pod \"redhat-operators-bs5ch\" (UID: \"6bec22b2-9167-4c92-837d-167bd5046273\") " pod="openshift-marketplace/redhat-operators-bs5ch" Mar 09 14:16:24 crc kubenswrapper[4723]: I0309 14:16:24.457283 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bs5ch" Mar 09 14:16:24 crc kubenswrapper[4723]: I0309 14:16:24.573374 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17" containerName="galera" containerID="cri-o://ab2fbb8f1c9652233a9ef65bfa17a386d216ea38a67831da48d799997c4ddebd" gracePeriod=28 Mar 09 14:16:24 crc kubenswrapper[4723]: I0309 14:16:24.628304 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bs5ch"] Mar 09 14:16:25 crc kubenswrapper[4723]: I0309 14:16:25.115093 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/metallb-operator-webhook-server-997cfb689-c8857" podUID="881230db-85c7-4159-b1dd-f537ed6baece" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.95:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:25 crc kubenswrapper[4723]: I0309 14:16:25.115471 4723 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="metallb-system/metallb-operator-webhook-server-997cfb689-c8857" Mar 09 14:16:25 crc kubenswrapper[4723]: I0309 14:16:25.115116 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-997cfb689-c8857" podUID="881230db-85c7-4159-b1dd-f537ed6baece" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.95:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:16:25 crc kubenswrapper[4723]: I0309 14:16:25.115666 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-997cfb689-c8857" Mar 09 14:16:25 crc kubenswrapper[4723]: I0309 14:16:25.116768 4723 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="webhook-server" containerStatusID={"Type":"cri-o","ID":"01c3fe4dfe2fc5ad71c06c36fc4c7ee15f4c12f72f5750618c35b20c33f89878"} pod="metallb-system/metallb-operator-webhook-server-997cfb689-c8857" containerMessage="Container webhook-server failed liveness probe, will be restarted" Mar 09 14:16:25 crc kubenswrapper[4723]: I0309 14:16:25.116820 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/metallb-operator-webhook-server-997cfb689-c8857" podUID="881230db-85c7-4159-b1dd-f537ed6baece" containerName="webhook-server" containerID="cri-o://01c3fe4dfe2fc5ad71c06c36fc4c7ee15f4c12f72f5750618c35b20c33f89878" gracePeriod=2 Mar 09 14:16:25 crc kubenswrapper[4723]: I0309 14:16:25.342346 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df18bf19-d23a-471f-8074-2eaaa7c4aead","Type":"ContainerStarted","Data":"67a075831722c616e79b73d64347bdcc649e529f16b65195a55cb18e7d5c0465"} Mar 09 14:16:25 crc kubenswrapper[4723]: I0309 14:16:25.342831 4723 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-fhxc9 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:5443/healthz\": dial tcp 10.217.0.30:5443: connect: connection refused" start-of-body= Mar 09 14:16:25 crc kubenswrapper[4723]: I0309 14:16:25.342898 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhxc9" podUID="3321a715-9c5f-4417-bec1-4ba3ccce946c" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.30:5443/healthz\": dial tcp 10.217.0.30:5443: connect: connection refused" Mar 09 14:16:25 crc kubenswrapper[4723]: I0309 14:16:25.343182 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhxc9" Mar 09 14:16:25 crc kubenswrapper[4723]: I0309 14:16:25.343992 4723 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-2nqwq container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.8:8081/healthz\": dial tcp 10.217.0.8:8081: connect: connection refused" start-of-body= Mar 09 14:16:25 crc kubenswrapper[4723]: I0309 14:16:25.344034 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-2nqwq" podUID="2bc0446d-1f37-4214-bd0a-0f7c64f844a8" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.8:8081/healthz\": dial tcp 10.217.0.8:8081: connect: connection refused" Mar 09 14:16:25 crc kubenswrapper[4723]: I0309 14:16:25.715512 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="cb802a89-59e3-4b45-bb49-20b980e06a57" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 14:16:26 crc kubenswrapper[4723]: I0309 14:16:26.357807 4723 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-fhxc9 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:5443/healthz\": dial tcp 10.217.0.30:5443: connect: connection refused" start-of-body= Mar 09 14:16:26 crc kubenswrapper[4723]: I0309 14:16:26.358073 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhxc9" podUID="3321a715-9c5f-4417-bec1-4ba3ccce946c" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.30:5443/healthz\": dial tcp 10.217.0.30:5443: connect: connection refused" Mar 09 14:16:27 crc kubenswrapper[4723]: I0309 14:16:27.426053 4723 generic.go:334] "Generic (PLEG): container finished" podID="881230db-85c7-4159-b1dd-f537ed6baece" containerID="01c3fe4dfe2fc5ad71c06c36fc4c7ee15f4c12f72f5750618c35b20c33f89878" exitCode=0 Mar 09 14:16:27 crc kubenswrapper[4723]: I0309 14:16:27.426675 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-997cfb689-c8857" event={"ID":"881230db-85c7-4159-b1dd-f537ed6baece","Type":"ContainerDied","Data":"01c3fe4dfe2fc5ad71c06c36fc4c7ee15f4c12f72f5750618c35b20c33f89878"} Mar 09 14:16:28 crc kubenswrapper[4723]: I0309 14:16:28.006982 4723 patch_prober.go:28] interesting pod/controller-manager-774cb675cc-hwvwx container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.78:8443/healthz\": dial tcp 10.217.0.78:8443: connect: connection refused" start-of-body= Mar 09 14:16:28 crc kubenswrapper[4723]: I0309 14:16:28.007268 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-774cb675cc-hwvwx" podUID="f52a68c7-f2a2-4c16-a45b-b821debecd6d" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.78:8443/healthz\": dial tcp 10.217.0.78:8443: connect: connection refused" Mar 09 14:16:28 crc kubenswrapper[4723]: I0309 14:16:28.096409 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="cb802a89-59e3-4b45-bb49-20b980e06a57" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 14:16:28 crc kubenswrapper[4723]: I0309 14:16:28.322816 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551096-jrfpb"] Mar 09 14:16:28 crc kubenswrapper[4723]: I0309 14:16:28.337467 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bs5ch"] Mar 09 14:16:28 crc kubenswrapper[4723]: W0309 14:16:28.410949 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d18d891_99a0_4089_86df_166cfa4297a6.slice/crio-bbaf034a6ed4c133270b8efc110ded4d848118848c6e214b53358fa7576e49af WatchSource:0}: Error finding container bbaf034a6ed4c133270b8efc110ded4d848118848c6e214b53358fa7576e49af: Status 404 returned error can't find the container with id bbaf034a6ed4c133270b8efc110ded4d848118848c6e214b53358fa7576e49af Mar 09 14:16:28 crc kubenswrapper[4723]: I0309 14:16:28.447139 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-997cfb689-c8857" event={"ID":"881230db-85c7-4159-b1dd-f537ed6baece","Type":"ContainerStarted","Data":"63d98c3b80933bd956c5204d3b0aaef531585c4f546138fffa3e1ef881fa321e"} Mar 09 14:16:28 crc kubenswrapper[4723]: I0309 14:16:28.448437 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-997cfb689-c8857" Mar 09 14:16:28 crc kubenswrapper[4723]: W0309 14:16:28.459774 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bec22b2_9167_4c92_837d_167bd5046273.slice/crio-593eadccb5822500188e0a6d67d0a380d7401241fb186a68c53850caa6e83dab WatchSource:0}: Error finding container 593eadccb5822500188e0a6d67d0a380d7401241fb186a68c53850caa6e83dab: Status 404 returned error can't find the container with id 593eadccb5822500188e0a6d67d0a380d7401241fb186a68c53850caa6e83dab Mar 09 14:16:28 crc kubenswrapper[4723]: I0309 14:16:28.522593 4723 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-2nqwq container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.8:8081/healthz\": dial tcp 10.217.0.8:8081: connect: connection refused" start-of-body= Mar 09 14:16:28 crc kubenswrapper[4723]: I0309 14:16:28.522655 4723 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-2nqwq" podUID="2bc0446d-1f37-4214-bd0a-0f7c64f844a8" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.8:8081/healthz\": dial tcp 10.217.0.8:8081: connect: connection refused" Mar 09 14:16:28 crc kubenswrapper[4723]: I0309 14:16:28.523496 4723 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-2nqwq container/operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.8:8081/healthz\": dial tcp 10.217.0.8:8081: connect: connection refused" start-of-body= Mar 09 14:16:28 crc kubenswrapper[4723]: I0309 14:16:28.523575 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/observability-operator-59bdc8b94-2nqwq" podUID="2bc0446d-1f37-4214-bd0a-0f7c64f844a8" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.8:8081/healthz\": dial tcp 10.217.0.8:8081: connect: connection refused" Mar 09 14:16:29 crc kubenswrapper[4723]: I0309 14:16:29.459295 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551096-jrfpb" event={"ID":"8d18d891-99a0-4089-86df-166cfa4297a6","Type":"ContainerStarted","Data":"bbaf034a6ed4c133270b8efc110ded4d848118848c6e214b53358fa7576e49af"} Mar 09 14:16:29 crc kubenswrapper[4723]: I0309 14:16:29.460776 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bs5ch" event={"ID":"6bec22b2-9167-4c92-837d-167bd5046273","Type":"ContainerStarted","Data":"593eadccb5822500188e0a6d67d0a380d7401241fb186a68c53850caa6e83dab"} Mar 09 14:16:29 crc kubenswrapper[4723]: I0309 14:16:29.555236 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhxc9" Mar 09 14:16:29 crc kubenswrapper[4723]: I0309 14:16:29.705118 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-wkpzg" Mar 09 14:16:29 crc kubenswrapper[4723]: I0309 14:16:29.846114 4723 trace.go:236] Trace[1601349796]: "Calculate volume metrics of persistence for pod openstack/rabbitmq-server-0" (09-Mar-2026 14:16:28.325) (total time: 1516ms): Mar 09 14:16:29 crc kubenswrapper[4723]: Trace[1601349796]: [1.516520004s] [1.516520004s] END Mar 09 14:16:29 crc kubenswrapper[4723]: I0309 14:16:29.967142 4723 trace.go:236] Trace[645000457]: "Calculate volume metrics of glance for pod openstack/glance-default-external-api-0" (09-Mar-2026 14:16:27.901) (total time: 2065ms): Mar 09 14:16:29 crc kubenswrapper[4723]: Trace[645000457]: [2.065874431s] [2.065874431s] END Mar 09 14:16:30 crc kubenswrapper[4723]: E0309 14:16:30.156172 4723 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ab2fbb8f1c9652233a9ef65bfa17a386d216ea38a67831da48d799997c4ddebd is running failed: container process not found" containerID="ab2fbb8f1c9652233a9ef65bfa17a386d216ea38a67831da48d799997c4ddebd" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 09 14:16:30 crc kubenswrapper[4723]: E0309 14:16:30.168249 4723 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ab2fbb8f1c9652233a9ef65bfa17a386d216ea38a67831da48d799997c4ddebd is running failed: container process not found" containerID="ab2fbb8f1c9652233a9ef65bfa17a386d216ea38a67831da48d799997c4ddebd" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 09 14:16:30 crc kubenswrapper[4723]: E0309 14:16:30.177361 4723 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ab2fbb8f1c9652233a9ef65bfa17a386d216ea38a67831da48d799997c4ddebd is running failed: container process not found" containerID="ab2fbb8f1c9652233a9ef65bfa17a386d216ea38a67831da48d799997c4ddebd" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 09 14:16:30 crc kubenswrapper[4723]: E0309 14:16:30.177435 4723 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ab2fbb8f1c9652233a9ef65bfa17a386d216ea38a67831da48d799997c4ddebd is running failed: container process not found" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17" containerName="galera" Mar 09 14:16:30 crc kubenswrapper[4723]: I0309 14:16:30.515104 4723 generic.go:334] "Generic (PLEG): container finished" podID="6bec22b2-9167-4c92-837d-167bd5046273" containerID="18b4d85dc2538df6f2fc12d82c445d244071aad058358d51786f21a032e68830" exitCode=0 Mar 09 14:16:30 crc kubenswrapper[4723]: I0309 14:16:30.515185 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bs5ch" event={"ID":"6bec22b2-9167-4c92-837d-167bd5046273","Type":"ContainerDied","Data":"18b4d85dc2538df6f2fc12d82c445d244071aad058358d51786f21a032e68830"} Mar 09 14:16:30 crc kubenswrapper[4723]: I0309 14:16:30.549952 4723 generic.go:334] "Generic (PLEG): container finished" podID="5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17" containerID="ab2fbb8f1c9652233a9ef65bfa17a386d216ea38a67831da48d799997c4ddebd" exitCode=0 Mar 09 14:16:30 crc kubenswrapper[4723]: I0309 14:16:30.551340 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17","Type":"ContainerDied","Data":"ab2fbb8f1c9652233a9ef65bfa17a386d216ea38a67831da48d799997c4ddebd"} Mar 09 14:16:31 crc kubenswrapper[4723]: I0309 14:16:31.066997 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="cb802a89-59e3-4b45-bb49-20b980e06a57" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 14:16:31 crc kubenswrapper[4723]: I0309 14:16:31.067529 4723 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 09 14:16:31 crc kubenswrapper[4723]: I0309 14:16:31.069358 4723 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cinder-scheduler" containerStatusID={"Type":"cri-o","ID":"f293aebdd6945999452439de2837f319ff8a2c8274c1112a5350754b0b9c8f7e"} pod="openstack/cinder-scheduler-0" containerMessage="Container cinder-scheduler failed liveness probe, will be restarted" Mar 09 14:16:31 crc kubenswrapper[4723]: I0309 14:16:31.069560 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="cb802a89-59e3-4b45-bb49-20b980e06a57" containerName="cinder-scheduler" containerID="cri-o://f293aebdd6945999452439de2837f319ff8a2c8274c1112a5350754b0b9c8f7e" gracePeriod=30 Mar 09 14:16:31 crc kubenswrapper[4723]: I0309 14:16:31.565311 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551096-jrfpb" event={"ID":"8d18d891-99a0-4089-86df-166cfa4297a6","Type":"ContainerStarted","Data":"5e5f8e737f5804dbd716d4de3423a5591e0428c02cba590bfa21c25feb4c3ed8"} Mar 09 14:16:31 crc kubenswrapper[4723]: I0309 14:16:31.574039 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bs5ch" event={"ID":"6bec22b2-9167-4c92-837d-167bd5046273","Type":"ContainerStarted","Data":"bcbfcf53e3ff3b61843eb983b6f95a7a8600b1ba099999aeddeb6e1ef8d8e9f0"} Mar 09 14:16:31 crc kubenswrapper[4723]: I0309 14:16:31.582837 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17","Type":"ContainerStarted","Data":"c20e896a3dd7f08cf679a9f59156614edad8f8bef42fc30cd883decb4774810d"} Mar 09 14:16:31 crc kubenswrapper[4723]: I0309 14:16:31.596390 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551096-jrfpb" podStartSLOduration=29.023022694 podStartE2EDuration="30.596372213s" podCreationTimestamp="2026-03-09 14:16:01 +0000 UTC" firstStartedPulling="2026-03-09 14:16:28.457611586 +0000 UTC m=+4662.472079116" lastFinishedPulling="2026-03-09 14:16:30.030961105 +0000 UTC m=+4664.045428635" observedRunningTime="2026-03-09 14:16:31.591328359 +0000 UTC m=+4665.605795919" watchObservedRunningTime="2026-03-09 14:16:31.596372213 +0000 UTC m=+4665.610839753" Mar 09 14:16:33 crc kubenswrapper[4723]: I0309 14:16:33.607758 4723 generic.go:334] "Generic (PLEG): container finished" podID="cb802a89-59e3-4b45-bb49-20b980e06a57" containerID="f293aebdd6945999452439de2837f319ff8a2c8274c1112a5350754b0b9c8f7e" exitCode=0 Mar 09 14:16:33 crc kubenswrapper[4723]: I0309 14:16:33.607852 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cb802a89-59e3-4b45-bb49-20b980e06a57","Type":"ContainerDied","Data":"f293aebdd6945999452439de2837f319ff8a2c8274c1112a5350754b0b9c8f7e"} Mar 09 14:16:33 crc kubenswrapper[4723]: I0309 14:16:33.612257 4723 generic.go:334] "Generic (PLEG): container finished" podID="8d18d891-99a0-4089-86df-166cfa4297a6" containerID="5e5f8e737f5804dbd716d4de3423a5591e0428c02cba590bfa21c25feb4c3ed8" exitCode=0 Mar 09 14:16:33 crc kubenswrapper[4723]: I0309 14:16:33.612298 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551096-jrfpb" event={"ID":"8d18d891-99a0-4089-86df-166cfa4297a6","Type":"ContainerDied","Data":"5e5f8e737f5804dbd716d4de3423a5591e0428c02cba590bfa21c25feb4c3ed8"} Mar 09 14:16:33 crc kubenswrapper[4723]: I0309 14:16:33.946934 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:16:33 crc kubenswrapper[4723]: I0309 14:16:33.947345 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:16:33 crc kubenswrapper[4723]: I0309 14:16:33.947400 4723 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" Mar 09 14:16:33 crc kubenswrapper[4723]: I0309 14:16:33.948488 4723 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b5af8cf4e6b647cc3921b22101db6b30f5e74eefccad33aef3904a6fa1114186"} pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 14:16:33 crc kubenswrapper[4723]: I0309 14:16:33.948561 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" containerID="cri-o://b5af8cf4e6b647cc3921b22101db6b30f5e74eefccad33aef3904a6fa1114186" gracePeriod=600 Mar 09 14:16:34 crc kubenswrapper[4723]: I0309 14:16:34.627221 4723 generic.go:334] "Generic (PLEG): container finished" podID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerID="b5af8cf4e6b647cc3921b22101db6b30f5e74eefccad33aef3904a6fa1114186" exitCode=0 Mar 09 14:16:34 crc kubenswrapper[4723]: I0309 14:16:34.627299 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" event={"ID":"983d5ed4-cfc7-402a-b226-29dc071c6e4e","Type":"ContainerDied","Data":"b5af8cf4e6b647cc3921b22101db6b30f5e74eefccad33aef3904a6fa1114186"} Mar 09 14:16:34 crc kubenswrapper[4723]: I0309 14:16:34.627899 4723 scope.go:117] "RemoveContainer" containerID="9ce6a9fca22c7136a89fb9bc0304454b543a23b1215c6e803804804ed52d6cff" Mar 09 14:16:34 crc kubenswrapper[4723]: E0309 14:16:34.893178 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:16:35 crc kubenswrapper[4723]: I0309 14:16:35.638557 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cb802a89-59e3-4b45-bb49-20b980e06a57","Type":"ContainerStarted","Data":"5ef4824efbabaedf65bd9e774ad402e20979eee8902568ffb1c124e7def5d897"} Mar 09 14:16:35 crc kubenswrapper[4723]: I0309 14:16:35.643283 4723 scope.go:117] "RemoveContainer" containerID="b5af8cf4e6b647cc3921b22101db6b30f5e74eefccad33aef3904a6fa1114186" Mar 09 14:16:35 crc kubenswrapper[4723]: E0309 14:16:35.643757 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:16:35 crc kubenswrapper[4723]: I0309 14:16:35.817081 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551096-jrfpb" Mar 09 14:16:35 crc kubenswrapper[4723]: I0309 14:16:35.975793 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2s7c\" (UniqueName: \"kubernetes.io/projected/8d18d891-99a0-4089-86df-166cfa4297a6-kube-api-access-c2s7c\") pod \"8d18d891-99a0-4089-86df-166cfa4297a6\" (UID: \"8d18d891-99a0-4089-86df-166cfa4297a6\") " Mar 09 14:16:36 crc kubenswrapper[4723]: I0309 14:16:35.999321 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d18d891-99a0-4089-86df-166cfa4297a6-kube-api-access-c2s7c" (OuterVolumeSpecName: "kube-api-access-c2s7c") pod "8d18d891-99a0-4089-86df-166cfa4297a6" (UID: "8d18d891-99a0-4089-86df-166cfa4297a6"). InnerVolumeSpecName "kube-api-access-c2s7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:16:36 crc kubenswrapper[4723]: I0309 14:16:36.081840 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2s7c\" (UniqueName: \"kubernetes.io/projected/8d18d891-99a0-4089-86df-166cfa4297a6-kube-api-access-c2s7c\") on node \"crc\" DevicePath \"\"" Mar 09 14:16:36 crc kubenswrapper[4723]: I0309 14:16:36.656922 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551096-jrfpb" event={"ID":"8d18d891-99a0-4089-86df-166cfa4297a6","Type":"ContainerDied","Data":"bbaf034a6ed4c133270b8efc110ded4d848118848c6e214b53358fa7576e49af"} Mar 09 14:16:36 crc kubenswrapper[4723]: I0309 14:16:36.656962 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551096-jrfpb" Mar 09 14:16:36 crc kubenswrapper[4723]: I0309 14:16:36.664135 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bbaf034a6ed4c133270b8efc110ded4d848118848c6e214b53358fa7576e49af" Mar 09 14:16:37 crc kubenswrapper[4723]: I0309 14:16:37.035349 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551090-vnvqs"] Mar 09 14:16:37 crc kubenswrapper[4723]: I0309 14:16:37.050352 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551090-vnvqs"] Mar 09 14:16:38 crc kubenswrapper[4723]: I0309 14:16:38.006328 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-774cb675cc-hwvwx" Mar 09 14:16:38 crc kubenswrapper[4723]: I0309 14:16:38.026692 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 09 14:16:38 crc kubenswrapper[4723]: I0309 14:16:38.522634 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-2nqwq" Mar 09 14:16:38 crc kubenswrapper[4723]: I0309 14:16:38.965440 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02167125-6205-49c3-8c1e-00349c7020a1" path="/var/lib/kubelet/pods/02167125-6205-49c3-8c1e-00349c7020a1/volumes" Mar 09 14:16:39 crc kubenswrapper[4723]: I0309 14:16:39.694693 4723 generic.go:334] "Generic (PLEG): container finished" podID="6bec22b2-9167-4c92-837d-167bd5046273" containerID="bcbfcf53e3ff3b61843eb983b6f95a7a8600b1ba099999aeddeb6e1ef8d8e9f0" exitCode=0 Mar 09 14:16:39 crc kubenswrapper[4723]: I0309 14:16:39.694758 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bs5ch" event={"ID":"6bec22b2-9167-4c92-837d-167bd5046273","Type":"ContainerDied","Data":"bcbfcf53e3ff3b61843eb983b6f95a7a8600b1ba099999aeddeb6e1ef8d8e9f0"} Mar 09 14:16:40 crc kubenswrapper[4723]: I0309 14:16:40.149024 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 09 14:16:40 crc kubenswrapper[4723]: I0309 14:16:40.149133 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 09 14:16:40 crc kubenswrapper[4723]: I0309 14:16:40.285728 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 09 14:16:40 crc kubenswrapper[4723]: I0309 14:16:40.772893 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bs5ch" event={"ID":"6bec22b2-9167-4c92-837d-167bd5046273","Type":"ContainerStarted","Data":"45c23ea50ed7225004d35a861c0b70ea2d20b1fb7413ee8865eb00a580c8e526"} Mar 09 14:16:40 crc kubenswrapper[4723]: I0309 14:16:40.836012 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bs5ch" podStartSLOduration=9.23895164 podStartE2EDuration="18.835993417s" podCreationTimestamp="2026-03-09 14:16:22 +0000 UTC" firstStartedPulling="2026-03-09 14:16:30.524014071 +0000 UTC m=+4664.538481601" lastFinishedPulling="2026-03-09 14:16:40.121055838 +0000 UTC m=+4674.135523378" observedRunningTime="2026-03-09 14:16:40.816653001 +0000 UTC m=+4674.831120541" watchObservedRunningTime="2026-03-09 14:16:40.835993417 +0000 UTC m=+4674.850460957" Mar 09 14:16:41 crc kubenswrapper[4723]: I0309 14:16:41.976138 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 09 14:16:43 crc kubenswrapper[4723]: I0309 14:16:43.096452 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 09 14:16:44 crc kubenswrapper[4723]: I0309 14:16:44.044988 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-997cfb689-c8857" Mar 09 14:16:44 crc kubenswrapper[4723]: I0309 14:16:44.458752 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bs5ch" Mar 09 14:16:44 crc kubenswrapper[4723]: I0309 14:16:44.458811 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bs5ch" Mar 09 14:16:45 crc kubenswrapper[4723]: I0309 14:16:45.512666 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bs5ch" podUID="6bec22b2-9167-4c92-837d-167bd5046273" containerName="registry-server" probeResult="failure" output=< Mar 09 14:16:45 crc kubenswrapper[4723]: timeout: failed to connect service ":50051" within 1s Mar 09 14:16:45 crc kubenswrapper[4723]: > Mar 09 14:16:49 crc kubenswrapper[4723]: I0309 14:16:49.882137 4723 scope.go:117] "RemoveContainer" containerID="b5af8cf4e6b647cc3921b22101db6b30f5e74eefccad33aef3904a6fa1114186" Mar 09 14:16:49 crc kubenswrapper[4723]: E0309 14:16:49.884252 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:16:55 crc kubenswrapper[4723]: I0309 14:16:55.525236 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bs5ch" podUID="6bec22b2-9167-4c92-837d-167bd5046273" containerName="registry-server" probeResult="failure" output=< Mar 09 14:16:55 crc kubenswrapper[4723]: timeout: failed to connect service ":50051" within 1s Mar 09 14:16:55 crc kubenswrapper[4723]: > Mar 09 14:17:00 crc kubenswrapper[4723]: I0309 14:17:00.881436 4723 scope.go:117] "RemoveContainer" containerID="b5af8cf4e6b647cc3921b22101db6b30f5e74eefccad33aef3904a6fa1114186" Mar 09 14:17:00 crc kubenswrapper[4723]: E0309 14:17:00.882263 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:17:05 crc kubenswrapper[4723]: I0309 14:17:05.523116 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bs5ch" podUID="6bec22b2-9167-4c92-837d-167bd5046273" containerName="registry-server" probeResult="failure" output=< Mar 09 14:17:05 crc kubenswrapper[4723]: timeout: failed to connect service ":50051" within 1s Mar 09 14:17:05 crc kubenswrapper[4723]: > Mar 09 14:17:12 crc kubenswrapper[4723]: I0309 14:17:12.881105 4723 scope.go:117] "RemoveContainer" containerID="b5af8cf4e6b647cc3921b22101db6b30f5e74eefccad33aef3904a6fa1114186" Mar 09 14:17:12 crc kubenswrapper[4723]: E0309 14:17:12.882147 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:17:15 crc kubenswrapper[4723]: I0309 14:17:15.837690 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bs5ch" podUID="6bec22b2-9167-4c92-837d-167bd5046273" containerName="registry-server" probeResult="failure" output=< Mar 09 14:17:15 crc kubenswrapper[4723]: timeout: failed to connect service ":50051" within 1s Mar 09 14:17:15 crc kubenswrapper[4723]: > Mar 09 14:17:24 crc kubenswrapper[4723]: I0309 14:17:24.514982 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bs5ch" Mar 09 14:17:24 crc kubenswrapper[4723]: I0309 14:17:24.567823 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bs5ch" Mar 09 14:17:24 crc kubenswrapper[4723]: I0309 14:17:24.760340 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bs5ch"] Mar 09 14:17:25 crc kubenswrapper[4723]: I0309 14:17:25.881363 4723 scope.go:117] "RemoveContainer" containerID="b5af8cf4e6b647cc3921b22101db6b30f5e74eefccad33aef3904a6fa1114186" Mar 09 14:17:25 crc kubenswrapper[4723]: E0309 14:17:25.881958 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:17:26 crc kubenswrapper[4723]: I0309 14:17:26.256979 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bs5ch" podUID="6bec22b2-9167-4c92-837d-167bd5046273" containerName="registry-server" containerID="cri-o://45c23ea50ed7225004d35a861c0b70ea2d20b1fb7413ee8865eb00a580c8e526" gracePeriod=2 Mar 09 14:17:27 crc kubenswrapper[4723]: I0309 14:17:27.265178 4723 generic.go:334] "Generic (PLEG): container finished" podID="6bec22b2-9167-4c92-837d-167bd5046273" containerID="45c23ea50ed7225004d35a861c0b70ea2d20b1fb7413ee8865eb00a580c8e526" exitCode=0 Mar 09 14:17:27 crc kubenswrapper[4723]: I0309 14:17:27.265250 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bs5ch" event={"ID":"6bec22b2-9167-4c92-837d-167bd5046273","Type":"ContainerDied","Data":"45c23ea50ed7225004d35a861c0b70ea2d20b1fb7413ee8865eb00a580c8e526"} Mar 09 14:17:28 crc kubenswrapper[4723]: I0309 14:17:28.111111 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bs5ch" Mar 09 14:17:28 crc kubenswrapper[4723]: I0309 14:17:28.172502 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxwm4\" (UniqueName: \"kubernetes.io/projected/6bec22b2-9167-4c92-837d-167bd5046273-kube-api-access-cxwm4\") pod \"6bec22b2-9167-4c92-837d-167bd5046273\" (UID: \"6bec22b2-9167-4c92-837d-167bd5046273\") " Mar 09 14:17:28 crc kubenswrapper[4723]: I0309 14:17:28.172614 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bec22b2-9167-4c92-837d-167bd5046273-catalog-content\") pod \"6bec22b2-9167-4c92-837d-167bd5046273\" (UID: \"6bec22b2-9167-4c92-837d-167bd5046273\") " Mar 09 14:17:28 crc kubenswrapper[4723]: I0309 14:17:28.172986 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bec22b2-9167-4c92-837d-167bd5046273-utilities\") pod \"6bec22b2-9167-4c92-837d-167bd5046273\" (UID: \"6bec22b2-9167-4c92-837d-167bd5046273\") " Mar 09 14:17:28 crc kubenswrapper[4723]: I0309 14:17:28.178714 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bec22b2-9167-4c92-837d-167bd5046273-utilities" (OuterVolumeSpecName: "utilities") pod "6bec22b2-9167-4c92-837d-167bd5046273" (UID: "6bec22b2-9167-4c92-837d-167bd5046273"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:17:28 crc kubenswrapper[4723]: I0309 14:17:28.276996 4723 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bec22b2-9167-4c92-837d-167bd5046273-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:17:28 crc kubenswrapper[4723]: I0309 14:17:28.295290 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bs5ch" event={"ID":"6bec22b2-9167-4c92-837d-167bd5046273","Type":"ContainerDied","Data":"593eadccb5822500188e0a6d67d0a380d7401241fb186a68c53850caa6e83dab"} Mar 09 14:17:28 crc kubenswrapper[4723]: I0309 14:17:28.295350 4723 scope.go:117] "RemoveContainer" containerID="45c23ea50ed7225004d35a861c0b70ea2d20b1fb7413ee8865eb00a580c8e526" Mar 09 14:17:28 crc kubenswrapper[4723]: I0309 14:17:28.295586 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bs5ch" Mar 09 14:17:28 crc kubenswrapper[4723]: I0309 14:17:28.323610 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bec22b2-9167-4c92-837d-167bd5046273-kube-api-access-cxwm4" (OuterVolumeSpecName: "kube-api-access-cxwm4") pod "6bec22b2-9167-4c92-837d-167bd5046273" (UID: "6bec22b2-9167-4c92-837d-167bd5046273"). InnerVolumeSpecName "kube-api-access-cxwm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:17:28 crc kubenswrapper[4723]: I0309 14:17:28.381807 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxwm4\" (UniqueName: \"kubernetes.io/projected/6bec22b2-9167-4c92-837d-167bd5046273-kube-api-access-cxwm4\") on node \"crc\" DevicePath \"\"" Mar 09 14:17:28 crc kubenswrapper[4723]: I0309 14:17:28.462563 4723 scope.go:117] "RemoveContainer" containerID="bcbfcf53e3ff3b61843eb983b6f95a7a8600b1ba099999aeddeb6e1ef8d8e9f0" Mar 09 14:17:28 crc kubenswrapper[4723]: I0309 14:17:28.515354 4723 scope.go:117] "RemoveContainer" containerID="18b4d85dc2538df6f2fc12d82c445d244071aad058358d51786f21a032e68830" Mar 09 14:17:28 crc kubenswrapper[4723]: I0309 14:17:28.547485 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bec22b2-9167-4c92-837d-167bd5046273-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6bec22b2-9167-4c92-837d-167bd5046273" (UID: "6bec22b2-9167-4c92-837d-167bd5046273"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:17:28 crc kubenswrapper[4723]: I0309 14:17:28.586290 4723 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bec22b2-9167-4c92-837d-167bd5046273-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:17:28 crc kubenswrapper[4723]: I0309 14:17:28.655602 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bs5ch"] Mar 09 14:17:28 crc kubenswrapper[4723]: I0309 14:17:28.655659 4723 scope.go:117] "RemoveContainer" containerID="a37d2094007d5dc0206e5b2e394ad7dd8904b61f44210734f2e38eb32e30ed0d" Mar 09 14:17:28 crc kubenswrapper[4723]: I0309 14:17:28.669557 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bs5ch"] Mar 09 14:17:28 crc kubenswrapper[4723]: I0309 14:17:28.905557 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bec22b2-9167-4c92-837d-167bd5046273" path="/var/lib/kubelet/pods/6bec22b2-9167-4c92-837d-167bd5046273/volumes" Mar 09 14:17:38 crc kubenswrapper[4723]: I0309 14:17:38.881472 4723 scope.go:117] "RemoveContainer" containerID="b5af8cf4e6b647cc3921b22101db6b30f5e74eefccad33aef3904a6fa1114186" Mar 09 14:17:38 crc kubenswrapper[4723]: E0309 14:17:38.882763 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:17:43 crc kubenswrapper[4723]: I0309 14:17:43.454063 4723 generic.go:334] "Generic (PLEG): container finished" podID="ef1f6085-70f7-44a1-bf7c-5b4c90284dda" containerID="48a8eb6f0e13d1ee2d14659873efe909710f769f9b738e0085c2cd703c964116" exitCode=1 Mar 09 14:17:43 crc kubenswrapper[4723]: I0309 14:17:43.454218 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ef1f6085-70f7-44a1-bf7c-5b4c90284dda","Type":"ContainerDied","Data":"48a8eb6f0e13d1ee2d14659873efe909710f769f9b738e0085c2cd703c964116"} Mar 09 14:17:44 crc kubenswrapper[4723]: I0309 14:17:44.922678 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 09 14:17:44 crc kubenswrapper[4723]: I0309 14:17:44.980216 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-openstack-config\") pod \"ef1f6085-70f7-44a1-bf7c-5b4c90284dda\" (UID: \"ef1f6085-70f7-44a1-bf7c-5b4c90284dda\") " Mar 09 14:17:44 crc kubenswrapper[4723]: I0309 14:17:44.980285 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-test-operator-ephemeral-workdir\") pod \"ef1f6085-70f7-44a1-bf7c-5b4c90284dda\" (UID: \"ef1f6085-70f7-44a1-bf7c-5b4c90284dda\") " Mar 09 14:17:44 crc kubenswrapper[4723]: I0309 14:17:44.980470 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-ca-certs\") pod \"ef1f6085-70f7-44a1-bf7c-5b4c90284dda\" (UID: \"ef1f6085-70f7-44a1-bf7c-5b4c90284dda\") " Mar 09 14:17:44 crc kubenswrapper[4723]: I0309 14:17:44.980548 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ef1f6085-70f7-44a1-bf7c-5b4c90284dda\" (UID: \"ef1f6085-70f7-44a1-bf7c-5b4c90284dda\") " Mar 09 14:17:44 crc kubenswrapper[4723]: I0309 14:17:44.980572 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbgbr\" (UniqueName: \"kubernetes.io/projected/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-kube-api-access-zbgbr\") pod \"ef1f6085-70f7-44a1-bf7c-5b4c90284dda\" (UID: \"ef1f6085-70f7-44a1-bf7c-5b4c90284dda\") " Mar 09 14:17:44 crc kubenswrapper[4723]: I0309 14:17:44.980652 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-openstack-config-secret\") pod \"ef1f6085-70f7-44a1-bf7c-5b4c90284dda\" (UID: \"ef1f6085-70f7-44a1-bf7c-5b4c90284dda\") " Mar 09 14:17:44 crc kubenswrapper[4723]: I0309 14:17:44.980743 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-test-operator-ephemeral-temporary\") pod \"ef1f6085-70f7-44a1-bf7c-5b4c90284dda\" (UID: \"ef1f6085-70f7-44a1-bf7c-5b4c90284dda\") " Mar 09 14:17:44 crc kubenswrapper[4723]: I0309 14:17:44.980782 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-ssh-key\") pod \"ef1f6085-70f7-44a1-bf7c-5b4c90284dda\" (UID: \"ef1f6085-70f7-44a1-bf7c-5b4c90284dda\") " Mar 09 14:17:44 crc kubenswrapper[4723]: I0309 14:17:44.981423 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-config-data\") pod \"ef1f6085-70f7-44a1-bf7c-5b4c90284dda\" (UID: \"ef1f6085-70f7-44a1-bf7c-5b4c90284dda\") " Mar 09 14:17:44 crc kubenswrapper[4723]: I0309 14:17:44.982142 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-config-data" (OuterVolumeSpecName: "config-data") pod "ef1f6085-70f7-44a1-bf7c-5b4c90284dda" (UID: "ef1f6085-70f7-44a1-bf7c-5b4c90284dda"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:17:44 crc kubenswrapper[4723]: I0309 14:17:44.982596 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "ef1f6085-70f7-44a1-bf7c-5b4c90284dda" (UID: "ef1f6085-70f7-44a1-bf7c-5b4c90284dda"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:17:44 crc kubenswrapper[4723]: I0309 14:17:44.985990 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "ef1f6085-70f7-44a1-bf7c-5b4c90284dda" (UID: "ef1f6085-70f7-44a1-bf7c-5b4c90284dda"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:17:44 crc kubenswrapper[4723]: I0309 14:17:44.986706 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "test-operator-logs") pod "ef1f6085-70f7-44a1-bf7c-5b4c90284dda" (UID: "ef1f6085-70f7-44a1-bf7c-5b4c90284dda"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 14:17:44 crc kubenswrapper[4723]: I0309 14:17:44.987482 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-kube-api-access-zbgbr" (OuterVolumeSpecName: "kube-api-access-zbgbr") pod "ef1f6085-70f7-44a1-bf7c-5b4c90284dda" (UID: "ef1f6085-70f7-44a1-bf7c-5b4c90284dda"). InnerVolumeSpecName "kube-api-access-zbgbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:17:45 crc kubenswrapper[4723]: I0309 14:17:45.025059 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "ef1f6085-70f7-44a1-bf7c-5b4c90284dda" (UID: "ef1f6085-70f7-44a1-bf7c-5b4c90284dda"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:17:45 crc kubenswrapper[4723]: I0309 14:17:45.026878 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "ef1f6085-70f7-44a1-bf7c-5b4c90284dda" (UID: "ef1f6085-70f7-44a1-bf7c-5b4c90284dda"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:17:45 crc kubenswrapper[4723]: I0309 14:17:45.049221 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ef1f6085-70f7-44a1-bf7c-5b4c90284dda" (UID: "ef1f6085-70f7-44a1-bf7c-5b4c90284dda"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:17:45 crc kubenswrapper[4723]: I0309 14:17:45.056500 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "ef1f6085-70f7-44a1-bf7c-5b4c90284dda" (UID: "ef1f6085-70f7-44a1-bf7c-5b4c90284dda"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:17:45 crc kubenswrapper[4723]: I0309 14:17:45.083801 4723 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 09 14:17:45 crc kubenswrapper[4723]: I0309 14:17:45.083869 4723 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 09 14:17:45 crc kubenswrapper[4723]: I0309 14:17:45.083886 4723 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:17:45 crc kubenswrapper[4723]: I0309 14:17:45.083899 4723 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:17:45 crc kubenswrapper[4723]: I0309 14:17:45.083911 4723 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 09 14:17:45 crc kubenswrapper[4723]: I0309 14:17:45.083922 4723 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 09 14:17:45 crc kubenswrapper[4723]: I0309 14:17:45.086014 4723 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 09 14:17:45 crc kubenswrapper[4723]: I0309 14:17:45.086038 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbgbr\" (UniqueName: \"kubernetes.io/projected/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-kube-api-access-zbgbr\") on node \"crc\" DevicePath \"\"" Mar 09 14:17:45 crc kubenswrapper[4723]: I0309 14:17:45.086051 4723 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ef1f6085-70f7-44a1-bf7c-5b4c90284dda-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 09 14:17:45 crc kubenswrapper[4723]: I0309 14:17:45.123469 4723 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 09 14:17:45 crc kubenswrapper[4723]: I0309 14:17:45.188615 4723 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 09 14:17:45 crc kubenswrapper[4723]: I0309 14:17:45.475210 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ef1f6085-70f7-44a1-bf7c-5b4c90284dda","Type":"ContainerDied","Data":"6e0084f9fa42ee4719ab6ccf98202cf5d53b860860419b6840ce7ff569491cf1"} Mar 09 14:17:45 crc kubenswrapper[4723]: I0309 14:17:45.475262 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 09 14:17:45 crc kubenswrapper[4723]: I0309 14:17:45.475353 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e0084f9fa42ee4719ab6ccf98202cf5d53b860860419b6840ce7ff569491cf1" Mar 09 14:17:52 crc kubenswrapper[4723]: I0309 14:17:52.882210 4723 scope.go:117] "RemoveContainer" containerID="b5af8cf4e6b647cc3921b22101db6b30f5e74eefccad33aef3904a6fa1114186" Mar 09 14:17:52 crc kubenswrapper[4723]: E0309 14:17:52.883537 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:17:55 crc kubenswrapper[4723]: I0309 14:17:55.420277 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 09 14:17:55 crc kubenswrapper[4723]: E0309 14:17:55.425395 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef1f6085-70f7-44a1-bf7c-5b4c90284dda" containerName="tempest-tests-tempest-tests-runner" Mar 09 14:17:55 crc kubenswrapper[4723]: I0309 14:17:55.425588 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef1f6085-70f7-44a1-bf7c-5b4c90284dda" containerName="tempest-tests-tempest-tests-runner" Mar 09 14:17:55 crc kubenswrapper[4723]: E0309 14:17:55.425676 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d18d891-99a0-4089-86df-166cfa4297a6" containerName="oc" Mar 09 14:17:55 crc kubenswrapper[4723]: I0309 14:17:55.425737 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d18d891-99a0-4089-86df-166cfa4297a6" containerName="oc" Mar 09 14:17:55 crc kubenswrapper[4723]: E0309 14:17:55.425807 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bec22b2-9167-4c92-837d-167bd5046273" containerName="registry-server" Mar 09 14:17:55 crc kubenswrapper[4723]: I0309 14:17:55.425890 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bec22b2-9167-4c92-837d-167bd5046273" containerName="registry-server" Mar 09 14:17:55 crc kubenswrapper[4723]: E0309 14:17:55.425990 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bec22b2-9167-4c92-837d-167bd5046273" containerName="extract-content" Mar 09 14:17:55 crc kubenswrapper[4723]: I0309 14:17:55.426069 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bec22b2-9167-4c92-837d-167bd5046273" containerName="extract-content" Mar 09 14:17:55 crc kubenswrapper[4723]: E0309 14:17:55.426234 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bec22b2-9167-4c92-837d-167bd5046273" containerName="extract-utilities" Mar 09 14:17:55 crc kubenswrapper[4723]: I0309 14:17:55.426357 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bec22b2-9167-4c92-837d-167bd5046273" containerName="extract-utilities" Mar 09 14:17:55 crc kubenswrapper[4723]: I0309 14:17:55.428018 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d18d891-99a0-4089-86df-166cfa4297a6" containerName="oc" Mar 09 14:17:55 crc kubenswrapper[4723]: I0309 14:17:55.428181 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bec22b2-9167-4c92-837d-167bd5046273" containerName="registry-server" Mar 09 14:17:55 crc kubenswrapper[4723]: I0309 14:17:55.428298 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef1f6085-70f7-44a1-bf7c-5b4c90284dda" containerName="tempest-tests-tempest-tests-runner" Mar 09 14:17:55 crc kubenswrapper[4723]: I0309 14:17:55.434410 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 14:17:55 crc kubenswrapper[4723]: I0309 14:17:55.445412 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-7zlc4" Mar 09 14:17:55 crc kubenswrapper[4723]: I0309 14:17:55.512335 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 09 14:17:55 crc kubenswrapper[4723]: I0309 14:17:55.544275 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"818affa4-b183-47a4-9697-6151845f58d7\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 14:17:55 crc kubenswrapper[4723]: I0309 14:17:55.544796 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8cjh\" (UniqueName: \"kubernetes.io/projected/818affa4-b183-47a4-9697-6151845f58d7-kube-api-access-r8cjh\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"818affa4-b183-47a4-9697-6151845f58d7\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 14:17:55 crc kubenswrapper[4723]: I0309 14:17:55.646678 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8cjh\" (UniqueName: \"kubernetes.io/projected/818affa4-b183-47a4-9697-6151845f58d7-kube-api-access-r8cjh\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"818affa4-b183-47a4-9697-6151845f58d7\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 14:17:55 crc kubenswrapper[4723]: I0309 14:17:55.647040 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"818affa4-b183-47a4-9697-6151845f58d7\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 14:17:55 crc kubenswrapper[4723]: I0309 14:17:55.648649 4723 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"818affa4-b183-47a4-9697-6151845f58d7\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 14:17:55 crc kubenswrapper[4723]: I0309 14:17:55.670971 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8cjh\" (UniqueName: \"kubernetes.io/projected/818affa4-b183-47a4-9697-6151845f58d7-kube-api-access-r8cjh\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"818affa4-b183-47a4-9697-6151845f58d7\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 14:17:55 crc kubenswrapper[4723]: I0309 14:17:55.681908 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"818affa4-b183-47a4-9697-6151845f58d7\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 14:17:55 crc kubenswrapper[4723]: I0309 14:17:55.765399 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 14:17:56 crc kubenswrapper[4723]: I0309 14:17:56.286489 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 09 14:17:56 crc kubenswrapper[4723]: W0309 14:17:56.302778 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod818affa4_b183_47a4_9697_6151845f58d7.slice/crio-d4356868d3b952e4c233a8c528c1805fa42b7c0e6bd7ac7aa3f1ca54a227fa69 WatchSource:0}: Error finding container d4356868d3b952e4c233a8c528c1805fa42b7c0e6bd7ac7aa3f1ca54a227fa69: Status 404 returned error can't find the container with id d4356868d3b952e4c233a8c528c1805fa42b7c0e6bd7ac7aa3f1ca54a227fa69 Mar 09 14:17:56 crc kubenswrapper[4723]: I0309 14:17:56.601064 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"818affa4-b183-47a4-9697-6151845f58d7","Type":"ContainerStarted","Data":"d4356868d3b952e4c233a8c528c1805fa42b7c0e6bd7ac7aa3f1ca54a227fa69"} Mar 09 14:17:58 crc kubenswrapper[4723]: I0309 14:17:58.628031 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"818affa4-b183-47a4-9697-6151845f58d7","Type":"ContainerStarted","Data":"f53b3911d984ee6bc7893c03f8c34a8f6ac0399adaf04ee643acafac1913ce89"} Mar 09 14:17:58 crc kubenswrapper[4723]: I0309 14:17:58.651491 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.447976176 podStartE2EDuration="3.651467821s" podCreationTimestamp="2026-03-09 14:17:55 +0000 UTC" firstStartedPulling="2026-03-09 14:17:56.310448689 +0000 UTC m=+4750.324916229" lastFinishedPulling="2026-03-09 14:17:57.513940334 +0000 UTC m=+4751.528407874" observedRunningTime="2026-03-09 14:17:58.646283923 +0000 UTC m=+4752.660751463" watchObservedRunningTime="2026-03-09 14:17:58.651467821 +0000 UTC m=+4752.665935371" Mar 09 14:18:00 crc kubenswrapper[4723]: I0309 14:18:00.161723 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551098-zgddk"] Mar 09 14:18:00 crc kubenswrapper[4723]: I0309 14:18:00.165441 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551098-zgddk" Mar 09 14:18:00 crc kubenswrapper[4723]: I0309 14:18:00.167770 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jrp4x" Mar 09 14:18:00 crc kubenswrapper[4723]: I0309 14:18:00.167944 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:18:00 crc kubenswrapper[4723]: I0309 14:18:00.168283 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:18:00 crc kubenswrapper[4723]: I0309 14:18:00.177341 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551098-zgddk"] Mar 09 14:18:00 crc kubenswrapper[4723]: I0309 14:18:00.272052 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mspqp\" (UniqueName: \"kubernetes.io/projected/1696b106-9b5a-4120-8947-8e7fee59e6a3-kube-api-access-mspqp\") pod \"auto-csr-approver-29551098-zgddk\" (UID: \"1696b106-9b5a-4120-8947-8e7fee59e6a3\") " pod="openshift-infra/auto-csr-approver-29551098-zgddk" Mar 09 14:18:00 crc kubenswrapper[4723]: I0309 14:18:00.374375 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mspqp\" (UniqueName: \"kubernetes.io/projected/1696b106-9b5a-4120-8947-8e7fee59e6a3-kube-api-access-mspqp\") pod \"auto-csr-approver-29551098-zgddk\" (UID: \"1696b106-9b5a-4120-8947-8e7fee59e6a3\") " pod="openshift-infra/auto-csr-approver-29551098-zgddk" Mar 09 14:18:00 crc kubenswrapper[4723]: I0309 14:18:00.393230 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mspqp\" (UniqueName: \"kubernetes.io/projected/1696b106-9b5a-4120-8947-8e7fee59e6a3-kube-api-access-mspqp\") pod \"auto-csr-approver-29551098-zgddk\" (UID: \"1696b106-9b5a-4120-8947-8e7fee59e6a3\") " pod="openshift-infra/auto-csr-approver-29551098-zgddk" Mar 09 14:18:00 crc kubenswrapper[4723]: I0309 14:18:00.501832 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551098-zgddk" Mar 09 14:18:00 crc kubenswrapper[4723]: I0309 14:18:00.942557 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-c6v82"] Mar 09 14:18:00 crc kubenswrapper[4723]: I0309 14:18:00.947458 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c6v82" Mar 09 14:18:00 crc kubenswrapper[4723]: I0309 14:18:00.956643 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c6v82"] Mar 09 14:18:01 crc kubenswrapper[4723]: I0309 14:18:01.015566 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551098-zgddk"] Mar 09 14:18:01 crc kubenswrapper[4723]: I0309 14:18:01.092319 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5122a512-d2ba-4950-996c-38bba6a01b41-catalog-content\") pod \"certified-operators-c6v82\" (UID: \"5122a512-d2ba-4950-996c-38bba6a01b41\") " pod="openshift-marketplace/certified-operators-c6v82" Mar 09 14:18:01 crc kubenswrapper[4723]: I0309 14:18:01.092504 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5122a512-d2ba-4950-996c-38bba6a01b41-utilities\") pod \"certified-operators-c6v82\" (UID: \"5122a512-d2ba-4950-996c-38bba6a01b41\") " pod="openshift-marketplace/certified-operators-c6v82" Mar 09 14:18:01 crc kubenswrapper[4723]: I0309 14:18:01.092565 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whhgh\" (UniqueName: \"kubernetes.io/projected/5122a512-d2ba-4950-996c-38bba6a01b41-kube-api-access-whhgh\") pod \"certified-operators-c6v82\" (UID: \"5122a512-d2ba-4950-996c-38bba6a01b41\") " pod="openshift-marketplace/certified-operators-c6v82" Mar 09 14:18:01 crc kubenswrapper[4723]: I0309 14:18:01.195534 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5122a512-d2ba-4950-996c-38bba6a01b41-catalog-content\") pod \"certified-operators-c6v82\" (UID: \"5122a512-d2ba-4950-996c-38bba6a01b41\") " pod="openshift-marketplace/certified-operators-c6v82" Mar 09 14:18:01 crc kubenswrapper[4723]: I0309 14:18:01.195696 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5122a512-d2ba-4950-996c-38bba6a01b41-utilities\") pod \"certified-operators-c6v82\" (UID: \"5122a512-d2ba-4950-996c-38bba6a01b41\") " pod="openshift-marketplace/certified-operators-c6v82" Mar 09 14:18:01 crc kubenswrapper[4723]: I0309 14:18:01.195742 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whhgh\" (UniqueName: \"kubernetes.io/projected/5122a512-d2ba-4950-996c-38bba6a01b41-kube-api-access-whhgh\") pod \"certified-operators-c6v82\" (UID: \"5122a512-d2ba-4950-996c-38bba6a01b41\") " pod="openshift-marketplace/certified-operators-c6v82" Mar 09 14:18:01 crc kubenswrapper[4723]: I0309 14:18:01.196047 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5122a512-d2ba-4950-996c-38bba6a01b41-catalog-content\") pod \"certified-operators-c6v82\" (UID: \"5122a512-d2ba-4950-996c-38bba6a01b41\") " pod="openshift-marketplace/certified-operators-c6v82" Mar 09 14:18:01 crc kubenswrapper[4723]: I0309 14:18:01.196056 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5122a512-d2ba-4950-996c-38bba6a01b41-utilities\") pod \"certified-operators-c6v82\" (UID: \"5122a512-d2ba-4950-996c-38bba6a01b41\") " pod="openshift-marketplace/certified-operators-c6v82" Mar 09 14:18:01 crc kubenswrapper[4723]: I0309 14:18:01.216068 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whhgh\" (UniqueName: \"kubernetes.io/projected/5122a512-d2ba-4950-996c-38bba6a01b41-kube-api-access-whhgh\") pod \"certified-operators-c6v82\" (UID: \"5122a512-d2ba-4950-996c-38bba6a01b41\") " pod="openshift-marketplace/certified-operators-c6v82" Mar 09 14:18:01 crc kubenswrapper[4723]: I0309 14:18:01.275968 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c6v82" Mar 09 14:18:01 crc kubenswrapper[4723]: I0309 14:18:01.663820 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551098-zgddk" event={"ID":"1696b106-9b5a-4120-8947-8e7fee59e6a3","Type":"ContainerStarted","Data":"fdc15ed6178159babf2eff99dbd00c5d24f02eaf419c4ee697cf538ed9d2d656"} Mar 09 14:18:01 crc kubenswrapper[4723]: I0309 14:18:01.782494 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c6v82"] Mar 09 14:18:02 crc kubenswrapper[4723]: I0309 14:18:02.678119 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c6v82" event={"ID":"5122a512-d2ba-4950-996c-38bba6a01b41","Type":"ContainerStarted","Data":"0f14c3e52eefadbb6a2fb8ef1c58e60c2104ccb69f2f8dae4f77aa18f6b60b27"} Mar 09 14:18:03 crc kubenswrapper[4723]: I0309 14:18:03.692471 4723 generic.go:334] "Generic (PLEG): container finished" podID="5122a512-d2ba-4950-996c-38bba6a01b41" containerID="ca18a68d6c8b5a86478cb47dc9c47a4e69186b03115de9b13999f06750a64bbb" exitCode=0 Mar 09 14:18:03 crc kubenswrapper[4723]: I0309 14:18:03.692516 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c6v82" event={"ID":"5122a512-d2ba-4950-996c-38bba6a01b41","Type":"ContainerDied","Data":"ca18a68d6c8b5a86478cb47dc9c47a4e69186b03115de9b13999f06750a64bbb"} Mar 09 14:18:04 crc kubenswrapper[4723]: I0309 14:18:04.705848 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551098-zgddk" event={"ID":"1696b106-9b5a-4120-8947-8e7fee59e6a3","Type":"ContainerStarted","Data":"430c64e6b99785ff06f4ea38f01c21433b055364f12a885f8041ae9be7a3f00c"} Mar 09 14:18:04 crc kubenswrapper[4723]: I0309 14:18:04.721397 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551098-zgddk" podStartSLOduration=3.566901035 podStartE2EDuration="4.721373014s" podCreationTimestamp="2026-03-09 14:18:00 +0000 UTC" firstStartedPulling="2026-03-09 14:18:01.023345196 +0000 UTC m=+4755.037812736" lastFinishedPulling="2026-03-09 14:18:02.177817165 +0000 UTC m=+4756.192284715" observedRunningTime="2026-03-09 14:18:04.719750301 +0000 UTC m=+4758.734217841" watchObservedRunningTime="2026-03-09 14:18:04.721373014 +0000 UTC m=+4758.735840554" Mar 09 14:18:04 crc kubenswrapper[4723]: I0309 14:18:04.881641 4723 scope.go:117] "RemoveContainer" containerID="b5af8cf4e6b647cc3921b22101db6b30f5e74eefccad33aef3904a6fa1114186" Mar 09 14:18:04 crc kubenswrapper[4723]: E0309 14:18:04.882339 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:18:05 crc kubenswrapper[4723]: I0309 14:18:05.717588 4723 generic.go:334] "Generic (PLEG): container finished" podID="1696b106-9b5a-4120-8947-8e7fee59e6a3" containerID="430c64e6b99785ff06f4ea38f01c21433b055364f12a885f8041ae9be7a3f00c" exitCode=0 Mar 09 14:18:05 crc kubenswrapper[4723]: I0309 14:18:05.717634 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551098-zgddk" event={"ID":"1696b106-9b5a-4120-8947-8e7fee59e6a3","Type":"ContainerDied","Data":"430c64e6b99785ff06f4ea38f01c21433b055364f12a885f8041ae9be7a3f00c"} Mar 09 14:18:05 crc kubenswrapper[4723]: I0309 14:18:05.720093 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c6v82" event={"ID":"5122a512-d2ba-4950-996c-38bba6a01b41","Type":"ContainerStarted","Data":"540c60f05a86c15480b94448f1bdc8b8c6831852157e05460626bf77cce770ba"} Mar 09 14:18:07 crc kubenswrapper[4723]: I0309 14:18:07.153545 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551098-zgddk" Mar 09 14:18:07 crc kubenswrapper[4723]: I0309 14:18:07.253135 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mspqp\" (UniqueName: \"kubernetes.io/projected/1696b106-9b5a-4120-8947-8e7fee59e6a3-kube-api-access-mspqp\") pod \"1696b106-9b5a-4120-8947-8e7fee59e6a3\" (UID: \"1696b106-9b5a-4120-8947-8e7fee59e6a3\") " Mar 09 14:18:07 crc kubenswrapper[4723]: I0309 14:18:07.281093 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1696b106-9b5a-4120-8947-8e7fee59e6a3-kube-api-access-mspqp" (OuterVolumeSpecName: "kube-api-access-mspqp") pod "1696b106-9b5a-4120-8947-8e7fee59e6a3" (UID: "1696b106-9b5a-4120-8947-8e7fee59e6a3"). InnerVolumeSpecName "kube-api-access-mspqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:18:07 crc kubenswrapper[4723]: I0309 14:18:07.356522 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mspqp\" (UniqueName: \"kubernetes.io/projected/1696b106-9b5a-4120-8947-8e7fee59e6a3-kube-api-access-mspqp\") on node \"crc\" DevicePath \"\"" Mar 09 14:18:07 crc kubenswrapper[4723]: I0309 14:18:07.745123 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551098-zgddk" event={"ID":"1696b106-9b5a-4120-8947-8e7fee59e6a3","Type":"ContainerDied","Data":"fdc15ed6178159babf2eff99dbd00c5d24f02eaf419c4ee697cf538ed9d2d656"} Mar 09 14:18:07 crc kubenswrapper[4723]: I0309 14:18:07.745185 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdc15ed6178159babf2eff99dbd00c5d24f02eaf419c4ee697cf538ed9d2d656" Mar 09 14:18:07 crc kubenswrapper[4723]: I0309 14:18:07.746488 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551098-zgddk" Mar 09 14:18:07 crc kubenswrapper[4723]: I0309 14:18:07.809666 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551092-rr82b"] Mar 09 14:18:07 crc kubenswrapper[4723]: I0309 14:18:07.822473 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551092-rr82b"] Mar 09 14:18:08 crc kubenswrapper[4723]: I0309 14:18:08.758618 4723 generic.go:334] "Generic (PLEG): container finished" podID="5122a512-d2ba-4950-996c-38bba6a01b41" containerID="540c60f05a86c15480b94448f1bdc8b8c6831852157e05460626bf77cce770ba" exitCode=0 Mar 09 14:18:08 crc kubenswrapper[4723]: I0309 14:18:08.758943 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c6v82" event={"ID":"5122a512-d2ba-4950-996c-38bba6a01b41","Type":"ContainerDied","Data":"540c60f05a86c15480b94448f1bdc8b8c6831852157e05460626bf77cce770ba"} Mar 09 14:18:08 crc kubenswrapper[4723]: I0309 14:18:08.898988 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7ab0b7e-14e2-499f-a1d8-cdcdc0a0c248" path="/var/lib/kubelet/pods/c7ab0b7e-14e2-499f-a1d8-cdcdc0a0c248/volumes" Mar 09 14:18:09 crc kubenswrapper[4723]: I0309 14:18:09.771888 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c6v82" event={"ID":"5122a512-d2ba-4950-996c-38bba6a01b41","Type":"ContainerStarted","Data":"ff3809c0aac1d1d5a4c8fbd82d0a88209e26a82584fb1a1733c2e203db6bf611"} Mar 09 14:18:09 crc kubenswrapper[4723]: I0309 14:18:09.812978 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-c6v82" podStartSLOduration=4.365706278 podStartE2EDuration="9.8129556s" podCreationTimestamp="2026-03-09 14:18:00 +0000 UTC" firstStartedPulling="2026-03-09 14:18:03.694853524 +0000 UTC m=+4757.709321064" lastFinishedPulling="2026-03-09 14:18:09.142102846 +0000 UTC m=+4763.156570386" observedRunningTime="2026-03-09 14:18:09.792589238 +0000 UTC m=+4763.807056788" watchObservedRunningTime="2026-03-09 14:18:09.8129556 +0000 UTC m=+4763.827423140" Mar 09 14:18:11 crc kubenswrapper[4723]: I0309 14:18:11.277485 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-c6v82" Mar 09 14:18:11 crc kubenswrapper[4723]: I0309 14:18:11.278062 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-c6v82" Mar 09 14:18:12 crc kubenswrapper[4723]: I0309 14:18:12.328190 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-c6v82" podUID="5122a512-d2ba-4950-996c-38bba6a01b41" containerName="registry-server" probeResult="failure" output=< Mar 09 14:18:12 crc kubenswrapper[4723]: timeout: failed to connect service ":50051" within 1s Mar 09 14:18:12 crc kubenswrapper[4723]: > Mar 09 14:18:16 crc kubenswrapper[4723]: I0309 14:18:16.889652 4723 scope.go:117] "RemoveContainer" containerID="b5af8cf4e6b647cc3921b22101db6b30f5e74eefccad33aef3904a6fa1114186" Mar 09 14:18:16 crc kubenswrapper[4723]: E0309 14:18:16.890475 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:18:21 crc kubenswrapper[4723]: I0309 14:18:21.336106 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-c6v82" Mar 09 14:18:21 crc kubenswrapper[4723]: I0309 14:18:21.394546 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-c6v82" Mar 09 14:18:21 crc kubenswrapper[4723]: I0309 14:18:21.578506 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c6v82"] Mar 09 14:18:22 crc kubenswrapper[4723]: I0309 14:18:22.925894 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-c6v82" podUID="5122a512-d2ba-4950-996c-38bba6a01b41" containerName="registry-server" containerID="cri-o://ff3809c0aac1d1d5a4c8fbd82d0a88209e26a82584fb1a1733c2e203db6bf611" gracePeriod=2 Mar 09 14:18:23 crc kubenswrapper[4723]: I0309 14:18:23.942656 4723 generic.go:334] "Generic (PLEG): container finished" podID="5122a512-d2ba-4950-996c-38bba6a01b41" containerID="ff3809c0aac1d1d5a4c8fbd82d0a88209e26a82584fb1a1733c2e203db6bf611" exitCode=0 Mar 09 14:18:23 crc kubenswrapper[4723]: I0309 14:18:23.942716 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c6v82" event={"ID":"5122a512-d2ba-4950-996c-38bba6a01b41","Type":"ContainerDied","Data":"ff3809c0aac1d1d5a4c8fbd82d0a88209e26a82584fb1a1733c2e203db6bf611"} Mar 09 14:18:23 crc kubenswrapper[4723]: I0309 14:18:23.943194 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c6v82" event={"ID":"5122a512-d2ba-4950-996c-38bba6a01b41","Type":"ContainerDied","Data":"0f14c3e52eefadbb6a2fb8ef1c58e60c2104ccb69f2f8dae4f77aa18f6b60b27"} Mar 09 14:18:23 crc kubenswrapper[4723]: I0309 14:18:23.943211 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f14c3e52eefadbb6a2fb8ef1c58e60c2104ccb69f2f8dae4f77aa18f6b60b27" Mar 09 14:18:24 crc kubenswrapper[4723]: I0309 14:18:24.168831 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c6v82" Mar 09 14:18:24 crc kubenswrapper[4723]: I0309 14:18:24.213024 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whhgh\" (UniqueName: \"kubernetes.io/projected/5122a512-d2ba-4950-996c-38bba6a01b41-kube-api-access-whhgh\") pod \"5122a512-d2ba-4950-996c-38bba6a01b41\" (UID: \"5122a512-d2ba-4950-996c-38bba6a01b41\") " Mar 09 14:18:24 crc kubenswrapper[4723]: I0309 14:18:24.213229 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5122a512-d2ba-4950-996c-38bba6a01b41-utilities\") pod \"5122a512-d2ba-4950-996c-38bba6a01b41\" (UID: \"5122a512-d2ba-4950-996c-38bba6a01b41\") " Mar 09 14:18:24 crc kubenswrapper[4723]: I0309 14:18:24.213470 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5122a512-d2ba-4950-996c-38bba6a01b41-catalog-content\") pod \"5122a512-d2ba-4950-996c-38bba6a01b41\" (UID: \"5122a512-d2ba-4950-996c-38bba6a01b41\") " Mar 09 14:18:24 crc kubenswrapper[4723]: I0309 14:18:24.214395 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5122a512-d2ba-4950-996c-38bba6a01b41-utilities" (OuterVolumeSpecName: "utilities") pod "5122a512-d2ba-4950-996c-38bba6a01b41" (UID: "5122a512-d2ba-4950-996c-38bba6a01b41"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:18:24 crc kubenswrapper[4723]: I0309 14:18:24.219428 4723 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5122a512-d2ba-4950-996c-38bba6a01b41-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:18:24 crc kubenswrapper[4723]: I0309 14:18:24.271050 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5122a512-d2ba-4950-996c-38bba6a01b41-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5122a512-d2ba-4950-996c-38bba6a01b41" (UID: "5122a512-d2ba-4950-996c-38bba6a01b41"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:18:24 crc kubenswrapper[4723]: I0309 14:18:24.293966 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5122a512-d2ba-4950-996c-38bba6a01b41-kube-api-access-whhgh" (OuterVolumeSpecName: "kube-api-access-whhgh") pod "5122a512-d2ba-4950-996c-38bba6a01b41" (UID: "5122a512-d2ba-4950-996c-38bba6a01b41"). InnerVolumeSpecName "kube-api-access-whhgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:18:24 crc kubenswrapper[4723]: I0309 14:18:24.321977 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whhgh\" (UniqueName: \"kubernetes.io/projected/5122a512-d2ba-4950-996c-38bba6a01b41-kube-api-access-whhgh\") on node \"crc\" DevicePath \"\"" Mar 09 14:18:24 crc kubenswrapper[4723]: I0309 14:18:24.322028 4723 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5122a512-d2ba-4950-996c-38bba6a01b41-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:18:24 crc kubenswrapper[4723]: I0309 14:18:24.954108 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c6v82" Mar 09 14:18:24 crc kubenswrapper[4723]: I0309 14:18:24.985547 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c6v82"] Mar 09 14:18:24 crc kubenswrapper[4723]: I0309 14:18:24.995626 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-c6v82"] Mar 09 14:18:26 crc kubenswrapper[4723]: I0309 14:18:26.893066 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5122a512-d2ba-4950-996c-38bba6a01b41" path="/var/lib/kubelet/pods/5122a512-d2ba-4950-996c-38bba6a01b41/volumes" Mar 09 14:18:28 crc kubenswrapper[4723]: I0309 14:18:28.899618 4723 scope.go:117] "RemoveContainer" containerID="2635494cbf65a6a16a563448d3ec1a8c8b08ef035226bf16e28a69b6ee49f9c7" Mar 09 14:18:29 crc kubenswrapper[4723]: I0309 14:18:29.881385 4723 scope.go:117] "RemoveContainer" containerID="b5af8cf4e6b647cc3921b22101db6b30f5e74eefccad33aef3904a6fa1114186" Mar 09 14:18:29 crc kubenswrapper[4723]: E0309 14:18:29.882039 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:18:42 crc kubenswrapper[4723]: I0309 14:18:42.881579 4723 scope.go:117] "RemoveContainer" containerID="b5af8cf4e6b647cc3921b22101db6b30f5e74eefccad33aef3904a6fa1114186" Mar 09 14:18:42 crc kubenswrapper[4723]: E0309 14:18:42.882323 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:18:47 crc kubenswrapper[4723]: I0309 14:18:47.291738 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hv968/must-gather-57slw"] Mar 09 14:18:47 crc kubenswrapper[4723]: E0309 14:18:47.292972 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5122a512-d2ba-4950-996c-38bba6a01b41" containerName="registry-server" Mar 09 14:18:47 crc kubenswrapper[4723]: I0309 14:18:47.292991 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="5122a512-d2ba-4950-996c-38bba6a01b41" containerName="registry-server" Mar 09 14:18:47 crc kubenswrapper[4723]: E0309 14:18:47.293016 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1696b106-9b5a-4120-8947-8e7fee59e6a3" containerName="oc" Mar 09 14:18:47 crc kubenswrapper[4723]: I0309 14:18:47.293024 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="1696b106-9b5a-4120-8947-8e7fee59e6a3" containerName="oc" Mar 09 14:18:47 crc kubenswrapper[4723]: E0309 14:18:47.293050 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5122a512-d2ba-4950-996c-38bba6a01b41" containerName="extract-content" Mar 09 14:18:47 crc kubenswrapper[4723]: I0309 14:18:47.293058 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="5122a512-d2ba-4950-996c-38bba6a01b41" containerName="extract-content" Mar 09 14:18:47 crc kubenswrapper[4723]: E0309 14:18:47.293078 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5122a512-d2ba-4950-996c-38bba6a01b41" containerName="extract-utilities" Mar 09 14:18:47 crc kubenswrapper[4723]: I0309 14:18:47.293087 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="5122a512-d2ba-4950-996c-38bba6a01b41" containerName="extract-utilities" Mar 09 14:18:47 crc kubenswrapper[4723]: I0309 14:18:47.293357 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="5122a512-d2ba-4950-996c-38bba6a01b41" containerName="registry-server" Mar 09 14:18:47 crc kubenswrapper[4723]: I0309 14:18:47.293376 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="1696b106-9b5a-4120-8947-8e7fee59e6a3" containerName="oc" Mar 09 14:18:47 crc kubenswrapper[4723]: I0309 14:18:47.294948 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hv968/must-gather-57slw" Mar 09 14:18:47 crc kubenswrapper[4723]: I0309 14:18:47.296880 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-hv968"/"default-dockercfg-h6m9w" Mar 09 14:18:47 crc kubenswrapper[4723]: I0309 14:18:47.297111 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-hv968"/"kube-root-ca.crt" Mar 09 14:18:47 crc kubenswrapper[4723]: I0309 14:18:47.297756 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-hv968"/"openshift-service-ca.crt" Mar 09 14:18:47 crc kubenswrapper[4723]: I0309 14:18:47.318357 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hv968/must-gather-57slw"] Mar 09 14:18:47 crc kubenswrapper[4723]: I0309 14:18:47.400759 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9d7fd055-fcc2-4221-9372-be1ffefd23da-must-gather-output\") pod \"must-gather-57slw\" (UID: \"9d7fd055-fcc2-4221-9372-be1ffefd23da\") " pod="openshift-must-gather-hv968/must-gather-57slw" Mar 09 14:18:47 crc kubenswrapper[4723]: I0309 14:18:47.400930 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26bvl\" (UniqueName: \"kubernetes.io/projected/9d7fd055-fcc2-4221-9372-be1ffefd23da-kube-api-access-26bvl\") pod \"must-gather-57slw\" (UID: \"9d7fd055-fcc2-4221-9372-be1ffefd23da\") " pod="openshift-must-gather-hv968/must-gather-57slw" Mar 09 14:18:47 crc kubenswrapper[4723]: I0309 14:18:47.502919 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9d7fd055-fcc2-4221-9372-be1ffefd23da-must-gather-output\") pod \"must-gather-57slw\" (UID: \"9d7fd055-fcc2-4221-9372-be1ffefd23da\") " pod="openshift-must-gather-hv968/must-gather-57slw" Mar 09 14:18:47 crc kubenswrapper[4723]: I0309 14:18:47.503039 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26bvl\" (UniqueName: \"kubernetes.io/projected/9d7fd055-fcc2-4221-9372-be1ffefd23da-kube-api-access-26bvl\") pod \"must-gather-57slw\" (UID: \"9d7fd055-fcc2-4221-9372-be1ffefd23da\") " pod="openshift-must-gather-hv968/must-gather-57slw" Mar 09 14:18:47 crc kubenswrapper[4723]: I0309 14:18:47.503366 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9d7fd055-fcc2-4221-9372-be1ffefd23da-must-gather-output\") pod \"must-gather-57slw\" (UID: \"9d7fd055-fcc2-4221-9372-be1ffefd23da\") " pod="openshift-must-gather-hv968/must-gather-57slw" Mar 09 14:18:47 crc kubenswrapper[4723]: I0309 14:18:47.536875 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26bvl\" (UniqueName: \"kubernetes.io/projected/9d7fd055-fcc2-4221-9372-be1ffefd23da-kube-api-access-26bvl\") pod \"must-gather-57slw\" (UID: \"9d7fd055-fcc2-4221-9372-be1ffefd23da\") " pod="openshift-must-gather-hv968/must-gather-57slw" Mar 09 14:18:47 crc kubenswrapper[4723]: I0309 14:18:47.619791 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hv968/must-gather-57slw" Mar 09 14:18:48 crc kubenswrapper[4723]: I0309 14:18:48.317953 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hv968/must-gather-57slw"] Mar 09 14:18:48 crc kubenswrapper[4723]: I0309 14:18:48.983490 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-smfn8"] Mar 09 14:18:48 crc kubenswrapper[4723]: I0309 14:18:48.986265 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-smfn8" Mar 09 14:18:48 crc kubenswrapper[4723]: I0309 14:18:48.996283 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-smfn8"] Mar 09 14:18:49 crc kubenswrapper[4723]: I0309 14:18:49.149228 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjk2m\" (UniqueName: \"kubernetes.io/projected/f61cb3f8-4165-4e3c-8290-6b85fee8a9f7-kube-api-access-bjk2m\") pod \"community-operators-smfn8\" (UID: \"f61cb3f8-4165-4e3c-8290-6b85fee8a9f7\") " pod="openshift-marketplace/community-operators-smfn8" Mar 09 14:18:49 crc kubenswrapper[4723]: I0309 14:18:49.149402 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f61cb3f8-4165-4e3c-8290-6b85fee8a9f7-catalog-content\") pod \"community-operators-smfn8\" (UID: \"f61cb3f8-4165-4e3c-8290-6b85fee8a9f7\") " pod="openshift-marketplace/community-operators-smfn8" Mar 09 14:18:49 crc kubenswrapper[4723]: I0309 14:18:49.149499 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f61cb3f8-4165-4e3c-8290-6b85fee8a9f7-utilities\") pod \"community-operators-smfn8\" (UID: \"f61cb3f8-4165-4e3c-8290-6b85fee8a9f7\") " pod="openshift-marketplace/community-operators-smfn8" Mar 09 14:18:49 crc kubenswrapper[4723]: I0309 14:18:49.225825 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hv968/must-gather-57slw" event={"ID":"9d7fd055-fcc2-4221-9372-be1ffefd23da","Type":"ContainerStarted","Data":"99c54cced60b6b889c87d84af0c4dac4dd3a27e5cfb18c0de846fdace386adb8"} Mar 09 14:18:49 crc kubenswrapper[4723]: I0309 14:18:49.256441 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjk2m\" (UniqueName: \"kubernetes.io/projected/f61cb3f8-4165-4e3c-8290-6b85fee8a9f7-kube-api-access-bjk2m\") pod \"community-operators-smfn8\" (UID: \"f61cb3f8-4165-4e3c-8290-6b85fee8a9f7\") " pod="openshift-marketplace/community-operators-smfn8" Mar 09 14:18:49 crc kubenswrapper[4723]: I0309 14:18:49.256566 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f61cb3f8-4165-4e3c-8290-6b85fee8a9f7-catalog-content\") pod \"community-operators-smfn8\" (UID: \"f61cb3f8-4165-4e3c-8290-6b85fee8a9f7\") " pod="openshift-marketplace/community-operators-smfn8" Mar 09 14:18:49 crc kubenswrapper[4723]: I0309 14:18:49.256594 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f61cb3f8-4165-4e3c-8290-6b85fee8a9f7-utilities\") pod \"community-operators-smfn8\" (UID: \"f61cb3f8-4165-4e3c-8290-6b85fee8a9f7\") " pod="openshift-marketplace/community-operators-smfn8" Mar 09 14:18:49 crc kubenswrapper[4723]: I0309 14:18:49.257395 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f61cb3f8-4165-4e3c-8290-6b85fee8a9f7-catalog-content\") pod \"community-operators-smfn8\" (UID: \"f61cb3f8-4165-4e3c-8290-6b85fee8a9f7\") " pod="openshift-marketplace/community-operators-smfn8" Mar 09 14:18:49 crc kubenswrapper[4723]: I0309 14:18:49.259078 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f61cb3f8-4165-4e3c-8290-6b85fee8a9f7-utilities\") pod \"community-operators-smfn8\" (UID: \"f61cb3f8-4165-4e3c-8290-6b85fee8a9f7\") " pod="openshift-marketplace/community-operators-smfn8" Mar 09 14:18:49 crc kubenswrapper[4723]: I0309 14:18:49.277133 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjk2m\" (UniqueName: \"kubernetes.io/projected/f61cb3f8-4165-4e3c-8290-6b85fee8a9f7-kube-api-access-bjk2m\") pod \"community-operators-smfn8\" (UID: \"f61cb3f8-4165-4e3c-8290-6b85fee8a9f7\") " pod="openshift-marketplace/community-operators-smfn8" Mar 09 14:18:49 crc kubenswrapper[4723]: I0309 14:18:49.323298 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-smfn8" Mar 09 14:18:49 crc kubenswrapper[4723]: I0309 14:18:49.911041 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-smfn8"] Mar 09 14:18:49 crc kubenswrapper[4723]: W0309 14:18:49.918387 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf61cb3f8_4165_4e3c_8290_6b85fee8a9f7.slice/crio-2380ae1d160f004f000ea5f7c524aa5d75dbb120d4ed302ddcc95bdabddda09b WatchSource:0}: Error finding container 2380ae1d160f004f000ea5f7c524aa5d75dbb120d4ed302ddcc95bdabddda09b: Status 404 returned error can't find the container with id 2380ae1d160f004f000ea5f7c524aa5d75dbb120d4ed302ddcc95bdabddda09b Mar 09 14:18:50 crc kubenswrapper[4723]: I0309 14:18:50.247118 4723 generic.go:334] "Generic (PLEG): container finished" podID="f61cb3f8-4165-4e3c-8290-6b85fee8a9f7" containerID="5e3d0530878e62ce7ce5903029ff9097fc1b62b39089aae52e42f29288ac91d3" exitCode=0 Mar 09 14:18:50 crc kubenswrapper[4723]: I0309 14:18:50.247158 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smfn8" event={"ID":"f61cb3f8-4165-4e3c-8290-6b85fee8a9f7","Type":"ContainerDied","Data":"5e3d0530878e62ce7ce5903029ff9097fc1b62b39089aae52e42f29288ac91d3"} Mar 09 14:18:50 crc kubenswrapper[4723]: I0309 14:18:50.247183 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smfn8" event={"ID":"f61cb3f8-4165-4e3c-8290-6b85fee8a9f7","Type":"ContainerStarted","Data":"2380ae1d160f004f000ea5f7c524aa5d75dbb120d4ed302ddcc95bdabddda09b"} Mar 09 14:18:52 crc kubenswrapper[4723]: I0309 14:18:52.268912 4723 generic.go:334] "Generic (PLEG): container finished" podID="9a4a344f-6f96-422b-9468-56c8e988ad3f" containerID="1cf976ee2556a62cbe7223703cbe5dc18a2e0cef64591475186da4c9dd8de172" exitCode=0 Mar 09 14:18:52 crc kubenswrapper[4723]: I0309 14:18:52.269022 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7f69f56458-z9f7c" event={"ID":"9a4a344f-6f96-422b-9468-56c8e988ad3f","Type":"ContainerDied","Data":"1cf976ee2556a62cbe7223703cbe5dc18a2e0cef64591475186da4c9dd8de172"} Mar 09 14:18:56 crc kubenswrapper[4723]: I0309 14:18:56.896879 4723 scope.go:117] "RemoveContainer" containerID="b5af8cf4e6b647cc3921b22101db6b30f5e74eefccad33aef3904a6fa1114186" Mar 09 14:18:56 crc kubenswrapper[4723]: E0309 14:18:56.897647 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:18:57 crc kubenswrapper[4723]: I0309 14:18:57.324616 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hv968/must-gather-57slw" event={"ID":"9d7fd055-fcc2-4221-9372-be1ffefd23da","Type":"ContainerStarted","Data":"49b28102f51eebe4de9ef7ee1adef9902c1b031bf6ea59e78f4e26b08750ef23"} Mar 09 14:18:57 crc kubenswrapper[4723]: I0309 14:18:57.324978 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hv968/must-gather-57slw" event={"ID":"9d7fd055-fcc2-4221-9372-be1ffefd23da","Type":"ContainerStarted","Data":"f11a46a1d258172a43663da9f42ca6c9a46aaa8b4ce2076967865765cf8a987d"} Mar 09 14:18:57 crc kubenswrapper[4723]: I0309 14:18:57.327125 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7f69f56458-z9f7c" event={"ID":"9a4a344f-6f96-422b-9468-56c8e988ad3f","Type":"ContainerStarted","Data":"59c47a88dbc768c4aaa86061907eb8d96ce0a3afdb7acd1d5bdf27b1a621f274"} Mar 09 14:18:57 crc kubenswrapper[4723]: I0309 14:18:57.330333 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smfn8" event={"ID":"f61cb3f8-4165-4e3c-8290-6b85fee8a9f7","Type":"ContainerStarted","Data":"5ee55f0f9d44cbddcb0cd6e6419be4e0e71e1ae6ca4e9f80f8accaa2ffe04669"} Mar 09 14:18:57 crc kubenswrapper[4723]: I0309 14:18:57.352145 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-hv968/must-gather-57slw" podStartSLOduration=1.921092511 podStartE2EDuration="10.352123052s" podCreationTimestamp="2026-03-09 14:18:47 +0000 UTC" firstStartedPulling="2026-03-09 14:18:48.313959015 +0000 UTC m=+4802.328426555" lastFinishedPulling="2026-03-09 14:18:56.744989556 +0000 UTC m=+4810.759457096" observedRunningTime="2026-03-09 14:18:57.340633046 +0000 UTC m=+4811.355100606" watchObservedRunningTime="2026-03-09 14:18:57.352123052 +0000 UTC m=+4811.366590582" Mar 09 14:18:59 crc kubenswrapper[4723]: I0309 14:18:59.353174 4723 generic.go:334] "Generic (PLEG): container finished" podID="f61cb3f8-4165-4e3c-8290-6b85fee8a9f7" containerID="5ee55f0f9d44cbddcb0cd6e6419be4e0e71e1ae6ca4e9f80f8accaa2ffe04669" exitCode=0 Mar 09 14:18:59 crc kubenswrapper[4723]: I0309 14:18:59.353279 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smfn8" event={"ID":"f61cb3f8-4165-4e3c-8290-6b85fee8a9f7","Type":"ContainerDied","Data":"5ee55f0f9d44cbddcb0cd6e6419be4e0e71e1ae6ca4e9f80f8accaa2ffe04669"} Mar 09 14:19:00 crc kubenswrapper[4723]: I0309 14:19:00.367675 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smfn8" event={"ID":"f61cb3f8-4165-4e3c-8290-6b85fee8a9f7","Type":"ContainerStarted","Data":"e8813c8bb55a605d30bbd153af04572a18b25daa6f01436d62d119ff65963043"} Mar 09 14:19:00 crc kubenswrapper[4723]: I0309 14:19:00.406846 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-smfn8" podStartSLOduration=2.895146638 podStartE2EDuration="12.406823209s" podCreationTimestamp="2026-03-09 14:18:48 +0000 UTC" firstStartedPulling="2026-03-09 14:18:50.250595684 +0000 UTC m=+4804.265063224" lastFinishedPulling="2026-03-09 14:18:59.762272255 +0000 UTC m=+4813.776739795" observedRunningTime="2026-03-09 14:19:00.391324736 +0000 UTC m=+4814.405792276" watchObservedRunningTime="2026-03-09 14:19:00.406823209 +0000 UTC m=+4814.421290749" Mar 09 14:19:03 crc kubenswrapper[4723]: I0309 14:19:03.715299 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hv968/crc-debug-bdnkn"] Mar 09 14:19:03 crc kubenswrapper[4723]: I0309 14:19:03.717735 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hv968/crc-debug-bdnkn" Mar 09 14:19:03 crc kubenswrapper[4723]: I0309 14:19:03.878405 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/70240430-8cf3-4260-b1e5-c611041088ca-host\") pod \"crc-debug-bdnkn\" (UID: \"70240430-8cf3-4260-b1e5-c611041088ca\") " pod="openshift-must-gather-hv968/crc-debug-bdnkn" Mar 09 14:19:03 crc kubenswrapper[4723]: I0309 14:19:03.878586 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm76z\" (UniqueName: \"kubernetes.io/projected/70240430-8cf3-4260-b1e5-c611041088ca-kube-api-access-pm76z\") pod \"crc-debug-bdnkn\" (UID: \"70240430-8cf3-4260-b1e5-c611041088ca\") " pod="openshift-must-gather-hv968/crc-debug-bdnkn" Mar 09 14:19:03 crc kubenswrapper[4723]: I0309 14:19:03.980684 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/70240430-8cf3-4260-b1e5-c611041088ca-host\") pod \"crc-debug-bdnkn\" (UID: \"70240430-8cf3-4260-b1e5-c611041088ca\") " pod="openshift-must-gather-hv968/crc-debug-bdnkn" Mar 09 14:19:03 crc kubenswrapper[4723]: I0309 14:19:03.980909 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm76z\" (UniqueName: \"kubernetes.io/projected/70240430-8cf3-4260-b1e5-c611041088ca-kube-api-access-pm76z\") pod \"crc-debug-bdnkn\" (UID: \"70240430-8cf3-4260-b1e5-c611041088ca\") " pod="openshift-must-gather-hv968/crc-debug-bdnkn" Mar 09 14:19:03 crc kubenswrapper[4723]: I0309 14:19:03.981766 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/70240430-8cf3-4260-b1e5-c611041088ca-host\") pod \"crc-debug-bdnkn\" (UID: \"70240430-8cf3-4260-b1e5-c611041088ca\") " pod="openshift-must-gather-hv968/crc-debug-bdnkn" Mar 09 14:19:04 crc kubenswrapper[4723]: I0309 14:19:04.000976 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm76z\" (UniqueName: \"kubernetes.io/projected/70240430-8cf3-4260-b1e5-c611041088ca-kube-api-access-pm76z\") pod \"crc-debug-bdnkn\" (UID: \"70240430-8cf3-4260-b1e5-c611041088ca\") " pod="openshift-must-gather-hv968/crc-debug-bdnkn" Mar 09 14:19:04 crc kubenswrapper[4723]: I0309 14:19:04.044115 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hv968/crc-debug-bdnkn" Mar 09 14:19:04 crc kubenswrapper[4723]: W0309 14:19:04.089788 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70240430_8cf3_4260_b1e5_c611041088ca.slice/crio-cb46fecb5dc4e2b43610a065d6986b7f78826e85fe82caaadd0903f55e4b0a7d WatchSource:0}: Error finding container cb46fecb5dc4e2b43610a065d6986b7f78826e85fe82caaadd0903f55e4b0a7d: Status 404 returned error can't find the container with id cb46fecb5dc4e2b43610a065d6986b7f78826e85fe82caaadd0903f55e4b0a7d Mar 09 14:19:04 crc kubenswrapper[4723]: I0309 14:19:04.432464 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hv968/crc-debug-bdnkn" event={"ID":"70240430-8cf3-4260-b1e5-c611041088ca","Type":"ContainerStarted","Data":"cb46fecb5dc4e2b43610a065d6986b7f78826e85fe82caaadd0903f55e4b0a7d"} Mar 09 14:19:08 crc kubenswrapper[4723]: I0309 14:19:08.881729 4723 scope.go:117] "RemoveContainer" containerID="b5af8cf4e6b647cc3921b22101db6b30f5e74eefccad33aef3904a6fa1114186" Mar 09 14:19:08 crc kubenswrapper[4723]: E0309 14:19:08.882710 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:19:09 crc kubenswrapper[4723]: I0309 14:19:09.324119 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-smfn8" Mar 09 14:19:09 crc kubenswrapper[4723]: I0309 14:19:09.324574 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-smfn8" Mar 09 14:19:09 crc kubenswrapper[4723]: I0309 14:19:09.393225 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-smfn8" Mar 09 14:19:09 crc kubenswrapper[4723]: I0309 14:19:09.555484 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-smfn8" Mar 09 14:19:09 crc kubenswrapper[4723]: I0309 14:19:09.640882 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-smfn8"] Mar 09 14:19:10 crc kubenswrapper[4723]: I0309 14:19:10.373769 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-7f69f56458-z9f7c" Mar 09 14:19:10 crc kubenswrapper[4723]: I0309 14:19:10.374176 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-7f69f56458-z9f7c" Mar 09 14:19:11 crc kubenswrapper[4723]: I0309 14:19:11.521898 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-smfn8" podUID="f61cb3f8-4165-4e3c-8290-6b85fee8a9f7" containerName="registry-server" containerID="cri-o://e8813c8bb55a605d30bbd153af04572a18b25daa6f01436d62d119ff65963043" gracePeriod=2 Mar 09 14:19:12 crc kubenswrapper[4723]: I0309 14:19:12.545306 4723 generic.go:334] "Generic (PLEG): container finished" podID="f61cb3f8-4165-4e3c-8290-6b85fee8a9f7" containerID="e8813c8bb55a605d30bbd153af04572a18b25daa6f01436d62d119ff65963043" exitCode=0 Mar 09 14:19:12 crc kubenswrapper[4723]: I0309 14:19:12.545491 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smfn8" event={"ID":"f61cb3f8-4165-4e3c-8290-6b85fee8a9f7","Type":"ContainerDied","Data":"e8813c8bb55a605d30bbd153af04572a18b25daa6f01436d62d119ff65963043"} Mar 09 14:19:15 crc kubenswrapper[4723]: I0309 14:19:15.472351 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-smfn8" Mar 09 14:19:15 crc kubenswrapper[4723]: I0309 14:19:15.487226 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjk2m\" (UniqueName: \"kubernetes.io/projected/f61cb3f8-4165-4e3c-8290-6b85fee8a9f7-kube-api-access-bjk2m\") pod \"f61cb3f8-4165-4e3c-8290-6b85fee8a9f7\" (UID: \"f61cb3f8-4165-4e3c-8290-6b85fee8a9f7\") " Mar 09 14:19:15 crc kubenswrapper[4723]: I0309 14:19:15.487331 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f61cb3f8-4165-4e3c-8290-6b85fee8a9f7-catalog-content\") pod \"f61cb3f8-4165-4e3c-8290-6b85fee8a9f7\" (UID: \"f61cb3f8-4165-4e3c-8290-6b85fee8a9f7\") " Mar 09 14:19:15 crc kubenswrapper[4723]: I0309 14:19:15.487525 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f61cb3f8-4165-4e3c-8290-6b85fee8a9f7-utilities\") pod \"f61cb3f8-4165-4e3c-8290-6b85fee8a9f7\" (UID: \"f61cb3f8-4165-4e3c-8290-6b85fee8a9f7\") " Mar 09 14:19:15 crc kubenswrapper[4723]: I0309 14:19:15.488035 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f61cb3f8-4165-4e3c-8290-6b85fee8a9f7-utilities" (OuterVolumeSpecName: "utilities") pod "f61cb3f8-4165-4e3c-8290-6b85fee8a9f7" (UID: "f61cb3f8-4165-4e3c-8290-6b85fee8a9f7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:19:15 crc kubenswrapper[4723]: I0309 14:19:15.488997 4723 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f61cb3f8-4165-4e3c-8290-6b85fee8a9f7-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:19:15 crc kubenswrapper[4723]: I0309 14:19:15.495323 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f61cb3f8-4165-4e3c-8290-6b85fee8a9f7-kube-api-access-bjk2m" (OuterVolumeSpecName: "kube-api-access-bjk2m") pod "f61cb3f8-4165-4e3c-8290-6b85fee8a9f7" (UID: "f61cb3f8-4165-4e3c-8290-6b85fee8a9f7"). InnerVolumeSpecName "kube-api-access-bjk2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:19:15 crc kubenswrapper[4723]: I0309 14:19:15.544703 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f61cb3f8-4165-4e3c-8290-6b85fee8a9f7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f61cb3f8-4165-4e3c-8290-6b85fee8a9f7" (UID: "f61cb3f8-4165-4e3c-8290-6b85fee8a9f7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:19:15 crc kubenswrapper[4723]: I0309 14:19:15.592469 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjk2m\" (UniqueName: \"kubernetes.io/projected/f61cb3f8-4165-4e3c-8290-6b85fee8a9f7-kube-api-access-bjk2m\") on node \"crc\" DevicePath \"\"" Mar 09 14:19:15 crc kubenswrapper[4723]: I0309 14:19:15.592503 4723 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f61cb3f8-4165-4e3c-8290-6b85fee8a9f7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:19:15 crc kubenswrapper[4723]: I0309 14:19:15.614762 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smfn8" event={"ID":"f61cb3f8-4165-4e3c-8290-6b85fee8a9f7","Type":"ContainerDied","Data":"2380ae1d160f004f000ea5f7c524aa5d75dbb120d4ed302ddcc95bdabddda09b"} Mar 09 14:19:15 crc kubenswrapper[4723]: I0309 14:19:15.614823 4723 scope.go:117] "RemoveContainer" containerID="e8813c8bb55a605d30bbd153af04572a18b25daa6f01436d62d119ff65963043" Mar 09 14:19:15 crc kubenswrapper[4723]: I0309 14:19:15.615012 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-smfn8" Mar 09 14:19:15 crc kubenswrapper[4723]: I0309 14:19:15.630400 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hv968/crc-debug-bdnkn" event={"ID":"70240430-8cf3-4260-b1e5-c611041088ca","Type":"ContainerStarted","Data":"795541e761fac29b933322c3bbef5c92e0addb31b91d534e1c430afd4dda5cbb"} Mar 09 14:19:15 crc kubenswrapper[4723]: I0309 14:19:15.686750 4723 scope.go:117] "RemoveContainer" containerID="5ee55f0f9d44cbddcb0cd6e6419be4e0e71e1ae6ca4e9f80f8accaa2ffe04669" Mar 09 14:19:15 crc kubenswrapper[4723]: I0309 14:19:15.715151 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-hv968/crc-debug-bdnkn" podStartSLOduration=1.763107124 podStartE2EDuration="12.715133532s" podCreationTimestamp="2026-03-09 14:19:03 +0000 UTC" firstStartedPulling="2026-03-09 14:19:04.092442195 +0000 UTC m=+4818.106909735" lastFinishedPulling="2026-03-09 14:19:15.044468603 +0000 UTC m=+4829.058936143" observedRunningTime="2026-03-09 14:19:15.672116166 +0000 UTC m=+4829.686583706" watchObservedRunningTime="2026-03-09 14:19:15.715133532 +0000 UTC m=+4829.729601072" Mar 09 14:19:15 crc kubenswrapper[4723]: I0309 14:19:15.742144 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-smfn8"] Mar 09 14:19:15 crc kubenswrapper[4723]: I0309 14:19:15.760637 4723 scope.go:117] "RemoveContainer" containerID="5e3d0530878e62ce7ce5903029ff9097fc1b62b39089aae52e42f29288ac91d3" Mar 09 14:19:15 crc kubenswrapper[4723]: I0309 14:19:15.770468 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-smfn8"] Mar 09 14:19:16 crc kubenswrapper[4723]: I0309 14:19:16.895989 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f61cb3f8-4165-4e3c-8290-6b85fee8a9f7" path="/var/lib/kubelet/pods/f61cb3f8-4165-4e3c-8290-6b85fee8a9f7/volumes" Mar 09 14:19:23 crc kubenswrapper[4723]: I0309 14:19:23.882442 4723 scope.go:117] "RemoveContainer" containerID="b5af8cf4e6b647cc3921b22101db6b30f5e74eefccad33aef3904a6fa1114186" Mar 09 14:19:23 crc kubenswrapper[4723]: E0309 14:19:23.883411 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:19:30 crc kubenswrapper[4723]: I0309 14:19:30.380457 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-7f69f56458-z9f7c" Mar 09 14:19:30 crc kubenswrapper[4723]: I0309 14:19:30.386322 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-7f69f56458-z9f7c" Mar 09 14:19:38 crc kubenswrapper[4723]: I0309 14:19:38.882131 4723 scope.go:117] "RemoveContainer" containerID="b5af8cf4e6b647cc3921b22101db6b30f5e74eefccad33aef3904a6fa1114186" Mar 09 14:19:38 crc kubenswrapper[4723]: E0309 14:19:38.883009 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:19:49 crc kubenswrapper[4723]: I0309 14:19:49.881280 4723 scope.go:117] "RemoveContainer" containerID="b5af8cf4e6b647cc3921b22101db6b30f5e74eefccad33aef3904a6fa1114186" Mar 09 14:19:49 crc kubenswrapper[4723]: E0309 14:19:49.882175 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:20:00 crc kubenswrapper[4723]: I0309 14:20:00.180203 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551100-kdnwn"] Mar 09 14:20:00 crc kubenswrapper[4723]: E0309 14:20:00.181114 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f61cb3f8-4165-4e3c-8290-6b85fee8a9f7" containerName="registry-server" Mar 09 14:20:00 crc kubenswrapper[4723]: I0309 14:20:00.181127 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="f61cb3f8-4165-4e3c-8290-6b85fee8a9f7" containerName="registry-server" Mar 09 14:20:00 crc kubenswrapper[4723]: E0309 14:20:00.181150 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f61cb3f8-4165-4e3c-8290-6b85fee8a9f7" containerName="extract-content" Mar 09 14:20:00 crc kubenswrapper[4723]: I0309 14:20:00.181156 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="f61cb3f8-4165-4e3c-8290-6b85fee8a9f7" containerName="extract-content" Mar 09 14:20:00 crc kubenswrapper[4723]: E0309 14:20:00.181188 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f61cb3f8-4165-4e3c-8290-6b85fee8a9f7" containerName="extract-utilities" Mar 09 14:20:00 crc kubenswrapper[4723]: I0309 14:20:00.181194 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="f61cb3f8-4165-4e3c-8290-6b85fee8a9f7" containerName="extract-utilities" Mar 09 14:20:00 crc kubenswrapper[4723]: I0309 14:20:00.181410 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="f61cb3f8-4165-4e3c-8290-6b85fee8a9f7" containerName="registry-server" Mar 09 14:20:00 crc kubenswrapper[4723]: I0309 14:20:00.182173 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551100-kdnwn" Mar 09 14:20:00 crc kubenswrapper[4723]: I0309 14:20:00.189662 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:20:00 crc kubenswrapper[4723]: I0309 14:20:00.189943 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:20:00 crc kubenswrapper[4723]: I0309 14:20:00.190336 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jrp4x" Mar 09 14:20:00 crc kubenswrapper[4723]: I0309 14:20:00.208168 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551100-kdnwn"] Mar 09 14:20:00 crc kubenswrapper[4723]: I0309 14:20:00.253775 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25fwx\" (UniqueName: \"kubernetes.io/projected/d85affa1-1885-445d-8a82-9f92136ae7f5-kube-api-access-25fwx\") pod \"auto-csr-approver-29551100-kdnwn\" (UID: \"d85affa1-1885-445d-8a82-9f92136ae7f5\") " pod="openshift-infra/auto-csr-approver-29551100-kdnwn" Mar 09 14:20:00 crc kubenswrapper[4723]: I0309 14:20:00.355511 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25fwx\" (UniqueName: \"kubernetes.io/projected/d85affa1-1885-445d-8a82-9f92136ae7f5-kube-api-access-25fwx\") pod \"auto-csr-approver-29551100-kdnwn\" (UID: \"d85affa1-1885-445d-8a82-9f92136ae7f5\") " pod="openshift-infra/auto-csr-approver-29551100-kdnwn" Mar 09 14:20:00 crc kubenswrapper[4723]: I0309 14:20:00.379324 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25fwx\" (UniqueName: \"kubernetes.io/projected/d85affa1-1885-445d-8a82-9f92136ae7f5-kube-api-access-25fwx\") pod \"auto-csr-approver-29551100-kdnwn\" (UID: \"d85affa1-1885-445d-8a82-9f92136ae7f5\") " pod="openshift-infra/auto-csr-approver-29551100-kdnwn" Mar 09 14:20:00 crc kubenswrapper[4723]: I0309 14:20:00.512567 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551100-kdnwn" Mar 09 14:20:01 crc kubenswrapper[4723]: I0309 14:20:01.047802 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551100-kdnwn"] Mar 09 14:20:01 crc kubenswrapper[4723]: I0309 14:20:01.114602 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551100-kdnwn" event={"ID":"d85affa1-1885-445d-8a82-9f92136ae7f5","Type":"ContainerStarted","Data":"32311bce863d657a7d3fc9f4ad93a98e7ae2a96f6cce4e1962441b48a7c3e839"} Mar 09 14:20:03 crc kubenswrapper[4723]: I0309 14:20:03.154634 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551100-kdnwn" event={"ID":"d85affa1-1885-445d-8a82-9f92136ae7f5","Type":"ContainerStarted","Data":"c9961d6fad91d1e5bbdb39fc69602c78a076f16d96afd54dd67e4d34b4823f47"} Mar 09 14:20:03 crc kubenswrapper[4723]: I0309 14:20:03.176730 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551100-kdnwn" podStartSLOduration=1.652697212 podStartE2EDuration="3.176712127s" podCreationTimestamp="2026-03-09 14:20:00 +0000 UTC" firstStartedPulling="2026-03-09 14:20:01.047451116 +0000 UTC m=+4875.061918656" lastFinishedPulling="2026-03-09 14:20:02.571466031 +0000 UTC m=+4876.585933571" observedRunningTime="2026-03-09 14:20:03.174305733 +0000 UTC m=+4877.188773273" watchObservedRunningTime="2026-03-09 14:20:03.176712127 +0000 UTC m=+4877.191179667" Mar 09 14:20:04 crc kubenswrapper[4723]: I0309 14:20:04.166408 4723 generic.go:334] "Generic (PLEG): container finished" podID="70240430-8cf3-4260-b1e5-c611041088ca" containerID="795541e761fac29b933322c3bbef5c92e0addb31b91d534e1c430afd4dda5cbb" exitCode=0 Mar 09 14:20:04 crc kubenswrapper[4723]: I0309 14:20:04.166464 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hv968/crc-debug-bdnkn" event={"ID":"70240430-8cf3-4260-b1e5-c611041088ca","Type":"ContainerDied","Data":"795541e761fac29b933322c3bbef5c92e0addb31b91d534e1c430afd4dda5cbb"} Mar 09 14:20:04 crc kubenswrapper[4723]: I0309 14:20:04.882537 4723 scope.go:117] "RemoveContainer" containerID="b5af8cf4e6b647cc3921b22101db6b30f5e74eefccad33aef3904a6fa1114186" Mar 09 14:20:04 crc kubenswrapper[4723]: E0309 14:20:04.882962 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:20:05 crc kubenswrapper[4723]: I0309 14:20:05.177626 4723 generic.go:334] "Generic (PLEG): container finished" podID="d85affa1-1885-445d-8a82-9f92136ae7f5" containerID="c9961d6fad91d1e5bbdb39fc69602c78a076f16d96afd54dd67e4d34b4823f47" exitCode=0 Mar 09 14:20:05 crc kubenswrapper[4723]: I0309 14:20:05.177715 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551100-kdnwn" event={"ID":"d85affa1-1885-445d-8a82-9f92136ae7f5","Type":"ContainerDied","Data":"c9961d6fad91d1e5bbdb39fc69602c78a076f16d96afd54dd67e4d34b4823f47"} Mar 09 14:20:05 crc kubenswrapper[4723]: I0309 14:20:05.327235 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hv968/crc-debug-bdnkn" Mar 09 14:20:05 crc kubenswrapper[4723]: I0309 14:20:05.372799 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hv968/crc-debug-bdnkn"] Mar 09 14:20:05 crc kubenswrapper[4723]: I0309 14:20:05.385338 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hv968/crc-debug-bdnkn"] Mar 09 14:20:05 crc kubenswrapper[4723]: I0309 14:20:05.491190 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pm76z\" (UniqueName: \"kubernetes.io/projected/70240430-8cf3-4260-b1e5-c611041088ca-kube-api-access-pm76z\") pod \"70240430-8cf3-4260-b1e5-c611041088ca\" (UID: \"70240430-8cf3-4260-b1e5-c611041088ca\") " Mar 09 14:20:05 crc kubenswrapper[4723]: I0309 14:20:05.492288 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/70240430-8cf3-4260-b1e5-c611041088ca-host\") pod \"70240430-8cf3-4260-b1e5-c611041088ca\" (UID: \"70240430-8cf3-4260-b1e5-c611041088ca\") " Mar 09 14:20:05 crc kubenswrapper[4723]: I0309 14:20:05.492381 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/70240430-8cf3-4260-b1e5-c611041088ca-host" (OuterVolumeSpecName: "host") pod "70240430-8cf3-4260-b1e5-c611041088ca" (UID: "70240430-8cf3-4260-b1e5-c611041088ca"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:20:05 crc kubenswrapper[4723]: I0309 14:20:05.494785 4723 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/70240430-8cf3-4260-b1e5-c611041088ca-host\") on node \"crc\" DevicePath \"\"" Mar 09 14:20:05 crc kubenswrapper[4723]: I0309 14:20:05.496605 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70240430-8cf3-4260-b1e5-c611041088ca-kube-api-access-pm76z" (OuterVolumeSpecName: "kube-api-access-pm76z") pod "70240430-8cf3-4260-b1e5-c611041088ca" (UID: "70240430-8cf3-4260-b1e5-c611041088ca"). InnerVolumeSpecName "kube-api-access-pm76z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:20:05 crc kubenswrapper[4723]: I0309 14:20:05.599181 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pm76z\" (UniqueName: \"kubernetes.io/projected/70240430-8cf3-4260-b1e5-c611041088ca-kube-api-access-pm76z\") on node \"crc\" DevicePath \"\"" Mar 09 14:20:06 crc kubenswrapper[4723]: I0309 14:20:06.189744 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb46fecb5dc4e2b43610a065d6986b7f78826e85fe82caaadd0903f55e4b0a7d" Mar 09 14:20:06 crc kubenswrapper[4723]: I0309 14:20:06.189773 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hv968/crc-debug-bdnkn" Mar 09 14:20:06 crc kubenswrapper[4723]: I0309 14:20:06.657775 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hv968/crc-debug-8xxq6"] Mar 09 14:20:06 crc kubenswrapper[4723]: E0309 14:20:06.659013 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70240430-8cf3-4260-b1e5-c611041088ca" containerName="container-00" Mar 09 14:20:06 crc kubenswrapper[4723]: I0309 14:20:06.659035 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="70240430-8cf3-4260-b1e5-c611041088ca" containerName="container-00" Mar 09 14:20:06 crc kubenswrapper[4723]: I0309 14:20:06.659324 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="70240430-8cf3-4260-b1e5-c611041088ca" containerName="container-00" Mar 09 14:20:06 crc kubenswrapper[4723]: I0309 14:20:06.660261 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hv968/crc-debug-8xxq6" Mar 09 14:20:06 crc kubenswrapper[4723]: I0309 14:20:06.671441 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551100-kdnwn" Mar 09 14:20:06 crc kubenswrapper[4723]: I0309 14:20:06.837359 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25fwx\" (UniqueName: \"kubernetes.io/projected/d85affa1-1885-445d-8a82-9f92136ae7f5-kube-api-access-25fwx\") pod \"d85affa1-1885-445d-8a82-9f92136ae7f5\" (UID: \"d85affa1-1885-445d-8a82-9f92136ae7f5\") " Mar 09 14:20:06 crc kubenswrapper[4723]: I0309 14:20:06.838044 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/74d9b1fa-f4d3-4b9d-8fef-e433d694a891-host\") pod \"crc-debug-8xxq6\" (UID: \"74d9b1fa-f4d3-4b9d-8fef-e433d694a891\") " pod="openshift-must-gather-hv968/crc-debug-8xxq6" Mar 09 14:20:06 crc kubenswrapper[4723]: I0309 14:20:06.838099 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czf72\" (UniqueName: \"kubernetes.io/projected/74d9b1fa-f4d3-4b9d-8fef-e433d694a891-kube-api-access-czf72\") pod \"crc-debug-8xxq6\" (UID: \"74d9b1fa-f4d3-4b9d-8fef-e433d694a891\") " pod="openshift-must-gather-hv968/crc-debug-8xxq6" Mar 09 14:20:06 crc kubenswrapper[4723]: I0309 14:20:06.843782 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d85affa1-1885-445d-8a82-9f92136ae7f5-kube-api-access-25fwx" (OuterVolumeSpecName: "kube-api-access-25fwx") pod "d85affa1-1885-445d-8a82-9f92136ae7f5" (UID: "d85affa1-1885-445d-8a82-9f92136ae7f5"). InnerVolumeSpecName "kube-api-access-25fwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:20:06 crc kubenswrapper[4723]: I0309 14:20:06.894527 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70240430-8cf3-4260-b1e5-c611041088ca" path="/var/lib/kubelet/pods/70240430-8cf3-4260-b1e5-c611041088ca/volumes" Mar 09 14:20:06 crc kubenswrapper[4723]: I0309 14:20:06.941164 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/74d9b1fa-f4d3-4b9d-8fef-e433d694a891-host\") pod \"crc-debug-8xxq6\" (UID: \"74d9b1fa-f4d3-4b9d-8fef-e433d694a891\") " pod="openshift-must-gather-hv968/crc-debug-8xxq6" Mar 09 14:20:06 crc kubenswrapper[4723]: I0309 14:20:06.941219 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czf72\" (UniqueName: \"kubernetes.io/projected/74d9b1fa-f4d3-4b9d-8fef-e433d694a891-kube-api-access-czf72\") pod \"crc-debug-8xxq6\" (UID: \"74d9b1fa-f4d3-4b9d-8fef-e433d694a891\") " pod="openshift-must-gather-hv968/crc-debug-8xxq6" Mar 09 14:20:06 crc kubenswrapper[4723]: I0309 14:20:06.941484 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25fwx\" (UniqueName: \"kubernetes.io/projected/d85affa1-1885-445d-8a82-9f92136ae7f5-kube-api-access-25fwx\") on node \"crc\" DevicePath \"\"" Mar 09 14:20:06 crc kubenswrapper[4723]: I0309 14:20:06.941758 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/74d9b1fa-f4d3-4b9d-8fef-e433d694a891-host\") pod \"crc-debug-8xxq6\" (UID: \"74d9b1fa-f4d3-4b9d-8fef-e433d694a891\") " pod="openshift-must-gather-hv968/crc-debug-8xxq6" Mar 09 14:20:06 crc kubenswrapper[4723]: I0309 14:20:06.959367 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czf72\" (UniqueName: \"kubernetes.io/projected/74d9b1fa-f4d3-4b9d-8fef-e433d694a891-kube-api-access-czf72\") pod \"crc-debug-8xxq6\" (UID: \"74d9b1fa-f4d3-4b9d-8fef-e433d694a891\") " pod="openshift-must-gather-hv968/crc-debug-8xxq6" Mar 09 14:20:06 crc kubenswrapper[4723]: I0309 14:20:06.991986 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hv968/crc-debug-8xxq6" Mar 09 14:20:07 crc kubenswrapper[4723]: W0309 14:20:07.022662 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74d9b1fa_f4d3_4b9d_8fef_e433d694a891.slice/crio-600802e8ec13ef470dfafe38ae409eee60ff7b9b39341920725470bbcdeb26b5 WatchSource:0}: Error finding container 600802e8ec13ef470dfafe38ae409eee60ff7b9b39341920725470bbcdeb26b5: Status 404 returned error can't find the container with id 600802e8ec13ef470dfafe38ae409eee60ff7b9b39341920725470bbcdeb26b5 Mar 09 14:20:07 crc kubenswrapper[4723]: I0309 14:20:07.201097 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551100-kdnwn" event={"ID":"d85affa1-1885-445d-8a82-9f92136ae7f5","Type":"ContainerDied","Data":"32311bce863d657a7d3fc9f4ad93a98e7ae2a96f6cce4e1962441b48a7c3e839"} Mar 09 14:20:07 crc kubenswrapper[4723]: I0309 14:20:07.201431 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32311bce863d657a7d3fc9f4ad93a98e7ae2a96f6cce4e1962441b48a7c3e839" Mar 09 14:20:07 crc kubenswrapper[4723]: I0309 14:20:07.201138 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551100-kdnwn" Mar 09 14:20:07 crc kubenswrapper[4723]: I0309 14:20:07.203564 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hv968/crc-debug-8xxq6" event={"ID":"74d9b1fa-f4d3-4b9d-8fef-e433d694a891","Type":"ContainerStarted","Data":"600802e8ec13ef470dfafe38ae409eee60ff7b9b39341920725470bbcdeb26b5"} Mar 09 14:20:07 crc kubenswrapper[4723]: I0309 14:20:07.264072 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551094-v7l4f"] Mar 09 14:20:07 crc kubenswrapper[4723]: I0309 14:20:07.278416 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551094-v7l4f"] Mar 09 14:20:08 crc kubenswrapper[4723]: I0309 14:20:08.213405 4723 generic.go:334] "Generic (PLEG): container finished" podID="74d9b1fa-f4d3-4b9d-8fef-e433d694a891" containerID="817078669c2a0e70fb168898c8884a4887147fd05fcc918f871e2c377bdcf9a1" exitCode=0 Mar 09 14:20:08 crc kubenswrapper[4723]: I0309 14:20:08.213512 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hv968/crc-debug-8xxq6" event={"ID":"74d9b1fa-f4d3-4b9d-8fef-e433d694a891","Type":"ContainerDied","Data":"817078669c2a0e70fb168898c8884a4887147fd05fcc918f871e2c377bdcf9a1"} Mar 09 14:20:08 crc kubenswrapper[4723]: I0309 14:20:08.899632 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e5b9c61-8285-4bfb-aa90-1dcb3130a55d" path="/var/lib/kubelet/pods/1e5b9c61-8285-4bfb-aa90-1dcb3130a55d/volumes" Mar 09 14:20:09 crc kubenswrapper[4723]: I0309 14:20:09.362145 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hv968/crc-debug-8xxq6"] Mar 09 14:20:09 crc kubenswrapper[4723]: I0309 14:20:09.371770 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hv968/crc-debug-8xxq6"] Mar 09 14:20:09 crc kubenswrapper[4723]: I0309 14:20:09.735291 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hv968/crc-debug-8xxq6" Mar 09 14:20:09 crc kubenswrapper[4723]: I0309 14:20:09.910474 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/74d9b1fa-f4d3-4b9d-8fef-e433d694a891-host\") pod \"74d9b1fa-f4d3-4b9d-8fef-e433d694a891\" (UID: \"74d9b1fa-f4d3-4b9d-8fef-e433d694a891\") " Mar 09 14:20:09 crc kubenswrapper[4723]: I0309 14:20:09.910582 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74d9b1fa-f4d3-4b9d-8fef-e433d694a891-host" (OuterVolumeSpecName: "host") pod "74d9b1fa-f4d3-4b9d-8fef-e433d694a891" (UID: "74d9b1fa-f4d3-4b9d-8fef-e433d694a891"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:20:09 crc kubenswrapper[4723]: I0309 14:20:09.910878 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czf72\" (UniqueName: \"kubernetes.io/projected/74d9b1fa-f4d3-4b9d-8fef-e433d694a891-kube-api-access-czf72\") pod \"74d9b1fa-f4d3-4b9d-8fef-e433d694a891\" (UID: \"74d9b1fa-f4d3-4b9d-8fef-e433d694a891\") " Mar 09 14:20:09 crc kubenswrapper[4723]: I0309 14:20:09.911452 4723 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/74d9b1fa-f4d3-4b9d-8fef-e433d694a891-host\") on node \"crc\" DevicePath \"\"" Mar 09 14:20:09 crc kubenswrapper[4723]: I0309 14:20:09.924268 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74d9b1fa-f4d3-4b9d-8fef-e433d694a891-kube-api-access-czf72" (OuterVolumeSpecName: "kube-api-access-czf72") pod "74d9b1fa-f4d3-4b9d-8fef-e433d694a891" (UID: "74d9b1fa-f4d3-4b9d-8fef-e433d694a891"). InnerVolumeSpecName "kube-api-access-czf72". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:20:10 crc kubenswrapper[4723]: I0309 14:20:10.014329 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czf72\" (UniqueName: \"kubernetes.io/projected/74d9b1fa-f4d3-4b9d-8fef-e433d694a891-kube-api-access-czf72\") on node \"crc\" DevicePath \"\"" Mar 09 14:20:10 crc kubenswrapper[4723]: I0309 14:20:10.243990 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="600802e8ec13ef470dfafe38ae409eee60ff7b9b39341920725470bbcdeb26b5" Mar 09 14:20:10 crc kubenswrapper[4723]: I0309 14:20:10.244076 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hv968/crc-debug-8xxq6" Mar 09 14:20:10 crc kubenswrapper[4723]: I0309 14:20:10.796886 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hv968/crc-debug-sr59l"] Mar 09 14:20:10 crc kubenswrapper[4723]: E0309 14:20:10.798143 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74d9b1fa-f4d3-4b9d-8fef-e433d694a891" containerName="container-00" Mar 09 14:20:10 crc kubenswrapper[4723]: I0309 14:20:10.798188 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="74d9b1fa-f4d3-4b9d-8fef-e433d694a891" containerName="container-00" Mar 09 14:20:10 crc kubenswrapper[4723]: E0309 14:20:10.798211 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d85affa1-1885-445d-8a82-9f92136ae7f5" containerName="oc" Mar 09 14:20:10 crc kubenswrapper[4723]: I0309 14:20:10.798223 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="d85affa1-1885-445d-8a82-9f92136ae7f5" containerName="oc" Mar 09 14:20:10 crc kubenswrapper[4723]: I0309 14:20:10.798617 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="74d9b1fa-f4d3-4b9d-8fef-e433d694a891" containerName="container-00" Mar 09 14:20:10 crc kubenswrapper[4723]: I0309 14:20:10.798665 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="d85affa1-1885-445d-8a82-9f92136ae7f5" containerName="oc" Mar 09 14:20:10 crc kubenswrapper[4723]: I0309 14:20:10.800273 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hv968/crc-debug-sr59l" Mar 09 14:20:10 crc kubenswrapper[4723]: I0309 14:20:10.900186 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74d9b1fa-f4d3-4b9d-8fef-e433d694a891" path="/var/lib/kubelet/pods/74d9b1fa-f4d3-4b9d-8fef-e433d694a891/volumes" Mar 09 14:20:10 crc kubenswrapper[4723]: I0309 14:20:10.944531 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2dbaf07a-3d50-45a4-a848-f5f3c1636307-host\") pod \"crc-debug-sr59l\" (UID: \"2dbaf07a-3d50-45a4-a848-f5f3c1636307\") " pod="openshift-must-gather-hv968/crc-debug-sr59l" Mar 09 14:20:10 crc kubenswrapper[4723]: I0309 14:20:10.944874 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr7bp\" (UniqueName: \"kubernetes.io/projected/2dbaf07a-3d50-45a4-a848-f5f3c1636307-kube-api-access-wr7bp\") pod \"crc-debug-sr59l\" (UID: \"2dbaf07a-3d50-45a4-a848-f5f3c1636307\") " pod="openshift-must-gather-hv968/crc-debug-sr59l" Mar 09 14:20:11 crc kubenswrapper[4723]: I0309 14:20:11.046610 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2dbaf07a-3d50-45a4-a848-f5f3c1636307-host\") pod \"crc-debug-sr59l\" (UID: \"2dbaf07a-3d50-45a4-a848-f5f3c1636307\") " pod="openshift-must-gather-hv968/crc-debug-sr59l" Mar 09 14:20:11 crc kubenswrapper[4723]: I0309 14:20:11.047002 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2dbaf07a-3d50-45a4-a848-f5f3c1636307-host\") pod \"crc-debug-sr59l\" (UID: \"2dbaf07a-3d50-45a4-a848-f5f3c1636307\") " pod="openshift-must-gather-hv968/crc-debug-sr59l" Mar 09 14:20:11 crc kubenswrapper[4723]: I0309 14:20:11.047947 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wr7bp\" (UniqueName: \"kubernetes.io/projected/2dbaf07a-3d50-45a4-a848-f5f3c1636307-kube-api-access-wr7bp\") pod \"crc-debug-sr59l\" (UID: \"2dbaf07a-3d50-45a4-a848-f5f3c1636307\") " pod="openshift-must-gather-hv968/crc-debug-sr59l" Mar 09 14:20:11 crc kubenswrapper[4723]: I0309 14:20:11.085073 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr7bp\" (UniqueName: \"kubernetes.io/projected/2dbaf07a-3d50-45a4-a848-f5f3c1636307-kube-api-access-wr7bp\") pod \"crc-debug-sr59l\" (UID: \"2dbaf07a-3d50-45a4-a848-f5f3c1636307\") " pod="openshift-must-gather-hv968/crc-debug-sr59l" Mar 09 14:20:11 crc kubenswrapper[4723]: I0309 14:20:11.135393 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hv968/crc-debug-sr59l" Mar 09 14:20:11 crc kubenswrapper[4723]: W0309 14:20:11.182196 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2dbaf07a_3d50_45a4_a848_f5f3c1636307.slice/crio-dd72850d3b4cbdc9bd3ffb97f2b6f5adb1d5dc3617cd98a06ed7363a2f508e97 WatchSource:0}: Error finding container dd72850d3b4cbdc9bd3ffb97f2b6f5adb1d5dc3617cd98a06ed7363a2f508e97: Status 404 returned error can't find the container with id dd72850d3b4cbdc9bd3ffb97f2b6f5adb1d5dc3617cd98a06ed7363a2f508e97 Mar 09 14:20:11 crc kubenswrapper[4723]: I0309 14:20:11.256900 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hv968/crc-debug-sr59l" event={"ID":"2dbaf07a-3d50-45a4-a848-f5f3c1636307","Type":"ContainerStarted","Data":"dd72850d3b4cbdc9bd3ffb97f2b6f5adb1d5dc3617cd98a06ed7363a2f508e97"} Mar 09 14:20:13 crc kubenswrapper[4723]: I0309 14:20:13.306221 4723 generic.go:334] "Generic (PLEG): container finished" podID="2dbaf07a-3d50-45a4-a848-f5f3c1636307" containerID="1bf4e09c48f5b1ff8dabc7fb66b6cbc7cdb25335f46212187b008606a89977f7" exitCode=0 Mar 09 14:20:13 crc kubenswrapper[4723]: I0309 14:20:13.306308 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hv968/crc-debug-sr59l" event={"ID":"2dbaf07a-3d50-45a4-a848-f5f3c1636307","Type":"ContainerDied","Data":"1bf4e09c48f5b1ff8dabc7fb66b6cbc7cdb25335f46212187b008606a89977f7"} Mar 09 14:20:13 crc kubenswrapper[4723]: I0309 14:20:13.357897 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hv968/crc-debug-sr59l"] Mar 09 14:20:13 crc kubenswrapper[4723]: I0309 14:20:13.368562 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hv968/crc-debug-sr59l"] Mar 09 14:20:14 crc kubenswrapper[4723]: I0309 14:20:14.462672 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hv968/crc-debug-sr59l" Mar 09 14:20:14 crc kubenswrapper[4723]: I0309 14:20:14.647743 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2dbaf07a-3d50-45a4-a848-f5f3c1636307-host\") pod \"2dbaf07a-3d50-45a4-a848-f5f3c1636307\" (UID: \"2dbaf07a-3d50-45a4-a848-f5f3c1636307\") " Mar 09 14:20:14 crc kubenswrapper[4723]: I0309 14:20:14.647825 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2dbaf07a-3d50-45a4-a848-f5f3c1636307-host" (OuterVolumeSpecName: "host") pod "2dbaf07a-3d50-45a4-a848-f5f3c1636307" (UID: "2dbaf07a-3d50-45a4-a848-f5f3c1636307"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:20:14 crc kubenswrapper[4723]: I0309 14:20:14.647930 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wr7bp\" (UniqueName: \"kubernetes.io/projected/2dbaf07a-3d50-45a4-a848-f5f3c1636307-kube-api-access-wr7bp\") pod \"2dbaf07a-3d50-45a4-a848-f5f3c1636307\" (UID: \"2dbaf07a-3d50-45a4-a848-f5f3c1636307\") " Mar 09 14:20:14 crc kubenswrapper[4723]: I0309 14:20:14.648685 4723 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2dbaf07a-3d50-45a4-a848-f5f3c1636307-host\") on node \"crc\" DevicePath \"\"" Mar 09 14:20:14 crc kubenswrapper[4723]: I0309 14:20:14.654685 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dbaf07a-3d50-45a4-a848-f5f3c1636307-kube-api-access-wr7bp" (OuterVolumeSpecName: "kube-api-access-wr7bp") pod "2dbaf07a-3d50-45a4-a848-f5f3c1636307" (UID: "2dbaf07a-3d50-45a4-a848-f5f3c1636307"). InnerVolumeSpecName "kube-api-access-wr7bp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:20:14 crc kubenswrapper[4723]: I0309 14:20:14.749556 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wr7bp\" (UniqueName: \"kubernetes.io/projected/2dbaf07a-3d50-45a4-a848-f5f3c1636307-kube-api-access-wr7bp\") on node \"crc\" DevicePath \"\"" Mar 09 14:20:14 crc kubenswrapper[4723]: I0309 14:20:14.895311 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dbaf07a-3d50-45a4-a848-f5f3c1636307" path="/var/lib/kubelet/pods/2dbaf07a-3d50-45a4-a848-f5f3c1636307/volumes" Mar 09 14:20:15 crc kubenswrapper[4723]: I0309 14:20:15.328963 4723 scope.go:117] "RemoveContainer" containerID="1bf4e09c48f5b1ff8dabc7fb66b6cbc7cdb25335f46212187b008606a89977f7" Mar 09 14:20:15 crc kubenswrapper[4723]: I0309 14:20:15.329034 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hv968/crc-debug-sr59l" Mar 09 14:20:15 crc kubenswrapper[4723]: I0309 14:20:15.881537 4723 scope.go:117] "RemoveContainer" containerID="b5af8cf4e6b647cc3921b22101db6b30f5e74eefccad33aef3904a6fa1114186" Mar 09 14:20:15 crc kubenswrapper[4723]: E0309 14:20:15.882357 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:20:28 crc kubenswrapper[4723]: I0309 14:20:28.881545 4723 scope.go:117] "RemoveContainer" containerID="b5af8cf4e6b647cc3921b22101db6b30f5e74eefccad33aef3904a6fa1114186" Mar 09 14:20:28 crc kubenswrapper[4723]: E0309 14:20:28.882391 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:20:29 crc kubenswrapper[4723]: I0309 14:20:29.072583 4723 scope.go:117] "RemoveContainer" containerID="9ba5e1c7e54a9750af3ad8343e7a78621063b1a6c1b4b940b126177930680dfa" Mar 09 14:20:42 crc kubenswrapper[4723]: I0309 14:20:42.881633 4723 scope.go:117] "RemoveContainer" containerID="b5af8cf4e6b647cc3921b22101db6b30f5e74eefccad33aef3904a6fa1114186" Mar 09 14:20:42 crc kubenswrapper[4723]: E0309 14:20:42.883239 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:20:52 crc kubenswrapper[4723]: I0309 14:20:52.484412 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_abe6e5e7-570d-40e4-995c-2921d3bd93ca/aodh-api/0.log" Mar 09 14:20:52 crc kubenswrapper[4723]: I0309 14:20:52.490158 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_abe6e5e7-570d-40e4-995c-2921d3bd93ca/aodh-evaluator/0.log" Mar 09 14:20:52 crc kubenswrapper[4723]: I0309 14:20:52.585409 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_abe6e5e7-570d-40e4-995c-2921d3bd93ca/aodh-listener/0.log" Mar 09 14:20:52 crc kubenswrapper[4723]: I0309 14:20:52.720500 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-695767fcfd-4rv9j_6a33c0e4-cc6f-49f1-bdb3-dab35255fd07/barbican-api/0.log" Mar 09 14:20:52 crc kubenswrapper[4723]: I0309 14:20:52.729698 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_abe6e5e7-570d-40e4-995c-2921d3bd93ca/aodh-notifier/0.log" Mar 09 14:20:52 crc kubenswrapper[4723]: I0309 14:20:52.819377 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-695767fcfd-4rv9j_6a33c0e4-cc6f-49f1-bdb3-dab35255fd07/barbican-api-log/0.log" Mar 09 14:20:53 crc kubenswrapper[4723]: I0309 14:20:53.035483 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-64c6fb6dbb-56jkv_cbf24888-ed0f-4900-963d-c61c23af5bfe/barbican-keystone-listener-log/0.log" Mar 09 14:20:53 crc kubenswrapper[4723]: I0309 14:20:53.052798 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-64c6fb6dbb-56jkv_cbf24888-ed0f-4900-963d-c61c23af5bfe/barbican-keystone-listener/0.log" Mar 09 14:20:53 crc kubenswrapper[4723]: I0309 14:20:53.239440 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6b7867888c-cmz44_f6048045-801d-4833-87ed-b47076a02338/barbican-worker-log/0.log" Mar 09 14:20:53 crc kubenswrapper[4723]: I0309 14:20:53.239521 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6b7867888c-cmz44_f6048045-801d-4833-87ed-b47076a02338/barbican-worker/0.log" Mar 09 14:20:53 crc kubenswrapper[4723]: I0309 14:20:53.441881 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-jxhhr_421e85c4-9862-4601-ab03-d2b602ba68f4/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:20:53 crc kubenswrapper[4723]: I0309 14:20:53.518587 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_df18bf19-d23a-471f-8074-2eaaa7c4aead/ceilometer-central-agent/1.log" Mar 09 14:20:53 crc kubenswrapper[4723]: I0309 14:20:53.602816 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_df18bf19-d23a-471f-8074-2eaaa7c4aead/ceilometer-central-agent/0.log" Mar 09 14:20:53 crc kubenswrapper[4723]: I0309 14:20:53.686781 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_df18bf19-d23a-471f-8074-2eaaa7c4aead/ceilometer-notification-agent/0.log" Mar 09 14:20:53 crc kubenswrapper[4723]: I0309 14:20:53.735981 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_df18bf19-d23a-471f-8074-2eaaa7c4aead/proxy-httpd/0.log" Mar 09 14:20:53 crc kubenswrapper[4723]: I0309 14:20:53.787088 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_df18bf19-d23a-471f-8074-2eaaa7c4aead/sg-core/0.log" Mar 09 14:20:53 crc kubenswrapper[4723]: I0309 14:20:53.949702 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_08390bdf-47f0-43a2-89ac-1a7240dcd2dd/cinder-api/0.log" Mar 09 14:20:53 crc kubenswrapper[4723]: I0309 14:20:53.977620 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_08390bdf-47f0-43a2-89ac-1a7240dcd2dd/cinder-api-log/0.log" Mar 09 14:20:54 crc kubenswrapper[4723]: I0309 14:20:54.172175 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_cb802a89-59e3-4b45-bb49-20b980e06a57/cinder-scheduler/1.log" Mar 09 14:20:54 crc kubenswrapper[4723]: I0309 14:20:54.189958 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_cb802a89-59e3-4b45-bb49-20b980e06a57/cinder-scheduler/0.log" Mar 09 14:20:54 crc kubenswrapper[4723]: I0309 14:20:54.229124 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_cb802a89-59e3-4b45-bb49-20b980e06a57/probe/0.log" Mar 09 14:20:54 crc kubenswrapper[4723]: I0309 14:20:54.399270 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-p96wk_5c63771a-b480-44bf-98f4-68ac62c7189a/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:20:54 crc kubenswrapper[4723]: I0309 14:20:54.456502 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-b5dvf_18f88a77-e904-4382-84ac-64567a2ef585/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:20:54 crc kubenswrapper[4723]: I0309 14:20:54.678772 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-bb85b8995-q4x27_73fbad24-6f18-4227-bc88-968635e92584/init/0.log" Mar 09 14:20:55 crc kubenswrapper[4723]: I0309 14:20:55.072594 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-bb85b8995-q4x27_73fbad24-6f18-4227-bc88-968635e92584/init/0.log" Mar 09 14:20:55 crc kubenswrapper[4723]: I0309 14:20:55.170888 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-vrckm_e9c9b511-2ead-4e25-a076-846d1723510b/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:20:55 crc kubenswrapper[4723]: I0309 14:20:55.195613 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-bb85b8995-q4x27_73fbad24-6f18-4227-bc88-968635e92584/dnsmasq-dns/0.log" Mar 09 14:20:55 crc kubenswrapper[4723]: I0309 14:20:55.409118 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_a40a3caa-3f07-4139-9ca7-0daa1fbc2806/glance-httpd/0.log" Mar 09 14:20:55 crc kubenswrapper[4723]: I0309 14:20:55.419438 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_a40a3caa-3f07-4139-9ca7-0daa1fbc2806/glance-log/0.log" Mar 09 14:20:55 crc kubenswrapper[4723]: I0309 14:20:55.612570 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_27c3ccb2-5302-4c07-98d7-2e4a9267f423/glance-httpd/0.log" Mar 09 14:20:55 crc kubenswrapper[4723]: I0309 14:20:55.656055 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_27c3ccb2-5302-4c07-98d7-2e4a9267f423/glance-log/0.log" Mar 09 14:20:55 crc kubenswrapper[4723]: I0309 14:20:55.881295 4723 scope.go:117] "RemoveContainer" containerID="b5af8cf4e6b647cc3921b22101db6b30f5e74eefccad33aef3904a6fa1114186" Mar 09 14:20:55 crc kubenswrapper[4723]: E0309 14:20:55.881624 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:20:56 crc kubenswrapper[4723]: I0309 14:20:56.310823 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-6c4d4cb58-tqwpm_e28e2823-9dea-4b47-9489-47442b2cfa08/heat-api/0.log" Mar 09 14:20:56 crc kubenswrapper[4723]: I0309 14:20:56.470117 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-85fdddbb7f-wkltg_3bc1a36b-8585-4de5-aca1-32edc3ef3b8d/heat-cfnapi/0.log" Mar 09 14:20:56 crc kubenswrapper[4723]: I0309 14:20:56.499190 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-cd9b85f6c-jhcds_655446a7-da12-4146-bf48-9a7a74765c5c/heat-engine/0.log" Mar 09 14:20:56 crc kubenswrapper[4723]: I0309 14:20:56.571820 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-qg5wh_746fd331-118e-4649-bd60-d93fa3cb0f12/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:20:56 crc kubenswrapper[4723]: I0309 14:20:56.801208 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-pwndd_8459660b-c673-46c5-81de-8081c3545a15/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:20:57 crc kubenswrapper[4723]: I0309 14:20:57.098932 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29551081-tq9wf_db975a46-d705-4df6-a5d6-1a598088dfd7/keystone-cron/0.log" Mar 09 14:20:57 crc kubenswrapper[4723]: I0309 14:20:57.268519 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_7d7e1dc5-cfb6-423a-9098-24e0ca23e7b9/kube-state-metrics/0.log" Mar 09 14:20:57 crc kubenswrapper[4723]: I0309 14:20:57.433813 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-mlxv7_e7260959-2422-4f47-b153-63bba9a58875/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:20:57 crc kubenswrapper[4723]: I0309 14:20:57.518754 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_logging-edpm-deployment-openstack-edpm-ipam-db7kb_b024d678-4342-4754-86de-89af19e8153a/logging-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:20:57 crc kubenswrapper[4723]: I0309 14:20:57.765965 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-644bd545d4-m82n9_8d4bcf87-b989-4476-b780-23cfb3da4504/keystone-api/0.log" Mar 09 14:20:57 crc kubenswrapper[4723]: I0309 14:20:57.780522 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mysqld-exporter-0_add32e59-a256-4ac8-9e96-8f340b9119de/mysqld-exporter/0.log" Mar 09 14:20:58 crc kubenswrapper[4723]: I0309 14:20:58.196595 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6556fbf64c-254qf_57d35c3a-e724-4070-b5b4-e6a979b0c09a/neutron-httpd/0.log" Mar 09 14:20:58 crc kubenswrapper[4723]: I0309 14:20:58.297775 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6556fbf64c-254qf_57d35c3a-e724-4070-b5b4-e6a979b0c09a/neutron-api/0.log" Mar 09 14:20:58 crc kubenswrapper[4723]: I0309 14:20:58.436720 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-jqf7q_eccdafec-6101-40db-8d3a-a7141546b0b5/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:20:58 crc kubenswrapper[4723]: I0309 14:20:58.955368 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f73e368c-e54c-4f8f-9e50-857e5e72f8ce/nova-api-log/0.log" Mar 09 14:20:58 crc kubenswrapper[4723]: I0309 14:20:58.963987 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_2d33e56a-fbb0-4ac1-a383-9ad42906d556/nova-cell0-conductor-conductor/0.log" Mar 09 14:20:59 crc kubenswrapper[4723]: I0309 14:20:59.279084 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_3884df2d-0e53-4bdd-b661-b809a2791240/nova-cell1-conductor-conductor/0.log" Mar 09 14:20:59 crc kubenswrapper[4723]: I0309 14:20:59.332603 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_44bd7e16-39c2-4847-8f6f-523ade24e8cb/nova-cell1-novncproxy-novncproxy/0.log" Mar 09 14:20:59 crc kubenswrapper[4723]: I0309 14:20:59.338783 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f73e368c-e54c-4f8f-9e50-857e5e72f8ce/nova-api-api/0.log" Mar 09 14:20:59 crc kubenswrapper[4723]: I0309 14:20:59.587433 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-sxtxv_bd750bc2-9d58-4d3d-9d73-644c1bce9804/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:20:59 crc kubenswrapper[4723]: I0309 14:20:59.718233 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_fdae6e1e-9f23-495b-aeef-2a457377db3a/nova-metadata-log/0.log" Mar 09 14:21:00 crc kubenswrapper[4723]: I0309 14:21:00.020719 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_c19eac35-9f03-4dea-b1a2-72276e8a1074/nova-scheduler-scheduler/0.log" Mar 09 14:21:00 crc kubenswrapper[4723]: I0309 14:21:00.122085 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f01dc50c-55d6-4f99-92f8-d3adfcf8d71b/mysql-bootstrap/0.log" Mar 09 14:21:00 crc kubenswrapper[4723]: I0309 14:21:00.482280 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f01dc50c-55d6-4f99-92f8-d3adfcf8d71b/mysql-bootstrap/0.log" Mar 09 14:21:00 crc kubenswrapper[4723]: I0309 14:21:00.490331 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f01dc50c-55d6-4f99-92f8-d3adfcf8d71b/galera/0.log" Mar 09 14:21:00 crc kubenswrapper[4723]: I0309 14:21:00.745559 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17/mysql-bootstrap/0.log" Mar 09 14:21:00 crc kubenswrapper[4723]: I0309 14:21:00.882235 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17/mysql-bootstrap/0.log" Mar 09 14:21:00 crc kubenswrapper[4723]: I0309 14:21:00.967968 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17/galera/1.log" Mar 09 14:21:00 crc kubenswrapper[4723]: I0309 14:21:00.980261 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_5e6e9b36-677a-46ad-8fbf-a5d5b6eadc17/galera/0.log" Mar 09 14:21:01 crc kubenswrapper[4723]: I0309 14:21:01.191531 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_2c7e526b-cb25-4469-a6bd-b19fa44ca499/openstackclient/0.log" Mar 09 14:21:01 crc kubenswrapper[4723]: I0309 14:21:01.448717 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-5n52p_ea8d3865-305b-4ab6-833c-f8b227b6bae4/ovn-controller/0.log" Mar 09 14:21:01 crc kubenswrapper[4723]: I0309 14:21:01.541066 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_fdae6e1e-9f23-495b-aeef-2a457377db3a/nova-metadata-metadata/0.log" Mar 09 14:21:01 crc kubenswrapper[4723]: I0309 14:21:01.541551 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-vxlcg_ae48b5ff-caaa-4d4c-81c5-201afe2220ef/openstack-network-exporter/0.log" Mar 09 14:21:01 crc kubenswrapper[4723]: I0309 14:21:01.685628 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-65x7z_afd41bfc-ea4f-4ac5-aa58-896d8fd76cfa/ovsdb-server-init/0.log" Mar 09 14:21:01 crc kubenswrapper[4723]: I0309 14:21:01.945260 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-65x7z_afd41bfc-ea4f-4ac5-aa58-896d8fd76cfa/ovsdb-server-init/0.log" Mar 09 14:21:01 crc kubenswrapper[4723]: I0309 14:21:01.961097 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-65x7z_afd41bfc-ea4f-4ac5-aa58-896d8fd76cfa/ovsdb-server/0.log" Mar 09 14:21:01 crc kubenswrapper[4723]: I0309 14:21:01.963820 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-65x7z_afd41bfc-ea4f-4ac5-aa58-896d8fd76cfa/ovs-vswitchd/0.log" Mar 09 14:21:02 crc kubenswrapper[4723]: I0309 14:21:02.185890 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_80aecab8-a10c-48aa-9cba-a35bd822cc09/openstack-network-exporter/0.log" Mar 09 14:21:02 crc kubenswrapper[4723]: I0309 14:21:02.273264 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-b8zjw_940da565-ce26-42f3-a35d-dbaa3efd8521/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:21:02 crc kubenswrapper[4723]: I0309 14:21:02.293156 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_80aecab8-a10c-48aa-9cba-a35bd822cc09/ovn-northd/0.log" Mar 09 14:21:02 crc kubenswrapper[4723]: I0309 14:21:02.715142 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a66d0873-8ac9-4410-92d9-aaf8efdaa527/openstack-network-exporter/0.log" Mar 09 14:21:02 crc kubenswrapper[4723]: I0309 14:21:02.788090 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a66d0873-8ac9-4410-92d9-aaf8efdaa527/ovsdbserver-nb/0.log" Mar 09 14:21:03 crc kubenswrapper[4723]: I0309 14:21:03.032627 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_3da5bb20-bbc3-4c40-9b5c-f1cb12074c23/openstack-network-exporter/0.log" Mar 09 14:21:03 crc kubenswrapper[4723]: I0309 14:21:03.119055 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_3da5bb20-bbc3-4c40-9b5c-f1cb12074c23/ovsdbserver-sb/0.log" Mar 09 14:21:03 crc kubenswrapper[4723]: I0309 14:21:03.343286 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7d4b578b96-kksd4_764f6f4a-4b34-41d3-b8b2-e7cd8cc754bf/placement-api/0.log" Mar 09 14:21:03 crc kubenswrapper[4723]: I0309 14:21:03.419325 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_3e47df78-6587-4f83-a1c9-dcaf0aa9b73c/init-config-reloader/0.log" Mar 09 14:21:03 crc kubenswrapper[4723]: I0309 14:21:03.534051 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7d4b578b96-kksd4_764f6f4a-4b34-41d3-b8b2-e7cd8cc754bf/placement-log/0.log" Mar 09 14:21:03 crc kubenswrapper[4723]: I0309 14:21:03.691698 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_3e47df78-6587-4f83-a1c9-dcaf0aa9b73c/init-config-reloader/0.log" Mar 09 14:21:03 crc kubenswrapper[4723]: I0309 14:21:03.740285 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_3e47df78-6587-4f83-a1c9-dcaf0aa9b73c/config-reloader/0.log" Mar 09 14:21:03 crc kubenswrapper[4723]: I0309 14:21:03.773004 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_3e47df78-6587-4f83-a1c9-dcaf0aa9b73c/thanos-sidecar/0.log" Mar 09 14:21:03 crc kubenswrapper[4723]: I0309 14:21:03.897434 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_3e47df78-6587-4f83-a1c9-dcaf0aa9b73c/prometheus/0.log" Mar 09 14:21:04 crc kubenswrapper[4723]: I0309 14:21:04.029436 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d/setup-container/0.log" Mar 09 14:21:04 crc kubenswrapper[4723]: I0309 14:21:04.262129 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d/rabbitmq/0.log" Mar 09 14:21:04 crc kubenswrapper[4723]: I0309 14:21:04.297400 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_fc5e8b3b-e65f-427a-81ef-83cb5edf3b3d/setup-container/0.log" Mar 09 14:21:04 crc kubenswrapper[4723]: I0309 14:21:04.325621 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b3c6ecc2-5b7f-43c3-adfc-d741cb3be077/setup-container/0.log" Mar 09 14:21:04 crc kubenswrapper[4723]: I0309 14:21:04.601071 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_447c3a5d-33c3-40ad-a5e4-dce2a0533b8f/setup-container/0.log" Mar 09 14:21:04 crc kubenswrapper[4723]: I0309 14:21:04.636022 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b3c6ecc2-5b7f-43c3-adfc-d741cb3be077/setup-container/0.log" Mar 09 14:21:04 crc kubenswrapper[4723]: I0309 14:21:04.667654 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b3c6ecc2-5b7f-43c3-adfc-d741cb3be077/rabbitmq/0.log" Mar 09 14:21:04 crc kubenswrapper[4723]: I0309 14:21:04.953610 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_447c3a5d-33c3-40ad-a5e4-dce2a0533b8f/setup-container/0.log" Mar 09 14:21:04 crc kubenswrapper[4723]: I0309 14:21:04.984425 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_2e967475-660d-4ada-b409-bae77e4f6905/setup-container/0.log" Mar 09 14:21:05 crc kubenswrapper[4723]: I0309 14:21:05.002962 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_447c3a5d-33c3-40ad-a5e4-dce2a0533b8f/rabbitmq/0.log" Mar 09 14:21:05 crc kubenswrapper[4723]: I0309 14:21:05.316547 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-flc4d_2b7aa25e-6eb5-4559-a5e9-ae08b27093ef/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:21:05 crc kubenswrapper[4723]: I0309 14:21:05.332275 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_2e967475-660d-4ada-b409-bae77e4f6905/rabbitmq/0.log" Mar 09 14:21:05 crc kubenswrapper[4723]: I0309 14:21:05.361261 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_2e967475-660d-4ada-b409-bae77e4f6905/setup-container/0.log" Mar 09 14:21:05 crc kubenswrapper[4723]: I0309 14:21:05.613953 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-fcdn6_c97cec8e-5cb2-455b-8b57-8179ced146c3/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:21:05 crc kubenswrapper[4723]: I0309 14:21:05.650245 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-fxqdh_6a2886a0-218f-4284-aacd-19614f6f602f/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:21:05 crc kubenswrapper[4723]: I0309 14:21:05.880637 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-tjglg_59ee4cf2-3f2e-4b78-8060-1e8f405ab1c8/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:21:06 crc kubenswrapper[4723]: I0309 14:21:06.015734 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-r6qpm_ac5a7be9-9a6b-4595-9cd9-ddec8641c533/ssh-known-hosts-edpm-deployment/0.log" Mar 09 14:21:06 crc kubenswrapper[4723]: I0309 14:21:06.285404 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-575fd88557-l2fxr_818af82f-16fa-47eb-a868-2272b915b99c/proxy-server/0.log" Mar 09 14:21:06 crc kubenswrapper[4723]: I0309 14:21:06.355793 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-575fd88557-l2fxr_818af82f-16fa-47eb-a868-2272b915b99c/proxy-httpd/0.log" Mar 09 14:21:06 crc kubenswrapper[4723]: I0309 14:21:06.738972 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-v4zmc_10b17a94-81eb-4e72-bd49-97f590e26aec/swift-ring-rebalance/0.log" Mar 09 14:21:06 crc kubenswrapper[4723]: I0309 14:21:06.917462 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d924133b-d3c9-4b71-bbf4-a894a618e6c4/account-auditor/0.log" Mar 09 14:21:06 crc kubenswrapper[4723]: I0309 14:21:06.949488 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d924133b-d3c9-4b71-bbf4-a894a618e6c4/account-reaper/0.log" Mar 09 14:21:07 crc kubenswrapper[4723]: I0309 14:21:07.029677 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d924133b-d3c9-4b71-bbf4-a894a618e6c4/account-replicator/0.log" Mar 09 14:21:07 crc kubenswrapper[4723]: I0309 14:21:07.104483 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d924133b-d3c9-4b71-bbf4-a894a618e6c4/account-server/0.log" Mar 09 14:21:07 crc kubenswrapper[4723]: I0309 14:21:07.209254 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d924133b-d3c9-4b71-bbf4-a894a618e6c4/container-auditor/0.log" Mar 09 14:21:07 crc kubenswrapper[4723]: I0309 14:21:07.210118 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d924133b-d3c9-4b71-bbf4-a894a618e6c4/container-replicator/0.log" Mar 09 14:21:07 crc kubenswrapper[4723]: I0309 14:21:07.254222 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d924133b-d3c9-4b71-bbf4-a894a618e6c4/container-server/0.log" Mar 09 14:21:07 crc kubenswrapper[4723]: I0309 14:21:07.367890 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d924133b-d3c9-4b71-bbf4-a894a618e6c4/container-updater/0.log" Mar 09 14:21:07 crc kubenswrapper[4723]: I0309 14:21:07.882767 4723 scope.go:117] "RemoveContainer" containerID="b5af8cf4e6b647cc3921b22101db6b30f5e74eefccad33aef3904a6fa1114186" Mar 09 14:21:07 crc kubenswrapper[4723]: E0309 14:21:07.883061 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:21:08 crc kubenswrapper[4723]: I0309 14:21:08.019656 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d924133b-d3c9-4b71-bbf4-a894a618e6c4/object-expirer/0.log" Mar 09 14:21:08 crc kubenswrapper[4723]: I0309 14:21:08.094855 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d924133b-d3c9-4b71-bbf4-a894a618e6c4/object-server/0.log" Mar 09 14:21:08 crc kubenswrapper[4723]: I0309 14:21:08.104194 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d924133b-d3c9-4b71-bbf4-a894a618e6c4/object-replicator/0.log" Mar 09 14:21:08 crc kubenswrapper[4723]: I0309 14:21:08.112537 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d924133b-d3c9-4b71-bbf4-a894a618e6c4/object-auditor/0.log" Mar 09 14:21:08 crc kubenswrapper[4723]: I0309 14:21:08.260936 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d924133b-d3c9-4b71-bbf4-a894a618e6c4/object-updater/0.log" Mar 09 14:21:08 crc kubenswrapper[4723]: I0309 14:21:08.387970 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d924133b-d3c9-4b71-bbf4-a894a618e6c4/rsync/0.log" Mar 09 14:21:08 crc kubenswrapper[4723]: I0309 14:21:08.403636 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d924133b-d3c9-4b71-bbf4-a894a618e6c4/swift-recon-cron/0.log" Mar 09 14:21:08 crc kubenswrapper[4723]: I0309 14:21:08.634572 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-4k4tb_76b26f74-a654-4507-a416-617f8fec3d89/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:21:08 crc kubenswrapper[4723]: I0309 14:21:08.770049 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-power-monitoring-edpm-deployment-openstack-edpm-2jhmw_2f686f3c-fee2-4853-8ab5-459d64696efc/telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:21:09 crc kubenswrapper[4723]: I0309 14:21:09.031916 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_818affa4-b183-47a4-9697-6151845f58d7/test-operator-logs-container/0.log" Mar 09 14:21:09 crc kubenswrapper[4723]: I0309 14:21:09.165484 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_ef1f6085-70f7-44a1-bf7c-5b4c90284dda/tempest-tests-tempest-tests-runner/0.log" Mar 09 14:21:09 crc kubenswrapper[4723]: I0309 14:21:09.271407 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-7mq48_107e9055-2e58-49af-98bd-478922559642/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:21:14 crc kubenswrapper[4723]: I0309 14:21:14.896074 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_b01063ef-9ba6-4f2b-8298-46acf5a50e80/memcached/0.log" Mar 09 14:21:18 crc kubenswrapper[4723]: I0309 14:21:18.881488 4723 scope.go:117] "RemoveContainer" containerID="b5af8cf4e6b647cc3921b22101db6b30f5e74eefccad33aef3904a6fa1114186" Mar 09 14:21:18 crc kubenswrapper[4723]: E0309 14:21:18.882471 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:21:32 crc kubenswrapper[4723]: I0309 14:21:32.881490 4723 scope.go:117] "RemoveContainer" containerID="b5af8cf4e6b647cc3921b22101db6b30f5e74eefccad33aef3904a6fa1114186" Mar 09 14:21:32 crc kubenswrapper[4723]: E0309 14:21:32.882529 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:21:43 crc kubenswrapper[4723]: I0309 14:21:43.881514 4723 scope.go:117] "RemoveContainer" containerID="b5af8cf4e6b647cc3921b22101db6b30f5e74eefccad33aef3904a6fa1114186" Mar 09 14:21:44 crc kubenswrapper[4723]: I0309 14:21:44.658389 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" event={"ID":"983d5ed4-cfc7-402a-b226-29dc071c6e4e","Type":"ContainerStarted","Data":"a1d1cb15a27887ef68b3d06baaa73779b78a555763876a2da46127c36e07be0e"} Mar 09 14:21:48 crc kubenswrapper[4723]: I0309 14:21:48.101083 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_21dcd4e21a427937548aca6e984807babb52619359c299a213cedf1d364pmf9_8f2b5701-dca6-45fa-8962-1d72ae98fe97/util/0.log" Mar 09 14:21:48 crc kubenswrapper[4723]: I0309 14:21:48.456695 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_21dcd4e21a427937548aca6e984807babb52619359c299a213cedf1d364pmf9_8f2b5701-dca6-45fa-8962-1d72ae98fe97/util/0.log" Mar 09 14:21:48 crc kubenswrapper[4723]: I0309 14:21:48.457741 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_21dcd4e21a427937548aca6e984807babb52619359c299a213cedf1d364pmf9_8f2b5701-dca6-45fa-8962-1d72ae98fe97/pull/0.log" Mar 09 14:21:48 crc kubenswrapper[4723]: I0309 14:21:48.515306 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_21dcd4e21a427937548aca6e984807babb52619359c299a213cedf1d364pmf9_8f2b5701-dca6-45fa-8962-1d72ae98fe97/pull/0.log" Mar 09 14:21:48 crc kubenswrapper[4723]: I0309 14:21:48.751265 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_21dcd4e21a427937548aca6e984807babb52619359c299a213cedf1d364pmf9_8f2b5701-dca6-45fa-8962-1d72ae98fe97/pull/0.log" Mar 09 14:21:48 crc kubenswrapper[4723]: I0309 14:21:48.796707 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_21dcd4e21a427937548aca6e984807babb52619359c299a213cedf1d364pmf9_8f2b5701-dca6-45fa-8962-1d72ae98fe97/util/0.log" Mar 09 14:21:48 crc kubenswrapper[4723]: I0309 14:21:48.868253 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_21dcd4e21a427937548aca6e984807babb52619359c299a213cedf1d364pmf9_8f2b5701-dca6-45fa-8962-1d72ae98fe97/extract/0.log" Mar 09 14:21:49 crc kubenswrapper[4723]: I0309 14:21:49.445485 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-5d87c9d997-dwlzx_6bb6b3ee-7923-42ce-b36d-dabdaa42f829/manager/0.log" Mar 09 14:21:49 crc kubenswrapper[4723]: I0309 14:21:49.856298 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-64db6967f8-89rtm_9646c273-606f-4551-82dd-39e09007dc17/manager/0.log" Mar 09 14:21:50 crc kubenswrapper[4723]: I0309 14:21:50.693853 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-cf99c678f-9btqb_6bf9afff-37d5-41e4-9270-8994fc65deda/manager/0.log" Mar 09 14:21:50 crc kubenswrapper[4723]: I0309 14:21:50.949985 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-78bc7f9bd9-5wrr9_2fc3d688-53db-4d4e-9555-54c047570ae5/manager/0.log" Mar 09 14:21:51 crc kubenswrapper[4723]: I0309 14:21:51.833369 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-545456dc4-rwzzl_76830983-65b6-495a-8283-c9e2df80562b/manager/0.log" Mar 09 14:21:51 crc kubenswrapper[4723]: I0309 14:21:51.932268 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-f7fcc58b9-v6s9h_5ea4f771-5b0c-410d-8a6c-a45b039edb6a/manager/0.log" Mar 09 14:21:52 crc kubenswrapper[4723]: I0309 14:21:52.595400 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-55d77d7b5c-4zwkg_01b1451d-b917-4176-abf6-fd84021ba30d/manager/0.log" Mar 09 14:21:53 crc kubenswrapper[4723]: I0309 14:21:53.074449 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-67d996989d-96b5g_afe8d0e8-415a-4f80-8b5a-c3eb45e585cd/manager/0.log" Mar 09 14:21:53 crc kubenswrapper[4723]: I0309 14:21:53.079945 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7c789f89c6-wrqqw_02c2f97c-15b6-4c33-8be5-c61cc982e989/manager/0.log" Mar 09 14:21:53 crc kubenswrapper[4723]: I0309 14:21:53.454037 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-7b6bfb6475-b2fx2_49f841ea-0808-406e-a0d0-671f5db13f93/manager/0.log" Mar 09 14:21:53 crc kubenswrapper[4723]: I0309 14:21:53.474680 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-54688575f-czjxc_c4d1a44c-121a-4326-9920-af7e6f87a031/manager/0.log" Mar 09 14:21:53 crc kubenswrapper[4723]: I0309 14:21:53.763065 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-74b6b5dc96-6npjh_eb08b38d-0624-4bd5-a3ba-9447cdbc80fb/manager/0.log" Mar 09 14:21:53 crc kubenswrapper[4723]: I0309 14:21:53.798512 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5d86c7ddb7-lbqm8_a8a23c57-bff5-4820-955c-441521c1e8f2/manager/0.log" Mar 09 14:21:54 crc kubenswrapper[4723]: I0309 14:21:54.161407 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9cz5w4f_c3f17509-7e0b-452d-b3ca-0a3210159f17/manager/0.log" Mar 09 14:21:54 crc kubenswrapper[4723]: I0309 14:21:54.410460 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-7d644d7fb7-r7swc_b223943e-1394-48af-8f5c-78a9d370b602/operator/0.log" Mar 09 14:21:54 crc kubenswrapper[4723]: I0309 14:21:54.535127 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-9kl8t_194a48e1-f053-4aa1-bdfe-07aa2a8a208e/registry-server/0.log" Mar 09 14:21:54 crc kubenswrapper[4723]: I0309 14:21:54.827766 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-75684d597f-5wrg7_1e62b006-449e-440b-b425-d56fbb171cd5/manager/0.log" Mar 09 14:21:55 crc kubenswrapper[4723]: I0309 14:21:55.147804 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-648564c9fc-mt7hb_f1620e57-58ba-4313-bba4-f5ece039f9f7/manager/0.log" Mar 09 14:21:55 crc kubenswrapper[4723]: I0309 14:21:55.198836 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-k56p4_d748e45b-6515-4e15-a776-61bbe83179c0/operator/0.log" Mar 09 14:21:55 crc kubenswrapper[4723]: I0309 14:21:55.412191 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9b9ff9f4d-8w2sj_57b972b8-b38f-4bc5-8cb5-cb2d949ff3b8/manager/0.log" Mar 09 14:21:55 crc kubenswrapper[4723]: I0309 14:21:55.876667 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-55b5ff4dbb-6qstb_8554b7c9-0bd7-4326-b906-fe07dcdce9da/manager/0.log" Mar 09 14:21:56 crc kubenswrapper[4723]: I0309 14:21:56.170229 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-bccc79885-4czcc_e93b778c-c10f-4da5-a3c2-91010b4b3aab/manager/0.log" Mar 09 14:21:56 crc kubenswrapper[4723]: I0309 14:21:56.226710 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5b9fbd87f-ssps2_36e23b55-a129-4c5f-8938-26f58742541b/manager/0.log" Mar 09 14:21:56 crc kubenswrapper[4723]: I0309 14:21:56.761452 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-76577b8ddd-8748r_b9b75469-0c5d-47b4-b75c-28cdf8316167/manager/0.log" Mar 09 14:22:00 crc kubenswrapper[4723]: I0309 14:22:00.152481 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551102-h6b6r"] Mar 09 14:22:00 crc kubenswrapper[4723]: E0309 14:22:00.153316 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dbaf07a-3d50-45a4-a848-f5f3c1636307" containerName="container-00" Mar 09 14:22:00 crc kubenswrapper[4723]: I0309 14:22:00.153329 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dbaf07a-3d50-45a4-a848-f5f3c1636307" containerName="container-00" Mar 09 14:22:00 crc kubenswrapper[4723]: I0309 14:22:00.153589 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dbaf07a-3d50-45a4-a848-f5f3c1636307" containerName="container-00" Mar 09 14:22:00 crc kubenswrapper[4723]: I0309 14:22:00.154438 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551102-h6b6r" Mar 09 14:22:00 crc kubenswrapper[4723]: I0309 14:22:00.161616 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551102-h6b6r"] Mar 09 14:22:00 crc kubenswrapper[4723]: I0309 14:22:00.185141 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jrp4x" Mar 09 14:22:00 crc kubenswrapper[4723]: I0309 14:22:00.185726 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:22:00 crc kubenswrapper[4723]: I0309 14:22:00.185903 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:22:00 crc kubenswrapper[4723]: I0309 14:22:00.259379 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx6pv\" (UniqueName: \"kubernetes.io/projected/52f56ac0-e521-4734-bbc5-94a35f0c1ae6-kube-api-access-lx6pv\") pod \"auto-csr-approver-29551102-h6b6r\" (UID: \"52f56ac0-e521-4734-bbc5-94a35f0c1ae6\") " pod="openshift-infra/auto-csr-approver-29551102-h6b6r" Mar 09 14:22:00 crc kubenswrapper[4723]: I0309 14:22:00.363751 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lx6pv\" (UniqueName: \"kubernetes.io/projected/52f56ac0-e521-4734-bbc5-94a35f0c1ae6-kube-api-access-lx6pv\") pod \"auto-csr-approver-29551102-h6b6r\" (UID: \"52f56ac0-e521-4734-bbc5-94a35f0c1ae6\") " pod="openshift-infra/auto-csr-approver-29551102-h6b6r" Mar 09 14:22:00 crc kubenswrapper[4723]: I0309 14:22:00.413670 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lx6pv\" (UniqueName: \"kubernetes.io/projected/52f56ac0-e521-4734-bbc5-94a35f0c1ae6-kube-api-access-lx6pv\") pod \"auto-csr-approver-29551102-h6b6r\" (UID: \"52f56ac0-e521-4734-bbc5-94a35f0c1ae6\") " pod="openshift-infra/auto-csr-approver-29551102-h6b6r" Mar 09 14:22:00 crc kubenswrapper[4723]: I0309 14:22:00.528170 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551102-h6b6r" Mar 09 14:22:01 crc kubenswrapper[4723]: I0309 14:22:01.802813 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551102-h6b6r"] Mar 09 14:22:01 crc kubenswrapper[4723]: I0309 14:22:01.847446 4723 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 14:22:01 crc kubenswrapper[4723]: I0309 14:22:01.885455 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551102-h6b6r" event={"ID":"52f56ac0-e521-4734-bbc5-94a35f0c1ae6","Type":"ContainerStarted","Data":"7f2a5a024a08d16f0bed0699363bac28d56957499b41b2223ab826d913a1d95d"} Mar 09 14:22:02 crc kubenswrapper[4723]: I0309 14:22:02.353443 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6db6876945-mtmcb_6e192922-8050-41f1-bf25-33a12ace409b/manager/0.log" Mar 09 14:22:03 crc kubenswrapper[4723]: I0309 14:22:03.945160 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551102-h6b6r" event={"ID":"52f56ac0-e521-4734-bbc5-94a35f0c1ae6","Type":"ContainerStarted","Data":"e0886828b34405d9c7162c8038432aa9836221d16b776c29c826a57817ed860f"} Mar 09 14:22:03 crc kubenswrapper[4723]: I0309 14:22:03.971599 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551102-h6b6r" podStartSLOduration=3.092643684 podStartE2EDuration="3.971572932s" podCreationTimestamp="2026-03-09 14:22:00 +0000 UTC" firstStartedPulling="2026-03-09 14:22:01.843156007 +0000 UTC m=+4995.857623547" lastFinishedPulling="2026-03-09 14:22:02.722085255 +0000 UTC m=+4996.736552795" observedRunningTime="2026-03-09 14:22:03.960762084 +0000 UTC m=+4997.975229644" watchObservedRunningTime="2026-03-09 14:22:03.971572932 +0000 UTC m=+4997.986040472" Mar 09 14:22:05 crc kubenswrapper[4723]: I0309 14:22:05.967732 4723 generic.go:334] "Generic (PLEG): container finished" podID="52f56ac0-e521-4734-bbc5-94a35f0c1ae6" containerID="e0886828b34405d9c7162c8038432aa9836221d16b776c29c826a57817ed860f" exitCode=0 Mar 09 14:22:05 crc kubenswrapper[4723]: I0309 14:22:05.967913 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551102-h6b6r" event={"ID":"52f56ac0-e521-4734-bbc5-94a35f0c1ae6","Type":"ContainerDied","Data":"e0886828b34405d9c7162c8038432aa9836221d16b776c29c826a57817ed860f"} Mar 09 14:22:07 crc kubenswrapper[4723]: I0309 14:22:07.483529 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551102-h6b6r" Mar 09 14:22:07 crc kubenswrapper[4723]: I0309 14:22:07.571234 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lx6pv\" (UniqueName: \"kubernetes.io/projected/52f56ac0-e521-4734-bbc5-94a35f0c1ae6-kube-api-access-lx6pv\") pod \"52f56ac0-e521-4734-bbc5-94a35f0c1ae6\" (UID: \"52f56ac0-e521-4734-bbc5-94a35f0c1ae6\") " Mar 09 14:22:07 crc kubenswrapper[4723]: I0309 14:22:07.613353 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52f56ac0-e521-4734-bbc5-94a35f0c1ae6-kube-api-access-lx6pv" (OuterVolumeSpecName: "kube-api-access-lx6pv") pod "52f56ac0-e521-4734-bbc5-94a35f0c1ae6" (UID: "52f56ac0-e521-4734-bbc5-94a35f0c1ae6"). InnerVolumeSpecName "kube-api-access-lx6pv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:22:07 crc kubenswrapper[4723]: I0309 14:22:07.673915 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lx6pv\" (UniqueName: \"kubernetes.io/projected/52f56ac0-e521-4734-bbc5-94a35f0c1ae6-kube-api-access-lx6pv\") on node \"crc\" DevicePath \"\"" Mar 09 14:22:08 crc kubenswrapper[4723]: I0309 14:22:08.000427 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551102-h6b6r" event={"ID":"52f56ac0-e521-4734-bbc5-94a35f0c1ae6","Type":"ContainerDied","Data":"7f2a5a024a08d16f0bed0699363bac28d56957499b41b2223ab826d913a1d95d"} Mar 09 14:22:08 crc kubenswrapper[4723]: I0309 14:22:08.000481 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f2a5a024a08d16f0bed0699363bac28d56957499b41b2223ab826d913a1d95d" Mar 09 14:22:08 crc kubenswrapper[4723]: I0309 14:22:08.000506 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551102-h6b6r" Mar 09 14:22:08 crc kubenswrapper[4723]: I0309 14:22:08.053695 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551096-jrfpb"] Mar 09 14:22:08 crc kubenswrapper[4723]: I0309 14:22:08.070476 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551096-jrfpb"] Mar 09 14:22:08 crc kubenswrapper[4723]: I0309 14:22:08.895695 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d18d891-99a0-4089-86df-166cfa4297a6" path="/var/lib/kubelet/pods/8d18d891-99a0-4089-86df-166cfa4297a6/volumes" Mar 09 14:22:24 crc kubenswrapper[4723]: I0309 14:22:24.729602 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-kdxpm_330d0ce9-3cc0-427c-acf4-7c14f36add18/control-plane-machine-set-operator/0.log" Mar 09 14:22:24 crc kubenswrapper[4723]: I0309 14:22:24.907988 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-t6z8c_d1a69d09-d3e2-4af9-857a-3229bc05c992/kube-rbac-proxy/0.log" Mar 09 14:22:24 crc kubenswrapper[4723]: I0309 14:22:24.990878 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-t6z8c_d1a69d09-d3e2-4af9-857a-3229bc05c992/machine-api-operator/0.log" Mar 09 14:22:40 crc kubenswrapper[4723]: I0309 14:22:40.493832 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-s7s5g_95099130-ff5e-4bed-9f9d-b53820c77540/cert-manager-controller/0.log" Mar 09 14:22:40 crc kubenswrapper[4723]: I0309 14:22:40.646111 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-b9zhr_80ed3d36-6e36-4f64-8c40-62b445173079/cert-manager-cainjector/0.log" Mar 09 14:22:40 crc kubenswrapper[4723]: I0309 14:22:40.821369 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-6wj4n_7b820c49-0780-4d8d-a069-6cecf6ee0f1e/cert-manager-webhook/0.log" Mar 09 14:22:55 crc kubenswrapper[4723]: I0309 14:22:55.548556 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-8nj2b_dbd14494-7dd0-4807-9980-024d38f263f8/nmstate-console-plugin/0.log" Mar 09 14:22:55 crc kubenswrapper[4723]: I0309 14:22:55.855005 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-cfc82_f40e96ae-e190-4d90-bb91-ce0a50b528a0/nmstate-handler/0.log" Mar 09 14:22:55 crc kubenswrapper[4723]: I0309 14:22:55.920229 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-cr6kv_d467b1e5-db4c-4066-8686-9626d2fd19af/kube-rbac-proxy/0.log" Mar 09 14:22:56 crc kubenswrapper[4723]: I0309 14:22:56.077319 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-plh7h_3c8f1518-efa2-4f99-ad51-e3c754e2b244/nmstate-operator/0.log" Mar 09 14:22:56 crc kubenswrapper[4723]: I0309 14:22:56.106080 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-cr6kv_d467b1e5-db4c-4066-8686-9626d2fd19af/nmstate-metrics/0.log" Mar 09 14:22:56 crc kubenswrapper[4723]: I0309 14:22:56.290154 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-msbbv_3b47483e-69de-403b-ab71-5c6665c0a36d/nmstate-webhook/0.log" Mar 09 14:23:12 crc kubenswrapper[4723]: I0309 14:23:12.335309 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-856bf85654-nsk4x_a4427e9d-2cc9-4cec-acf7-7bbcc1c91582/kube-rbac-proxy/0.log" Mar 09 14:23:12 crc kubenswrapper[4723]: I0309 14:23:12.389545 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-856bf85654-nsk4x_a4427e9d-2cc9-4cec-acf7-7bbcc1c91582/manager/0.log" Mar 09 14:23:28 crc kubenswrapper[4723]: I0309 14:23:28.001030 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-26ngl_bcc48a4f-2d0e-4fb9-98d7-af5958403a01/prometheus-operator/0.log" Mar 09 14:23:28 crc kubenswrapper[4723]: I0309 14:23:28.385394 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-77c6585f54-h6x8j_441fc6d3-ed2e-44b6-9e0d-d1925412eb23/prometheus-operator-admission-webhook/0.log" Mar 09 14:23:28 crc kubenswrapper[4723]: I0309 14:23:28.443396 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-77c6585f54-vwh7x_b3774887-0abb-4692-a856-fb86baa11ba6/prometheus-operator-admission-webhook/0.log" Mar 09 14:23:28 crc kubenswrapper[4723]: I0309 14:23:28.627621 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-2nqwq_2bc0446d-1f37-4214-bd0a-0f7c64f844a8/operator/1.log" Mar 09 14:23:28 crc kubenswrapper[4723]: I0309 14:23:28.664206 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-2nqwq_2bc0446d-1f37-4214-bd0a-0f7c64f844a8/operator/0.log" Mar 09 14:23:28 crc kubenswrapper[4723]: I0309 14:23:28.840025 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-66cbf594b5-fwcs7_f19d74e0-826f-47c6-80bc-d82478a56657/observability-ui-dashboards/0.log" Mar 09 14:23:28 crc kubenswrapper[4723]: I0309 14:23:28.890903 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-25fp4_5d6ca68e-2044-42ed-9ee9-41bcf5aaf8e6/perses-operator/0.log" Mar 09 14:23:29 crc kubenswrapper[4723]: I0309 14:23:29.321367 4723 scope.go:117] "RemoveContainer" containerID="5e5f8e737f5804dbd716d4de3423a5591e0428c02cba590bfa21c25feb4c3ed8" Mar 09 14:23:46 crc kubenswrapper[4723]: I0309 14:23:46.390129 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_cluster-logging-operator-c769fd969-hr28t_f671edd2-126e-4037-b17c-0d707e2a01e3/cluster-logging-operator/0.log" Mar 09 14:23:46 crc kubenswrapper[4723]: I0309 14:23:46.637659 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_collector-xkssk_fa952b22-aa73-49cf-b851-59e7c93de305/collector/0.log" Mar 09 14:23:46 crc kubenswrapper[4723]: I0309 14:23:46.753784 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-compactor-0_a3a44cb8-3d3a-4462-b7fb-ae571ad70a70/loki-compactor/0.log" Mar 09 14:23:46 crc kubenswrapper[4723]: I0309 14:23:46.939376 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-distributor-5d5548c9f5-zg9mx_04edbd9e-fd1b-4346-97ce-adfb011720a4/loki-distributor/0.log" Mar 09 14:23:47 crc kubenswrapper[4723]: I0309 14:23:47.028082 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-867fb59d66-2pwh2_0bd030fd-cf38-4403-971f-4170fdc71bb0/gateway/0.log" Mar 09 14:23:47 crc kubenswrapper[4723]: I0309 14:23:47.110686 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-867fb59d66-2pwh2_0bd030fd-cf38-4403-971f-4170fdc71bb0/opa/0.log" Mar 09 14:23:47 crc kubenswrapper[4723]: I0309 14:23:47.253663 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-867fb59d66-pxpr6_3dcae42d-f05a-41f1-9d6a-11ccb28eb379/gateway/0.log" Mar 09 14:23:47 crc kubenswrapper[4723]: I0309 14:23:47.295749 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-867fb59d66-pxpr6_3dcae42d-f05a-41f1-9d6a-11ccb28eb379/opa/0.log" Mar 09 14:23:47 crc kubenswrapper[4723]: I0309 14:23:47.434885 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-index-gateway-0_e2b63e12-eaaf-47df-93c5-cbd7effb4124/loki-index-gateway/0.log" Mar 09 14:23:47 crc kubenswrapper[4723]: I0309 14:23:47.604941 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-ingester-0_48261310-f664-41dd-9fbd-dd5a7bfc11e9/loki-ingester/0.log" Mar 09 14:23:47 crc kubenswrapper[4723]: I0309 14:23:47.655076 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-querier-76bf7b6d45-2d54b_8b9cdd14-6347-4701-9825-1ced6362cd8c/loki-querier/0.log" Mar 09 14:23:47 crc kubenswrapper[4723]: I0309 14:23:47.819147 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-query-frontend-6d6859c548-4mdnv_9cd1997b-cced-41c1-8a27-77321ffc48ae/loki-query-frontend/0.log" Mar 09 14:24:00 crc kubenswrapper[4723]: I0309 14:24:00.178753 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551104-h5bhz"] Mar 09 14:24:00 crc kubenswrapper[4723]: E0309 14:24:00.179889 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52f56ac0-e521-4734-bbc5-94a35f0c1ae6" containerName="oc" Mar 09 14:24:00 crc kubenswrapper[4723]: I0309 14:24:00.179910 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="52f56ac0-e521-4734-bbc5-94a35f0c1ae6" containerName="oc" Mar 09 14:24:00 crc kubenswrapper[4723]: I0309 14:24:00.180275 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="52f56ac0-e521-4734-bbc5-94a35f0c1ae6" containerName="oc" Mar 09 14:24:00 crc kubenswrapper[4723]: I0309 14:24:00.181509 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551104-h5bhz" Mar 09 14:24:00 crc kubenswrapper[4723]: I0309 14:24:00.183746 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:24:00 crc kubenswrapper[4723]: I0309 14:24:00.183861 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jrp4x" Mar 09 14:24:00 crc kubenswrapper[4723]: I0309 14:24:00.184684 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:24:00 crc kubenswrapper[4723]: I0309 14:24:00.198493 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551104-h5bhz"] Mar 09 14:24:00 crc kubenswrapper[4723]: I0309 14:24:00.279031 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmk6r\" (UniqueName: \"kubernetes.io/projected/6d01f5e3-e2bb-47ac-ae93-5588acc2c783-kube-api-access-vmk6r\") pod \"auto-csr-approver-29551104-h5bhz\" (UID: \"6d01f5e3-e2bb-47ac-ae93-5588acc2c783\") " pod="openshift-infra/auto-csr-approver-29551104-h5bhz" Mar 09 14:24:00 crc kubenswrapper[4723]: I0309 14:24:00.380885 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmk6r\" (UniqueName: \"kubernetes.io/projected/6d01f5e3-e2bb-47ac-ae93-5588acc2c783-kube-api-access-vmk6r\") pod \"auto-csr-approver-29551104-h5bhz\" (UID: \"6d01f5e3-e2bb-47ac-ae93-5588acc2c783\") " pod="openshift-infra/auto-csr-approver-29551104-h5bhz" Mar 09 14:24:00 crc kubenswrapper[4723]: I0309 14:24:00.697384 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmk6r\" (UniqueName: \"kubernetes.io/projected/6d01f5e3-e2bb-47ac-ae93-5588acc2c783-kube-api-access-vmk6r\") pod \"auto-csr-approver-29551104-h5bhz\" (UID: \"6d01f5e3-e2bb-47ac-ae93-5588acc2c783\") " pod="openshift-infra/auto-csr-approver-29551104-h5bhz" Mar 09 14:24:00 crc kubenswrapper[4723]: I0309 14:24:00.814063 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551104-h5bhz" Mar 09 14:24:01 crc kubenswrapper[4723]: I0309 14:24:01.332811 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551104-h5bhz"] Mar 09 14:24:01 crc kubenswrapper[4723]: I0309 14:24:01.758231 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551104-h5bhz" event={"ID":"6d01f5e3-e2bb-47ac-ae93-5588acc2c783","Type":"ContainerStarted","Data":"b6f76f75eb61069a01ff752ea6434d83943e8d2a3e7786dd9b388daf13cd7edf"} Mar 09 14:24:03 crc kubenswrapper[4723]: I0309 14:24:03.782141 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551104-h5bhz" event={"ID":"6d01f5e3-e2bb-47ac-ae93-5588acc2c783","Type":"ContainerStarted","Data":"540c229dfc51d5227ad146e94b46462c0af7277540b21fc180e4c43f62c21f77"} Mar 09 14:24:03 crc kubenswrapper[4723]: I0309 14:24:03.806272 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551104-h5bhz" podStartSLOduration=2.369961637 podStartE2EDuration="3.806245648s" podCreationTimestamp="2026-03-09 14:24:00 +0000 UTC" firstStartedPulling="2026-03-09 14:24:01.345498723 +0000 UTC m=+5115.359966263" lastFinishedPulling="2026-03-09 14:24:02.781782744 +0000 UTC m=+5116.796250274" observedRunningTime="2026-03-09 14:24:03.798138002 +0000 UTC m=+5117.812605542" watchObservedRunningTime="2026-03-09 14:24:03.806245648 +0000 UTC m=+5117.820713188" Mar 09 14:24:03 crc kubenswrapper[4723]: I0309 14:24:03.947781 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:24:03 crc kubenswrapper[4723]: I0309 14:24:03.947899 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:24:04 crc kubenswrapper[4723]: I0309 14:24:04.804017 4723 generic.go:334] "Generic (PLEG): container finished" podID="6d01f5e3-e2bb-47ac-ae93-5588acc2c783" containerID="540c229dfc51d5227ad146e94b46462c0af7277540b21fc180e4c43f62c21f77" exitCode=0 Mar 09 14:24:04 crc kubenswrapper[4723]: I0309 14:24:04.804137 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551104-h5bhz" event={"ID":"6d01f5e3-e2bb-47ac-ae93-5588acc2c783","Type":"ContainerDied","Data":"540c229dfc51d5227ad146e94b46462c0af7277540b21fc180e4c43f62c21f77"} Mar 09 14:24:06 crc kubenswrapper[4723]: I0309 14:24:06.254993 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551104-h5bhz" Mar 09 14:24:06 crc kubenswrapper[4723]: I0309 14:24:06.332552 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmk6r\" (UniqueName: \"kubernetes.io/projected/6d01f5e3-e2bb-47ac-ae93-5588acc2c783-kube-api-access-vmk6r\") pod \"6d01f5e3-e2bb-47ac-ae93-5588acc2c783\" (UID: \"6d01f5e3-e2bb-47ac-ae93-5588acc2c783\") " Mar 09 14:24:06 crc kubenswrapper[4723]: I0309 14:24:06.346315 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d01f5e3-e2bb-47ac-ae93-5588acc2c783-kube-api-access-vmk6r" (OuterVolumeSpecName: "kube-api-access-vmk6r") pod "6d01f5e3-e2bb-47ac-ae93-5588acc2c783" (UID: "6d01f5e3-e2bb-47ac-ae93-5588acc2c783"). InnerVolumeSpecName "kube-api-access-vmk6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:24:06 crc kubenswrapper[4723]: I0309 14:24:06.434857 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmk6r\" (UniqueName: \"kubernetes.io/projected/6d01f5e3-e2bb-47ac-ae93-5588acc2c783-kube-api-access-vmk6r\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:06 crc kubenswrapper[4723]: I0309 14:24:06.468360 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-7hbxv_351b987c-4b9a-4bf6-8832-a0504c9c16ed/kube-rbac-proxy/0.log" Mar 09 14:24:06 crc kubenswrapper[4723]: I0309 14:24:06.590371 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-7hbxv_351b987c-4b9a-4bf6-8832-a0504c9c16ed/controller/0.log" Mar 09 14:24:06 crc kubenswrapper[4723]: I0309 14:24:06.744089 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jqk9s_54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4/cp-frr-files/0.log" Mar 09 14:24:06 crc kubenswrapper[4723]: I0309 14:24:06.825487 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551104-h5bhz" event={"ID":"6d01f5e3-e2bb-47ac-ae93-5588acc2c783","Type":"ContainerDied","Data":"b6f76f75eb61069a01ff752ea6434d83943e8d2a3e7786dd9b388daf13cd7edf"} Mar 09 14:24:06 crc kubenswrapper[4723]: I0309 14:24:06.825536 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6f76f75eb61069a01ff752ea6434d83943e8d2a3e7786dd9b388daf13cd7edf" Mar 09 14:24:06 crc kubenswrapper[4723]: I0309 14:24:06.825564 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551104-h5bhz" Mar 09 14:24:06 crc kubenswrapper[4723]: I0309 14:24:06.937438 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551098-zgddk"] Mar 09 14:24:06 crc kubenswrapper[4723]: I0309 14:24:06.952128 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551098-zgddk"] Mar 09 14:24:06 crc kubenswrapper[4723]: I0309 14:24:06.994367 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jqk9s_54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4/cp-frr-files/0.log" Mar 09 14:24:07 crc kubenswrapper[4723]: I0309 14:24:07.048452 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jqk9s_54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4/cp-metrics/0.log" Mar 09 14:24:07 crc kubenswrapper[4723]: I0309 14:24:07.060344 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jqk9s_54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4/cp-reloader/0.log" Mar 09 14:24:07 crc kubenswrapper[4723]: I0309 14:24:07.077529 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jqk9s_54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4/cp-reloader/0.log" Mar 09 14:24:07 crc kubenswrapper[4723]: I0309 14:24:07.263496 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jqk9s_54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4/cp-reloader/0.log" Mar 09 14:24:07 crc kubenswrapper[4723]: I0309 14:24:07.296584 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jqk9s_54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4/cp-frr-files/0.log" Mar 09 14:24:07 crc kubenswrapper[4723]: I0309 14:24:07.313411 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jqk9s_54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4/cp-metrics/0.log" Mar 09 14:24:07 crc kubenswrapper[4723]: I0309 14:24:07.358369 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jqk9s_54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4/cp-metrics/0.log" Mar 09 14:24:07 crc kubenswrapper[4723]: I0309 14:24:07.493893 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jqk9s_54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4/cp-frr-files/0.log" Mar 09 14:24:07 crc kubenswrapper[4723]: I0309 14:24:07.540201 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jqk9s_54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4/cp-reloader/0.log" Mar 09 14:24:07 crc kubenswrapper[4723]: I0309 14:24:07.580427 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jqk9s_54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4/cp-metrics/0.log" Mar 09 14:24:07 crc kubenswrapper[4723]: I0309 14:24:07.609578 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jqk9s_54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4/controller/0.log" Mar 09 14:24:07 crc kubenswrapper[4723]: I0309 14:24:07.859461 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jqk9s_54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4/frr-metrics/0.log" Mar 09 14:24:07 crc kubenswrapper[4723]: I0309 14:24:07.868757 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jqk9s_54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4/kube-rbac-proxy/0.log" Mar 09 14:24:07 crc kubenswrapper[4723]: I0309 14:24:07.915291 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jqk9s_54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4/kube-rbac-proxy-frr/0.log" Mar 09 14:24:08 crc kubenswrapper[4723]: I0309 14:24:08.101122 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jqk9s_54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4/reloader/0.log" Mar 09 14:24:08 crc kubenswrapper[4723]: I0309 14:24:08.601698 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-t8zw4_3fab2f82-df4b-417b-8188-0c4f455df30c/frr-k8s-webhook-server/0.log" Mar 09 14:24:08 crc kubenswrapper[4723]: I0309 14:24:08.886308 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6d8b459b8b-tm8sv_4fc91e18-da85-44c6-96c7-2c15123b9980/manager/0.log" Mar 09 14:24:08 crc kubenswrapper[4723]: I0309 14:24:08.920743 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1696b106-9b5a-4120-8947-8e7fee59e6a3" path="/var/lib/kubelet/pods/1696b106-9b5a-4120-8947-8e7fee59e6a3/volumes" Mar 09 14:24:09 crc kubenswrapper[4723]: I0309 14:24:09.112891 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-997cfb689-c8857_881230db-85c7-4159-b1dd-f537ed6baece/webhook-server/0.log" Mar 09 14:24:09 crc kubenswrapper[4723]: I0309 14:24:09.154635 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-997cfb689-c8857_881230db-85c7-4159-b1dd-f537ed6baece/webhook-server/1.log" Mar 09 14:24:09 crc kubenswrapper[4723]: I0309 14:24:09.386510 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zpfhd_b8287b9e-89fc-417d-b98b-564e6acdbb25/kube-rbac-proxy/0.log" Mar 09 14:24:09 crc kubenswrapper[4723]: I0309 14:24:09.798025 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jqk9s_54ae40b9-1ce6-4b2a-b878-cbf15a49b1c4/frr/0.log" Mar 09 14:24:10 crc kubenswrapper[4723]: I0309 14:24:10.136297 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zpfhd_b8287b9e-89fc-417d-b98b-564e6acdbb25/speaker/0.log" Mar 09 14:24:26 crc kubenswrapper[4723]: I0309 14:24:26.061585 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82pcl7p_f5db2d82-fe45-4c4a-a3b2-8addddad74fe/util/0.log" Mar 09 14:24:26 crc kubenswrapper[4723]: I0309 14:24:26.189005 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82pcl7p_f5db2d82-fe45-4c4a-a3b2-8addddad74fe/util/0.log" Mar 09 14:24:26 crc kubenswrapper[4723]: I0309 14:24:26.189351 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82pcl7p_f5db2d82-fe45-4c4a-a3b2-8addddad74fe/pull/0.log" Mar 09 14:24:26 crc kubenswrapper[4723]: I0309 14:24:26.245122 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82pcl7p_f5db2d82-fe45-4c4a-a3b2-8addddad74fe/pull/0.log" Mar 09 14:24:26 crc kubenswrapper[4723]: I0309 14:24:26.493120 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82pcl7p_f5db2d82-fe45-4c4a-a3b2-8addddad74fe/util/0.log" Mar 09 14:24:26 crc kubenswrapper[4723]: I0309 14:24:26.493676 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82pcl7p_f5db2d82-fe45-4c4a-a3b2-8addddad74fe/extract/0.log" Mar 09 14:24:26 crc kubenswrapper[4723]: I0309 14:24:26.520858 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82pcl7p_f5db2d82-fe45-4c4a-a3b2-8addddad74fe/pull/0.log" Mar 09 14:24:26 crc kubenswrapper[4723]: I0309 14:24:26.709625 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19fs492_73775981-b81a-47d9-b93e-0ecf9ba86890/util/0.log" Mar 09 14:24:26 crc kubenswrapper[4723]: I0309 14:24:26.941736 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19fs492_73775981-b81a-47d9-b93e-0ecf9ba86890/pull/0.log" Mar 09 14:24:27 crc kubenswrapper[4723]: I0309 14:24:27.001987 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19fs492_73775981-b81a-47d9-b93e-0ecf9ba86890/util/0.log" Mar 09 14:24:27 crc kubenswrapper[4723]: I0309 14:24:27.010541 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19fs492_73775981-b81a-47d9-b93e-0ecf9ba86890/pull/0.log" Mar 09 14:24:27 crc kubenswrapper[4723]: I0309 14:24:27.335327 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19fs492_73775981-b81a-47d9-b93e-0ecf9ba86890/pull/0.log" Mar 09 14:24:27 crc kubenswrapper[4723]: I0309 14:24:27.358142 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19fs492_73775981-b81a-47d9-b93e-0ecf9ba86890/util/0.log" Mar 09 14:24:27 crc kubenswrapper[4723]: I0309 14:24:27.366023 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19fs492_73775981-b81a-47d9-b93e-0ecf9ba86890/extract/0.log" Mar 09 14:24:27 crc kubenswrapper[4723]: I0309 14:24:27.539149 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z6m5p_618feaa1-6349-4b7e-b344-f750770dc970/util/0.log" Mar 09 14:24:27 crc kubenswrapper[4723]: I0309 14:24:27.770721 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z6m5p_618feaa1-6349-4b7e-b344-f750770dc970/pull/0.log" Mar 09 14:24:27 crc kubenswrapper[4723]: I0309 14:24:27.798067 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z6m5p_618feaa1-6349-4b7e-b344-f750770dc970/pull/0.log" Mar 09 14:24:27 crc kubenswrapper[4723]: I0309 14:24:27.864237 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z6m5p_618feaa1-6349-4b7e-b344-f750770dc970/util/0.log" Mar 09 14:24:28 crc kubenswrapper[4723]: I0309 14:24:28.242236 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z6m5p_618feaa1-6349-4b7e-b344-f750770dc970/util/0.log" Mar 09 14:24:28 crc kubenswrapper[4723]: I0309 14:24:28.465111 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z6m5p_618feaa1-6349-4b7e-b344-f750770dc970/extract/0.log" Mar 09 14:24:28 crc kubenswrapper[4723]: I0309 14:24:28.549078 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z6m5p_618feaa1-6349-4b7e-b344-f750770dc970/pull/0.log" Mar 09 14:24:28 crc kubenswrapper[4723]: I0309 14:24:28.779529 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wn4hg_901259b6-1c9d-49ca-9c13-4626d65c68fa/extract-utilities/0.log" Mar 09 14:24:29 crc kubenswrapper[4723]: I0309 14:24:29.003364 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wn4hg_901259b6-1c9d-49ca-9c13-4626d65c68fa/extract-content/0.log" Mar 09 14:24:29 crc kubenswrapper[4723]: I0309 14:24:29.066669 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wn4hg_901259b6-1c9d-49ca-9c13-4626d65c68fa/extract-utilities/0.log" Mar 09 14:24:29 crc kubenswrapper[4723]: I0309 14:24:29.092848 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wn4hg_901259b6-1c9d-49ca-9c13-4626d65c68fa/extract-content/0.log" Mar 09 14:24:29 crc kubenswrapper[4723]: I0309 14:24:29.282490 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wn4hg_901259b6-1c9d-49ca-9c13-4626d65c68fa/extract-utilities/0.log" Mar 09 14:24:29 crc kubenswrapper[4723]: I0309 14:24:29.286474 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wn4hg_901259b6-1c9d-49ca-9c13-4626d65c68fa/extract-content/0.log" Mar 09 14:24:29 crc kubenswrapper[4723]: I0309 14:24:29.445654 4723 scope.go:117] "RemoveContainer" containerID="540c60f05a86c15480b94448f1bdc8b8c6831852157e05460626bf77cce770ba" Mar 09 14:24:29 crc kubenswrapper[4723]: I0309 14:24:29.484246 4723 scope.go:117] "RemoveContainer" containerID="ca18a68d6c8b5a86478cb47dc9c47a4e69186b03115de9b13999f06750a64bbb" Mar 09 14:24:29 crc kubenswrapper[4723]: I0309 14:24:29.557219 4723 scope.go:117] "RemoveContainer" containerID="ff3809c0aac1d1d5a4c8fbd82d0a88209e26a82584fb1a1733c2e203db6bf611" Mar 09 14:24:29 crc kubenswrapper[4723]: I0309 14:24:29.619629 4723 scope.go:117] "RemoveContainer" containerID="430c64e6b99785ff06f4ea38f01c21433b055364f12a885f8041ae9be7a3f00c" Mar 09 14:24:29 crc kubenswrapper[4723]: I0309 14:24:29.632183 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-h7rg4_ce90ce22-0632-4cec-bb6e-4c85b78b1833/extract-utilities/0.log" Mar 09 14:24:29 crc kubenswrapper[4723]: I0309 14:24:29.990128 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-h7rg4_ce90ce22-0632-4cec-bb6e-4c85b78b1833/extract-content/0.log" Mar 09 14:24:29 crc kubenswrapper[4723]: I0309 14:24:29.990184 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-h7rg4_ce90ce22-0632-4cec-bb6e-4c85b78b1833/extract-utilities/0.log" Mar 09 14:24:30 crc kubenswrapper[4723]: I0309 14:24:30.023583 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-h7rg4_ce90ce22-0632-4cec-bb6e-4c85b78b1833/extract-content/0.log" Mar 09 14:24:30 crc kubenswrapper[4723]: I0309 14:24:30.220598 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-h7rg4_ce90ce22-0632-4cec-bb6e-4c85b78b1833/extract-utilities/0.log" Mar 09 14:24:30 crc kubenswrapper[4723]: I0309 14:24:30.257208 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wn4hg_901259b6-1c9d-49ca-9c13-4626d65c68fa/registry-server/0.log" Mar 09 14:24:30 crc kubenswrapper[4723]: I0309 14:24:30.422068 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-h7rg4_ce90ce22-0632-4cec-bb6e-4c85b78b1833/extract-content/0.log" Mar 09 14:24:30 crc kubenswrapper[4723]: I0309 14:24:30.564298 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4xz7tk_dec42794-de2a-4e6d-9e9d-fe400f4052f3/util/0.log" Mar 09 14:24:30 crc kubenswrapper[4723]: I0309 14:24:30.758626 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4xz7tk_dec42794-de2a-4e6d-9e9d-fe400f4052f3/pull/0.log" Mar 09 14:24:30 crc kubenswrapper[4723]: I0309 14:24:30.810375 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4xz7tk_dec42794-de2a-4e6d-9e9d-fe400f4052f3/util/0.log" Mar 09 14:24:30 crc kubenswrapper[4723]: I0309 14:24:30.881142 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4xz7tk_dec42794-de2a-4e6d-9e9d-fe400f4052f3/pull/0.log" Mar 09 14:24:31 crc kubenswrapper[4723]: I0309 14:24:31.156355 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4xz7tk_dec42794-de2a-4e6d-9e9d-fe400f4052f3/util/0.log" Mar 09 14:24:31 crc kubenswrapper[4723]: I0309 14:24:31.203770 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-h7rg4_ce90ce22-0632-4cec-bb6e-4c85b78b1833/registry-server/0.log" Mar 09 14:24:31 crc kubenswrapper[4723]: I0309 14:24:31.231806 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4xz7tk_dec42794-de2a-4e6d-9e9d-fe400f4052f3/pull/0.log" Mar 09 14:24:31 crc kubenswrapper[4723]: I0309 14:24:31.240582 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4xz7tk_dec42794-de2a-4e6d-9e9d-fe400f4052f3/extract/0.log" Mar 09 14:24:31 crc kubenswrapper[4723]: I0309 14:24:31.364142 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989nbkvb_995955a2-1d3d-4705-826b-d61bf24a1f2d/util/0.log" Mar 09 14:24:31 crc kubenswrapper[4723]: I0309 14:24:31.569445 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989nbkvb_995955a2-1d3d-4705-826b-d61bf24a1f2d/pull/0.log" Mar 09 14:24:31 crc kubenswrapper[4723]: I0309 14:24:31.587484 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989nbkvb_995955a2-1d3d-4705-826b-d61bf24a1f2d/util/0.log" Mar 09 14:24:31 crc kubenswrapper[4723]: I0309 14:24:31.602300 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989nbkvb_995955a2-1d3d-4705-826b-d61bf24a1f2d/pull/0.log" Mar 09 14:24:31 crc kubenswrapper[4723]: I0309 14:24:31.746005 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989nbkvb_995955a2-1d3d-4705-826b-d61bf24a1f2d/util/0.log" Mar 09 14:24:31 crc kubenswrapper[4723]: I0309 14:24:31.780395 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989nbkvb_995955a2-1d3d-4705-826b-d61bf24a1f2d/extract/0.log" Mar 09 14:24:31 crc kubenswrapper[4723]: I0309 14:24:31.783830 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989nbkvb_995955a2-1d3d-4705-826b-d61bf24a1f2d/pull/0.log" Mar 09 14:24:31 crc kubenswrapper[4723]: I0309 14:24:31.826994 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-7nz4v_0c45ecd0-a916-4ef0-80aa-cfe88212d0ed/marketplace-operator/0.log" Mar 09 14:24:32 crc kubenswrapper[4723]: I0309 14:24:32.020270 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vwn46_98195455-05c0-408c-b3e2-728b991eee12/extract-utilities/0.log" Mar 09 14:24:32 crc kubenswrapper[4723]: I0309 14:24:32.201617 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vwn46_98195455-05c0-408c-b3e2-728b991eee12/extract-content/0.log" Mar 09 14:24:32 crc kubenswrapper[4723]: I0309 14:24:32.215711 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vwn46_98195455-05c0-408c-b3e2-728b991eee12/extract-utilities/0.log" Mar 09 14:24:32 crc kubenswrapper[4723]: I0309 14:24:32.235577 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vwn46_98195455-05c0-408c-b3e2-728b991eee12/extract-content/0.log" Mar 09 14:24:32 crc kubenswrapper[4723]: I0309 14:24:32.468578 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vwn46_98195455-05c0-408c-b3e2-728b991eee12/extract-utilities/0.log" Mar 09 14:24:32 crc kubenswrapper[4723]: I0309 14:24:32.490181 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vwn46_98195455-05c0-408c-b3e2-728b991eee12/extract-content/0.log" Mar 09 14:24:32 crc kubenswrapper[4723]: I0309 14:24:32.674139 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vwn46_98195455-05c0-408c-b3e2-728b991eee12/registry-server/0.log" Mar 09 14:24:32 crc kubenswrapper[4723]: I0309 14:24:32.695992 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-r9xzz_0b462749-ee4f-4661-8a3a-06e721ef51a8/extract-utilities/0.log" Mar 09 14:24:32 crc kubenswrapper[4723]: I0309 14:24:32.791975 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-r9xzz_0b462749-ee4f-4661-8a3a-06e721ef51a8/extract-utilities/0.log" Mar 09 14:24:32 crc kubenswrapper[4723]: I0309 14:24:32.835771 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-r9xzz_0b462749-ee4f-4661-8a3a-06e721ef51a8/extract-content/0.log" Mar 09 14:24:32 crc kubenswrapper[4723]: I0309 14:24:32.874837 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-r9xzz_0b462749-ee4f-4661-8a3a-06e721ef51a8/extract-content/0.log" Mar 09 14:24:33 crc kubenswrapper[4723]: I0309 14:24:33.058310 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-r9xzz_0b462749-ee4f-4661-8a3a-06e721ef51a8/extract-content/0.log" Mar 09 14:24:33 crc kubenswrapper[4723]: I0309 14:24:33.062912 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-r9xzz_0b462749-ee4f-4661-8a3a-06e721ef51a8/extract-utilities/0.log" Mar 09 14:24:33 crc kubenswrapper[4723]: I0309 14:24:33.946702 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:24:33 crc kubenswrapper[4723]: I0309 14:24:33.947011 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:24:34 crc kubenswrapper[4723]: I0309 14:24:34.329962 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-r9xzz_0b462749-ee4f-4661-8a3a-06e721ef51a8/registry-server/0.log" Mar 09 14:24:48 crc kubenswrapper[4723]: I0309 14:24:48.088893 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-77c6585f54-h6x8j_441fc6d3-ed2e-44b6-9e0d-d1925412eb23/prometheus-operator-admission-webhook/0.log" Mar 09 14:24:48 crc kubenswrapper[4723]: I0309 14:24:48.090335 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-26ngl_bcc48a4f-2d0e-4fb9-98d7-af5958403a01/prometheus-operator/0.log" Mar 09 14:24:48 crc kubenswrapper[4723]: I0309 14:24:48.100360 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-77c6585f54-vwh7x_b3774887-0abb-4692-a856-fb86baa11ba6/prometheus-operator-admission-webhook/0.log" Mar 09 14:24:48 crc kubenswrapper[4723]: I0309 14:24:48.361637 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-2nqwq_2bc0446d-1f37-4214-bd0a-0f7c64f844a8/operator/0.log" Mar 09 14:24:48 crc kubenswrapper[4723]: I0309 14:24:48.389437 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-25fp4_5d6ca68e-2044-42ed-9ee9-41bcf5aaf8e6/perses-operator/0.log" Mar 09 14:24:48 crc kubenswrapper[4723]: I0309 14:24:48.444573 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-2nqwq_2bc0446d-1f37-4214-bd0a-0f7c64f844a8/operator/1.log" Mar 09 14:24:48 crc kubenswrapper[4723]: I0309 14:24:48.462088 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-66cbf594b5-fwcs7_f19d74e0-826f-47c6-80bc-d82478a56657/observability-ui-dashboards/0.log" Mar 09 14:25:01 crc kubenswrapper[4723]: I0309 14:25:01.951068 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-856bf85654-nsk4x_a4427e9d-2cc9-4cec-acf7-7bbcc1c91582/kube-rbac-proxy/0.log" Mar 09 14:25:02 crc kubenswrapper[4723]: I0309 14:25:02.090787 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-856bf85654-nsk4x_a4427e9d-2cc9-4cec-acf7-7bbcc1c91582/manager/0.log" Mar 09 14:25:03 crc kubenswrapper[4723]: I0309 14:25:03.947056 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:25:03 crc kubenswrapper[4723]: I0309 14:25:03.947404 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:25:03 crc kubenswrapper[4723]: I0309 14:25:03.947458 4723 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" Mar 09 14:25:03 crc kubenswrapper[4723]: I0309 14:25:03.949079 4723 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a1d1cb15a27887ef68b3d06baaa73779b78a555763876a2da46127c36e07be0e"} pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 14:25:03 crc kubenswrapper[4723]: I0309 14:25:03.949166 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" containerID="cri-o://a1d1cb15a27887ef68b3d06baaa73779b78a555763876a2da46127c36e07be0e" gracePeriod=600 Mar 09 14:25:04 crc kubenswrapper[4723]: I0309 14:25:04.466661 4723 generic.go:334] "Generic (PLEG): container finished" podID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerID="a1d1cb15a27887ef68b3d06baaa73779b78a555763876a2da46127c36e07be0e" exitCode=0 Mar 09 14:25:04 crc kubenswrapper[4723]: I0309 14:25:04.466739 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" event={"ID":"983d5ed4-cfc7-402a-b226-29dc071c6e4e","Type":"ContainerDied","Data":"a1d1cb15a27887ef68b3d06baaa73779b78a555763876a2da46127c36e07be0e"} Mar 09 14:25:04 crc kubenswrapper[4723]: I0309 14:25:04.467251 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" event={"ID":"983d5ed4-cfc7-402a-b226-29dc071c6e4e","Type":"ContainerStarted","Data":"2763ec92ac5283f817f046f9c8b6c15b52cdcda46dee1aa29bf3bbe464a3180b"} Mar 09 14:25:04 crc kubenswrapper[4723]: I0309 14:25:04.467284 4723 scope.go:117] "RemoveContainer" containerID="b5af8cf4e6b647cc3921b22101db6b30f5e74eefccad33aef3904a6fa1114186" Mar 09 14:25:29 crc kubenswrapper[4723]: I0309 14:25:29.804564 4723 scope.go:117] "RemoveContainer" containerID="795541e761fac29b933322c3bbef5c92e0addb31b91d534e1c430afd4dda5cbb" Mar 09 14:25:32 crc kubenswrapper[4723]: E0309 14:25:32.802216 4723 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.129:56968->38.102.83.129:35705: write tcp 38.102.83.129:56968->38.102.83.129:35705: write: connection reset by peer Mar 09 14:26:00 crc kubenswrapper[4723]: I0309 14:26:00.152464 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551106-bj2lm"] Mar 09 14:26:00 crc kubenswrapper[4723]: E0309 14:26:00.153597 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d01f5e3-e2bb-47ac-ae93-5588acc2c783" containerName="oc" Mar 09 14:26:00 crc kubenswrapper[4723]: I0309 14:26:00.153613 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d01f5e3-e2bb-47ac-ae93-5588acc2c783" containerName="oc" Mar 09 14:26:00 crc kubenswrapper[4723]: I0309 14:26:00.153868 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d01f5e3-e2bb-47ac-ae93-5588acc2c783" containerName="oc" Mar 09 14:26:00 crc kubenswrapper[4723]: I0309 14:26:00.155273 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551106-bj2lm" Mar 09 14:26:00 crc kubenswrapper[4723]: I0309 14:26:00.164301 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551106-bj2lm"] Mar 09 14:26:00 crc kubenswrapper[4723]: I0309 14:26:00.200383 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:26:00 crc kubenswrapper[4723]: I0309 14:26:00.200729 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:26:00 crc kubenswrapper[4723]: I0309 14:26:00.201103 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jrp4x" Mar 09 14:26:00 crc kubenswrapper[4723]: I0309 14:26:00.203135 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcjqn\" (UniqueName: \"kubernetes.io/projected/8c00bf0d-d587-4583-b603-49e8fcccc58e-kube-api-access-rcjqn\") pod \"auto-csr-approver-29551106-bj2lm\" (UID: \"8c00bf0d-d587-4583-b603-49e8fcccc58e\") " pod="openshift-infra/auto-csr-approver-29551106-bj2lm" Mar 09 14:26:00 crc kubenswrapper[4723]: I0309 14:26:00.305966 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcjqn\" (UniqueName: \"kubernetes.io/projected/8c00bf0d-d587-4583-b603-49e8fcccc58e-kube-api-access-rcjqn\") pod \"auto-csr-approver-29551106-bj2lm\" (UID: \"8c00bf0d-d587-4583-b603-49e8fcccc58e\") " pod="openshift-infra/auto-csr-approver-29551106-bj2lm" Mar 09 14:26:00 crc kubenswrapper[4723]: I0309 14:26:00.348271 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcjqn\" (UniqueName: \"kubernetes.io/projected/8c00bf0d-d587-4583-b603-49e8fcccc58e-kube-api-access-rcjqn\") pod \"auto-csr-approver-29551106-bj2lm\" (UID: \"8c00bf0d-d587-4583-b603-49e8fcccc58e\") " pod="openshift-infra/auto-csr-approver-29551106-bj2lm" Mar 09 14:26:00 crc kubenswrapper[4723]: I0309 14:26:00.542001 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551106-bj2lm" Mar 09 14:26:01 crc kubenswrapper[4723]: I0309 14:26:01.025118 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551106-bj2lm"] Mar 09 14:26:01 crc kubenswrapper[4723]: I0309 14:26:01.226695 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551106-bj2lm" event={"ID":"8c00bf0d-d587-4583-b603-49e8fcccc58e","Type":"ContainerStarted","Data":"570c8c688fb519d4d8b8e32c3bce11cc38de1a53b0dd93a49024be7eb1e95d08"} Mar 09 14:26:02 crc kubenswrapper[4723]: I0309 14:26:02.240932 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551106-bj2lm" event={"ID":"8c00bf0d-d587-4583-b603-49e8fcccc58e","Type":"ContainerStarted","Data":"3ccfa0433835cc27e5f2ab2213b54943afe940b4eb8d74813aef950db3a4ac8c"} Mar 09 14:26:02 crc kubenswrapper[4723]: I0309 14:26:02.260365 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551106-bj2lm" podStartSLOduration=1.39891299 podStartE2EDuration="2.260344751s" podCreationTimestamp="2026-03-09 14:26:00 +0000 UTC" firstStartedPulling="2026-03-09 14:26:01.032335587 +0000 UTC m=+5235.046803127" lastFinishedPulling="2026-03-09 14:26:01.893767348 +0000 UTC m=+5235.908234888" observedRunningTime="2026-03-09 14:26:02.252596475 +0000 UTC m=+5236.267064015" watchObservedRunningTime="2026-03-09 14:26:02.260344751 +0000 UTC m=+5236.274812291" Mar 09 14:26:03 crc kubenswrapper[4723]: I0309 14:26:03.268620 4723 generic.go:334] "Generic (PLEG): container finished" podID="8c00bf0d-d587-4583-b603-49e8fcccc58e" containerID="3ccfa0433835cc27e5f2ab2213b54943afe940b4eb8d74813aef950db3a4ac8c" exitCode=0 Mar 09 14:26:03 crc kubenswrapper[4723]: I0309 14:26:03.268715 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551106-bj2lm" event={"ID":"8c00bf0d-d587-4583-b603-49e8fcccc58e","Type":"ContainerDied","Data":"3ccfa0433835cc27e5f2ab2213b54943afe940b4eb8d74813aef950db3a4ac8c"} Mar 09 14:26:04 crc kubenswrapper[4723]: I0309 14:26:04.736113 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551106-bj2lm" Mar 09 14:26:04 crc kubenswrapper[4723]: I0309 14:26:04.853421 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcjqn\" (UniqueName: \"kubernetes.io/projected/8c00bf0d-d587-4583-b603-49e8fcccc58e-kube-api-access-rcjqn\") pod \"8c00bf0d-d587-4583-b603-49e8fcccc58e\" (UID: \"8c00bf0d-d587-4583-b603-49e8fcccc58e\") " Mar 09 14:26:04 crc kubenswrapper[4723]: I0309 14:26:04.867477 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c00bf0d-d587-4583-b603-49e8fcccc58e-kube-api-access-rcjqn" (OuterVolumeSpecName: "kube-api-access-rcjqn") pod "8c00bf0d-d587-4583-b603-49e8fcccc58e" (UID: "8c00bf0d-d587-4583-b603-49e8fcccc58e"). InnerVolumeSpecName "kube-api-access-rcjqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:26:04 crc kubenswrapper[4723]: I0309 14:26:04.957309 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcjqn\" (UniqueName: \"kubernetes.io/projected/8c00bf0d-d587-4583-b603-49e8fcccc58e-kube-api-access-rcjqn\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:05 crc kubenswrapper[4723]: I0309 14:26:05.305308 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551106-bj2lm" event={"ID":"8c00bf0d-d587-4583-b603-49e8fcccc58e","Type":"ContainerDied","Data":"570c8c688fb519d4d8b8e32c3bce11cc38de1a53b0dd93a49024be7eb1e95d08"} Mar 09 14:26:05 crc kubenswrapper[4723]: I0309 14:26:05.305574 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="570c8c688fb519d4d8b8e32c3bce11cc38de1a53b0dd93a49024be7eb1e95d08" Mar 09 14:26:05 crc kubenswrapper[4723]: I0309 14:26:05.305349 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551106-bj2lm" Mar 09 14:26:05 crc kubenswrapper[4723]: I0309 14:26:05.331894 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551100-kdnwn"] Mar 09 14:26:05 crc kubenswrapper[4723]: I0309 14:26:05.348078 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551100-kdnwn"] Mar 09 14:26:06 crc kubenswrapper[4723]: I0309 14:26:06.898315 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d85affa1-1885-445d-8a82-9f92136ae7f5" path="/var/lib/kubelet/pods/d85affa1-1885-445d-8a82-9f92136ae7f5/volumes" Mar 09 14:26:29 crc kubenswrapper[4723]: I0309 14:26:29.983722 4723 scope.go:117] "RemoveContainer" containerID="c9961d6fad91d1e5bbdb39fc69602c78a076f16d96afd54dd67e4d34b4823f47" Mar 09 14:26:30 crc kubenswrapper[4723]: I0309 14:26:30.052650 4723 scope.go:117] "RemoveContainer" containerID="817078669c2a0e70fb168898c8884a4887147fd05fcc918f871e2c377bdcf9a1" Mar 09 14:27:09 crc kubenswrapper[4723]: I0309 14:27:09.053175 4723 generic.go:334] "Generic (PLEG): container finished" podID="9d7fd055-fcc2-4221-9372-be1ffefd23da" containerID="f11a46a1d258172a43663da9f42ca6c9a46aaa8b4ce2076967865765cf8a987d" exitCode=0 Mar 09 14:27:09 crc kubenswrapper[4723]: I0309 14:27:09.053253 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hv968/must-gather-57slw" event={"ID":"9d7fd055-fcc2-4221-9372-be1ffefd23da","Type":"ContainerDied","Data":"f11a46a1d258172a43663da9f42ca6c9a46aaa8b4ce2076967865765cf8a987d"} Mar 09 14:27:09 crc kubenswrapper[4723]: I0309 14:27:09.054790 4723 scope.go:117] "RemoveContainer" containerID="f11a46a1d258172a43663da9f42ca6c9a46aaa8b4ce2076967865765cf8a987d" Mar 09 14:27:09 crc kubenswrapper[4723]: I0309 14:27:09.740079 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hv968_must-gather-57slw_9d7fd055-fcc2-4221-9372-be1ffefd23da/gather/0.log" Mar 09 14:27:17 crc kubenswrapper[4723]: I0309 14:27:17.432680 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hv968/must-gather-57slw"] Mar 09 14:27:17 crc kubenswrapper[4723]: I0309 14:27:17.434191 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-hv968/must-gather-57slw" podUID="9d7fd055-fcc2-4221-9372-be1ffefd23da" containerName="copy" containerID="cri-o://49b28102f51eebe4de9ef7ee1adef9902c1b031bf6ea59e78f4e26b08750ef23" gracePeriod=2 Mar 09 14:27:17 crc kubenswrapper[4723]: I0309 14:27:17.490971 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hv968/must-gather-57slw"] Mar 09 14:27:17 crc kubenswrapper[4723]: I0309 14:27:17.946302 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hv968_must-gather-57slw_9d7fd055-fcc2-4221-9372-be1ffefd23da/copy/0.log" Mar 09 14:27:17 crc kubenswrapper[4723]: I0309 14:27:17.947119 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hv968/must-gather-57slw" Mar 09 14:27:18 crc kubenswrapper[4723]: I0309 14:27:18.038344 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9d7fd055-fcc2-4221-9372-be1ffefd23da-must-gather-output\") pod \"9d7fd055-fcc2-4221-9372-be1ffefd23da\" (UID: \"9d7fd055-fcc2-4221-9372-be1ffefd23da\") " Mar 09 14:27:18 crc kubenswrapper[4723]: I0309 14:27:18.038442 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26bvl\" (UniqueName: \"kubernetes.io/projected/9d7fd055-fcc2-4221-9372-be1ffefd23da-kube-api-access-26bvl\") pod \"9d7fd055-fcc2-4221-9372-be1ffefd23da\" (UID: \"9d7fd055-fcc2-4221-9372-be1ffefd23da\") " Mar 09 14:27:18 crc kubenswrapper[4723]: I0309 14:27:18.044510 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d7fd055-fcc2-4221-9372-be1ffefd23da-kube-api-access-26bvl" (OuterVolumeSpecName: "kube-api-access-26bvl") pod "9d7fd055-fcc2-4221-9372-be1ffefd23da" (UID: "9d7fd055-fcc2-4221-9372-be1ffefd23da"). InnerVolumeSpecName "kube-api-access-26bvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:27:18 crc kubenswrapper[4723]: I0309 14:27:18.141205 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26bvl\" (UniqueName: \"kubernetes.io/projected/9d7fd055-fcc2-4221-9372-be1ffefd23da-kube-api-access-26bvl\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:18 crc kubenswrapper[4723]: I0309 14:27:18.160763 4723 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hv968_must-gather-57slw_9d7fd055-fcc2-4221-9372-be1ffefd23da/copy/0.log" Mar 09 14:27:18 crc kubenswrapper[4723]: I0309 14:27:18.161391 4723 generic.go:334] "Generic (PLEG): container finished" podID="9d7fd055-fcc2-4221-9372-be1ffefd23da" containerID="49b28102f51eebe4de9ef7ee1adef9902c1b031bf6ea59e78f4e26b08750ef23" exitCode=143 Mar 09 14:27:18 crc kubenswrapper[4723]: I0309 14:27:18.161457 4723 scope.go:117] "RemoveContainer" containerID="49b28102f51eebe4de9ef7ee1adef9902c1b031bf6ea59e78f4e26b08750ef23" Mar 09 14:27:18 crc kubenswrapper[4723]: I0309 14:27:18.161600 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hv968/must-gather-57slw" Mar 09 14:27:18 crc kubenswrapper[4723]: I0309 14:27:18.190043 4723 scope.go:117] "RemoveContainer" containerID="f11a46a1d258172a43663da9f42ca6c9a46aaa8b4ce2076967865765cf8a987d" Mar 09 14:27:18 crc kubenswrapper[4723]: I0309 14:27:18.236542 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d7fd055-fcc2-4221-9372-be1ffefd23da-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "9d7fd055-fcc2-4221-9372-be1ffefd23da" (UID: "9d7fd055-fcc2-4221-9372-be1ffefd23da"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:27:18 crc kubenswrapper[4723]: I0309 14:27:18.242587 4723 scope.go:117] "RemoveContainer" containerID="49b28102f51eebe4de9ef7ee1adef9902c1b031bf6ea59e78f4e26b08750ef23" Mar 09 14:27:18 crc kubenswrapper[4723]: I0309 14:27:18.243509 4723 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9d7fd055-fcc2-4221-9372-be1ffefd23da-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:18 crc kubenswrapper[4723]: E0309 14:27:18.245710 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49b28102f51eebe4de9ef7ee1adef9902c1b031bf6ea59e78f4e26b08750ef23\": container with ID starting with 49b28102f51eebe4de9ef7ee1adef9902c1b031bf6ea59e78f4e26b08750ef23 not found: ID does not exist" containerID="49b28102f51eebe4de9ef7ee1adef9902c1b031bf6ea59e78f4e26b08750ef23" Mar 09 14:27:18 crc kubenswrapper[4723]: I0309 14:27:18.245789 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49b28102f51eebe4de9ef7ee1adef9902c1b031bf6ea59e78f4e26b08750ef23"} err="failed to get container status \"49b28102f51eebe4de9ef7ee1adef9902c1b031bf6ea59e78f4e26b08750ef23\": rpc error: code = NotFound desc = could not find container \"49b28102f51eebe4de9ef7ee1adef9902c1b031bf6ea59e78f4e26b08750ef23\": container with ID starting with 49b28102f51eebe4de9ef7ee1adef9902c1b031bf6ea59e78f4e26b08750ef23 not found: ID does not exist" Mar 09 14:27:18 crc kubenswrapper[4723]: I0309 14:27:18.245823 4723 scope.go:117] "RemoveContainer" containerID="f11a46a1d258172a43663da9f42ca6c9a46aaa8b4ce2076967865765cf8a987d" Mar 09 14:27:18 crc kubenswrapper[4723]: E0309 14:27:18.246327 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f11a46a1d258172a43663da9f42ca6c9a46aaa8b4ce2076967865765cf8a987d\": container with ID starting with f11a46a1d258172a43663da9f42ca6c9a46aaa8b4ce2076967865765cf8a987d not found: ID does not exist" containerID="f11a46a1d258172a43663da9f42ca6c9a46aaa8b4ce2076967865765cf8a987d" Mar 09 14:27:18 crc kubenswrapper[4723]: I0309 14:27:18.246364 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f11a46a1d258172a43663da9f42ca6c9a46aaa8b4ce2076967865765cf8a987d"} err="failed to get container status \"f11a46a1d258172a43663da9f42ca6c9a46aaa8b4ce2076967865765cf8a987d\": rpc error: code = NotFound desc = could not find container \"f11a46a1d258172a43663da9f42ca6c9a46aaa8b4ce2076967865765cf8a987d\": container with ID starting with f11a46a1d258172a43663da9f42ca6c9a46aaa8b4ce2076967865765cf8a987d not found: ID does not exist" Mar 09 14:27:18 crc kubenswrapper[4723]: I0309 14:27:18.897559 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d7fd055-fcc2-4221-9372-be1ffefd23da" path="/var/lib/kubelet/pods/9d7fd055-fcc2-4221-9372-be1ffefd23da/volumes" Mar 09 14:27:28 crc kubenswrapper[4723]: I0309 14:27:28.668895 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nl475"] Mar 09 14:27:28 crc kubenswrapper[4723]: E0309 14:27:28.670251 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d7fd055-fcc2-4221-9372-be1ffefd23da" containerName="copy" Mar 09 14:27:28 crc kubenswrapper[4723]: I0309 14:27:28.670271 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d7fd055-fcc2-4221-9372-be1ffefd23da" containerName="copy" Mar 09 14:27:28 crc kubenswrapper[4723]: E0309 14:27:28.670313 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c00bf0d-d587-4583-b603-49e8fcccc58e" containerName="oc" Mar 09 14:27:28 crc kubenswrapper[4723]: I0309 14:27:28.670322 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c00bf0d-d587-4583-b603-49e8fcccc58e" containerName="oc" Mar 09 14:27:28 crc kubenswrapper[4723]: E0309 14:27:28.670342 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d7fd055-fcc2-4221-9372-be1ffefd23da" containerName="gather" Mar 09 14:27:28 crc kubenswrapper[4723]: I0309 14:27:28.670349 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d7fd055-fcc2-4221-9372-be1ffefd23da" containerName="gather" Mar 09 14:27:28 crc kubenswrapper[4723]: I0309 14:27:28.670617 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d7fd055-fcc2-4221-9372-be1ffefd23da" containerName="copy" Mar 09 14:27:28 crc kubenswrapper[4723]: I0309 14:27:28.670651 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c00bf0d-d587-4583-b603-49e8fcccc58e" containerName="oc" Mar 09 14:27:28 crc kubenswrapper[4723]: I0309 14:27:28.670674 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d7fd055-fcc2-4221-9372-be1ffefd23da" containerName="gather" Mar 09 14:27:28 crc kubenswrapper[4723]: I0309 14:27:28.683068 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nl475" Mar 09 14:27:28 crc kubenswrapper[4723]: I0309 14:27:28.689291 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nl475"] Mar 09 14:27:28 crc kubenswrapper[4723]: I0309 14:27:28.797782 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73b32f53-ab5c-4b0c-94d6-d29ae22be1c0-utilities\") pod \"redhat-operators-nl475\" (UID: \"73b32f53-ab5c-4b0c-94d6-d29ae22be1c0\") " pod="openshift-marketplace/redhat-operators-nl475" Mar 09 14:27:28 crc kubenswrapper[4723]: I0309 14:27:28.798047 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73b32f53-ab5c-4b0c-94d6-d29ae22be1c0-catalog-content\") pod \"redhat-operators-nl475\" (UID: \"73b32f53-ab5c-4b0c-94d6-d29ae22be1c0\") " pod="openshift-marketplace/redhat-operators-nl475" Mar 09 14:27:28 crc kubenswrapper[4723]: I0309 14:27:28.798095 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm8sb\" (UniqueName: \"kubernetes.io/projected/73b32f53-ab5c-4b0c-94d6-d29ae22be1c0-kube-api-access-hm8sb\") pod \"redhat-operators-nl475\" (UID: \"73b32f53-ab5c-4b0c-94d6-d29ae22be1c0\") " pod="openshift-marketplace/redhat-operators-nl475" Mar 09 14:27:28 crc kubenswrapper[4723]: I0309 14:27:28.900304 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73b32f53-ab5c-4b0c-94d6-d29ae22be1c0-catalog-content\") pod \"redhat-operators-nl475\" (UID: \"73b32f53-ab5c-4b0c-94d6-d29ae22be1c0\") " pod="openshift-marketplace/redhat-operators-nl475" Mar 09 14:27:28 crc kubenswrapper[4723]: I0309 14:27:28.900375 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hm8sb\" (UniqueName: \"kubernetes.io/projected/73b32f53-ab5c-4b0c-94d6-d29ae22be1c0-kube-api-access-hm8sb\") pod \"redhat-operators-nl475\" (UID: \"73b32f53-ab5c-4b0c-94d6-d29ae22be1c0\") " pod="openshift-marketplace/redhat-operators-nl475" Mar 09 14:27:28 crc kubenswrapper[4723]: I0309 14:27:28.900510 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73b32f53-ab5c-4b0c-94d6-d29ae22be1c0-utilities\") pod \"redhat-operators-nl475\" (UID: \"73b32f53-ab5c-4b0c-94d6-d29ae22be1c0\") " pod="openshift-marketplace/redhat-operators-nl475" Mar 09 14:27:28 crc kubenswrapper[4723]: I0309 14:27:28.900924 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73b32f53-ab5c-4b0c-94d6-d29ae22be1c0-catalog-content\") pod \"redhat-operators-nl475\" (UID: \"73b32f53-ab5c-4b0c-94d6-d29ae22be1c0\") " pod="openshift-marketplace/redhat-operators-nl475" Mar 09 14:27:28 crc kubenswrapper[4723]: I0309 14:27:28.901458 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73b32f53-ab5c-4b0c-94d6-d29ae22be1c0-utilities\") pod \"redhat-operators-nl475\" (UID: \"73b32f53-ab5c-4b0c-94d6-d29ae22be1c0\") " pod="openshift-marketplace/redhat-operators-nl475" Mar 09 14:27:28 crc kubenswrapper[4723]: I0309 14:27:28.918666 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm8sb\" (UniqueName: \"kubernetes.io/projected/73b32f53-ab5c-4b0c-94d6-d29ae22be1c0-kube-api-access-hm8sb\") pod \"redhat-operators-nl475\" (UID: \"73b32f53-ab5c-4b0c-94d6-d29ae22be1c0\") " pod="openshift-marketplace/redhat-operators-nl475" Mar 09 14:27:29 crc kubenswrapper[4723]: I0309 14:27:29.032670 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nl475" Mar 09 14:27:29 crc kubenswrapper[4723]: I0309 14:27:29.609556 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nl475"] Mar 09 14:27:30 crc kubenswrapper[4723]: I0309 14:27:30.304389 4723 generic.go:334] "Generic (PLEG): container finished" podID="73b32f53-ab5c-4b0c-94d6-d29ae22be1c0" containerID="1a483b0851cf5ae01f6f2827c966851d0a9a1d7f3de763cdf322a71ac3435eba" exitCode=0 Mar 09 14:27:30 crc kubenswrapper[4723]: I0309 14:27:30.304446 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nl475" event={"ID":"73b32f53-ab5c-4b0c-94d6-d29ae22be1c0","Type":"ContainerDied","Data":"1a483b0851cf5ae01f6f2827c966851d0a9a1d7f3de763cdf322a71ac3435eba"} Mar 09 14:27:30 crc kubenswrapper[4723]: I0309 14:27:30.304706 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nl475" event={"ID":"73b32f53-ab5c-4b0c-94d6-d29ae22be1c0","Type":"ContainerStarted","Data":"48ced29faf2e8e0c5787937957db340b49a086e5739f4afb421934d6ac8a8d4a"} Mar 09 14:27:30 crc kubenswrapper[4723]: I0309 14:27:30.306253 4723 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 14:27:32 crc kubenswrapper[4723]: I0309 14:27:32.327004 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nl475" event={"ID":"73b32f53-ab5c-4b0c-94d6-d29ae22be1c0","Type":"ContainerStarted","Data":"ed9c862cd5beb8029bc8d24832efcfe9a59e03e7b0a8adfb00a033e46fb30132"} Mar 09 14:27:33 crc kubenswrapper[4723]: I0309 14:27:33.947135 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:27:33 crc kubenswrapper[4723]: I0309 14:27:33.947402 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:27:37 crc kubenswrapper[4723]: I0309 14:27:37.394735 4723 generic.go:334] "Generic (PLEG): container finished" podID="73b32f53-ab5c-4b0c-94d6-d29ae22be1c0" containerID="ed9c862cd5beb8029bc8d24832efcfe9a59e03e7b0a8adfb00a033e46fb30132" exitCode=0 Mar 09 14:27:37 crc kubenswrapper[4723]: I0309 14:27:37.394768 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nl475" event={"ID":"73b32f53-ab5c-4b0c-94d6-d29ae22be1c0","Type":"ContainerDied","Data":"ed9c862cd5beb8029bc8d24832efcfe9a59e03e7b0a8adfb00a033e46fb30132"} Mar 09 14:27:38 crc kubenswrapper[4723]: I0309 14:27:38.409636 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nl475" event={"ID":"73b32f53-ab5c-4b0c-94d6-d29ae22be1c0","Type":"ContainerStarted","Data":"2de2ebfc1bb752c6db8333be4d70007bd3050fdc363ea557987f4173efe398c2"} Mar 09 14:27:38 crc kubenswrapper[4723]: I0309 14:27:38.450732 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nl475" podStartSLOduration=2.85416581 podStartE2EDuration="10.450707432s" podCreationTimestamp="2026-03-09 14:27:28 +0000 UTC" firstStartedPulling="2026-03-09 14:27:30.306041573 +0000 UTC m=+5324.320509103" lastFinishedPulling="2026-03-09 14:27:37.902583185 +0000 UTC m=+5331.917050725" observedRunningTime="2026-03-09 14:27:38.434831809 +0000 UTC m=+5332.449299349" watchObservedRunningTime="2026-03-09 14:27:38.450707432 +0000 UTC m=+5332.465174972" Mar 09 14:27:39 crc kubenswrapper[4723]: I0309 14:27:39.033522 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nl475" Mar 09 14:27:39 crc kubenswrapper[4723]: I0309 14:27:39.033910 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nl475" Mar 09 14:27:40 crc kubenswrapper[4723]: I0309 14:27:40.090330 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nl475" podUID="73b32f53-ab5c-4b0c-94d6-d29ae22be1c0" containerName="registry-server" probeResult="failure" output=< Mar 09 14:27:40 crc kubenswrapper[4723]: timeout: failed to connect service ":50051" within 1s Mar 09 14:27:40 crc kubenswrapper[4723]: > Mar 09 14:27:50 crc kubenswrapper[4723]: I0309 14:27:50.100566 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nl475" podUID="73b32f53-ab5c-4b0c-94d6-d29ae22be1c0" containerName="registry-server" probeResult="failure" output=< Mar 09 14:27:50 crc kubenswrapper[4723]: timeout: failed to connect service ":50051" within 1s Mar 09 14:27:50 crc kubenswrapper[4723]: > Mar 09 14:27:59 crc kubenswrapper[4723]: I0309 14:27:59.104232 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nl475" Mar 09 14:27:59 crc kubenswrapper[4723]: I0309 14:27:59.156677 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nl475" Mar 09 14:27:59 crc kubenswrapper[4723]: I0309 14:27:59.871213 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nl475"] Mar 09 14:28:00 crc kubenswrapper[4723]: I0309 14:28:00.146614 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551108-r8k94"] Mar 09 14:28:00 crc kubenswrapper[4723]: I0309 14:28:00.148225 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551108-r8k94" Mar 09 14:28:00 crc kubenswrapper[4723]: I0309 14:28:00.150595 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:28:00 crc kubenswrapper[4723]: I0309 14:28:00.150693 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jrp4x" Mar 09 14:28:00 crc kubenswrapper[4723]: I0309 14:28:00.153672 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:28:00 crc kubenswrapper[4723]: I0309 14:28:00.160704 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551108-r8k94"] Mar 09 14:28:00 crc kubenswrapper[4723]: I0309 14:28:00.214382 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htz2r\" (UniqueName: \"kubernetes.io/projected/9736ceca-3335-4d7f-bbd8-279d52703c44-kube-api-access-htz2r\") pod \"auto-csr-approver-29551108-r8k94\" (UID: \"9736ceca-3335-4d7f-bbd8-279d52703c44\") " pod="openshift-infra/auto-csr-approver-29551108-r8k94" Mar 09 14:28:00 crc kubenswrapper[4723]: I0309 14:28:00.317160 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htz2r\" (UniqueName: \"kubernetes.io/projected/9736ceca-3335-4d7f-bbd8-279d52703c44-kube-api-access-htz2r\") pod \"auto-csr-approver-29551108-r8k94\" (UID: \"9736ceca-3335-4d7f-bbd8-279d52703c44\") " pod="openshift-infra/auto-csr-approver-29551108-r8k94" Mar 09 14:28:00 crc kubenswrapper[4723]: I0309 14:28:00.338518 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htz2r\" (UniqueName: \"kubernetes.io/projected/9736ceca-3335-4d7f-bbd8-279d52703c44-kube-api-access-htz2r\") pod \"auto-csr-approver-29551108-r8k94\" (UID: \"9736ceca-3335-4d7f-bbd8-279d52703c44\") " pod="openshift-infra/auto-csr-approver-29551108-r8k94" Mar 09 14:28:00 crc kubenswrapper[4723]: I0309 14:28:00.471059 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551108-r8k94" Mar 09 14:28:00 crc kubenswrapper[4723]: I0309 14:28:00.701930 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nl475" podUID="73b32f53-ab5c-4b0c-94d6-d29ae22be1c0" containerName="registry-server" containerID="cri-o://2de2ebfc1bb752c6db8333be4d70007bd3050fdc363ea557987f4173efe398c2" gracePeriod=2 Mar 09 14:28:00 crc kubenswrapper[4723]: I0309 14:28:00.948576 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551108-r8k94"] Mar 09 14:28:01 crc kubenswrapper[4723]: I0309 14:28:01.132424 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nl475" Mar 09 14:28:01 crc kubenswrapper[4723]: I0309 14:28:01.241210 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hm8sb\" (UniqueName: \"kubernetes.io/projected/73b32f53-ab5c-4b0c-94d6-d29ae22be1c0-kube-api-access-hm8sb\") pod \"73b32f53-ab5c-4b0c-94d6-d29ae22be1c0\" (UID: \"73b32f53-ab5c-4b0c-94d6-d29ae22be1c0\") " Mar 09 14:28:01 crc kubenswrapper[4723]: I0309 14:28:01.241339 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73b32f53-ab5c-4b0c-94d6-d29ae22be1c0-utilities\") pod \"73b32f53-ab5c-4b0c-94d6-d29ae22be1c0\" (UID: \"73b32f53-ab5c-4b0c-94d6-d29ae22be1c0\") " Mar 09 14:28:01 crc kubenswrapper[4723]: I0309 14:28:01.241752 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73b32f53-ab5c-4b0c-94d6-d29ae22be1c0-catalog-content\") pod \"73b32f53-ab5c-4b0c-94d6-d29ae22be1c0\" (UID: \"73b32f53-ab5c-4b0c-94d6-d29ae22be1c0\") " Mar 09 14:28:01 crc kubenswrapper[4723]: I0309 14:28:01.242604 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73b32f53-ab5c-4b0c-94d6-d29ae22be1c0-utilities" (OuterVolumeSpecName: "utilities") pod "73b32f53-ab5c-4b0c-94d6-d29ae22be1c0" (UID: "73b32f53-ab5c-4b0c-94d6-d29ae22be1c0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:28:01 crc kubenswrapper[4723]: I0309 14:28:01.243094 4723 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73b32f53-ab5c-4b0c-94d6-d29ae22be1c0-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:01 crc kubenswrapper[4723]: I0309 14:28:01.247282 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73b32f53-ab5c-4b0c-94d6-d29ae22be1c0-kube-api-access-hm8sb" (OuterVolumeSpecName: "kube-api-access-hm8sb") pod "73b32f53-ab5c-4b0c-94d6-d29ae22be1c0" (UID: "73b32f53-ab5c-4b0c-94d6-d29ae22be1c0"). InnerVolumeSpecName "kube-api-access-hm8sb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:28:01 crc kubenswrapper[4723]: I0309 14:28:01.345006 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hm8sb\" (UniqueName: \"kubernetes.io/projected/73b32f53-ab5c-4b0c-94d6-d29ae22be1c0-kube-api-access-hm8sb\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:01 crc kubenswrapper[4723]: I0309 14:28:01.364740 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73b32f53-ab5c-4b0c-94d6-d29ae22be1c0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "73b32f53-ab5c-4b0c-94d6-d29ae22be1c0" (UID: "73b32f53-ab5c-4b0c-94d6-d29ae22be1c0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:28:01 crc kubenswrapper[4723]: I0309 14:28:01.447311 4723 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73b32f53-ab5c-4b0c-94d6-d29ae22be1c0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:01 crc kubenswrapper[4723]: I0309 14:28:01.713763 4723 generic.go:334] "Generic (PLEG): container finished" podID="73b32f53-ab5c-4b0c-94d6-d29ae22be1c0" containerID="2de2ebfc1bb752c6db8333be4d70007bd3050fdc363ea557987f4173efe398c2" exitCode=0 Mar 09 14:28:01 crc kubenswrapper[4723]: I0309 14:28:01.713817 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nl475" Mar 09 14:28:01 crc kubenswrapper[4723]: I0309 14:28:01.713883 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nl475" event={"ID":"73b32f53-ab5c-4b0c-94d6-d29ae22be1c0","Type":"ContainerDied","Data":"2de2ebfc1bb752c6db8333be4d70007bd3050fdc363ea557987f4173efe398c2"} Mar 09 14:28:01 crc kubenswrapper[4723]: I0309 14:28:01.713935 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nl475" event={"ID":"73b32f53-ab5c-4b0c-94d6-d29ae22be1c0","Type":"ContainerDied","Data":"48ced29faf2e8e0c5787937957db340b49a086e5739f4afb421934d6ac8a8d4a"} Mar 09 14:28:01 crc kubenswrapper[4723]: I0309 14:28:01.713962 4723 scope.go:117] "RemoveContainer" containerID="2de2ebfc1bb752c6db8333be4d70007bd3050fdc363ea557987f4173efe398c2" Mar 09 14:28:01 crc kubenswrapper[4723]: I0309 14:28:01.715606 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551108-r8k94" event={"ID":"9736ceca-3335-4d7f-bbd8-279d52703c44","Type":"ContainerStarted","Data":"cb24753290fe35a1251cc03b286c9a18aece6c3ae7c0c4db9640b6320de86807"} Mar 09 14:28:01 crc kubenswrapper[4723]: I0309 14:28:01.748818 4723 scope.go:117] "RemoveContainer" containerID="ed9c862cd5beb8029bc8d24832efcfe9a59e03e7b0a8adfb00a033e46fb30132" Mar 09 14:28:01 crc kubenswrapper[4723]: I0309 14:28:01.760966 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nl475"] Mar 09 14:28:01 crc kubenswrapper[4723]: I0309 14:28:01.773491 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nl475"] Mar 09 14:28:01 crc kubenswrapper[4723]: I0309 14:28:01.774690 4723 scope.go:117] "RemoveContainer" containerID="1a483b0851cf5ae01f6f2827c966851d0a9a1d7f3de763cdf322a71ac3435eba" Mar 09 14:28:01 crc kubenswrapper[4723]: I0309 14:28:01.836069 4723 scope.go:117] "RemoveContainer" containerID="2de2ebfc1bb752c6db8333be4d70007bd3050fdc363ea557987f4173efe398c2" Mar 09 14:28:01 crc kubenswrapper[4723]: E0309 14:28:01.836412 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2de2ebfc1bb752c6db8333be4d70007bd3050fdc363ea557987f4173efe398c2\": container with ID starting with 2de2ebfc1bb752c6db8333be4d70007bd3050fdc363ea557987f4173efe398c2 not found: ID does not exist" containerID="2de2ebfc1bb752c6db8333be4d70007bd3050fdc363ea557987f4173efe398c2" Mar 09 14:28:01 crc kubenswrapper[4723]: I0309 14:28:01.836439 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2de2ebfc1bb752c6db8333be4d70007bd3050fdc363ea557987f4173efe398c2"} err="failed to get container status \"2de2ebfc1bb752c6db8333be4d70007bd3050fdc363ea557987f4173efe398c2\": rpc error: code = NotFound desc = could not find container \"2de2ebfc1bb752c6db8333be4d70007bd3050fdc363ea557987f4173efe398c2\": container with ID starting with 2de2ebfc1bb752c6db8333be4d70007bd3050fdc363ea557987f4173efe398c2 not found: ID does not exist" Mar 09 14:28:01 crc kubenswrapper[4723]: I0309 14:28:01.836459 4723 scope.go:117] "RemoveContainer" containerID="ed9c862cd5beb8029bc8d24832efcfe9a59e03e7b0a8adfb00a033e46fb30132" Mar 09 14:28:01 crc kubenswrapper[4723]: E0309 14:28:01.836705 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed9c862cd5beb8029bc8d24832efcfe9a59e03e7b0a8adfb00a033e46fb30132\": container with ID starting with ed9c862cd5beb8029bc8d24832efcfe9a59e03e7b0a8adfb00a033e46fb30132 not found: ID does not exist" containerID="ed9c862cd5beb8029bc8d24832efcfe9a59e03e7b0a8adfb00a033e46fb30132" Mar 09 14:28:01 crc kubenswrapper[4723]: I0309 14:28:01.836725 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed9c862cd5beb8029bc8d24832efcfe9a59e03e7b0a8adfb00a033e46fb30132"} err="failed to get container status \"ed9c862cd5beb8029bc8d24832efcfe9a59e03e7b0a8adfb00a033e46fb30132\": rpc error: code = NotFound desc = could not find container \"ed9c862cd5beb8029bc8d24832efcfe9a59e03e7b0a8adfb00a033e46fb30132\": container with ID starting with ed9c862cd5beb8029bc8d24832efcfe9a59e03e7b0a8adfb00a033e46fb30132 not found: ID does not exist" Mar 09 14:28:01 crc kubenswrapper[4723]: I0309 14:28:01.836742 4723 scope.go:117] "RemoveContainer" containerID="1a483b0851cf5ae01f6f2827c966851d0a9a1d7f3de763cdf322a71ac3435eba" Mar 09 14:28:01 crc kubenswrapper[4723]: E0309 14:28:01.837163 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a483b0851cf5ae01f6f2827c966851d0a9a1d7f3de763cdf322a71ac3435eba\": container with ID starting with 1a483b0851cf5ae01f6f2827c966851d0a9a1d7f3de763cdf322a71ac3435eba not found: ID does not exist" containerID="1a483b0851cf5ae01f6f2827c966851d0a9a1d7f3de763cdf322a71ac3435eba" Mar 09 14:28:01 crc kubenswrapper[4723]: I0309 14:28:01.837181 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a483b0851cf5ae01f6f2827c966851d0a9a1d7f3de763cdf322a71ac3435eba"} err="failed to get container status \"1a483b0851cf5ae01f6f2827c966851d0a9a1d7f3de763cdf322a71ac3435eba\": rpc error: code = NotFound desc = could not find container \"1a483b0851cf5ae01f6f2827c966851d0a9a1d7f3de763cdf322a71ac3435eba\": container with ID starting with 1a483b0851cf5ae01f6f2827c966851d0a9a1d7f3de763cdf322a71ac3435eba not found: ID does not exist" Mar 09 14:28:02 crc kubenswrapper[4723]: I0309 14:28:02.727966 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551108-r8k94" event={"ID":"9736ceca-3335-4d7f-bbd8-279d52703c44","Type":"ContainerStarted","Data":"afed7e6a473b8a359291b629cab83283f34f4a05f5093a3b8dd030edb43458bb"} Mar 09 14:28:02 crc kubenswrapper[4723]: I0309 14:28:02.747448 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551108-r8k94" podStartSLOduration=1.591545205 podStartE2EDuration="2.747423279s" podCreationTimestamp="2026-03-09 14:28:00 +0000 UTC" firstStartedPulling="2026-03-09 14:28:00.964152716 +0000 UTC m=+5354.978620256" lastFinishedPulling="2026-03-09 14:28:02.12003079 +0000 UTC m=+5356.134498330" observedRunningTime="2026-03-09 14:28:02.73994574 +0000 UTC m=+5356.754413300" watchObservedRunningTime="2026-03-09 14:28:02.747423279 +0000 UTC m=+5356.761890819" Mar 09 14:28:02 crc kubenswrapper[4723]: I0309 14:28:02.897376 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73b32f53-ab5c-4b0c-94d6-d29ae22be1c0" path="/var/lib/kubelet/pods/73b32f53-ab5c-4b0c-94d6-d29ae22be1c0/volumes" Mar 09 14:28:03 crc kubenswrapper[4723]: I0309 14:28:03.741092 4723 generic.go:334] "Generic (PLEG): container finished" podID="9736ceca-3335-4d7f-bbd8-279d52703c44" containerID="afed7e6a473b8a359291b629cab83283f34f4a05f5093a3b8dd030edb43458bb" exitCode=0 Mar 09 14:28:03 crc kubenswrapper[4723]: I0309 14:28:03.741167 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551108-r8k94" event={"ID":"9736ceca-3335-4d7f-bbd8-279d52703c44","Type":"ContainerDied","Data":"afed7e6a473b8a359291b629cab83283f34f4a05f5093a3b8dd030edb43458bb"} Mar 09 14:28:03 crc kubenswrapper[4723]: I0309 14:28:03.946817 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:28:03 crc kubenswrapper[4723]: I0309 14:28:03.946888 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:28:05 crc kubenswrapper[4723]: I0309 14:28:05.177172 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551108-r8k94" Mar 09 14:28:05 crc kubenswrapper[4723]: I0309 14:28:05.337931 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htz2r\" (UniqueName: \"kubernetes.io/projected/9736ceca-3335-4d7f-bbd8-279d52703c44-kube-api-access-htz2r\") pod \"9736ceca-3335-4d7f-bbd8-279d52703c44\" (UID: \"9736ceca-3335-4d7f-bbd8-279d52703c44\") " Mar 09 14:28:05 crc kubenswrapper[4723]: I0309 14:28:05.345584 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9736ceca-3335-4d7f-bbd8-279d52703c44-kube-api-access-htz2r" (OuterVolumeSpecName: "kube-api-access-htz2r") pod "9736ceca-3335-4d7f-bbd8-279d52703c44" (UID: "9736ceca-3335-4d7f-bbd8-279d52703c44"). InnerVolumeSpecName "kube-api-access-htz2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:28:05 crc kubenswrapper[4723]: I0309 14:28:05.443233 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htz2r\" (UniqueName: \"kubernetes.io/projected/9736ceca-3335-4d7f-bbd8-279d52703c44-kube-api-access-htz2r\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:05 crc kubenswrapper[4723]: I0309 14:28:05.770282 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551108-r8k94" event={"ID":"9736ceca-3335-4d7f-bbd8-279d52703c44","Type":"ContainerDied","Data":"cb24753290fe35a1251cc03b286c9a18aece6c3ae7c0c4db9640b6320de86807"} Mar 09 14:28:05 crc kubenswrapper[4723]: I0309 14:28:05.770631 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb24753290fe35a1251cc03b286c9a18aece6c3ae7c0c4db9640b6320de86807" Mar 09 14:28:05 crc kubenswrapper[4723]: I0309 14:28:05.770369 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551108-r8k94" Mar 09 14:28:05 crc kubenswrapper[4723]: I0309 14:28:05.825536 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551102-h6b6r"] Mar 09 14:28:05 crc kubenswrapper[4723]: I0309 14:28:05.839981 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551102-h6b6r"] Mar 09 14:28:06 crc kubenswrapper[4723]: I0309 14:28:06.897542 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52f56ac0-e521-4734-bbc5-94a35f0c1ae6" path="/var/lib/kubelet/pods/52f56ac0-e521-4734-bbc5-94a35f0c1ae6/volumes" Mar 09 14:28:30 crc kubenswrapper[4723]: I0309 14:28:30.164429 4723 scope.go:117] "RemoveContainer" containerID="e0886828b34405d9c7162c8038432aa9836221d16b776c29c826a57817ed860f" Mar 09 14:28:33 crc kubenswrapper[4723]: I0309 14:28:33.947159 4723 patch_prober.go:28] interesting pod/machine-config-daemon-cfjq2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:28:33 crc kubenswrapper[4723]: I0309 14:28:33.947752 4723 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:28:33 crc kubenswrapper[4723]: I0309 14:28:33.947799 4723 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" Mar 09 14:28:33 crc kubenswrapper[4723]: I0309 14:28:33.948729 4723 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2763ec92ac5283f817f046f9c8b6c15b52cdcda46dee1aa29bf3bbe464a3180b"} pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 14:28:33 crc kubenswrapper[4723]: I0309 14:28:33.948804 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerName="machine-config-daemon" containerID="cri-o://2763ec92ac5283f817f046f9c8b6c15b52cdcda46dee1aa29bf3bbe464a3180b" gracePeriod=600 Mar 09 14:28:34 crc kubenswrapper[4723]: E0309 14:28:34.073112 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:28:34 crc kubenswrapper[4723]: I0309 14:28:34.083841 4723 generic.go:334] "Generic (PLEG): container finished" podID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" containerID="2763ec92ac5283f817f046f9c8b6c15b52cdcda46dee1aa29bf3bbe464a3180b" exitCode=0 Mar 09 14:28:34 crc kubenswrapper[4723]: I0309 14:28:34.083903 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" event={"ID":"983d5ed4-cfc7-402a-b226-29dc071c6e4e","Type":"ContainerDied","Data":"2763ec92ac5283f817f046f9c8b6c15b52cdcda46dee1aa29bf3bbe464a3180b"} Mar 09 14:28:34 crc kubenswrapper[4723]: I0309 14:28:34.083936 4723 scope.go:117] "RemoveContainer" containerID="a1d1cb15a27887ef68b3d06baaa73779b78a555763876a2da46127c36e07be0e" Mar 09 14:28:34 crc kubenswrapper[4723]: I0309 14:28:34.084544 4723 scope.go:117] "RemoveContainer" containerID="2763ec92ac5283f817f046f9c8b6c15b52cdcda46dee1aa29bf3bbe464a3180b" Mar 09 14:28:34 crc kubenswrapper[4723]: E0309 14:28:34.084839 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:28:48 crc kubenswrapper[4723]: I0309 14:28:48.882563 4723 scope.go:117] "RemoveContainer" containerID="2763ec92ac5283f817f046f9c8b6c15b52cdcda46dee1aa29bf3bbe464a3180b" Mar 09 14:28:48 crc kubenswrapper[4723]: E0309 14:28:48.883462 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:29:00 crc kubenswrapper[4723]: I0309 14:29:00.881003 4723 scope.go:117] "RemoveContainer" containerID="2763ec92ac5283f817f046f9c8b6c15b52cdcda46dee1aa29bf3bbe464a3180b" Mar 09 14:29:00 crc kubenswrapper[4723]: E0309 14:29:00.881829 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:29:04 crc kubenswrapper[4723]: I0309 14:29:04.788270 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4npck"] Mar 09 14:29:04 crc kubenswrapper[4723]: E0309 14:29:04.789942 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73b32f53-ab5c-4b0c-94d6-d29ae22be1c0" containerName="extract-content" Mar 09 14:29:04 crc kubenswrapper[4723]: I0309 14:29:04.790049 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="73b32f53-ab5c-4b0c-94d6-d29ae22be1c0" containerName="extract-content" Mar 09 14:29:04 crc kubenswrapper[4723]: E0309 14:29:04.790125 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73b32f53-ab5c-4b0c-94d6-d29ae22be1c0" containerName="extract-utilities" Mar 09 14:29:04 crc kubenswrapper[4723]: I0309 14:29:04.790189 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="73b32f53-ab5c-4b0c-94d6-d29ae22be1c0" containerName="extract-utilities" Mar 09 14:29:04 crc kubenswrapper[4723]: E0309 14:29:04.790252 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73b32f53-ab5c-4b0c-94d6-d29ae22be1c0" containerName="registry-server" Mar 09 14:29:04 crc kubenswrapper[4723]: I0309 14:29:04.790316 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="73b32f53-ab5c-4b0c-94d6-d29ae22be1c0" containerName="registry-server" Mar 09 14:29:04 crc kubenswrapper[4723]: E0309 14:29:04.790427 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9736ceca-3335-4d7f-bbd8-279d52703c44" containerName="oc" Mar 09 14:29:04 crc kubenswrapper[4723]: I0309 14:29:04.790491 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="9736ceca-3335-4d7f-bbd8-279d52703c44" containerName="oc" Mar 09 14:29:04 crc kubenswrapper[4723]: I0309 14:29:04.790804 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="73b32f53-ab5c-4b0c-94d6-d29ae22be1c0" containerName="registry-server" Mar 09 14:29:04 crc kubenswrapper[4723]: I0309 14:29:04.790977 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="9736ceca-3335-4d7f-bbd8-279d52703c44" containerName="oc" Mar 09 14:29:04 crc kubenswrapper[4723]: I0309 14:29:04.792826 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4npck" Mar 09 14:29:04 crc kubenswrapper[4723]: I0309 14:29:04.819226 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4npck"] Mar 09 14:29:04 crc kubenswrapper[4723]: I0309 14:29:04.911025 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd83abdd-eaa5-4f0c-a068-72957f30efea-utilities\") pod \"community-operators-4npck\" (UID: \"fd83abdd-eaa5-4f0c-a068-72957f30efea\") " pod="openshift-marketplace/community-operators-4npck" Mar 09 14:29:04 crc kubenswrapper[4723]: I0309 14:29:04.911140 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbxw9\" (UniqueName: \"kubernetes.io/projected/fd83abdd-eaa5-4f0c-a068-72957f30efea-kube-api-access-hbxw9\") pod \"community-operators-4npck\" (UID: \"fd83abdd-eaa5-4f0c-a068-72957f30efea\") " pod="openshift-marketplace/community-operators-4npck" Mar 09 14:29:04 crc kubenswrapper[4723]: I0309 14:29:04.911413 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd83abdd-eaa5-4f0c-a068-72957f30efea-catalog-content\") pod \"community-operators-4npck\" (UID: \"fd83abdd-eaa5-4f0c-a068-72957f30efea\") " pod="openshift-marketplace/community-operators-4npck" Mar 09 14:29:05 crc kubenswrapper[4723]: I0309 14:29:05.012948 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd83abdd-eaa5-4f0c-a068-72957f30efea-utilities\") pod \"community-operators-4npck\" (UID: \"fd83abdd-eaa5-4f0c-a068-72957f30efea\") " pod="openshift-marketplace/community-operators-4npck" Mar 09 14:29:05 crc kubenswrapper[4723]: I0309 14:29:05.013033 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbxw9\" (UniqueName: \"kubernetes.io/projected/fd83abdd-eaa5-4f0c-a068-72957f30efea-kube-api-access-hbxw9\") pod \"community-operators-4npck\" (UID: \"fd83abdd-eaa5-4f0c-a068-72957f30efea\") " pod="openshift-marketplace/community-operators-4npck" Mar 09 14:29:05 crc kubenswrapper[4723]: I0309 14:29:05.013148 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd83abdd-eaa5-4f0c-a068-72957f30efea-catalog-content\") pod \"community-operators-4npck\" (UID: \"fd83abdd-eaa5-4f0c-a068-72957f30efea\") " pod="openshift-marketplace/community-operators-4npck" Mar 09 14:29:05 crc kubenswrapper[4723]: I0309 14:29:05.013700 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd83abdd-eaa5-4f0c-a068-72957f30efea-catalog-content\") pod \"community-operators-4npck\" (UID: \"fd83abdd-eaa5-4f0c-a068-72957f30efea\") " pod="openshift-marketplace/community-operators-4npck" Mar 09 14:29:05 crc kubenswrapper[4723]: I0309 14:29:05.013747 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd83abdd-eaa5-4f0c-a068-72957f30efea-utilities\") pod \"community-operators-4npck\" (UID: \"fd83abdd-eaa5-4f0c-a068-72957f30efea\") " pod="openshift-marketplace/community-operators-4npck" Mar 09 14:29:05 crc kubenswrapper[4723]: I0309 14:29:05.031743 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbxw9\" (UniqueName: \"kubernetes.io/projected/fd83abdd-eaa5-4f0c-a068-72957f30efea-kube-api-access-hbxw9\") pod \"community-operators-4npck\" (UID: \"fd83abdd-eaa5-4f0c-a068-72957f30efea\") " pod="openshift-marketplace/community-operators-4npck" Mar 09 14:29:05 crc kubenswrapper[4723]: I0309 14:29:05.119546 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4npck" Mar 09 14:29:05 crc kubenswrapper[4723]: I0309 14:29:05.483816 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4npck"] Mar 09 14:29:06 crc kubenswrapper[4723]: I0309 14:29:06.503479 4723 generic.go:334] "Generic (PLEG): container finished" podID="fd83abdd-eaa5-4f0c-a068-72957f30efea" containerID="7985ebcbb05ae87c1f4c4b9b71990cec60f5ebb45361ffca25872a724848093a" exitCode=0 Mar 09 14:29:06 crc kubenswrapper[4723]: I0309 14:29:06.504167 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4npck" event={"ID":"fd83abdd-eaa5-4f0c-a068-72957f30efea","Type":"ContainerDied","Data":"7985ebcbb05ae87c1f4c4b9b71990cec60f5ebb45361ffca25872a724848093a"} Mar 09 14:29:06 crc kubenswrapper[4723]: I0309 14:29:06.505008 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4npck" event={"ID":"fd83abdd-eaa5-4f0c-a068-72957f30efea","Type":"ContainerStarted","Data":"0dd97965b59c4d260c99557ffb9505b50421bb6d02861d364a6509584a529fd8"} Mar 09 14:29:11 crc kubenswrapper[4723]: I0309 14:29:11.589655 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4npck" event={"ID":"fd83abdd-eaa5-4f0c-a068-72957f30efea","Type":"ContainerStarted","Data":"ff066b648584fff2f040038b8a78e7366462a7135de4e1a0d5167b5a3f9a1bca"} Mar 09 14:29:11 crc kubenswrapper[4723]: I0309 14:29:11.881447 4723 scope.go:117] "RemoveContainer" containerID="2763ec92ac5283f817f046f9c8b6c15b52cdcda46dee1aa29bf3bbe464a3180b" Mar 09 14:29:11 crc kubenswrapper[4723]: E0309 14:29:11.881885 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:29:13 crc kubenswrapper[4723]: I0309 14:29:13.611738 4723 generic.go:334] "Generic (PLEG): container finished" podID="fd83abdd-eaa5-4f0c-a068-72957f30efea" containerID="ff066b648584fff2f040038b8a78e7366462a7135de4e1a0d5167b5a3f9a1bca" exitCode=0 Mar 09 14:29:13 crc kubenswrapper[4723]: I0309 14:29:13.611754 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4npck" event={"ID":"fd83abdd-eaa5-4f0c-a068-72957f30efea","Type":"ContainerDied","Data":"ff066b648584fff2f040038b8a78e7366462a7135de4e1a0d5167b5a3f9a1bca"} Mar 09 14:29:14 crc kubenswrapper[4723]: I0309 14:29:14.627378 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4npck" event={"ID":"fd83abdd-eaa5-4f0c-a068-72957f30efea","Type":"ContainerStarted","Data":"dc6e2c1bb518dc67c8b80f624de4a3e80663dd8b57fb5f891baa7e1d6bc8dfd0"} Mar 09 14:29:14 crc kubenswrapper[4723]: I0309 14:29:14.671722 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4npck" podStartSLOduration=3.171238329 podStartE2EDuration="10.671699273s" podCreationTimestamp="2026-03-09 14:29:04 +0000 UTC" firstStartedPulling="2026-03-09 14:29:06.506123997 +0000 UTC m=+5420.520591537" lastFinishedPulling="2026-03-09 14:29:14.006584941 +0000 UTC m=+5428.021052481" observedRunningTime="2026-03-09 14:29:14.653966852 +0000 UTC m=+5428.668434392" watchObservedRunningTime="2026-03-09 14:29:14.671699273 +0000 UTC m=+5428.686166813" Mar 09 14:29:15 crc kubenswrapper[4723]: I0309 14:29:15.120173 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4npck" Mar 09 14:29:15 crc kubenswrapper[4723]: I0309 14:29:15.120458 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4npck" Mar 09 14:29:16 crc kubenswrapper[4723]: I0309 14:29:16.177809 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-4npck" podUID="fd83abdd-eaa5-4f0c-a068-72957f30efea" containerName="registry-server" probeResult="failure" output=< Mar 09 14:29:16 crc kubenswrapper[4723]: timeout: failed to connect service ":50051" within 1s Mar 09 14:29:16 crc kubenswrapper[4723]: > Mar 09 14:29:25 crc kubenswrapper[4723]: I0309 14:29:25.450725 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4npck" Mar 09 14:29:25 crc kubenswrapper[4723]: I0309 14:29:25.516328 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4npck" Mar 09 14:29:25 crc kubenswrapper[4723]: I0309 14:29:25.645940 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4npck"] Mar 09 14:29:25 crc kubenswrapper[4723]: I0309 14:29:25.694696 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h7rg4"] Mar 09 14:29:25 crc kubenswrapper[4723]: I0309 14:29:25.694945 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-h7rg4" podUID="ce90ce22-0632-4cec-bb6e-4c85b78b1833" containerName="registry-server" containerID="cri-o://0cdeed54b1694870ebcffa01f973a3ce5570f9e0c78750f958c981952b47dec2" gracePeriod=2 Mar 09 14:29:25 crc kubenswrapper[4723]: I0309 14:29:25.884877 4723 scope.go:117] "RemoveContainer" containerID="2763ec92ac5283f817f046f9c8b6c15b52cdcda46dee1aa29bf3bbe464a3180b" Mar 09 14:29:25 crc kubenswrapper[4723]: E0309 14:29:25.885741 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:29:26 crc kubenswrapper[4723]: I0309 14:29:26.370621 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h7rg4" Mar 09 14:29:26 crc kubenswrapper[4723]: I0309 14:29:26.468032 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce90ce22-0632-4cec-bb6e-4c85b78b1833-catalog-content\") pod \"ce90ce22-0632-4cec-bb6e-4c85b78b1833\" (UID: \"ce90ce22-0632-4cec-bb6e-4c85b78b1833\") " Mar 09 14:29:26 crc kubenswrapper[4723]: I0309 14:29:26.468236 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce90ce22-0632-4cec-bb6e-4c85b78b1833-utilities\") pod \"ce90ce22-0632-4cec-bb6e-4c85b78b1833\" (UID: \"ce90ce22-0632-4cec-bb6e-4c85b78b1833\") " Mar 09 14:29:26 crc kubenswrapper[4723]: I0309 14:29:26.468347 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5j6w\" (UniqueName: \"kubernetes.io/projected/ce90ce22-0632-4cec-bb6e-4c85b78b1833-kube-api-access-h5j6w\") pod \"ce90ce22-0632-4cec-bb6e-4c85b78b1833\" (UID: \"ce90ce22-0632-4cec-bb6e-4c85b78b1833\") " Mar 09 14:29:26 crc kubenswrapper[4723]: I0309 14:29:26.469479 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce90ce22-0632-4cec-bb6e-4c85b78b1833-utilities" (OuterVolumeSpecName: "utilities") pod "ce90ce22-0632-4cec-bb6e-4c85b78b1833" (UID: "ce90ce22-0632-4cec-bb6e-4c85b78b1833"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:29:26 crc kubenswrapper[4723]: I0309 14:29:26.487675 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce90ce22-0632-4cec-bb6e-4c85b78b1833-kube-api-access-h5j6w" (OuterVolumeSpecName: "kube-api-access-h5j6w") pod "ce90ce22-0632-4cec-bb6e-4c85b78b1833" (UID: "ce90ce22-0632-4cec-bb6e-4c85b78b1833"). InnerVolumeSpecName "kube-api-access-h5j6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:29:26 crc kubenswrapper[4723]: I0309 14:29:26.536810 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce90ce22-0632-4cec-bb6e-4c85b78b1833-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ce90ce22-0632-4cec-bb6e-4c85b78b1833" (UID: "ce90ce22-0632-4cec-bb6e-4c85b78b1833"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:29:26 crc kubenswrapper[4723]: I0309 14:29:26.571225 4723 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce90ce22-0632-4cec-bb6e-4c85b78b1833-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:26 crc kubenswrapper[4723]: I0309 14:29:26.571269 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5j6w\" (UniqueName: \"kubernetes.io/projected/ce90ce22-0632-4cec-bb6e-4c85b78b1833-kube-api-access-h5j6w\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:26 crc kubenswrapper[4723]: I0309 14:29:26.571285 4723 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce90ce22-0632-4cec-bb6e-4c85b78b1833-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:26 crc kubenswrapper[4723]: I0309 14:29:26.763842 4723 generic.go:334] "Generic (PLEG): container finished" podID="ce90ce22-0632-4cec-bb6e-4c85b78b1833" containerID="0cdeed54b1694870ebcffa01f973a3ce5570f9e0c78750f958c981952b47dec2" exitCode=0 Mar 09 14:29:26 crc kubenswrapper[4723]: I0309 14:29:26.764748 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h7rg4" Mar 09 14:29:26 crc kubenswrapper[4723]: I0309 14:29:26.769066 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h7rg4" event={"ID":"ce90ce22-0632-4cec-bb6e-4c85b78b1833","Type":"ContainerDied","Data":"0cdeed54b1694870ebcffa01f973a3ce5570f9e0c78750f958c981952b47dec2"} Mar 09 14:29:26 crc kubenswrapper[4723]: I0309 14:29:26.769132 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h7rg4" event={"ID":"ce90ce22-0632-4cec-bb6e-4c85b78b1833","Type":"ContainerDied","Data":"29b49e3b3007ff9232ae99819133c65150714311c5d179c00f47701e79a8fd64"} Mar 09 14:29:26 crc kubenswrapper[4723]: I0309 14:29:26.769158 4723 scope.go:117] "RemoveContainer" containerID="0cdeed54b1694870ebcffa01f973a3ce5570f9e0c78750f958c981952b47dec2" Mar 09 14:29:26 crc kubenswrapper[4723]: I0309 14:29:26.807733 4723 scope.go:117] "RemoveContainer" containerID="1c419abeee7b7b92e75d4fed67f72c0a0a3f91a80b58671fb1cdc5e79110264d" Mar 09 14:29:26 crc kubenswrapper[4723]: I0309 14:29:26.814773 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h7rg4"] Mar 09 14:29:26 crc kubenswrapper[4723]: I0309 14:29:26.832268 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-h7rg4"] Mar 09 14:29:26 crc kubenswrapper[4723]: I0309 14:29:26.841459 4723 scope.go:117] "RemoveContainer" containerID="263726b0eac4cedf204b7a1163b19a2c29599ee5515969e81a400fdadea3c902" Mar 09 14:29:26 crc kubenswrapper[4723]: I0309 14:29:26.893807 4723 scope.go:117] "RemoveContainer" containerID="0cdeed54b1694870ebcffa01f973a3ce5570f9e0c78750f958c981952b47dec2" Mar 09 14:29:26 crc kubenswrapper[4723]: E0309 14:29:26.899101 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cdeed54b1694870ebcffa01f973a3ce5570f9e0c78750f958c981952b47dec2\": container with ID starting with 0cdeed54b1694870ebcffa01f973a3ce5570f9e0c78750f958c981952b47dec2 not found: ID does not exist" containerID="0cdeed54b1694870ebcffa01f973a3ce5570f9e0c78750f958c981952b47dec2" Mar 09 14:29:26 crc kubenswrapper[4723]: I0309 14:29:26.899146 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cdeed54b1694870ebcffa01f973a3ce5570f9e0c78750f958c981952b47dec2"} err="failed to get container status \"0cdeed54b1694870ebcffa01f973a3ce5570f9e0c78750f958c981952b47dec2\": rpc error: code = NotFound desc = could not find container \"0cdeed54b1694870ebcffa01f973a3ce5570f9e0c78750f958c981952b47dec2\": container with ID starting with 0cdeed54b1694870ebcffa01f973a3ce5570f9e0c78750f958c981952b47dec2 not found: ID does not exist" Mar 09 14:29:26 crc kubenswrapper[4723]: I0309 14:29:26.899173 4723 scope.go:117] "RemoveContainer" containerID="1c419abeee7b7b92e75d4fed67f72c0a0a3f91a80b58671fb1cdc5e79110264d" Mar 09 14:29:26 crc kubenswrapper[4723]: E0309 14:29:26.900238 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c419abeee7b7b92e75d4fed67f72c0a0a3f91a80b58671fb1cdc5e79110264d\": container with ID starting with 1c419abeee7b7b92e75d4fed67f72c0a0a3f91a80b58671fb1cdc5e79110264d not found: ID does not exist" containerID="1c419abeee7b7b92e75d4fed67f72c0a0a3f91a80b58671fb1cdc5e79110264d" Mar 09 14:29:26 crc kubenswrapper[4723]: I0309 14:29:26.900273 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c419abeee7b7b92e75d4fed67f72c0a0a3f91a80b58671fb1cdc5e79110264d"} err="failed to get container status \"1c419abeee7b7b92e75d4fed67f72c0a0a3f91a80b58671fb1cdc5e79110264d\": rpc error: code = NotFound desc = could not find container \"1c419abeee7b7b92e75d4fed67f72c0a0a3f91a80b58671fb1cdc5e79110264d\": container with ID starting with 1c419abeee7b7b92e75d4fed67f72c0a0a3f91a80b58671fb1cdc5e79110264d not found: ID does not exist" Mar 09 14:29:26 crc kubenswrapper[4723]: I0309 14:29:26.900293 4723 scope.go:117] "RemoveContainer" containerID="263726b0eac4cedf204b7a1163b19a2c29599ee5515969e81a400fdadea3c902" Mar 09 14:29:26 crc kubenswrapper[4723]: E0309 14:29:26.900642 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"263726b0eac4cedf204b7a1163b19a2c29599ee5515969e81a400fdadea3c902\": container with ID starting with 263726b0eac4cedf204b7a1163b19a2c29599ee5515969e81a400fdadea3c902 not found: ID does not exist" containerID="263726b0eac4cedf204b7a1163b19a2c29599ee5515969e81a400fdadea3c902" Mar 09 14:29:26 crc kubenswrapper[4723]: I0309 14:29:26.900695 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"263726b0eac4cedf204b7a1163b19a2c29599ee5515969e81a400fdadea3c902"} err="failed to get container status \"263726b0eac4cedf204b7a1163b19a2c29599ee5515969e81a400fdadea3c902\": rpc error: code = NotFound desc = could not find container \"263726b0eac4cedf204b7a1163b19a2c29599ee5515969e81a400fdadea3c902\": container with ID starting with 263726b0eac4cedf204b7a1163b19a2c29599ee5515969e81a400fdadea3c902 not found: ID does not exist" Mar 09 14:29:26 crc kubenswrapper[4723]: I0309 14:29:26.901828 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce90ce22-0632-4cec-bb6e-4c85b78b1833" path="/var/lib/kubelet/pods/ce90ce22-0632-4cec-bb6e-4c85b78b1833/volumes" Mar 09 14:29:37 crc kubenswrapper[4723]: I0309 14:29:37.881257 4723 scope.go:117] "RemoveContainer" containerID="2763ec92ac5283f817f046f9c8b6c15b52cdcda46dee1aa29bf3bbe464a3180b" Mar 09 14:29:37 crc kubenswrapper[4723]: E0309 14:29:37.882067 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:29:48 crc kubenswrapper[4723]: I0309 14:29:48.881663 4723 scope.go:117] "RemoveContainer" containerID="2763ec92ac5283f817f046f9c8b6c15b52cdcda46dee1aa29bf3bbe464a3180b" Mar 09 14:29:48 crc kubenswrapper[4723]: E0309 14:29:48.882623 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:29:57 crc kubenswrapper[4723]: I0309 14:29:57.570882 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lvm2w"] Mar 09 14:29:57 crc kubenswrapper[4723]: E0309 14:29:57.572736 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce90ce22-0632-4cec-bb6e-4c85b78b1833" containerName="registry-server" Mar 09 14:29:57 crc kubenswrapper[4723]: I0309 14:29:57.572829 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce90ce22-0632-4cec-bb6e-4c85b78b1833" containerName="registry-server" Mar 09 14:29:57 crc kubenswrapper[4723]: E0309 14:29:57.572938 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce90ce22-0632-4cec-bb6e-4c85b78b1833" containerName="extract-utilities" Mar 09 14:29:57 crc kubenswrapper[4723]: I0309 14:29:57.573028 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce90ce22-0632-4cec-bb6e-4c85b78b1833" containerName="extract-utilities" Mar 09 14:29:57 crc kubenswrapper[4723]: E0309 14:29:57.573155 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce90ce22-0632-4cec-bb6e-4c85b78b1833" containerName="extract-content" Mar 09 14:29:57 crc kubenswrapper[4723]: I0309 14:29:57.573233 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce90ce22-0632-4cec-bb6e-4c85b78b1833" containerName="extract-content" Mar 09 14:29:57 crc kubenswrapper[4723]: I0309 14:29:57.573567 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce90ce22-0632-4cec-bb6e-4c85b78b1833" containerName="registry-server" Mar 09 14:29:57 crc kubenswrapper[4723]: I0309 14:29:57.575683 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lvm2w" Mar 09 14:29:57 crc kubenswrapper[4723]: I0309 14:29:57.586338 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lvm2w"] Mar 09 14:29:57 crc kubenswrapper[4723]: I0309 14:29:57.743655 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e17b3c66-44be-43e5-9965-1f54658cf79b-utilities\") pod \"redhat-marketplace-lvm2w\" (UID: \"e17b3c66-44be-43e5-9965-1f54658cf79b\") " pod="openshift-marketplace/redhat-marketplace-lvm2w" Mar 09 14:29:57 crc kubenswrapper[4723]: I0309 14:29:57.744204 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx6q2\" (UniqueName: \"kubernetes.io/projected/e17b3c66-44be-43e5-9965-1f54658cf79b-kube-api-access-gx6q2\") pod \"redhat-marketplace-lvm2w\" (UID: \"e17b3c66-44be-43e5-9965-1f54658cf79b\") " pod="openshift-marketplace/redhat-marketplace-lvm2w" Mar 09 14:29:57 crc kubenswrapper[4723]: I0309 14:29:57.744358 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e17b3c66-44be-43e5-9965-1f54658cf79b-catalog-content\") pod \"redhat-marketplace-lvm2w\" (UID: \"e17b3c66-44be-43e5-9965-1f54658cf79b\") " pod="openshift-marketplace/redhat-marketplace-lvm2w" Mar 09 14:29:57 crc kubenswrapper[4723]: I0309 14:29:57.846791 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gx6q2\" (UniqueName: \"kubernetes.io/projected/e17b3c66-44be-43e5-9965-1f54658cf79b-kube-api-access-gx6q2\") pod \"redhat-marketplace-lvm2w\" (UID: \"e17b3c66-44be-43e5-9965-1f54658cf79b\") " pod="openshift-marketplace/redhat-marketplace-lvm2w" Mar 09 14:29:57 crc kubenswrapper[4723]: I0309 14:29:57.846966 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e17b3c66-44be-43e5-9965-1f54658cf79b-catalog-content\") pod \"redhat-marketplace-lvm2w\" (UID: \"e17b3c66-44be-43e5-9965-1f54658cf79b\") " pod="openshift-marketplace/redhat-marketplace-lvm2w" Mar 09 14:29:57 crc kubenswrapper[4723]: I0309 14:29:57.847105 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e17b3c66-44be-43e5-9965-1f54658cf79b-utilities\") pod \"redhat-marketplace-lvm2w\" (UID: \"e17b3c66-44be-43e5-9965-1f54658cf79b\") " pod="openshift-marketplace/redhat-marketplace-lvm2w" Mar 09 14:29:57 crc kubenswrapper[4723]: I0309 14:29:57.847658 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e17b3c66-44be-43e5-9965-1f54658cf79b-utilities\") pod \"redhat-marketplace-lvm2w\" (UID: \"e17b3c66-44be-43e5-9965-1f54658cf79b\") " pod="openshift-marketplace/redhat-marketplace-lvm2w" Mar 09 14:29:57 crc kubenswrapper[4723]: I0309 14:29:57.847659 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e17b3c66-44be-43e5-9965-1f54658cf79b-catalog-content\") pod \"redhat-marketplace-lvm2w\" (UID: \"e17b3c66-44be-43e5-9965-1f54658cf79b\") " pod="openshift-marketplace/redhat-marketplace-lvm2w" Mar 09 14:29:57 crc kubenswrapper[4723]: I0309 14:29:57.865763 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx6q2\" (UniqueName: \"kubernetes.io/projected/e17b3c66-44be-43e5-9965-1f54658cf79b-kube-api-access-gx6q2\") pod \"redhat-marketplace-lvm2w\" (UID: \"e17b3c66-44be-43e5-9965-1f54658cf79b\") " pod="openshift-marketplace/redhat-marketplace-lvm2w" Mar 09 14:29:57 crc kubenswrapper[4723]: I0309 14:29:57.933442 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lvm2w" Mar 09 14:29:58 crc kubenswrapper[4723]: I0309 14:29:58.517441 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lvm2w"] Mar 09 14:29:59 crc kubenswrapper[4723]: I0309 14:29:59.165166 4723 generic.go:334] "Generic (PLEG): container finished" podID="e17b3c66-44be-43e5-9965-1f54658cf79b" containerID="ff6a3cc7ad5763a70542ee49497c02cbd7cb18ce02c8cfee6108db3d8d21e440" exitCode=0 Mar 09 14:29:59 crc kubenswrapper[4723]: I0309 14:29:59.165484 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lvm2w" event={"ID":"e17b3c66-44be-43e5-9965-1f54658cf79b","Type":"ContainerDied","Data":"ff6a3cc7ad5763a70542ee49497c02cbd7cb18ce02c8cfee6108db3d8d21e440"} Mar 09 14:29:59 crc kubenswrapper[4723]: I0309 14:29:59.165516 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lvm2w" event={"ID":"e17b3c66-44be-43e5-9965-1f54658cf79b","Type":"ContainerStarted","Data":"f5f9e4106c6d43f2b55cefe667e5268f55b04897394b2c553b06b19ae0fda990"} Mar 09 14:30:00 crc kubenswrapper[4723]: I0309 14:30:00.153482 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551110-457qr"] Mar 09 14:30:00 crc kubenswrapper[4723]: I0309 14:30:00.155907 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551110-457qr" Mar 09 14:30:00 crc kubenswrapper[4723]: I0309 14:30:00.157370 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jrp4x" Mar 09 14:30:00 crc kubenswrapper[4723]: I0309 14:30:00.157605 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:30:00 crc kubenswrapper[4723]: I0309 14:30:00.161661 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:30:00 crc kubenswrapper[4723]: I0309 14:30:00.168271 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551110-hvwqb"] Mar 09 14:30:00 crc kubenswrapper[4723]: I0309 14:30:00.171898 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-hvwqb" Mar 09 14:30:00 crc kubenswrapper[4723]: I0309 14:30:00.175325 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 14:30:00 crc kubenswrapper[4723]: I0309 14:30:00.181760 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551110-457qr"] Mar 09 14:30:00 crc kubenswrapper[4723]: I0309 14:30:00.185534 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lvm2w" event={"ID":"e17b3c66-44be-43e5-9965-1f54658cf79b","Type":"ContainerStarted","Data":"8b253ca3866c7def88a2099e6392314e1fe14434a0e0134f65c2f92c052ddc64"} Mar 09 14:30:00 crc kubenswrapper[4723]: I0309 14:30:00.186217 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 14:30:00 crc kubenswrapper[4723]: I0309 14:30:00.195075 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551110-hvwqb"] Mar 09 14:30:00 crc kubenswrapper[4723]: I0309 14:30:00.315371 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqqv9\" (UniqueName: \"kubernetes.io/projected/7d9a7751-6aa3-4b5f-a74e-f3cf0e469f3f-kube-api-access-kqqv9\") pod \"auto-csr-approver-29551110-457qr\" (UID: \"7d9a7751-6aa3-4b5f-a74e-f3cf0e469f3f\") " pod="openshift-infra/auto-csr-approver-29551110-457qr" Mar 09 14:30:00 crc kubenswrapper[4723]: I0309 14:30:00.315822 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7ab27c0b-3e4f-4cd9-84ae-959a085cb0a7-config-volume\") pod \"collect-profiles-29551110-hvwqb\" (UID: \"7ab27c0b-3e4f-4cd9-84ae-959a085cb0a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-hvwqb" Mar 09 14:30:00 crc kubenswrapper[4723]: I0309 14:30:00.315955 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqnz5\" (UniqueName: \"kubernetes.io/projected/7ab27c0b-3e4f-4cd9-84ae-959a085cb0a7-kube-api-access-tqnz5\") pod \"collect-profiles-29551110-hvwqb\" (UID: \"7ab27c0b-3e4f-4cd9-84ae-959a085cb0a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-hvwqb" Mar 09 14:30:00 crc kubenswrapper[4723]: I0309 14:30:00.316039 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7ab27c0b-3e4f-4cd9-84ae-959a085cb0a7-secret-volume\") pod \"collect-profiles-29551110-hvwqb\" (UID: \"7ab27c0b-3e4f-4cd9-84ae-959a085cb0a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-hvwqb" Mar 09 14:30:00 crc kubenswrapper[4723]: I0309 14:30:00.418431 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqqv9\" (UniqueName: \"kubernetes.io/projected/7d9a7751-6aa3-4b5f-a74e-f3cf0e469f3f-kube-api-access-kqqv9\") pod \"auto-csr-approver-29551110-457qr\" (UID: \"7d9a7751-6aa3-4b5f-a74e-f3cf0e469f3f\") " pod="openshift-infra/auto-csr-approver-29551110-457qr" Mar 09 14:30:00 crc kubenswrapper[4723]: I0309 14:30:00.418616 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7ab27c0b-3e4f-4cd9-84ae-959a085cb0a7-config-volume\") pod \"collect-profiles-29551110-hvwqb\" (UID: \"7ab27c0b-3e4f-4cd9-84ae-959a085cb0a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-hvwqb" Mar 09 14:30:00 crc kubenswrapper[4723]: I0309 14:30:00.418665 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqnz5\" (UniqueName: \"kubernetes.io/projected/7ab27c0b-3e4f-4cd9-84ae-959a085cb0a7-kube-api-access-tqnz5\") pod \"collect-profiles-29551110-hvwqb\" (UID: \"7ab27c0b-3e4f-4cd9-84ae-959a085cb0a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-hvwqb" Mar 09 14:30:00 crc kubenswrapper[4723]: I0309 14:30:00.418742 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7ab27c0b-3e4f-4cd9-84ae-959a085cb0a7-secret-volume\") pod \"collect-profiles-29551110-hvwqb\" (UID: \"7ab27c0b-3e4f-4cd9-84ae-959a085cb0a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-hvwqb" Mar 09 14:30:00 crc kubenswrapper[4723]: I0309 14:30:00.419627 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7ab27c0b-3e4f-4cd9-84ae-959a085cb0a7-config-volume\") pod \"collect-profiles-29551110-hvwqb\" (UID: \"7ab27c0b-3e4f-4cd9-84ae-959a085cb0a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-hvwqb" Mar 09 14:30:00 crc kubenswrapper[4723]: I0309 14:30:00.424813 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7ab27c0b-3e4f-4cd9-84ae-959a085cb0a7-secret-volume\") pod \"collect-profiles-29551110-hvwqb\" (UID: \"7ab27c0b-3e4f-4cd9-84ae-959a085cb0a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-hvwqb" Mar 09 14:30:00 crc kubenswrapper[4723]: I0309 14:30:00.436084 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqqv9\" (UniqueName: \"kubernetes.io/projected/7d9a7751-6aa3-4b5f-a74e-f3cf0e469f3f-kube-api-access-kqqv9\") pod \"auto-csr-approver-29551110-457qr\" (UID: \"7d9a7751-6aa3-4b5f-a74e-f3cf0e469f3f\") " pod="openshift-infra/auto-csr-approver-29551110-457qr" Mar 09 14:30:00 crc kubenswrapper[4723]: I0309 14:30:00.442367 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqnz5\" (UniqueName: \"kubernetes.io/projected/7ab27c0b-3e4f-4cd9-84ae-959a085cb0a7-kube-api-access-tqnz5\") pod \"collect-profiles-29551110-hvwqb\" (UID: \"7ab27c0b-3e4f-4cd9-84ae-959a085cb0a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-hvwqb" Mar 09 14:30:00 crc kubenswrapper[4723]: I0309 14:30:00.482393 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551110-457qr" Mar 09 14:30:00 crc kubenswrapper[4723]: I0309 14:30:00.500852 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-hvwqb" Mar 09 14:30:01 crc kubenswrapper[4723]: W0309 14:30:01.097678 4723 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d9a7751_6aa3_4b5f_a74e_f3cf0e469f3f.slice/crio-542bd0e53a69cac389972aa1ef864dc556e91f3062b935c3aa4cb8361c8a0db3 WatchSource:0}: Error finding container 542bd0e53a69cac389972aa1ef864dc556e91f3062b935c3aa4cb8361c8a0db3: Status 404 returned error can't find the container with id 542bd0e53a69cac389972aa1ef864dc556e91f3062b935c3aa4cb8361c8a0db3 Mar 09 14:30:01 crc kubenswrapper[4723]: I0309 14:30:01.157876 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551110-457qr"] Mar 09 14:30:01 crc kubenswrapper[4723]: I0309 14:30:01.181756 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551110-hvwqb"] Mar 09 14:30:01 crc kubenswrapper[4723]: I0309 14:30:01.209111 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551110-457qr" event={"ID":"7d9a7751-6aa3-4b5f-a74e-f3cf0e469f3f","Type":"ContainerStarted","Data":"542bd0e53a69cac389972aa1ef864dc556e91f3062b935c3aa4cb8361c8a0db3"} Mar 09 14:30:02 crc kubenswrapper[4723]: I0309 14:30:02.220991 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-hvwqb" event={"ID":"7ab27c0b-3e4f-4cd9-84ae-959a085cb0a7","Type":"ContainerStarted","Data":"cee420aca2590da452336ed1123ef3b03e78636c26909f3107f43a10b65dc493"} Mar 09 14:30:02 crc kubenswrapper[4723]: I0309 14:30:02.221970 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-hvwqb" event={"ID":"7ab27c0b-3e4f-4cd9-84ae-959a085cb0a7","Type":"ContainerStarted","Data":"dc6a9456ef396c49ffe170c481c0367f80ee0800c8ad935ac2de889bf70d9c22"} Mar 09 14:30:02 crc kubenswrapper[4723]: I0309 14:30:02.223525 4723 generic.go:334] "Generic (PLEG): container finished" podID="e17b3c66-44be-43e5-9965-1f54658cf79b" containerID="8b253ca3866c7def88a2099e6392314e1fe14434a0e0134f65c2f92c052ddc64" exitCode=0 Mar 09 14:30:02 crc kubenswrapper[4723]: I0309 14:30:02.223558 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lvm2w" event={"ID":"e17b3c66-44be-43e5-9965-1f54658cf79b","Type":"ContainerDied","Data":"8b253ca3866c7def88a2099e6392314e1fe14434a0e0134f65c2f92c052ddc64"} Mar 09 14:30:02 crc kubenswrapper[4723]: I0309 14:30:02.244064 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-hvwqb" podStartSLOduration=2.24404219 podStartE2EDuration="2.24404219s" podCreationTimestamp="2026-03-09 14:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:30:02.239395476 +0000 UTC m=+5476.253863036" watchObservedRunningTime="2026-03-09 14:30:02.24404219 +0000 UTC m=+5476.258509730" Mar 09 14:30:02 crc kubenswrapper[4723]: I0309 14:30:02.882388 4723 scope.go:117] "RemoveContainer" containerID="2763ec92ac5283f817f046f9c8b6c15b52cdcda46dee1aa29bf3bbe464a3180b" Mar 09 14:30:02 crc kubenswrapper[4723]: E0309 14:30:02.883076 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:30:03 crc kubenswrapper[4723]: I0309 14:30:03.237818 4723 generic.go:334] "Generic (PLEG): container finished" podID="7ab27c0b-3e4f-4cd9-84ae-959a085cb0a7" containerID="cee420aca2590da452336ed1123ef3b03e78636c26909f3107f43a10b65dc493" exitCode=0 Mar 09 14:30:03 crc kubenswrapper[4723]: I0309 14:30:03.237924 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-hvwqb" event={"ID":"7ab27c0b-3e4f-4cd9-84ae-959a085cb0a7","Type":"ContainerDied","Data":"cee420aca2590da452336ed1123ef3b03e78636c26909f3107f43a10b65dc493"} Mar 09 14:30:04 crc kubenswrapper[4723]: I0309 14:30:04.732326 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-hvwqb" Mar 09 14:30:04 crc kubenswrapper[4723]: I0309 14:30:04.857919 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7ab27c0b-3e4f-4cd9-84ae-959a085cb0a7-config-volume\") pod \"7ab27c0b-3e4f-4cd9-84ae-959a085cb0a7\" (UID: \"7ab27c0b-3e4f-4cd9-84ae-959a085cb0a7\") " Mar 09 14:30:04 crc kubenswrapper[4723]: I0309 14:30:04.858080 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7ab27c0b-3e4f-4cd9-84ae-959a085cb0a7-secret-volume\") pod \"7ab27c0b-3e4f-4cd9-84ae-959a085cb0a7\" (UID: \"7ab27c0b-3e4f-4cd9-84ae-959a085cb0a7\") " Mar 09 14:30:04 crc kubenswrapper[4723]: I0309 14:30:04.858182 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqnz5\" (UniqueName: \"kubernetes.io/projected/7ab27c0b-3e4f-4cd9-84ae-959a085cb0a7-kube-api-access-tqnz5\") pod \"7ab27c0b-3e4f-4cd9-84ae-959a085cb0a7\" (UID: \"7ab27c0b-3e4f-4cd9-84ae-959a085cb0a7\") " Mar 09 14:30:04 crc kubenswrapper[4723]: I0309 14:30:04.859735 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ab27c0b-3e4f-4cd9-84ae-959a085cb0a7-config-volume" (OuterVolumeSpecName: "config-volume") pod "7ab27c0b-3e4f-4cd9-84ae-959a085cb0a7" (UID: "7ab27c0b-3e4f-4cd9-84ae-959a085cb0a7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:30:04 crc kubenswrapper[4723]: I0309 14:30:04.867415 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ab27c0b-3e4f-4cd9-84ae-959a085cb0a7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7ab27c0b-3e4f-4cd9-84ae-959a085cb0a7" (UID: "7ab27c0b-3e4f-4cd9-84ae-959a085cb0a7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:30:04 crc kubenswrapper[4723]: I0309 14:30:04.868085 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ab27c0b-3e4f-4cd9-84ae-959a085cb0a7-kube-api-access-tqnz5" (OuterVolumeSpecName: "kube-api-access-tqnz5") pod "7ab27c0b-3e4f-4cd9-84ae-959a085cb0a7" (UID: "7ab27c0b-3e4f-4cd9-84ae-959a085cb0a7"). InnerVolumeSpecName "kube-api-access-tqnz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:30:04 crc kubenswrapper[4723]: I0309 14:30:04.961606 4723 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7ab27c0b-3e4f-4cd9-84ae-959a085cb0a7-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:04 crc kubenswrapper[4723]: I0309 14:30:04.961662 4723 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7ab27c0b-3e4f-4cd9-84ae-959a085cb0a7-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:04 crc kubenswrapper[4723]: I0309 14:30:04.961676 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqnz5\" (UniqueName: \"kubernetes.io/projected/7ab27c0b-3e4f-4cd9-84ae-959a085cb0a7-kube-api-access-tqnz5\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:05 crc kubenswrapper[4723]: I0309 14:30:05.260307 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-hvwqb" Mar 09 14:30:05 crc kubenswrapper[4723]: I0309 14:30:05.260449 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-hvwqb" event={"ID":"7ab27c0b-3e4f-4cd9-84ae-959a085cb0a7","Type":"ContainerDied","Data":"dc6a9456ef396c49ffe170c481c0367f80ee0800c8ad935ac2de889bf70d9c22"} Mar 09 14:30:05 crc kubenswrapper[4723]: I0309 14:30:05.260738 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc6a9456ef396c49ffe170c481c0367f80ee0800c8ad935ac2de889bf70d9c22" Mar 09 14:30:05 crc kubenswrapper[4723]: I0309 14:30:05.263615 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551110-457qr" event={"ID":"7d9a7751-6aa3-4b5f-a74e-f3cf0e469f3f","Type":"ContainerStarted","Data":"18aae67cce9201b9f67d9c541b78349b0c793e62744381c5c05530bb7c8eb8c4"} Mar 09 14:30:05 crc kubenswrapper[4723]: I0309 14:30:05.294963 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551110-457qr" podStartSLOduration=2.009116773 podStartE2EDuration="5.294935972s" podCreationTimestamp="2026-03-09 14:30:00 +0000 UTC" firstStartedPulling="2026-03-09 14:30:01.102005095 +0000 UTC m=+5475.116472635" lastFinishedPulling="2026-03-09 14:30:04.387824294 +0000 UTC m=+5478.402291834" observedRunningTime="2026-03-09 14:30:05.279728107 +0000 UTC m=+5479.294195647" watchObservedRunningTime="2026-03-09 14:30:05.294935972 +0000 UTC m=+5479.309403512" Mar 09 14:30:05 crc kubenswrapper[4723]: I0309 14:30:05.324541 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551065-8m792"] Mar 09 14:30:05 crc kubenswrapper[4723]: I0309 14:30:05.336118 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551065-8m792"] Mar 09 14:30:06 crc kubenswrapper[4723]: I0309 14:30:06.295850 4723 generic.go:334] "Generic (PLEG): container finished" podID="7d9a7751-6aa3-4b5f-a74e-f3cf0e469f3f" containerID="18aae67cce9201b9f67d9c541b78349b0c793e62744381c5c05530bb7c8eb8c4" exitCode=0 Mar 09 14:30:06 crc kubenswrapper[4723]: I0309 14:30:06.295971 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551110-457qr" event={"ID":"7d9a7751-6aa3-4b5f-a74e-f3cf0e469f3f","Type":"ContainerDied","Data":"18aae67cce9201b9f67d9c541b78349b0c793e62744381c5c05530bb7c8eb8c4"} Mar 09 14:30:06 crc kubenswrapper[4723]: I0309 14:30:06.899096 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9125e69b-8a83-46bc-9f9b-d23390153693" path="/var/lib/kubelet/pods/9125e69b-8a83-46bc-9f9b-d23390153693/volumes" Mar 09 14:30:07 crc kubenswrapper[4723]: I0309 14:30:07.772723 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551110-457qr" Mar 09 14:30:07 crc kubenswrapper[4723]: I0309 14:30:07.930416 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqqv9\" (UniqueName: \"kubernetes.io/projected/7d9a7751-6aa3-4b5f-a74e-f3cf0e469f3f-kube-api-access-kqqv9\") pod \"7d9a7751-6aa3-4b5f-a74e-f3cf0e469f3f\" (UID: \"7d9a7751-6aa3-4b5f-a74e-f3cf0e469f3f\") " Mar 09 14:30:07 crc kubenswrapper[4723]: I0309 14:30:07.936670 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d9a7751-6aa3-4b5f-a74e-f3cf0e469f3f-kube-api-access-kqqv9" (OuterVolumeSpecName: "kube-api-access-kqqv9") pod "7d9a7751-6aa3-4b5f-a74e-f3cf0e469f3f" (UID: "7d9a7751-6aa3-4b5f-a74e-f3cf0e469f3f"). InnerVolumeSpecName "kube-api-access-kqqv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:30:08 crc kubenswrapper[4723]: I0309 14:30:08.033892 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqqv9\" (UniqueName: \"kubernetes.io/projected/7d9a7751-6aa3-4b5f-a74e-f3cf0e469f3f-kube-api-access-kqqv9\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:08 crc kubenswrapper[4723]: I0309 14:30:08.321105 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lvm2w" event={"ID":"e17b3c66-44be-43e5-9965-1f54658cf79b","Type":"ContainerStarted","Data":"a3e2c91c8a59cf3e4cdca3c7a54c31130d7d44740a618c6e4c4a1bf531d2d057"} Mar 09 14:30:08 crc kubenswrapper[4723]: I0309 14:30:08.322934 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551110-457qr" Mar 09 14:30:08 crc kubenswrapper[4723]: I0309 14:30:08.322969 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551110-457qr" event={"ID":"7d9a7751-6aa3-4b5f-a74e-f3cf0e469f3f","Type":"ContainerDied","Data":"542bd0e53a69cac389972aa1ef864dc556e91f3062b935c3aa4cb8361c8a0db3"} Mar 09 14:30:08 crc kubenswrapper[4723]: I0309 14:30:08.322997 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="542bd0e53a69cac389972aa1ef864dc556e91f3062b935c3aa4cb8361c8a0db3" Mar 09 14:30:08 crc kubenswrapper[4723]: I0309 14:30:08.355042 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551104-h5bhz"] Mar 09 14:30:08 crc kubenswrapper[4723]: I0309 14:30:08.366570 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551104-h5bhz"] Mar 09 14:30:08 crc kubenswrapper[4723]: I0309 14:30:08.373576 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lvm2w" podStartSLOduration=3.218621148 podStartE2EDuration="11.373555802s" podCreationTimestamp="2026-03-09 14:29:57 +0000 UTC" firstStartedPulling="2026-03-09 14:29:59.168142892 +0000 UTC m=+5473.182610422" lastFinishedPulling="2026-03-09 14:30:07.323077536 +0000 UTC m=+5481.337545076" observedRunningTime="2026-03-09 14:30:08.352516561 +0000 UTC m=+5482.366984101" watchObservedRunningTime="2026-03-09 14:30:08.373555802 +0000 UTC m=+5482.388023342" Mar 09 14:30:08 crc kubenswrapper[4723]: I0309 14:30:08.896742 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d01f5e3-e2bb-47ac-ae93-5588acc2c783" path="/var/lib/kubelet/pods/6d01f5e3-e2bb-47ac-ae93-5588acc2c783/volumes" Mar 09 14:30:15 crc kubenswrapper[4723]: I0309 14:30:15.881792 4723 scope.go:117] "RemoveContainer" containerID="2763ec92ac5283f817f046f9c8b6c15b52cdcda46dee1aa29bf3bbe464a3180b" Mar 09 14:30:15 crc kubenswrapper[4723]: E0309 14:30:15.882696 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:30:17 crc kubenswrapper[4723]: I0309 14:30:17.934693 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lvm2w" Mar 09 14:30:17 crc kubenswrapper[4723]: I0309 14:30:17.935008 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lvm2w" Mar 09 14:30:17 crc kubenswrapper[4723]: I0309 14:30:17.996155 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lvm2w" Mar 09 14:30:18 crc kubenswrapper[4723]: I0309 14:30:18.483017 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lvm2w" Mar 09 14:30:18 crc kubenswrapper[4723]: I0309 14:30:18.531409 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lvm2w"] Mar 09 14:30:20 crc kubenswrapper[4723]: I0309 14:30:20.450125 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lvm2w" podUID="e17b3c66-44be-43e5-9965-1f54658cf79b" containerName="registry-server" containerID="cri-o://a3e2c91c8a59cf3e4cdca3c7a54c31130d7d44740a618c6e4c4a1bf531d2d057" gracePeriod=2 Mar 09 14:30:21 crc kubenswrapper[4723]: I0309 14:30:21.053215 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lvm2w" Mar 09 14:30:21 crc kubenswrapper[4723]: I0309 14:30:21.161006 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e17b3c66-44be-43e5-9965-1f54658cf79b-catalog-content\") pod \"e17b3c66-44be-43e5-9965-1f54658cf79b\" (UID: \"e17b3c66-44be-43e5-9965-1f54658cf79b\") " Mar 09 14:30:21 crc kubenswrapper[4723]: I0309 14:30:21.161176 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gx6q2\" (UniqueName: \"kubernetes.io/projected/e17b3c66-44be-43e5-9965-1f54658cf79b-kube-api-access-gx6q2\") pod \"e17b3c66-44be-43e5-9965-1f54658cf79b\" (UID: \"e17b3c66-44be-43e5-9965-1f54658cf79b\") " Mar 09 14:30:21 crc kubenswrapper[4723]: I0309 14:30:21.161284 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e17b3c66-44be-43e5-9965-1f54658cf79b-utilities\") pod \"e17b3c66-44be-43e5-9965-1f54658cf79b\" (UID: \"e17b3c66-44be-43e5-9965-1f54658cf79b\") " Mar 09 14:30:21 crc kubenswrapper[4723]: I0309 14:30:21.163113 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e17b3c66-44be-43e5-9965-1f54658cf79b-utilities" (OuterVolumeSpecName: "utilities") pod "e17b3c66-44be-43e5-9965-1f54658cf79b" (UID: "e17b3c66-44be-43e5-9965-1f54658cf79b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:30:21 crc kubenswrapper[4723]: I0309 14:30:21.180203 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e17b3c66-44be-43e5-9965-1f54658cf79b-kube-api-access-gx6q2" (OuterVolumeSpecName: "kube-api-access-gx6q2") pod "e17b3c66-44be-43e5-9965-1f54658cf79b" (UID: "e17b3c66-44be-43e5-9965-1f54658cf79b"). InnerVolumeSpecName "kube-api-access-gx6q2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:30:21 crc kubenswrapper[4723]: I0309 14:30:21.194303 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e17b3c66-44be-43e5-9965-1f54658cf79b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e17b3c66-44be-43e5-9965-1f54658cf79b" (UID: "e17b3c66-44be-43e5-9965-1f54658cf79b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:30:21 crc kubenswrapper[4723]: I0309 14:30:21.264593 4723 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e17b3c66-44be-43e5-9965-1f54658cf79b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:21 crc kubenswrapper[4723]: I0309 14:30:21.264640 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gx6q2\" (UniqueName: \"kubernetes.io/projected/e17b3c66-44be-43e5-9965-1f54658cf79b-kube-api-access-gx6q2\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:21 crc kubenswrapper[4723]: I0309 14:30:21.264655 4723 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e17b3c66-44be-43e5-9965-1f54658cf79b-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:21 crc kubenswrapper[4723]: I0309 14:30:21.461591 4723 generic.go:334] "Generic (PLEG): container finished" podID="e17b3c66-44be-43e5-9965-1f54658cf79b" containerID="a3e2c91c8a59cf3e4cdca3c7a54c31130d7d44740a618c6e4c4a1bf531d2d057" exitCode=0 Mar 09 14:30:21 crc kubenswrapper[4723]: I0309 14:30:21.461641 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lvm2w" event={"ID":"e17b3c66-44be-43e5-9965-1f54658cf79b","Type":"ContainerDied","Data":"a3e2c91c8a59cf3e4cdca3c7a54c31130d7d44740a618c6e4c4a1bf531d2d057"} Mar 09 14:30:21 crc kubenswrapper[4723]: I0309 14:30:21.461671 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lvm2w" event={"ID":"e17b3c66-44be-43e5-9965-1f54658cf79b","Type":"ContainerDied","Data":"f5f9e4106c6d43f2b55cefe667e5268f55b04897394b2c553b06b19ae0fda990"} Mar 09 14:30:21 crc kubenswrapper[4723]: I0309 14:30:21.461692 4723 scope.go:117] "RemoveContainer" containerID="a3e2c91c8a59cf3e4cdca3c7a54c31130d7d44740a618c6e4c4a1bf531d2d057" Mar 09 14:30:21 crc kubenswrapper[4723]: I0309 14:30:21.462847 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lvm2w" Mar 09 14:30:21 crc kubenswrapper[4723]: I0309 14:30:21.483516 4723 scope.go:117] "RemoveContainer" containerID="8b253ca3866c7def88a2099e6392314e1fe14434a0e0134f65c2f92c052ddc64" Mar 09 14:30:21 crc kubenswrapper[4723]: I0309 14:30:21.513787 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lvm2w"] Mar 09 14:30:21 crc kubenswrapper[4723]: I0309 14:30:21.523582 4723 scope.go:117] "RemoveContainer" containerID="ff6a3cc7ad5763a70542ee49497c02cbd7cb18ce02c8cfee6108db3d8d21e440" Mar 09 14:30:21 crc kubenswrapper[4723]: I0309 14:30:21.525498 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lvm2w"] Mar 09 14:30:21 crc kubenswrapper[4723]: I0309 14:30:21.569217 4723 scope.go:117] "RemoveContainer" containerID="a3e2c91c8a59cf3e4cdca3c7a54c31130d7d44740a618c6e4c4a1bf531d2d057" Mar 09 14:30:21 crc kubenswrapper[4723]: E0309 14:30:21.569998 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3e2c91c8a59cf3e4cdca3c7a54c31130d7d44740a618c6e4c4a1bf531d2d057\": container with ID starting with a3e2c91c8a59cf3e4cdca3c7a54c31130d7d44740a618c6e4c4a1bf531d2d057 not found: ID does not exist" containerID="a3e2c91c8a59cf3e4cdca3c7a54c31130d7d44740a618c6e4c4a1bf531d2d057" Mar 09 14:30:21 crc kubenswrapper[4723]: I0309 14:30:21.570051 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3e2c91c8a59cf3e4cdca3c7a54c31130d7d44740a618c6e4c4a1bf531d2d057"} err="failed to get container status \"a3e2c91c8a59cf3e4cdca3c7a54c31130d7d44740a618c6e4c4a1bf531d2d057\": rpc error: code = NotFound desc = could not find container \"a3e2c91c8a59cf3e4cdca3c7a54c31130d7d44740a618c6e4c4a1bf531d2d057\": container with ID starting with a3e2c91c8a59cf3e4cdca3c7a54c31130d7d44740a618c6e4c4a1bf531d2d057 not found: ID does not exist" Mar 09 14:30:21 crc kubenswrapper[4723]: I0309 14:30:21.570083 4723 scope.go:117] "RemoveContainer" containerID="8b253ca3866c7def88a2099e6392314e1fe14434a0e0134f65c2f92c052ddc64" Mar 09 14:30:21 crc kubenswrapper[4723]: E0309 14:30:21.572056 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b253ca3866c7def88a2099e6392314e1fe14434a0e0134f65c2f92c052ddc64\": container with ID starting with 8b253ca3866c7def88a2099e6392314e1fe14434a0e0134f65c2f92c052ddc64 not found: ID does not exist" containerID="8b253ca3866c7def88a2099e6392314e1fe14434a0e0134f65c2f92c052ddc64" Mar 09 14:30:21 crc kubenswrapper[4723]: I0309 14:30:21.572128 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b253ca3866c7def88a2099e6392314e1fe14434a0e0134f65c2f92c052ddc64"} err="failed to get container status \"8b253ca3866c7def88a2099e6392314e1fe14434a0e0134f65c2f92c052ddc64\": rpc error: code = NotFound desc = could not find container \"8b253ca3866c7def88a2099e6392314e1fe14434a0e0134f65c2f92c052ddc64\": container with ID starting with 8b253ca3866c7def88a2099e6392314e1fe14434a0e0134f65c2f92c052ddc64 not found: ID does not exist" Mar 09 14:30:21 crc kubenswrapper[4723]: I0309 14:30:21.572154 4723 scope.go:117] "RemoveContainer" containerID="ff6a3cc7ad5763a70542ee49497c02cbd7cb18ce02c8cfee6108db3d8d21e440" Mar 09 14:30:21 crc kubenswrapper[4723]: E0309 14:30:21.572607 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff6a3cc7ad5763a70542ee49497c02cbd7cb18ce02c8cfee6108db3d8d21e440\": container with ID starting with ff6a3cc7ad5763a70542ee49497c02cbd7cb18ce02c8cfee6108db3d8d21e440 not found: ID does not exist" containerID="ff6a3cc7ad5763a70542ee49497c02cbd7cb18ce02c8cfee6108db3d8d21e440" Mar 09 14:30:21 crc kubenswrapper[4723]: I0309 14:30:21.572653 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff6a3cc7ad5763a70542ee49497c02cbd7cb18ce02c8cfee6108db3d8d21e440"} err="failed to get container status \"ff6a3cc7ad5763a70542ee49497c02cbd7cb18ce02c8cfee6108db3d8d21e440\": rpc error: code = NotFound desc = could not find container \"ff6a3cc7ad5763a70542ee49497c02cbd7cb18ce02c8cfee6108db3d8d21e440\": container with ID starting with ff6a3cc7ad5763a70542ee49497c02cbd7cb18ce02c8cfee6108db3d8d21e440 not found: ID does not exist" Mar 09 14:30:22 crc kubenswrapper[4723]: I0309 14:30:22.899650 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e17b3c66-44be-43e5-9965-1f54658cf79b" path="/var/lib/kubelet/pods/e17b3c66-44be-43e5-9965-1f54658cf79b/volumes" Mar 09 14:30:27 crc kubenswrapper[4723]: I0309 14:30:27.881380 4723 scope.go:117] "RemoveContainer" containerID="2763ec92ac5283f817f046f9c8b6c15b52cdcda46dee1aa29bf3bbe464a3180b" Mar 09 14:30:27 crc kubenswrapper[4723]: E0309 14:30:27.882180 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:30:30 crc kubenswrapper[4723]: I0309 14:30:30.369748 4723 scope.go:117] "RemoveContainer" containerID="540c229dfc51d5227ad146e94b46462c0af7277540b21fc180e4c43f62c21f77" Mar 09 14:30:30 crc kubenswrapper[4723]: I0309 14:30:30.429350 4723 scope.go:117] "RemoveContainer" containerID="24537ded5078dc0bf8c927aeb0212378e10cd41dcb3eb2d9b31b1c93eb8fce85" Mar 09 14:30:41 crc kubenswrapper[4723]: I0309 14:30:41.881521 4723 scope.go:117] "RemoveContainer" containerID="2763ec92ac5283f817f046f9c8b6c15b52cdcda46dee1aa29bf3bbe464a3180b" Mar 09 14:30:41 crc kubenswrapper[4723]: E0309 14:30:41.882458 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:30:50 crc kubenswrapper[4723]: I0309 14:30:50.798829 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mkth5"] Mar 09 14:30:50 crc kubenswrapper[4723]: E0309 14:30:50.799752 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e17b3c66-44be-43e5-9965-1f54658cf79b" containerName="extract-content" Mar 09 14:30:50 crc kubenswrapper[4723]: I0309 14:30:50.799766 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="e17b3c66-44be-43e5-9965-1f54658cf79b" containerName="extract-content" Mar 09 14:30:50 crc kubenswrapper[4723]: E0309 14:30:50.799792 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e17b3c66-44be-43e5-9965-1f54658cf79b" containerName="registry-server" Mar 09 14:30:50 crc kubenswrapper[4723]: I0309 14:30:50.799799 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="e17b3c66-44be-43e5-9965-1f54658cf79b" containerName="registry-server" Mar 09 14:30:50 crc kubenswrapper[4723]: E0309 14:30:50.799813 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d9a7751-6aa3-4b5f-a74e-f3cf0e469f3f" containerName="oc" Mar 09 14:30:50 crc kubenswrapper[4723]: I0309 14:30:50.799821 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d9a7751-6aa3-4b5f-a74e-f3cf0e469f3f" containerName="oc" Mar 09 14:30:50 crc kubenswrapper[4723]: E0309 14:30:50.799840 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e17b3c66-44be-43e5-9965-1f54658cf79b" containerName="extract-utilities" Mar 09 14:30:50 crc kubenswrapper[4723]: I0309 14:30:50.799846 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="e17b3c66-44be-43e5-9965-1f54658cf79b" containerName="extract-utilities" Mar 09 14:30:50 crc kubenswrapper[4723]: E0309 14:30:50.799853 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ab27c0b-3e4f-4cd9-84ae-959a085cb0a7" containerName="collect-profiles" Mar 09 14:30:50 crc kubenswrapper[4723]: I0309 14:30:50.799874 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ab27c0b-3e4f-4cd9-84ae-959a085cb0a7" containerName="collect-profiles" Mar 09 14:30:50 crc kubenswrapper[4723]: I0309 14:30:50.800076 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="e17b3c66-44be-43e5-9965-1f54658cf79b" containerName="registry-server" Mar 09 14:30:50 crc kubenswrapper[4723]: I0309 14:30:50.800101 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d9a7751-6aa3-4b5f-a74e-f3cf0e469f3f" containerName="oc" Mar 09 14:30:50 crc kubenswrapper[4723]: I0309 14:30:50.800113 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ab27c0b-3e4f-4cd9-84ae-959a085cb0a7" containerName="collect-profiles" Mar 09 14:30:50 crc kubenswrapper[4723]: I0309 14:30:50.801679 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mkth5" Mar 09 14:30:50 crc kubenswrapper[4723]: I0309 14:30:50.825635 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mkth5"] Mar 09 14:30:50 crc kubenswrapper[4723]: I0309 14:30:50.919053 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee517e1d-7784-48a3-9e46-40ce568f7bf1-catalog-content\") pod \"certified-operators-mkth5\" (UID: \"ee517e1d-7784-48a3-9e46-40ce568f7bf1\") " pod="openshift-marketplace/certified-operators-mkth5" Mar 09 14:30:50 crc kubenswrapper[4723]: I0309 14:30:50.919169 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx5gf\" (UniqueName: \"kubernetes.io/projected/ee517e1d-7784-48a3-9e46-40ce568f7bf1-kube-api-access-xx5gf\") pod \"certified-operators-mkth5\" (UID: \"ee517e1d-7784-48a3-9e46-40ce568f7bf1\") " pod="openshift-marketplace/certified-operators-mkth5" Mar 09 14:30:50 crc kubenswrapper[4723]: I0309 14:30:50.919260 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee517e1d-7784-48a3-9e46-40ce568f7bf1-utilities\") pod \"certified-operators-mkth5\" (UID: \"ee517e1d-7784-48a3-9e46-40ce568f7bf1\") " pod="openshift-marketplace/certified-operators-mkth5" Mar 09 14:30:51 crc kubenswrapper[4723]: I0309 14:30:51.021097 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee517e1d-7784-48a3-9e46-40ce568f7bf1-catalog-content\") pod \"certified-operators-mkth5\" (UID: \"ee517e1d-7784-48a3-9e46-40ce568f7bf1\") " pod="openshift-marketplace/certified-operators-mkth5" Mar 09 14:30:51 crc kubenswrapper[4723]: I0309 14:30:51.021200 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx5gf\" (UniqueName: \"kubernetes.io/projected/ee517e1d-7784-48a3-9e46-40ce568f7bf1-kube-api-access-xx5gf\") pod \"certified-operators-mkth5\" (UID: \"ee517e1d-7784-48a3-9e46-40ce568f7bf1\") " pod="openshift-marketplace/certified-operators-mkth5" Mar 09 14:30:51 crc kubenswrapper[4723]: I0309 14:30:51.021299 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee517e1d-7784-48a3-9e46-40ce568f7bf1-utilities\") pod \"certified-operators-mkth5\" (UID: \"ee517e1d-7784-48a3-9e46-40ce568f7bf1\") " pod="openshift-marketplace/certified-operators-mkth5" Mar 09 14:30:51 crc kubenswrapper[4723]: I0309 14:30:51.022213 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee517e1d-7784-48a3-9e46-40ce568f7bf1-catalog-content\") pod \"certified-operators-mkth5\" (UID: \"ee517e1d-7784-48a3-9e46-40ce568f7bf1\") " pod="openshift-marketplace/certified-operators-mkth5" Mar 09 14:30:51 crc kubenswrapper[4723]: I0309 14:30:51.022232 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee517e1d-7784-48a3-9e46-40ce568f7bf1-utilities\") pod \"certified-operators-mkth5\" (UID: \"ee517e1d-7784-48a3-9e46-40ce568f7bf1\") " pod="openshift-marketplace/certified-operators-mkth5" Mar 09 14:30:51 crc kubenswrapper[4723]: I0309 14:30:51.054900 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx5gf\" (UniqueName: \"kubernetes.io/projected/ee517e1d-7784-48a3-9e46-40ce568f7bf1-kube-api-access-xx5gf\") pod \"certified-operators-mkth5\" (UID: \"ee517e1d-7784-48a3-9e46-40ce568f7bf1\") " pod="openshift-marketplace/certified-operators-mkth5" Mar 09 14:30:51 crc kubenswrapper[4723]: I0309 14:30:51.139068 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mkth5" Mar 09 14:30:51 crc kubenswrapper[4723]: I0309 14:30:51.648636 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mkth5"] Mar 09 14:30:51 crc kubenswrapper[4723]: I0309 14:30:51.813774 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mkth5" event={"ID":"ee517e1d-7784-48a3-9e46-40ce568f7bf1","Type":"ContainerStarted","Data":"d9203578ecddf8ae86cc33dcbee29d79217d72c032840a347e1d0fba110fafa3"} Mar 09 14:30:52 crc kubenswrapper[4723]: I0309 14:30:52.835608 4723 generic.go:334] "Generic (PLEG): container finished" podID="ee517e1d-7784-48a3-9e46-40ce568f7bf1" containerID="bae6f425e28fe2567ecb92e5fb6795d189304d8d107bec0d4864ef5807ab5ae0" exitCode=0 Mar 09 14:30:52 crc kubenswrapper[4723]: I0309 14:30:52.835698 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mkth5" event={"ID":"ee517e1d-7784-48a3-9e46-40ce568f7bf1","Type":"ContainerDied","Data":"bae6f425e28fe2567ecb92e5fb6795d189304d8d107bec0d4864ef5807ab5ae0"} Mar 09 14:30:53 crc kubenswrapper[4723]: I0309 14:30:53.882006 4723 scope.go:117] "RemoveContainer" containerID="2763ec92ac5283f817f046f9c8b6c15b52cdcda46dee1aa29bf3bbe464a3180b" Mar 09 14:30:53 crc kubenswrapper[4723]: E0309 14:30:53.882503 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:30:54 crc kubenswrapper[4723]: I0309 14:30:54.860316 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mkth5" event={"ID":"ee517e1d-7784-48a3-9e46-40ce568f7bf1","Type":"ContainerStarted","Data":"bac7dad6ad8d516d50a6f1228fc8677a1c153be2a43b94eb036c9305c05ee1d1"} Mar 09 14:30:56 crc kubenswrapper[4723]: I0309 14:30:56.881802 4723 generic.go:334] "Generic (PLEG): container finished" podID="ee517e1d-7784-48a3-9e46-40ce568f7bf1" containerID="bac7dad6ad8d516d50a6f1228fc8677a1c153be2a43b94eb036c9305c05ee1d1" exitCode=0 Mar 09 14:30:56 crc kubenswrapper[4723]: I0309 14:30:56.909121 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mkth5" event={"ID":"ee517e1d-7784-48a3-9e46-40ce568f7bf1","Type":"ContainerDied","Data":"bac7dad6ad8d516d50a6f1228fc8677a1c153be2a43b94eb036c9305c05ee1d1"} Mar 09 14:30:58 crc kubenswrapper[4723]: I0309 14:30:58.911620 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mkth5" event={"ID":"ee517e1d-7784-48a3-9e46-40ce568f7bf1","Type":"ContainerStarted","Data":"6c102853755567674826e11b5f9b6671f798fcec5ef8c918198f05d9d47bcc8f"} Mar 09 14:30:58 crc kubenswrapper[4723]: I0309 14:30:58.941639 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mkth5" podStartSLOduration=3.425690077 podStartE2EDuration="8.94161376s" podCreationTimestamp="2026-03-09 14:30:50 +0000 UTC" firstStartedPulling="2026-03-09 14:30:52.837660907 +0000 UTC m=+5526.852128447" lastFinishedPulling="2026-03-09 14:30:58.35358459 +0000 UTC m=+5532.368052130" observedRunningTime="2026-03-09 14:30:58.932795385 +0000 UTC m=+5532.947262925" watchObservedRunningTime="2026-03-09 14:30:58.94161376 +0000 UTC m=+5532.956081300" Mar 09 14:31:01 crc kubenswrapper[4723]: I0309 14:31:01.140381 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mkth5" Mar 09 14:31:01 crc kubenswrapper[4723]: I0309 14:31:01.140974 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mkth5" Mar 09 14:31:02 crc kubenswrapper[4723]: I0309 14:31:02.188093 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-mkth5" podUID="ee517e1d-7784-48a3-9e46-40ce568f7bf1" containerName="registry-server" probeResult="failure" output=< Mar 09 14:31:02 crc kubenswrapper[4723]: timeout: failed to connect service ":50051" within 1s Mar 09 14:31:02 crc kubenswrapper[4723]: > Mar 09 14:31:07 crc kubenswrapper[4723]: I0309 14:31:07.881548 4723 scope.go:117] "RemoveContainer" containerID="2763ec92ac5283f817f046f9c8b6c15b52cdcda46dee1aa29bf3bbe464a3180b" Mar 09 14:31:07 crc kubenswrapper[4723]: E0309 14:31:07.882388 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:31:12 crc kubenswrapper[4723]: I0309 14:31:12.191177 4723 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-mkth5" podUID="ee517e1d-7784-48a3-9e46-40ce568f7bf1" containerName="registry-server" probeResult="failure" output=< Mar 09 14:31:12 crc kubenswrapper[4723]: timeout: failed to connect service ":50051" within 1s Mar 09 14:31:12 crc kubenswrapper[4723]: > Mar 09 14:31:18 crc kubenswrapper[4723]: I0309 14:31:18.881360 4723 scope.go:117] "RemoveContainer" containerID="2763ec92ac5283f817f046f9c8b6c15b52cdcda46dee1aa29bf3bbe464a3180b" Mar 09 14:31:18 crc kubenswrapper[4723]: E0309 14:31:18.882542 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:31:21 crc kubenswrapper[4723]: I0309 14:31:21.200576 4723 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mkth5" Mar 09 14:31:21 crc kubenswrapper[4723]: I0309 14:31:21.262535 4723 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mkth5" Mar 09 14:31:22 crc kubenswrapper[4723]: I0309 14:31:22.003974 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mkth5"] Mar 09 14:31:23 crc kubenswrapper[4723]: I0309 14:31:23.178256 4723 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mkth5" podUID="ee517e1d-7784-48a3-9e46-40ce568f7bf1" containerName="registry-server" containerID="cri-o://6c102853755567674826e11b5f9b6671f798fcec5ef8c918198f05d9d47bcc8f" gracePeriod=2 Mar 09 14:31:23 crc kubenswrapper[4723]: I0309 14:31:23.817964 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mkth5" Mar 09 14:31:23 crc kubenswrapper[4723]: I0309 14:31:23.955436 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee517e1d-7784-48a3-9e46-40ce568f7bf1-catalog-content\") pod \"ee517e1d-7784-48a3-9e46-40ce568f7bf1\" (UID: \"ee517e1d-7784-48a3-9e46-40ce568f7bf1\") " Mar 09 14:31:23 crc kubenswrapper[4723]: I0309 14:31:23.955523 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee517e1d-7784-48a3-9e46-40ce568f7bf1-utilities\") pod \"ee517e1d-7784-48a3-9e46-40ce568f7bf1\" (UID: \"ee517e1d-7784-48a3-9e46-40ce568f7bf1\") " Mar 09 14:31:23 crc kubenswrapper[4723]: I0309 14:31:23.955580 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xx5gf\" (UniqueName: \"kubernetes.io/projected/ee517e1d-7784-48a3-9e46-40ce568f7bf1-kube-api-access-xx5gf\") pod \"ee517e1d-7784-48a3-9e46-40ce568f7bf1\" (UID: \"ee517e1d-7784-48a3-9e46-40ce568f7bf1\") " Mar 09 14:31:23 crc kubenswrapper[4723]: I0309 14:31:23.958634 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee517e1d-7784-48a3-9e46-40ce568f7bf1-utilities" (OuterVolumeSpecName: "utilities") pod "ee517e1d-7784-48a3-9e46-40ce568f7bf1" (UID: "ee517e1d-7784-48a3-9e46-40ce568f7bf1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:31:23 crc kubenswrapper[4723]: I0309 14:31:23.974063 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee517e1d-7784-48a3-9e46-40ce568f7bf1-kube-api-access-xx5gf" (OuterVolumeSpecName: "kube-api-access-xx5gf") pod "ee517e1d-7784-48a3-9e46-40ce568f7bf1" (UID: "ee517e1d-7784-48a3-9e46-40ce568f7bf1"). InnerVolumeSpecName "kube-api-access-xx5gf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:31:24 crc kubenswrapper[4723]: I0309 14:31:24.019883 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee517e1d-7784-48a3-9e46-40ce568f7bf1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ee517e1d-7784-48a3-9e46-40ce568f7bf1" (UID: "ee517e1d-7784-48a3-9e46-40ce568f7bf1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:31:24 crc kubenswrapper[4723]: I0309 14:31:24.059292 4723 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee517e1d-7784-48a3-9e46-40ce568f7bf1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:24 crc kubenswrapper[4723]: I0309 14:31:24.059340 4723 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee517e1d-7784-48a3-9e46-40ce568f7bf1-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:24 crc kubenswrapper[4723]: I0309 14:31:24.059354 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xx5gf\" (UniqueName: \"kubernetes.io/projected/ee517e1d-7784-48a3-9e46-40ce568f7bf1-kube-api-access-xx5gf\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:24 crc kubenswrapper[4723]: I0309 14:31:24.192919 4723 generic.go:334] "Generic (PLEG): container finished" podID="ee517e1d-7784-48a3-9e46-40ce568f7bf1" containerID="6c102853755567674826e11b5f9b6671f798fcec5ef8c918198f05d9d47bcc8f" exitCode=0 Mar 09 14:31:24 crc kubenswrapper[4723]: I0309 14:31:24.192988 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mkth5" event={"ID":"ee517e1d-7784-48a3-9e46-40ce568f7bf1","Type":"ContainerDied","Data":"6c102853755567674826e11b5f9b6671f798fcec5ef8c918198f05d9d47bcc8f"} Mar 09 14:31:24 crc kubenswrapper[4723]: I0309 14:31:24.193062 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mkth5" Mar 09 14:31:24 crc kubenswrapper[4723]: I0309 14:31:24.193086 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mkth5" event={"ID":"ee517e1d-7784-48a3-9e46-40ce568f7bf1","Type":"ContainerDied","Data":"d9203578ecddf8ae86cc33dcbee29d79217d72c032840a347e1d0fba110fafa3"} Mar 09 14:31:24 crc kubenswrapper[4723]: I0309 14:31:24.193117 4723 scope.go:117] "RemoveContainer" containerID="6c102853755567674826e11b5f9b6671f798fcec5ef8c918198f05d9d47bcc8f" Mar 09 14:31:24 crc kubenswrapper[4723]: I0309 14:31:24.230156 4723 scope.go:117] "RemoveContainer" containerID="bac7dad6ad8d516d50a6f1228fc8677a1c153be2a43b94eb036c9305c05ee1d1" Mar 09 14:31:24 crc kubenswrapper[4723]: I0309 14:31:24.238777 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mkth5"] Mar 09 14:31:24 crc kubenswrapper[4723]: I0309 14:31:24.252366 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mkth5"] Mar 09 14:31:24 crc kubenswrapper[4723]: I0309 14:31:24.267138 4723 scope.go:117] "RemoveContainer" containerID="bae6f425e28fe2567ecb92e5fb6795d189304d8d107bec0d4864ef5807ab5ae0" Mar 09 14:31:24 crc kubenswrapper[4723]: I0309 14:31:24.330547 4723 scope.go:117] "RemoveContainer" containerID="6c102853755567674826e11b5f9b6671f798fcec5ef8c918198f05d9d47bcc8f" Mar 09 14:31:24 crc kubenswrapper[4723]: E0309 14:31:24.331664 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c102853755567674826e11b5f9b6671f798fcec5ef8c918198f05d9d47bcc8f\": container with ID starting with 6c102853755567674826e11b5f9b6671f798fcec5ef8c918198f05d9d47bcc8f not found: ID does not exist" containerID="6c102853755567674826e11b5f9b6671f798fcec5ef8c918198f05d9d47bcc8f" Mar 09 14:31:24 crc kubenswrapper[4723]: I0309 14:31:24.331716 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c102853755567674826e11b5f9b6671f798fcec5ef8c918198f05d9d47bcc8f"} err="failed to get container status \"6c102853755567674826e11b5f9b6671f798fcec5ef8c918198f05d9d47bcc8f\": rpc error: code = NotFound desc = could not find container \"6c102853755567674826e11b5f9b6671f798fcec5ef8c918198f05d9d47bcc8f\": container with ID starting with 6c102853755567674826e11b5f9b6671f798fcec5ef8c918198f05d9d47bcc8f not found: ID does not exist" Mar 09 14:31:24 crc kubenswrapper[4723]: I0309 14:31:24.331741 4723 scope.go:117] "RemoveContainer" containerID="bac7dad6ad8d516d50a6f1228fc8677a1c153be2a43b94eb036c9305c05ee1d1" Mar 09 14:31:24 crc kubenswrapper[4723]: E0309 14:31:24.332129 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bac7dad6ad8d516d50a6f1228fc8677a1c153be2a43b94eb036c9305c05ee1d1\": container with ID starting with bac7dad6ad8d516d50a6f1228fc8677a1c153be2a43b94eb036c9305c05ee1d1 not found: ID does not exist" containerID="bac7dad6ad8d516d50a6f1228fc8677a1c153be2a43b94eb036c9305c05ee1d1" Mar 09 14:31:24 crc kubenswrapper[4723]: I0309 14:31:24.332152 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bac7dad6ad8d516d50a6f1228fc8677a1c153be2a43b94eb036c9305c05ee1d1"} err="failed to get container status \"bac7dad6ad8d516d50a6f1228fc8677a1c153be2a43b94eb036c9305c05ee1d1\": rpc error: code = NotFound desc = could not find container \"bac7dad6ad8d516d50a6f1228fc8677a1c153be2a43b94eb036c9305c05ee1d1\": container with ID starting with bac7dad6ad8d516d50a6f1228fc8677a1c153be2a43b94eb036c9305c05ee1d1 not found: ID does not exist" Mar 09 14:31:24 crc kubenswrapper[4723]: I0309 14:31:24.332166 4723 scope.go:117] "RemoveContainer" containerID="bae6f425e28fe2567ecb92e5fb6795d189304d8d107bec0d4864ef5807ab5ae0" Mar 09 14:31:24 crc kubenswrapper[4723]: E0309 14:31:24.332378 4723 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bae6f425e28fe2567ecb92e5fb6795d189304d8d107bec0d4864ef5807ab5ae0\": container with ID starting with bae6f425e28fe2567ecb92e5fb6795d189304d8d107bec0d4864ef5807ab5ae0 not found: ID does not exist" containerID="bae6f425e28fe2567ecb92e5fb6795d189304d8d107bec0d4864ef5807ab5ae0" Mar 09 14:31:24 crc kubenswrapper[4723]: I0309 14:31:24.332421 4723 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bae6f425e28fe2567ecb92e5fb6795d189304d8d107bec0d4864ef5807ab5ae0"} err="failed to get container status \"bae6f425e28fe2567ecb92e5fb6795d189304d8d107bec0d4864ef5807ab5ae0\": rpc error: code = NotFound desc = could not find container \"bae6f425e28fe2567ecb92e5fb6795d189304d8d107bec0d4864ef5807ab5ae0\": container with ID starting with bae6f425e28fe2567ecb92e5fb6795d189304d8d107bec0d4864ef5807ab5ae0 not found: ID does not exist" Mar 09 14:31:24 crc kubenswrapper[4723]: I0309 14:31:24.893256 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee517e1d-7784-48a3-9e46-40ce568f7bf1" path="/var/lib/kubelet/pods/ee517e1d-7784-48a3-9e46-40ce568f7bf1/volumes" Mar 09 14:31:31 crc kubenswrapper[4723]: I0309 14:31:31.882105 4723 scope.go:117] "RemoveContainer" containerID="2763ec92ac5283f817f046f9c8b6c15b52cdcda46dee1aa29bf3bbe464a3180b" Mar 09 14:31:31 crc kubenswrapper[4723]: E0309 14:31:31.883182 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:31:42 crc kubenswrapper[4723]: I0309 14:31:42.881395 4723 scope.go:117] "RemoveContainer" containerID="2763ec92ac5283f817f046f9c8b6c15b52cdcda46dee1aa29bf3bbe464a3180b" Mar 09 14:31:42 crc kubenswrapper[4723]: E0309 14:31:42.882129 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:31:56 crc kubenswrapper[4723]: I0309 14:31:56.890906 4723 scope.go:117] "RemoveContainer" containerID="2763ec92ac5283f817f046f9c8b6c15b52cdcda46dee1aa29bf3bbe464a3180b" Mar 09 14:31:56 crc kubenswrapper[4723]: E0309 14:31:56.891946 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:32:00 crc kubenswrapper[4723]: I0309 14:32:00.152357 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551112-7lwbq"] Mar 09 14:32:00 crc kubenswrapper[4723]: E0309 14:32:00.153444 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee517e1d-7784-48a3-9e46-40ce568f7bf1" containerName="extract-content" Mar 09 14:32:00 crc kubenswrapper[4723]: I0309 14:32:00.153463 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee517e1d-7784-48a3-9e46-40ce568f7bf1" containerName="extract-content" Mar 09 14:32:00 crc kubenswrapper[4723]: E0309 14:32:00.153477 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee517e1d-7784-48a3-9e46-40ce568f7bf1" containerName="registry-server" Mar 09 14:32:00 crc kubenswrapper[4723]: I0309 14:32:00.153485 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee517e1d-7784-48a3-9e46-40ce568f7bf1" containerName="registry-server" Mar 09 14:32:00 crc kubenswrapper[4723]: E0309 14:32:00.153501 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee517e1d-7784-48a3-9e46-40ce568f7bf1" containerName="extract-utilities" Mar 09 14:32:00 crc kubenswrapper[4723]: I0309 14:32:00.153508 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee517e1d-7784-48a3-9e46-40ce568f7bf1" containerName="extract-utilities" Mar 09 14:32:00 crc kubenswrapper[4723]: I0309 14:32:00.153804 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee517e1d-7784-48a3-9e46-40ce568f7bf1" containerName="registry-server" Mar 09 14:32:00 crc kubenswrapper[4723]: I0309 14:32:00.154849 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551112-7lwbq" Mar 09 14:32:00 crc kubenswrapper[4723]: I0309 14:32:00.159142 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:32:00 crc kubenswrapper[4723]: I0309 14:32:00.159412 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jrp4x" Mar 09 14:32:00 crc kubenswrapper[4723]: I0309 14:32:00.166291 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:32:00 crc kubenswrapper[4723]: I0309 14:32:00.172334 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551112-7lwbq"] Mar 09 14:32:00 crc kubenswrapper[4723]: I0309 14:32:00.313433 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf5sn\" (UniqueName: \"kubernetes.io/projected/91fc8b2b-dd76-40a3-b3cc-2c7b23fb7208-kube-api-access-wf5sn\") pod \"auto-csr-approver-29551112-7lwbq\" (UID: \"91fc8b2b-dd76-40a3-b3cc-2c7b23fb7208\") " pod="openshift-infra/auto-csr-approver-29551112-7lwbq" Mar 09 14:32:00 crc kubenswrapper[4723]: I0309 14:32:00.415621 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wf5sn\" (UniqueName: \"kubernetes.io/projected/91fc8b2b-dd76-40a3-b3cc-2c7b23fb7208-kube-api-access-wf5sn\") pod \"auto-csr-approver-29551112-7lwbq\" (UID: \"91fc8b2b-dd76-40a3-b3cc-2c7b23fb7208\") " pod="openshift-infra/auto-csr-approver-29551112-7lwbq" Mar 09 14:32:00 crc kubenswrapper[4723]: I0309 14:32:00.433317 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf5sn\" (UniqueName: \"kubernetes.io/projected/91fc8b2b-dd76-40a3-b3cc-2c7b23fb7208-kube-api-access-wf5sn\") pod \"auto-csr-approver-29551112-7lwbq\" (UID: \"91fc8b2b-dd76-40a3-b3cc-2c7b23fb7208\") " pod="openshift-infra/auto-csr-approver-29551112-7lwbq" Mar 09 14:32:00 crc kubenswrapper[4723]: I0309 14:32:00.491478 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551112-7lwbq" Mar 09 14:32:00 crc kubenswrapper[4723]: I0309 14:32:00.990140 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551112-7lwbq"] Mar 09 14:32:01 crc kubenswrapper[4723]: I0309 14:32:01.591073 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551112-7lwbq" event={"ID":"91fc8b2b-dd76-40a3-b3cc-2c7b23fb7208","Type":"ContainerStarted","Data":"bbb52cc7962b9d518485baccd1f091f254820f33c93abacc4d6d7e8746611196"} Mar 09 14:32:04 crc kubenswrapper[4723]: I0309 14:32:04.639907 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551112-7lwbq" event={"ID":"91fc8b2b-dd76-40a3-b3cc-2c7b23fb7208","Type":"ContainerStarted","Data":"fdb6366e8304b443eb7dbe47d4fe8fb93422f8d6e628e217e0489dd44e0d791f"} Mar 09 14:32:04 crc kubenswrapper[4723]: I0309 14:32:04.697272 4723 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551112-7lwbq" podStartSLOduration=2.283155105 podStartE2EDuration="4.697245447s" podCreationTimestamp="2026-03-09 14:32:00 +0000 UTC" firstStartedPulling="2026-03-09 14:32:00.999062776 +0000 UTC m=+5595.013530316" lastFinishedPulling="2026-03-09 14:32:03.413153108 +0000 UTC m=+5597.427620658" observedRunningTime="2026-03-09 14:32:04.658116475 +0000 UTC m=+5598.672584025" watchObservedRunningTime="2026-03-09 14:32:04.697245447 +0000 UTC m=+5598.711712987" Mar 09 14:32:06 crc kubenswrapper[4723]: I0309 14:32:06.670772 4723 generic.go:334] "Generic (PLEG): container finished" podID="91fc8b2b-dd76-40a3-b3cc-2c7b23fb7208" containerID="fdb6366e8304b443eb7dbe47d4fe8fb93422f8d6e628e217e0489dd44e0d791f" exitCode=0 Mar 09 14:32:06 crc kubenswrapper[4723]: I0309 14:32:06.670840 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551112-7lwbq" event={"ID":"91fc8b2b-dd76-40a3-b3cc-2c7b23fb7208","Type":"ContainerDied","Data":"fdb6366e8304b443eb7dbe47d4fe8fb93422f8d6e628e217e0489dd44e0d791f"} Mar 09 14:32:07 crc kubenswrapper[4723]: I0309 14:32:07.881378 4723 scope.go:117] "RemoveContainer" containerID="2763ec92ac5283f817f046f9c8b6c15b52cdcda46dee1aa29bf3bbe464a3180b" Mar 09 14:32:07 crc kubenswrapper[4723]: E0309 14:32:07.882385 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:32:08 crc kubenswrapper[4723]: I0309 14:32:08.113284 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551112-7lwbq" Mar 09 14:32:08 crc kubenswrapper[4723]: I0309 14:32:08.236812 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wf5sn\" (UniqueName: \"kubernetes.io/projected/91fc8b2b-dd76-40a3-b3cc-2c7b23fb7208-kube-api-access-wf5sn\") pod \"91fc8b2b-dd76-40a3-b3cc-2c7b23fb7208\" (UID: \"91fc8b2b-dd76-40a3-b3cc-2c7b23fb7208\") " Mar 09 14:32:08 crc kubenswrapper[4723]: I0309 14:32:08.243443 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91fc8b2b-dd76-40a3-b3cc-2c7b23fb7208-kube-api-access-wf5sn" (OuterVolumeSpecName: "kube-api-access-wf5sn") pod "91fc8b2b-dd76-40a3-b3cc-2c7b23fb7208" (UID: "91fc8b2b-dd76-40a3-b3cc-2c7b23fb7208"). InnerVolumeSpecName "kube-api-access-wf5sn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:32:08 crc kubenswrapper[4723]: I0309 14:32:08.342250 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wf5sn\" (UniqueName: \"kubernetes.io/projected/91fc8b2b-dd76-40a3-b3cc-2c7b23fb7208-kube-api-access-wf5sn\") on node \"crc\" DevicePath \"\"" Mar 09 14:32:08 crc kubenswrapper[4723]: I0309 14:32:08.694585 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551112-7lwbq" event={"ID":"91fc8b2b-dd76-40a3-b3cc-2c7b23fb7208","Type":"ContainerDied","Data":"bbb52cc7962b9d518485baccd1f091f254820f33c93abacc4d6d7e8746611196"} Mar 09 14:32:08 crc kubenswrapper[4723]: I0309 14:32:08.694631 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bbb52cc7962b9d518485baccd1f091f254820f33c93abacc4d6d7e8746611196" Mar 09 14:32:08 crc kubenswrapper[4723]: I0309 14:32:08.694634 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551112-7lwbq" Mar 09 14:32:08 crc kubenswrapper[4723]: I0309 14:32:08.757743 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551106-bj2lm"] Mar 09 14:32:08 crc kubenswrapper[4723]: I0309 14:32:08.769492 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551106-bj2lm"] Mar 09 14:32:08 crc kubenswrapper[4723]: I0309 14:32:08.919500 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c00bf0d-d587-4583-b603-49e8fcccc58e" path="/var/lib/kubelet/pods/8c00bf0d-d587-4583-b603-49e8fcccc58e/volumes" Mar 09 14:32:18 crc kubenswrapper[4723]: I0309 14:32:18.885421 4723 scope.go:117] "RemoveContainer" containerID="2763ec92ac5283f817f046f9c8b6c15b52cdcda46dee1aa29bf3bbe464a3180b" Mar 09 14:32:18 crc kubenswrapper[4723]: E0309 14:32:18.887524 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:32:29 crc kubenswrapper[4723]: I0309 14:32:29.881885 4723 scope.go:117] "RemoveContainer" containerID="2763ec92ac5283f817f046f9c8b6c15b52cdcda46dee1aa29bf3bbe464a3180b" Mar 09 14:32:29 crc kubenswrapper[4723]: E0309 14:32:29.882672 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:32:30 crc kubenswrapper[4723]: I0309 14:32:30.641743 4723 scope.go:117] "RemoveContainer" containerID="3ccfa0433835cc27e5f2ab2213b54943afe940b4eb8d74813aef950db3a4ac8c" Mar 09 14:32:42 crc kubenswrapper[4723]: I0309 14:32:42.880921 4723 scope.go:117] "RemoveContainer" containerID="2763ec92ac5283f817f046f9c8b6c15b52cdcda46dee1aa29bf3bbe464a3180b" Mar 09 14:32:42 crc kubenswrapper[4723]: E0309 14:32:42.881881 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:32:54 crc kubenswrapper[4723]: I0309 14:32:54.881506 4723 scope.go:117] "RemoveContainer" containerID="2763ec92ac5283f817f046f9c8b6c15b52cdcda46dee1aa29bf3bbe464a3180b" Mar 09 14:32:54 crc kubenswrapper[4723]: E0309 14:32:54.882493 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:33:07 crc kubenswrapper[4723]: I0309 14:33:07.881590 4723 scope.go:117] "RemoveContainer" containerID="2763ec92ac5283f817f046f9c8b6c15b52cdcda46dee1aa29bf3bbe464a3180b" Mar 09 14:33:07 crc kubenswrapper[4723]: E0309 14:33:07.882276 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:33:20 crc kubenswrapper[4723]: I0309 14:33:20.881401 4723 scope.go:117] "RemoveContainer" containerID="2763ec92ac5283f817f046f9c8b6c15b52cdcda46dee1aa29bf3bbe464a3180b" Mar 09 14:33:20 crc kubenswrapper[4723]: E0309 14:33:20.882227 4723 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cfjq2_openshift-machine-config-operator(983d5ed4-cfc7-402a-b226-29dc071c6e4e)\"" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" podUID="983d5ed4-cfc7-402a-b226-29dc071c6e4e" Mar 09 14:33:34 crc kubenswrapper[4723]: I0309 14:33:34.886479 4723 scope.go:117] "RemoveContainer" containerID="2763ec92ac5283f817f046f9c8b6c15b52cdcda46dee1aa29bf3bbe464a3180b" Mar 09 14:33:35 crc kubenswrapper[4723]: I0309 14:33:35.884208 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cfjq2" event={"ID":"983d5ed4-cfc7-402a-b226-29dc071c6e4e","Type":"ContainerStarted","Data":"720505182ba68642731cda74dbedd65253deb1585b9bb313c651321f83c1bdce"} Mar 09 14:34:00 crc kubenswrapper[4723]: I0309 14:34:00.140776 4723 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551114-n8x2g"] Mar 09 14:34:00 crc kubenswrapper[4723]: E0309 14:34:00.142031 4723 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91fc8b2b-dd76-40a3-b3cc-2c7b23fb7208" containerName="oc" Mar 09 14:34:00 crc kubenswrapper[4723]: I0309 14:34:00.142051 4723 state_mem.go:107] "Deleted CPUSet assignment" podUID="91fc8b2b-dd76-40a3-b3cc-2c7b23fb7208" containerName="oc" Mar 09 14:34:00 crc kubenswrapper[4723]: I0309 14:34:00.142347 4723 memory_manager.go:354] "RemoveStaleState removing state" podUID="91fc8b2b-dd76-40a3-b3cc-2c7b23fb7208" containerName="oc" Mar 09 14:34:00 crc kubenswrapper[4723]: I0309 14:34:00.143405 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551114-n8x2g" Mar 09 14:34:00 crc kubenswrapper[4723]: I0309 14:34:00.145577 4723 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jrp4x" Mar 09 14:34:00 crc kubenswrapper[4723]: I0309 14:34:00.145790 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:34:00 crc kubenswrapper[4723]: I0309 14:34:00.146251 4723 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:34:00 crc kubenswrapper[4723]: I0309 14:34:00.151297 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551114-n8x2g"] Mar 09 14:34:00 crc kubenswrapper[4723]: I0309 14:34:00.316406 4723 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7c85\" (UniqueName: \"kubernetes.io/projected/aab8aee1-f666-44e2-ac7b-cdebdacc3061-kube-api-access-n7c85\") pod \"auto-csr-approver-29551114-n8x2g\" (UID: \"aab8aee1-f666-44e2-ac7b-cdebdacc3061\") " pod="openshift-infra/auto-csr-approver-29551114-n8x2g" Mar 09 14:34:00 crc kubenswrapper[4723]: I0309 14:34:00.419098 4723 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7c85\" (UniqueName: \"kubernetes.io/projected/aab8aee1-f666-44e2-ac7b-cdebdacc3061-kube-api-access-n7c85\") pod \"auto-csr-approver-29551114-n8x2g\" (UID: \"aab8aee1-f666-44e2-ac7b-cdebdacc3061\") " pod="openshift-infra/auto-csr-approver-29551114-n8x2g" Mar 09 14:34:00 crc kubenswrapper[4723]: I0309 14:34:00.448927 4723 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7c85\" (UniqueName: \"kubernetes.io/projected/aab8aee1-f666-44e2-ac7b-cdebdacc3061-kube-api-access-n7c85\") pod \"auto-csr-approver-29551114-n8x2g\" (UID: \"aab8aee1-f666-44e2-ac7b-cdebdacc3061\") " pod="openshift-infra/auto-csr-approver-29551114-n8x2g" Mar 09 14:34:00 crc kubenswrapper[4723]: I0309 14:34:00.466182 4723 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551114-n8x2g" Mar 09 14:34:00 crc kubenswrapper[4723]: I0309 14:34:00.976547 4723 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551114-n8x2g"] Mar 09 14:34:00 crc kubenswrapper[4723]: I0309 14:34:00.981188 4723 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 14:34:01 crc kubenswrapper[4723]: I0309 14:34:01.157457 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551114-n8x2g" event={"ID":"aab8aee1-f666-44e2-ac7b-cdebdacc3061","Type":"ContainerStarted","Data":"0616a3025382e683d68707a8077cca96755c79f2cfb6733e83a8694d28a03c16"} Mar 09 14:34:03 crc kubenswrapper[4723]: I0309 14:34:03.200213 4723 generic.go:334] "Generic (PLEG): container finished" podID="aab8aee1-f666-44e2-ac7b-cdebdacc3061" containerID="105d6419d6919fba876675cba7aa5875620930dea9fb339547435387bca6dae5" exitCode=0 Mar 09 14:34:03 crc kubenswrapper[4723]: I0309 14:34:03.200342 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551114-n8x2g" event={"ID":"aab8aee1-f666-44e2-ac7b-cdebdacc3061","Type":"ContainerDied","Data":"105d6419d6919fba876675cba7aa5875620930dea9fb339547435387bca6dae5"} Mar 09 14:34:04 crc kubenswrapper[4723]: I0309 14:34:04.842944 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551114-n8x2g" Mar 09 14:34:04 crc kubenswrapper[4723]: I0309 14:34:04.934855 4723 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7c85\" (UniqueName: \"kubernetes.io/projected/aab8aee1-f666-44e2-ac7b-cdebdacc3061-kube-api-access-n7c85\") pod \"aab8aee1-f666-44e2-ac7b-cdebdacc3061\" (UID: \"aab8aee1-f666-44e2-ac7b-cdebdacc3061\") " Mar 09 14:34:04 crc kubenswrapper[4723]: I0309 14:34:04.942566 4723 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aab8aee1-f666-44e2-ac7b-cdebdacc3061-kube-api-access-n7c85" (OuterVolumeSpecName: "kube-api-access-n7c85") pod "aab8aee1-f666-44e2-ac7b-cdebdacc3061" (UID: "aab8aee1-f666-44e2-ac7b-cdebdacc3061"). InnerVolumeSpecName "kube-api-access-n7c85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:34:05 crc kubenswrapper[4723]: I0309 14:34:05.038002 4723 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7c85\" (UniqueName: \"kubernetes.io/projected/aab8aee1-f666-44e2-ac7b-cdebdacc3061-kube-api-access-n7c85\") on node \"crc\" DevicePath \"\"" Mar 09 14:34:05 crc kubenswrapper[4723]: I0309 14:34:05.222141 4723 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551114-n8x2g" event={"ID":"aab8aee1-f666-44e2-ac7b-cdebdacc3061","Type":"ContainerDied","Data":"0616a3025382e683d68707a8077cca96755c79f2cfb6733e83a8694d28a03c16"} Mar 09 14:34:05 crc kubenswrapper[4723]: I0309 14:34:05.222438 4723 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0616a3025382e683d68707a8077cca96755c79f2cfb6733e83a8694d28a03c16" Mar 09 14:34:05 crc kubenswrapper[4723]: I0309 14:34:05.222192 4723 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551114-n8x2g" Mar 09 14:34:05 crc kubenswrapper[4723]: I0309 14:34:05.916321 4723 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551108-r8k94"] Mar 09 14:34:05 crc kubenswrapper[4723]: I0309 14:34:05.929554 4723 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551108-r8k94"] Mar 09 14:34:06 crc kubenswrapper[4723]: I0309 14:34:06.896328 4723 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9736ceca-3335-4d7f-bbd8-279d52703c44" path="/var/lib/kubelet/pods/9736ceca-3335-4d7f-bbd8-279d52703c44/volumes" Mar 09 14:34:30 crc kubenswrapper[4723]: I0309 14:34:30.745504 4723 scope.go:117] "RemoveContainer" containerID="afed7e6a473b8a359291b629cab83283f34f4a05f5093a3b8dd030edb43458bb"